WO2016009688A1 - システム、機械、制御方法、プログラム - Google Patents
システム、機械、制御方法、プログラム Download PDFInfo
- Publication number
- WO2016009688A1 WO2016009688A1 PCT/JP2015/061542 JP2015061542W WO2016009688A1 WO 2016009688 A1 WO2016009688 A1 WO 2016009688A1 JP 2015061542 W JP2015061542 W JP 2015061542W WO 2016009688 A1 WO2016009688 A1 WO 2016009688A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- information
- work
- agricultural machine
- target
- image
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 255
- 230000008569 process Effects 0.000 claims description 162
- 230000003595 spectral effect Effects 0.000 claims description 120
- 230000033001 locomotion Effects 0.000 claims description 111
- 238000003384 imaging method Methods 0.000 claims description 87
- 230000005540 biological transmission Effects 0.000 claims description 65
- 238000010248 power generation Methods 0.000 claims description 5
- 238000012806 monitoring device Methods 0.000 description 122
- 238000004891 communication Methods 0.000 description 91
- 238000012545 processing Methods 0.000 description 83
- 241000196324 Embryophyta Species 0.000 description 64
- 230000002159 abnormal effect Effects 0.000 description 51
- 239000003337 fertilizer Substances 0.000 description 45
- 230000003287 optical effect Effects 0.000 description 44
- 238000012544 monitoring process Methods 0.000 description 34
- 230000005856 abnormality Effects 0.000 description 33
- 230000008859 change Effects 0.000 description 33
- 238000003306 harvesting Methods 0.000 description 33
- 238000005259 measurement Methods 0.000 description 31
- 238000003860 storage Methods 0.000 description 31
- 230000000694 effects Effects 0.000 description 28
- 230000000875 corresponding effect Effects 0.000 description 26
- 230000006870 function Effects 0.000 description 26
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 25
- 238000001514 detection method Methods 0.000 description 24
- 238000010586 diagram Methods 0.000 description 23
- 239000000463 material Substances 0.000 description 23
- 230000010287 polarization Effects 0.000 description 23
- 238000004364 calculation method Methods 0.000 description 22
- 241000607479 Yersinia pestis Species 0.000 description 21
- 238000002834 transmittance Methods 0.000 description 20
- 235000013399 edible fruits Nutrition 0.000 description 17
- 230000004720 fertilization Effects 0.000 description 17
- 239000000446 fuel Substances 0.000 description 17
- 230000012010 growth Effects 0.000 description 16
- 238000012790 confirmation Methods 0.000 description 14
- 238000005507 spraying Methods 0.000 description 14
- 238000010276 construction Methods 0.000 description 13
- 230000015654 memory Effects 0.000 description 13
- 239000002689 soil Substances 0.000 description 13
- 239000011521 glass Substances 0.000 description 12
- 230000015572 biosynthetic process Effects 0.000 description 11
- 230000001276 controlling effect Effects 0.000 description 11
- 238000012937 correction Methods 0.000 description 10
- 238000003708 edge detection Methods 0.000 description 10
- 230000002829 reductive effect Effects 0.000 description 10
- 239000007921 spray Substances 0.000 description 10
- 238000003786 synthesis reaction Methods 0.000 description 10
- 238000002485 combustion reaction Methods 0.000 description 9
- 238000000605 extraction Methods 0.000 description 9
- 238000005286 illumination Methods 0.000 description 9
- 239000000575 pesticide Substances 0.000 description 9
- 230000005611 electricity Effects 0.000 description 8
- 238000009331 sowing Methods 0.000 description 8
- 238000006243 chemical reaction Methods 0.000 description 7
- 238000009434 installation Methods 0.000 description 7
- 230000004044 response Effects 0.000 description 7
- 238000001228 spectrum Methods 0.000 description 7
- 208000005156 Dehydration Diseases 0.000 description 6
- 238000004458 analytical method Methods 0.000 description 6
- 239000004033 plastic Substances 0.000 description 6
- 230000002441 reversible effect Effects 0.000 description 6
- 239000000126 substance Substances 0.000 description 6
- 238000009333 weeding Methods 0.000 description 6
- 230000009471 action Effects 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000009313 farming Methods 0.000 description 5
- 238000004519 manufacturing process Methods 0.000 description 5
- CURLTUGMZLYLDI-UHFFFAOYSA-N Carbon dioxide Chemical compound O=C=O CURLTUGMZLYLDI-UHFFFAOYSA-N 0.000 description 4
- 241001465754 Metazoa Species 0.000 description 4
- 230000007423 decrease Effects 0.000 description 4
- 230000005499 meniscus Effects 0.000 description 4
- 238000010899 nucleation Methods 0.000 description 4
- 239000010409 thin film Substances 0.000 description 4
- 230000000052 comparative effect Effects 0.000 description 3
- 239000002131 composite material Substances 0.000 description 3
- 238000005520 cutting process Methods 0.000 description 3
- 238000009826 distribution Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 239000000284 extract Substances 0.000 description 3
- 239000003502 gasoline Substances 0.000 description 3
- 239000003673 groundwater Substances 0.000 description 3
- 239000007788 liquid Substances 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000009467 reduction Effects 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 230000035945 sensitivity Effects 0.000 description 3
- 238000012546 transfer Methods 0.000 description 3
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- 240000007594 Oryza sativa Species 0.000 description 2
- 235000007164 Oryza sativa Nutrition 0.000 description 2
- NBIIXXVUZAFLBC-UHFFFAOYSA-N Phosphoric acid Chemical compound OP(O)(O)=O NBIIXXVUZAFLBC-UHFFFAOYSA-N 0.000 description 2
- 238000010521 absorption reaction Methods 0.000 description 2
- 239000003905 agrochemical Substances 0.000 description 2
- 239000001569 carbon dioxide Substances 0.000 description 2
- 229910002092 carbon dioxide Inorganic materials 0.000 description 2
- 210000004027 cell Anatomy 0.000 description 2
- 229930002875 chlorophyll Natural products 0.000 description 2
- 235000019804 chlorophyll Nutrition 0.000 description 2
- ATNHDLDRLWWWCB-AENOIHSZSA-M chlorophyll a Chemical compound C1([C@@H](C(=O)OC)C(=O)C2=C3C)=C2N2C3=CC(C(CC)=C3C)=[N+]4C3=CC3=C(C=C)C(C)=C5N3[Mg-2]42[N+]2=C1[C@@H](CCC(=O)OC\C=C(/C)CCC[C@H](C)CCC[C@H](C)CCCC(C)C)[C@H](C)C2=C5 ATNHDLDRLWWWCB-AENOIHSZSA-M 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000002596 correlated effect Effects 0.000 description 2
- 230000000994 depressogenic effect Effects 0.000 description 2
- 230000001066 destructive effect Effects 0.000 description 2
- 238000011049 filling Methods 0.000 description 2
- 239000010408 film Substances 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 230000001678 irradiating effect Effects 0.000 description 2
- 238000003973 irrigation Methods 0.000 description 2
- 230000002262 irrigation Effects 0.000 description 2
- 235000015097 nutrients Nutrition 0.000 description 2
- 239000003921 oil Substances 0.000 description 2
- 230000008635 plant growth Effects 0.000 description 2
- 238000001556 precipitation Methods 0.000 description 2
- 235000009566 rice Nutrition 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 239000000758 substrate Substances 0.000 description 2
- 238000004804 winding Methods 0.000 description 2
- 244000144730 Amygdalus persica Species 0.000 description 1
- 241000283690 Bos taurus Species 0.000 description 1
- 235000016623 Fragaria vesca Nutrition 0.000 description 1
- 240000009088 Fragaria x ananassa Species 0.000 description 1
- 235000011363 Fragaria x ananassa Nutrition 0.000 description 1
- UFHFLCQGNIYNRP-UHFFFAOYSA-N Hydrogen Chemical compound [H][H] UFHFLCQGNIYNRP-UHFFFAOYSA-N 0.000 description 1
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 240000008415 Lactuca sativa Species 0.000 description 1
- 235000003228 Lactuca sativa Nutrition 0.000 description 1
- 241000289619 Macropodidae Species 0.000 description 1
- ZLMJMSJWJFRBEC-UHFFFAOYSA-N Potassium Chemical compound [K] ZLMJMSJWJFRBEC-UHFFFAOYSA-N 0.000 description 1
- 235000006040 Prunus persica var persica Nutrition 0.000 description 1
- 238000000862 absorption spectrum Methods 0.000 description 1
- NIXOWILDQLNWCW-UHFFFAOYSA-N acrylic acid group Chemical group C(C=C)(=O)O NIXOWILDQLNWCW-UHFFFAOYSA-N 0.000 description 1
- 230000002776 aggregation Effects 0.000 description 1
- 238000004220 aggregation Methods 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 229910052782 aluminium Inorganic materials 0.000 description 1
- XAGFODPZIPBFFR-UHFFFAOYSA-N aluminium Chemical compound [Al] XAGFODPZIPBFFR-UHFFFAOYSA-N 0.000 description 1
- 229910000147 aluminium phosphate Inorganic materials 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000005452 bending Methods 0.000 description 1
- 239000011230 binding agent Substances 0.000 description 1
- 230000000903 blocking effect Effects 0.000 description 1
- 239000000969 carrier Substances 0.000 description 1
- 210000003763 chloroplast Anatomy 0.000 description 1
- 210000000078 claw Anatomy 0.000 description 1
- 238000004737 colorimetric analysis Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000002845 discoloration Methods 0.000 description 1
- 239000006185 dispersion Substances 0.000 description 1
- 230000009189 diving Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 238000001704 evaporation Methods 0.000 description 1
- 230000008020 evaporation Effects 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 239000004459 forage Substances 0.000 description 1
- 239000000295 fuel oil Substances 0.000 description 1
- 239000002828 fuel tank Substances 0.000 description 1
- 238000009499 grossing Methods 0.000 description 1
- 229910052739 hydrogen Inorganic materials 0.000 description 1
- 239000001257 hydrogen Substances 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 230000002401 inhibitory effect Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000005389 magnetism Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 229910052751 metal Inorganic materials 0.000 description 1
- 239000002184 metal Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000000491 multivariate analysis Methods 0.000 description 1
- 230000004297 night vision Effects 0.000 description 1
- ZKATWMILCYLAPD-UHFFFAOYSA-N niobium pentoxide Inorganic materials O=[Nb](=O)O[Nb](=O)=O ZKATWMILCYLAPD-UHFFFAOYSA-N 0.000 description 1
- URLJKFSTXLNXLG-UHFFFAOYSA-N niobium(5+);oxygen(2-) Chemical group [O-2].[O-2].[O-2].[O-2].[O-2].[Nb+5].[Nb+5] URLJKFSTXLNXLG-UHFFFAOYSA-N 0.000 description 1
- 229910052757 nitrogen Inorganic materials 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 239000005304 optical glass Substances 0.000 description 1
- 230000010355 oscillation Effects 0.000 description 1
- BPUBBGLMJRNUCC-UHFFFAOYSA-N oxygen(2-);tantalum(5+) Chemical group [O-2].[O-2].[O-2].[O-2].[O-2].[Ta+5].[Ta+5] BPUBBGLMJRNUCC-UHFFFAOYSA-N 0.000 description 1
- 238000012805 post-processing Methods 0.000 description 1
- 229910052700 potassium Inorganic materials 0.000 description 1
- 239000011591 potassium Substances 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
- 230000001737 promoting effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000000985 reflectance spectrum Methods 0.000 description 1
- 239000011347 resin Substances 0.000 description 1
- 229920005989 resin Polymers 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 238000003892 spreading Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 239000002344 surface layer Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- PBCFLUZVCVVTBY-UHFFFAOYSA-N tantalum pentoxide Inorganic materials O=[Ta](=O)O[Ta](=O)=O PBCFLUZVCVVTBY-UHFFFAOYSA-N 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000013519 translation Methods 0.000 description 1
- 238000002054 transplantation Methods 0.000 description 1
- 238000007740 vapor deposition Methods 0.000 description 1
- 235000013311 vegetables Nutrition 0.000 description 1
- 230000037303 wrinkles Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/001—Steering by means of optical assistance, e.g. television cameras
-
- A—HUMAN NECESSITIES
- A01—AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
- A01B—SOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
- A01B69/00—Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
- A01B69/007—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow
- A01B69/008—Steering or guiding of agricultural vehicles, e.g. steering of the tractor to keep the plough in the furrow automatic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/28—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising
- G02B27/283—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 for polarising used for beam splitting or combining
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/243—Image signal generators using stereoscopic image cameras using three or more 2D image sensors
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/10—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths
- H04N23/11—Cameras or camera modules comprising electronic image sensors; Control thereof for generating image signals from different wavelengths for generating image signals from visible and infrared light wavelengths
Definitions
- the present invention relates to a system, a machine, a control method, and a program.
- Patent Document 1 a crop sensing head of a reflection-based sensor having a light source and a detector is connected to a vehicle, and the crop detection head is passed near the crop using this vehicle. Collect crop data. Based on the crop data, the amount of substances (for example, fertilizer, seeds, nutrients, water, chemicals, etc.) required by the crops is obtained, and the amount of substances sprayed from the sprayer connected to the vehicle is adjusted. The amount of substance spraying is adjusted by changing the speed of the vehicle. If it is desired to increase the spraying speed, the vehicle speed is reduced, and if it is desired to decrease the spraying speed, the vehicle speed is increased. This speed adjustment is performed automatically.
- substances for example, fertilizer, seeds, nutrients, water, chemicals, etc.
- an object of the present invention is to provide an apparatus that can improve the efficiency of the entire system.
- the system of the present invention includes a first operating device that performs an operation on a first target, at least one sensor that acquires analog information from the first target, and the analog information acquired by the sensor.
- the first target is identified based on at least one type of first digital information, and at least one type of second information different from the first digital information
- a control device that controls the operation of the first operation device with respect to the identified first object based on digital information.
- the efficiency of the entire system can be improved.
- FIG. 1 It is a figure which shows typically the system configuration
- FIG. 1 is a functional block diagram showing functions of an FPGA mounted on a stereo camera device according to an embodiment of the invention. It is a schematic diagram explaining the principle for measuring distance with the stereo camera apparatus in one Embodiment of invention. It is a reference
- FIG. 11A in one Embodiment of invention.
- the figure explaining the process of detecting the cost (matching degree or dissimilarity, similarity) in the designated range of the comparison image with respect to a certain area (predetermined reference pixel) of the reference image by the stereo camera device in one embodiment of the invention is there.
- FIG. 37B shows the agricultural machine which performs the leveling operation
- FIG. 37B shows the agricultural machine which performs the leveling operation
- FIG. 37B shows the agricultural machine which performs the leveling operation
- FIG. 37B shows the schematic diagram which observed the relationship between the rotation of the laser radar apparatus which performs laser irradiation with the agricultural machine which performs leveling work in one Embodiment of an invention, and the position of a laser light-receiving apparatus from upper direction.
- FIG. 1 it is a schematic diagram which shows a part of system configuration
- FIG. 55 is a flowchart showing a continuation process of the flowchart shown in FIG. 54 in an embodiment of the present invention. It is a figure which shows an example of the information of the distance and magnitude
- traveling machines such as agricultural machines and construction machines that perform work while moving, or work after moving, flying machines, ships, submersibles, robots, and other moving bodies themselves
- 1 shows an example of a system for performing a desired work by directly or indirectly controlling a moving body.
- this embodiment can be applied to various mobile bodies as described above, here, a basic configuration and operation will be described for an agricultural machine in which the contents of movement and work are intuitively easy to understand.
- FIG. 1 shows a configuration of a system 1501 in a farm field to which the present embodiment is applied.
- the overall system 1500 of this embodiment is the entire system 1501 in FIG. 1 and the information communication system 1502 in FIG.
- the system 1501 or the information communication system 1502 may be described.
- the description of the overall system 1500 is described. There is a case.
- a tractor as an agricultural machine 100, a crop 350, a field monitoring device 500 using a omnidirectional camera, a state monitoring device 550 using a multispectral camera (or a colorimetric camera), a leveling of the field ( There is a laser light receiving position feedback device (laser light receiving device) 610 or the like for performing a leveling operation.
- laser light receiving position feedback device laser light receiving device 610 or the like for performing a leveling operation.
- the broken lines in the figure indicate transmission / reception of information by wireless communication, and the agricultural machine 100, the field monitoring device 500, the state monitoring device 550, etc. construct a wireless communication network.
- This wireless communication is connected to the wireless access point 700 of the information communication system 1502 shown in FIG.
- the agricultural machine 100, the field monitoring device 500, the state monitoring device 550, and the like and the information communication system 1502 operate in cooperation, so that manual operation is kept to a minimum such as initial setting, and the farming by automatic control is performed thereafter.
- the machine 100 can be moved and operated. As a result, work efficiency can be improved.
- the operation of the information communication system 1502 using these wireless communications will be described in detail later.
- FIG. 1 has shown the outdoor field, it is not restricted to this, The case where greenhouse crops by a greenhouse and other indoor crops are produced are also included in this embodiment.
- the entire system 1500 in this embodiment is constructed by the system 1501 in this field and the information communication system 1502 described below.
- the overall system 1500 uses these machines and devices to perform efficient farming operations with minimal manual operation.
- a hyphen and a number after the reference sign basically performs the same or similar function as that of the reference sign only, but indicates that the configuration is different.
- the hyphen and the number after the symbol are omitted. In this case, all the machines and devices having only a symbol and a hyphen and a number after the symbol are objects of the description.
- FIG. 2 shows a configuration of an information communication system to which this embodiment is applied.
- the information communication system 1502 includes a wireless access point 700, the Internet 702, a server 704, a database 706, a database 708, a user terminal 710, and a user terminal 712.
- the wireless access point 700, the server 704, and the databases 706 and 708 are connected to the Internet 702 by wire, but are not limited thereto and may be wirelessly connected. Further, the user terminals 710 and 712 may be directly connected to the Internet 702 by wire or wirelessly, or may be connected via the wireless access point 700 or other relays.
- the wireless access point 700 is an outdoor long-distance wireless LAN access point and includes a directional antenna 701 in order to perform information communication with machines and devices in the field. When information is not communicated from a specific direction, an omnidirectional antenna may be used as the directional antenna 701.
- the wireless access point 700 is a router type, and has a routing function and a network address translation (NAT) function. With the routing function, an optimal route can be selected and transmitted when a packet is transmitted to the target host in the TCP / IP network.
- the NAT function enables a router or gateway at the boundary between two TCP / IP networks to automatically convert both IP addresses and transfer data. With these functions, efficient information communication can be performed with the server 704 or the like.
- the wireless standard conforms to the standard IEEE 802.11 series, but is not limited thereto.
- a W-CDMA (UMTS) method a CDMA2000 1X method, a Long Term Evolution (LTE) method, or the like used in a mobile communication system may be used.
- UMTS UMTS
- CDMA2000 1X method a CDMA2000 1X method
- LTE Long Term Evolution
- the server 704 includes a CPU 7041, ROM 7042, RAM 7043, solid state drive (SSD) 7044, and interface (I / F) 7045. Note that a hard disk may be provided in addition to or in place of the SSD 7044.
- the CPU 7041 is a main body that executes a program in the server 704.
- the ROM 7042 records contents to be processed by the CPU 7041 immediately after the power is turned on and a minimum necessary instruction group.
- the RAM 7043 is a memory for temporarily storing data processed by the CPU 7041.
- the server 704 functions as a control device that controls the agricultural machine 100, various field monitoring devices 500 and 555, and the state monitoring device 550.
- the server 704 performs information communication with the agricultural machine 100, the field monitoring device 500, the state monitoring device 550, and the like illustrated in FIG. 1 via the wireless access point 700.
- the server 704 also performs information communication with the databases 706 and 708 and the user terminals 710 and 712. Operations performed by the server 704 will be described later.
- the operation executed by the server 704 is executed by the CPU 7041 based on the data read out to the RAM by the CPU 7041 reading the program stored in the SSD 7044.
- the program stored in the SSD 7044 can be updated.
- the program may be stored in a portable recording medium such as a CD-ROM, DVD-ROM, SD card, or USB memory. In that case, the server 704 reads the program from the medium and executes the program. .
- the server 704 is connected to the Internet 702 via an interface.
- the server 704 determines that the agricultural machine 100, the user terminals 710, 712, etc. are located in a specific area such as an agricultural field or an information communication related facility from the position information acquired from the agricultural machine 100, the user terminals 710, 712, etc. Determine whether or not.
- a specific area such as an agricultural field or an information communication related facility from the position information acquired from the agricultural machine 100, the user terminals 710, 712, etc. Determine whether or not.
- authentication processing is performed with the agricultural machine 100 and the user terminals 710 and 712, and the entire system 1500 of this embodiment is applied only when the authentication is successful.
- the information communicated in the overall system 1500 is encrypted, and only when authentication is successful, a key for decryption is given, and meaningful information communication is possible.
- authentication fails, information cannot be decrypted, meaning that meaningful information communication cannot be performed, and the entire system 1500 cannot be used. In this way, the safety of the overall system 1500 is increased. Further, even if the agricultural machine 100 is stolen, if authentication is not possible, making the agricultural machine 100 unusable helps prevent theft. Note that the authentication process may be performed regardless of whether or not a device for using the information communication system 1502 is located in a specific area.
- the authentication may be performed by inputting a user ID or password of the user as in the present embodiment, and in the case of the agricultural machine 100, the field monitoring device 500, the state monitoring device 550, etc., the unique machine or device has. You may use ID. If security is not taken into consideration, authentication, encryption, and decryption processing are not necessary.
- the server 704 also performs billing processing (invoicing issuance) described later.
- billing processing invoicing issuance
- a plurality of servers may share the processing performed by the server 704 described so far or described later.
- the processing can be performed separately for a management server, a recognition / analysis server, a billing management server, and the like of the overall system 1500.
- the server 704 monitors whether a failure such as a failure occurs in the agricultural machine 100, the field monitoring devices 500 and 555, and the state monitoring device 550. When a failure is detected, the server 704 automatically notifies the provider of the overall system 1500 including the information communication system 1502 or the service provider of the overall system 1500 and the user terminals 710 and 712. Note that when the agricultural machine 100 or the like detects a failure such as a failure, the server 704 may be notified without waiting for a query from the server 704. As described above, since the entire system 1500 can cope with a failure, a service provider or the like can quickly grasp the situation and take a countermeasure when a failure occurs in the system.
- the database 706 stores various data in order to perform this recognition processing accurately and quickly.
- the server 704 performs recognition processing described later using data stored in the database 706.
- Data stored in the database 706 is mainly information indicating image data (standard pattern used for recognition processing, etc.), the attribute and type of the image data, and the correspondence between the agricultural machine 100 corresponding to the type.
- Image data and data indicating attributes and types are stored in an associated state.
- the database 706 may store content data that provides information via the Internet 702. Also in this case, the image data and the data indicating the attribute and type are associated with each other. The more such data is stored, the more accurate recognition processing can be performed.
- the database 708 is a storage place for storing information transmitted from the agricultural field such as the agricultural machine 100, the agricultural field monitoring apparatus 500, and the state monitoring apparatus 550. For example, work start time, interruption time, end time, information on the place where the work is required, work position and date such as a place where fertilizer is applied, normalized vegetation index NDVI, pest information, etc. described later . It is possible to improve the efficiency of future farming by storing such information in a database and analyzing and utilizing the accumulated data.
- a specific tendency is derived for the growing situation of the crops and the shipping time, and based on the tendency, for example, how much fertilizer is applied, the quality of the target It is possible to determine the crops and whether the crops can be obtained at a desired time. In particular, since the harvest time can be predicted by the value of the normalized vegetation index NDVI, it is desirable to accumulate a lot of information from the crops grown in the field.
- the database 708 also stores shipping information and inventory information from the market. For example, identifiable information such as a wireless tag or a bar code is given to a shipped crop (package). The type of crop is acquired from the identification information at the timing of movement and storage from shipment to the market, and information such as information, identification location, identification time, etc., sequentially stored in the database 708 is stored. The identified information is acquired in a system having a wireless tag reading device, a barcode reader, and the like, and stored in the database 708 via the Internet 702 together with information necessary for tracking crops such as identification time information and identification location information. Stored.
- identifiable information such as a wireless tag or a bar code is given to a shipped crop (package).
- the type of crop is acquired from the identification information at the timing of movement and storage from shipment to the market, and information such as information, identification location, identification time, etc., sequentially stored in the database 708 is stored.
- the identified information is acquired in a system having a wireless tag reading device, a barcode reader, and the
- the user using the user terminals 710 and 712) and the server 704 in the present embodiment can track the movement of the crop and can determine the demand situation of the crop. That is, because the crops preferred by consumers are low in stock or move quickly, the server 704 analyzes the information stored in the database 708 (or the user via the user terminals 710 and 712) and selects such crops. Can be identified. Then, the server 704 controls the agricultural machine 100 and the like to supply the fertilizer and water, and supply carbon dioxide so that the crops preferred by consumers can be shipped quickly. Promote and enable early harvesting.
- the server 704 responds to instructions from the user terminals 710 and 712, and crops such as a normalized vegetation index NDVI, a degree of water stress, a degree of watering and fertilization, sunshine duration, temperature, and humidity, which will be described later.
- crops such as a normalized vegetation index NDVI, a degree of water stress, a degree of watering and fertilization, sunshine duration, temperature, and humidity, which will be described later.
- Multivariate analysis and analysis can be performed using the conditions (growth conditions) under which the plants are actually grown, the degree of plant growth under these conditions, the harvest time, and the yield. The more these data are accumulated, the more accurate the output (harvest time and yield) can be predicted.
- the above growing conditions include the agricultural machine 100 in the field, the field monitoring devices 500 and 555, the state monitoring device 550, content information (such as weather information) related to the environment provided via the Internet, and input by the user.
- the server 704 obtains either or a combination thereof.
- the predicted output is transmitted to the user terminals 710 and 712 and displayed.
- the output prediction information is information goods (Information Goods) that can be sold independently to other users and customers through telecommunication lines such as the Internet or by providing a recording medium that records the prediction information. It is also.
- each database 706 and 708 may exist in the server 704.
- each database may be configured by dividing the SSD area.
- at least one of the database 706 and the database 708 may be connected to the server 704 in a wired or wireless manner without using the Internet 702. By doing in this way, since it is not necessary to perform communication via the Internet, it is possible to speed up the processing that needs to access the database.
- the user terminal 710 is a tablet type computer. Further, the user terminal 712 is a mobile type computer that does not select a place to be used, such as a smartphone. These terminals have a function of receiving a Global Positioning System (GPS) signal from four satellites and specifying the current position.
- GPS Global Positioning System
- the absolute positions of the field monitoring devices 500 and 555, the state monitoring device 550, etc. are known, signals are received from three or more of those devices, and the current position is determined from the attenuation of these signals or the delay in reception. You may specify.
- GPS Global Positioning System
- user terminals 710 and 712 are not limited to tablet-type computers and mobile-type computers, but are desktop-type computers, built-in computers incorporated in something, or wearable computers such as watches and glasses. Also good.
- These user terminals 710 and 712 can send instructions to the agricultural machine 100 in the field, the field monitoring devices 500 and 555, the state monitoring device 550, and the like via the server 704. For example, a work start command or the like is transmitted to the agricultural machine 100. Further, the user terminals 710 and 712 can acquire notifications and information from the agricultural machine 100 in the field, the field monitoring device 500, and the state monitoring device 550 via the server 704. For example, the user terminals 710 and 712 can display images acquired by the agricultural machine 100, the farm field monitoring device 500, and the state monitoring device 550.
- the server 704 monitors the exchange of information between the user terminals 710 and 712 and the agricultural machine 100, the field monitoring device 500, and the state monitoring device 550, and records them in the database 706 and the database 708.
- the user terminals 710 and 712 may directly perform information communication with the agricultural machine 100, the field monitoring devices 500 and 555, and the state monitoring device 550 without using the server 704. it can.
- the information communication system 1502 of this embodiment is a so-called cloud type system that exchanges information via the Internet 702, but is not limited to this.
- a dedicated communication network is constructed in the user's facility, etc. Information may be exchanged only by the dedicated communication network or a combination of the dedicated communication network and the Internet. Thereby, high-speed information transmission can be performed.
- the agricultural machine 100 may include the function of the server 704 and the processing performed by the server 704. Thereby, the processing speed of the work by the agricultural machine 100 can be further increased.
- the overall system 1500 includes a system 1501 in an agricultural field as shown in FIG. 1 and an information communication system 1502 as shown in FIG. 2, but the server 704 and database 706 in the information communication system 1502 described above. 708 may be incorporated into the agricultural machine 100 or the field monitoring device 500 in the field system 1501. [Description of agricultural machinery and equipment] Next, with reference to FIGS. 3 to 31, various sensor devices provided in the agricultural machine, the agricultural machine, and the like in this embodiment and devices installed in the farm field will be described.
- FIG. 3 is a diagram mainly showing the appearance of the agricultural machine 100A.
- symbol is attached
- this agricultural machine 100A shows a tractor
- the agricultural machine of this embodiment is another agricultural machine, for example, a rice transplanter, a combiner, a binder, a forage crop machine, a robot pesticide spreader, a mobile sprinkler, and a product harvesting robot.
- a machine that performs work while moving, such as a flying object for farm work, may be used.
- the agricultural machine 100A includes a prime mover 102A, a transmission device 104, a work device 106A, a support device 108, a stereo camera device 110, a laser radar device 112, a multispectral camera device 113, a wireless communication antenna 114, a manual operation unit 116, and a control device 118A.
- a GPS antenna 120 a steering device 122, a pair of illumination lamps 124, a set of ultrasonic sonar devices 126, a set of front wheels 128 and a rear wheel 130.
- the prime mover 102A is located inside the agricultural machine 100A and refers to a prime mover such as an engine (internal combustion engine) or a part that receives energy.
- the internal combustion engine is a diesel engine, and light oil is used as a fuel.
- the present invention is not limited to this, and a gasoline engine using gasoline as a fuel or a diesel engine using heavy oil as a fuel may be used.
- the speed of the reciprocating motion of the piston in the cylinder is changed according to the operation of the accelerator pedal in the manual operation unit 116 and the control signal from the control device 118A.
- a generator for charging a battery 224 described later is also provided.
- the motor is the prime mover 102A.
- the traveling speed of the agricultural machine is changed by changing the rotation speed of the motor.
- the prime mover 102A may be a hybrid prime mover combining an electric motor and an internal combustion engine. Furthermore, the prime mover 102A may be an engine that uses hydrogen as a fuel, an engine that generates power using a fuel cell, or the like.
- the transmission device 104 is a part that transmits or converts received energy such as a belt, a chain, or a gear (the transmission device 104 is an example of an operation device). That is, it is a device that transmits power generated by a power generation source (such as an internal combustion engine or a motor) of the prime mover 102A to each part of the agricultural machine 100. Details of the transmission device 104 will be described later.
- a power generation source such as an internal combustion engine or a motor
- the working device 106 is a part that operates for a desired work or work (for example, an actuating device) such as a plow, a plow, a seeder, a planting device, a fertilizer, or a carbon dioxide generator.
- Reference numeral 106A denotes a tilling device having a plurality of tilling claws.
- the work device 106A to be pulled by the agricultural machine 100A is different for each work. (Working device 106 is an example of an operating device)
- the support device 108 is a portion that holds the prime mover 102A, the transmission device 104, and the work device 106A in place.
- the stereo camera device 110 is an image sensor device that includes two optical systems and an image sensor, and mainly acquires a stereo image for distance measurement.
- the stereo camera device 110 is a device used to detect an obstacle or a work target in the traveling direction of the agricultural machine 100, or to detect a distance to the measurement target or a size of the target. It plays a major role in 100A automatic driving (distance (including parallax) and size are examples of second digital information or fourth digital information).
- the stereo camera device 110 is installed in the vicinity of the top of the agricultural machine 100A so as to be rotatable with respect to the vertical axis.
- the stereo camera device 110 is rotated manually or controlled by the control device 118A.
- the installation position is not limited to the vicinity of the head, and for example, it may be installed at a place where it is easy to look around the agricultural machine 100A, such as on the roof where the wireless communication antenna 114 and the GPS antenna 120 are installed.
- the rotation is not limited to the rotation of only one axis, and may be a rotation that can be rotated about a plurality of axes so that an image of a desired position and angle can be obtained.
- the rotation can be performed manually or controlled by the control device 118A.
- the configuration of the stereo camera device 110 will be described in detail later.
- a polarizing filter is installed on the light receiving side of the image sensor (image sensors 13a and 13b) of the stereo camera device 110, and S-polarized light and P-polarized light respectively. It is also possible to acquire a polarized image of If the stereo camera device 110 is such a polarization stereo camera device, it is possible to easily detect the things that are difficult to distinguish with a normal camera, such as a field hail or frost, with high contrast. If there is no need to measure the distance, a polarizing camera device with one image sensor may be installed in the agricultural machine 100 instead of the stereo camera device 110.
- the laser radar device 112 in the present embodiment is a sensor device that outputs a laser having a predetermined wavelength while scanning in two dimensions, and recognizes the distance from the reflected light from the object to the object. It is also called a rider (LIDAR: Light Detection And Ranging) device or a laser range finder device.
- the laser scanning can be one-dimensional.
- the laser radar device 112 is installed above the multispectral camera device 113 so as to be rotatable with respect to the vertical axis.
- the installation location is not limited to above the multispectral camera device 113.
- an agricultural machine 100C described later is rotatably installed on a roof.
- the rotation is not limited to the rotation of only one axis, and may be capable of rotating about a plurality of axes so that the laser can be emitted and incident at a desired position and angle. These rotating operations are performed manually or controlled by the control device 118A.
- the configuration and operation of the laser radar device 112 will be described in detail later.
- the multispectral camera device 113 is an imaging sensor device that acquires spectral information from an object, and can acquire a crop growth situation and the like.
- the multispectral camera device 113 is installed so as to be rotatable with respect to the vertical axis, and includes a laser radar device 112 in the vicinity thereof.
- the rotation is not limited to the rotation of only one axis, and may be a rotation that can be rotated about a plurality of axes so that an image of a desired position and angle can be obtained. These rotating operations are performed manually or controlled by the control device 118A. If spectral information is not obtained using the reflection of laser light from the laser radar device 112, it is not necessary to provide it in the vicinity of the laser radar device 112.
- the wireless communication antenna 114 is an antenna for transmitting / receiving information to / from other agricultural machines 100, the field monitoring device 500, the state monitoring device 550, the wireless access point 700, and the like by wireless communication. It is attached to the roof of the agricultural machine 100A so as to easily receive a radio signal.
- the wireless communication antenna 114 can also perform wireless relay.
- the manual operation unit 116 is a part for manually operating the agricultural machine 100A.
- the control device 118A exchanges information with the prime mover 102A, the transmission device 104, the work device 106A, the stereo camera device 110, the laser radar device 112, the wireless communication antenna 114, the manual operation unit 116, the steering device 122, etc.
- Control 100A The control device 118A can identify the work device 106A by exchanging information with the work device 106.
- the control device 118A is installed inside the agricultural machine 100A.
- the control device 118A is also electrically connected to a terrestrial magnetism sensor that can detect the direction of the traveling direction of the illuminating lamp 124 and the agricultural machine 100, a warning whistle that performs sound intimidation on the target, etc., and controls them. .
- the control device 118A can communicate with the server 704 and the user terminals 710 and 712 via the wireless communication antenna 114.
- the control device 118A includes a CPU, RAM, ROM, memory, and the like, and the CPU executes control processing based on a program stored in the memory.
- the GPS antenna 120 is an antenna for receiving GPS signals from four satellites in order to recognize the absolute position of the agricultural machine 100. It is installed on the roof of the agricultural machine 100A so that GPS signals can be easily received. Thus, since the agricultural machine 100A can specify the position using the GPS satellite, for example, even when the agricultural machine 100A is stolen, if the network environment is prepared, the position can be specified. The machine 100A can be easily found.
- the GPS antenna 120 receives a radio signal from three or more devices such as the field monitoring device 500 and the state monitoring device 550 whose absolute position is known instead of or together with the GPS signal. There may be. In this case, the current absolute position may be specified from the attenuation of the received signal or the time or delay time required from transmission to reception. This is particularly effective when it is difficult to acquire GPS signals, such as when the field is indoors.
- the steering device 122 includes a steering handle, a steering gear box, a tie rod that connects both front wheels, an arm, and the like, and is a device that turns an agricultural machine.
- the direction of the front wheels is changed in accordance with the operation of the steering wheel or a control signal from the control device 118.
- the illuminating lamp 124 is a light that brightens the front of the agricultural machine 100A for night illumination or threat of light to the object.
- the ultrasonic sonar device 126 is a sensor device that recognizes a distance to an object by applying an elastic wave (sound wave) to the object and measuring a time until the reflected wave is detected. It is mainly used for distance measurement with an obstacle in a blind spot where the stereo camera device 110 cannot be captured.
- the ultrasonic information measured by the ultrasonic sonar device 126 is an example of second digital information and fourth digital information.
- the front wheel 128 is for turning the agricultural machine 100A by moving the agricultural machine 100A and operating the steering device 122.
- the rear wheel 130 is a portion where the power generated by the power generation source of the prime mover 102A is finally transmitted in the transmission device 104, and the agricultural machine 100A moves back and forth as they rotate.
- the agricultural machine (tractor) 100A in this embodiment is a stereo camera device 110, a laser radar device 112, a multispectral camera device 113, and an ultrasonic sonar device as sensor devices for acquiring information from outside the agricultural machine 100A.
- 126 is provided, but not all of them need to be provided, and a sensor device to be used may be installed according to the work to be performed.
- sensors other than these sensor devices for example, an infrared sensor, a temperature sensor, and a humidity sensor may be provided.
- Information acquired by these sensors is transmitted to the server 704.
- the server 704 stores such information in the database 708 and uses it for prediction of harvest time and the like.
- FIG. 4 shows another agricultural machine 100B.
- 100B is also a tractor.
- the difference from 100A is that there is no manual operation unit 116 in 100B. That is, 100B is an agricultural machine that performs work by remote control or automatic control.
- this agricultural machine 100B has the stereo camera apparatus 110 installed in front, back, left, and right, and can run and work based on images captured by them. For this reason, compared with the agricultural machine 100A, it can be said that the automatic operation and the remote operation are facilitated.
- eaves is provided above the stereo camera device 110 installed in front of and behind the agricultural machine 100B, thereby preventing the stereo camera device 110 from being stained due to rain or snow.
- the control device 118B built in the agricultural machine 100B does not need to be connected to the manual operation unit 116 in the agricultural machine 100A.
- the amount of information that must be processed by the control device 118B increases. Therefore, a higher performance CPU than the control device 118A is installed, or a plurality of CPUs are installed. ing.
- the configuration required for manual operation such as a steering handle and a steering gear box is omitted from the steering device 122 of the agricultural machine 100A.
- the wireless communication antenna 114 (and the control device 118B) functions as a wireless access point. Thereby, the agricultural machine 100 can be used as a relay point for wireless communication, and an area where wireless communication is possible can be expanded.
- the agricultural machine 100 may be integrated. Moreover, the agricultural machine 100 may connect a plurality of work devices 106 in order to perform a plurality of types of work.
- FIG. 5 is a diagram for explaining the transmission device 104 of FIG. 3 or 4 in detail.
- the transmission device 104 serves as a means for moving the agricultural machine 100 and the work device 106.
- a solid line indicates transmission of kinetic energy
- a broken line indicates transmission of an electric signal
- an alternate long and short dash line indicates an electric supply line.
- FIG. 5 shows an example of a rear wheel two-wheel drive in which the power generation source of the prime mover 102 is an internal combustion engine (engine).
- An example in which the power generation source of the prime mover 102 is an electric motor is shown in FIG.
- the drive system is not limited to two-wheel drive, and may be four-wheel drive.
- the transmission device 104 includes a rear wheel 130, a main clutch 202, a transmission device 204, a differential device 206, brake devices 208 and 214, final reduction devices 210 and 216, a PTO (Power Take Off) transmission device 220, a PTO shaft 222, a battery. 224.
- the main clutch 202 is a device that interrupts transmission of power generated by the engine. It is operated when stopping running or changing speed with the engine started or with the engine on.
- the main clutch 202 can be used to connect and disconnect the power of the traveling device and the PTO at the same time. However, the main clutch 202 may be used to connect and disconnect the traveling clutch and the PTO clutch with separate pedals and levers.
- the transmission device 204 is a device that converts engine power into a rotational speed and torque according to a running state and a working state. This is a device necessary for retracting the tractor and stopping the agricultural machine 100 while the engine is rotating.
- the differential device 206 is a device for rotating the agricultural machine 100 at different speeds on the left and right sides to facilitate the turning of the agricultural machine 100 and eliminating slipping of the wheels.
- the brake devices 208 and 214 are devices that are used when the brake pedal is depressed, or when the kinetic energy is absorbed in accordance with a control signal from the control device 118 to reduce the traveling speed or stop the traveling. .
- the final reduction gears 210 and 216 are devices for further reducing the rotational speed reduced by the bevel gears of the transmission device 204 and the differential device 206 to increase the driving force of the axle.
- the PTO transmission 220 shifts a power take-out device that extracts a part of engine power.
- the PTO shaft 222 is a drive shaft for taking out part of the engine power, and is used as a power source for the work device 106.
- the battery 224 stores electricity as chemical energy. By taking it out again as electric energy, it becomes a power source for engine ignition, start-up, illumination lamp 124, control device 118, and the like.
- Electricity can be supplied from the battery 224 to devices that can be controlled by the control device 118. Then, the apparatus is controlled by converting electrical energy into kinetic energy.
- the prime mover 102 controls the fuel supply amount and timing based on the control signal from the control device, changes the reciprocating motion of the piston, and adjusts the operation speed.
- electric energy drives the actuator and switches gears based on the control signal to control the shift and reverse.
- the brake devices 208 and 214 the actuator is driven based on the control signal to apply a brake to decelerate, or stop traveling.
- FIG. 6 shows details of the transmission device 104-2 for moving the agricultural machine 100 using the prime mover 102-2 as a driving force.
- the solid line indicates kinetic energy transmission
- the broken line indicates electrical signal transmission
- the alternate long and short dash line indicates an electricity supply line.
- This example is also an example of rear wheel two-wheel drive, but may be four-wheel drive.
- the prime mover 102-2 is a power unit including a motor controller and an electric motor (motor). Since this transmission device 104-2 controls the number of rotations and the direction of rotation by the motor, the transmission device 204 described with reference to FIG. 5 is basically unnecessary, but the transmission device 204 is provided for smoother running. Good.
- Battery 224-2 includes a converter and a battery.
- the converter converts AC voltage into DC voltage. Compared with the battery 224 of FIG.
- the battery 224 may be configured by combining a plurality of small batteries.
- the battery 224-2 is charged from the external power source 226.
- the external power source 226 is not exactly the configuration of the transmission device 104, but is an essential element for the agricultural machine 100 driven by the electric motor. Since the external power source 226 employs a non-contact power transmission technique, the battery 224-2 can be charged without an electric wire contact operation. Note that the battery 224-2 may be charged by a contact type using an outlet or the like.
- the work device 106 in FIG. 6 is operated by electric energy from the power source 228 to the work device or the like instead of the PTO shaft 222 as shown in FIG.
- the work device 106 may be operated using the PTO transmission 220 or the PTO shaft 222 as in FIG.
- the conventional working device 106 used in the transmission device 104 of FIG. 5 can be used as it is.
- the torque can be increased even when the rotational speed of the motor is low (that is, when moving at a low speed) due to the motor characteristics, so it is suitable for agricultural machines that work at a low speed compared to automobiles. .
- the battery 224-2 can be automatically charged as will be described later, it is possible to efficiently perform a series of farm work without manpower.
- the transmission device 104-2 may be driven by an in-wheel motor system in which the motor is inserted into the wheel.
- FIG. 7 shows the appearance of the stereo camera device 110.
- the stereo camera device 110 captures a certain area and generates image data that can be transmitted to the control device 118, the server 704, and the user terminals 710 and 712 of the agricultural machine 100, and the stereo at each point in the captured image.
- the distance information (or parallax value information) from the camera device 110 is acquired.
- distance information (or parallax value information) can also be transmitted to the control device 118 or the like.
- the stereo camera device 110 can perform distance measurement using a Semi-Global Matching (SGM) method.
- SGM Semi-Global Matching
- the stereo camera device 110 includes a main body 2 and a pair of cylindrical imaging devices 10 a and 10 b provided in the main body 2.
- the stereo camera device 110 is rotatably attached to the agricultural machine 100 by a pillar having a rotation axis. This rotation operation is performed manually or controlled by the control device 118.
- FIG. 8 shows the overall hardware configuration of the stereo camera device 110.
- the stereo camera device 110 includes an imaging device 10a, an imaging device 10b, a signal conversion device 20a, a signal conversion device 20b, and an image processing device 30.
- the imaging device 10a generates an analog signal (an example of analog information) representing an image by imaging a front scene, and includes an imaging lens 11a, an aperture 12a, and an image sensor 13a.
- the imaging lens 11a is an optical element that forms an object image by refracting light passing through the imaging lens 11a.
- the diaphragm 12a adjusts the amount of light input to the image sensor 13a described later by blocking part of the light that has passed through the imaging lens 11a.
- the image sensor 13a is a semiconductor element that converts light input from the imaging lens 11a and the aperture 12a into an electrical analog image signal, and is realized by a Charge Coupled Device (CCD) or a Complementary Metal Oxide Semiconductor (CMOS).
- CCD Charge Coupled Device
- CMOS Complementary Metal Oxide Semiconductor
- the signal conversion device 20a converts an analog signal representing a captured image into digital image data (digital image information. First digital information, an example of third digital information).
- Correlated Double It includes a sampling (CDS) 21a, an auto gain control (AGC) 22a, an analog digital converter (ADC) 23a, and a frame memory 24a.
- the CDS 21a removes noise from the analog image signal converted by the image sensor 13a by correlated double sampling.
- the AGC 22a performs gain control for controlling the intensity of the analog image signal from which noise has been removed by the CDS 21a.
- the ADC 23a converts an analog image signal whose gain is controlled by the AGC 22a into digital image data.
- the frame memory 24a stores the image data (reference image) converted by the ADC 23a.
- the signal conversion device 20b acquires image data from an analog image signal converted by the imaging device 10b having the imaging lens 11b, the diaphragm 12b, and the image sensor 13b, and includes a CDS 21b, an AGC 22b, an ADC 23b, and a frame. It has a memory 24b. Note that the CDS 21b, AGC 22b, ADC 23b, and frame memory 24b have the same configuration as the CDS 21a, AGC 22a, ADC 23a, and frame memory 24a, respectively, and thus description thereof is omitted. However, the comparison image is stored in the frame memory 24b.
- the image processing device 30 is a device for processing the image data converted by the signal conversion device 20a and the signal conversion device 20b.
- the image processing apparatus 30 includes an FPGA (Field Programmable Gate Array) 31, a CPU (Central Processing Unit) 32, a ROM (Read Only Memory) 33, a RAM (Random Access Memory) 34, and an I / F (Interface code).
- FPGA Field Programmable Gate Array
- CPU Central Processing Unit
- ROM Read Only Memory
- RAM Random Access Memory
- I / F Interface code
- a bus line 39 such as an address bus or a data bus is provided for electrically connecting the above components 31 to 35 as shown in FIG.
- the FPGA 31 is an integrated circuit whose configuration can be set by the purchaser or designer after manufacture, and here, processing for calculating the parallax d in the image represented by the image data is performed.
- the CPU 32 controls each function of the stereo camera device 110.
- the ROM 33 stores an image processing program executed by the CPU 32 to control each function of the parallax value deriving device.
- the RAM 34 is used as a work area for the CPU 32.
- the I / F 35 is an interface for connecting to the control device 118 of the agricultural machine 100.
- the above-described image processing program is a file in an installable or executable format, and may be recorded and distributed on a computer-readable recording medium. This recording medium is a CD-ROM, an SD card, or the like.
- FIG. 9 shows a hardware configuration of a main part of the stereo camera device 110.
- the FPGA 31 includes a cost (matching degree or dissimilarity, similarity) calculating unit 310, a cost synthesizing unit 320, and a parallax value deriving unit 330. These are part of the circuit of the FPGA, but the same processing may be performed by executing an image processing program stored in the ROM 33.
- the cost calculation unit 310 calculates the reference pixel based on the luminance value of the reference pixel in the reference image Ia and the luminance values of a plurality of corresponding pixel candidates on the epipolar line in the comparison image Ib with respect to the reference pixel.
- the cost value C of each corresponding pixel candidate is calculated.
- the cost synthesis unit 320 synthesizes the cost value of each corresponding pixel candidate for one reference pixel by the cost calculation unit 310 and the cost value of each corresponding pixel candidate for another reference pixel by the cost calculation unit 310, and synthesizes them.
- the cost value Ls is output.
- the path cost value Lr for each radiation is further added based on (Expression 4) described later. This is the process of finally calculating the synthesis cost value Ls.
- the disparity value deriving unit 330 derives the disparity value ⁇ based on the position of the one reference pixel in the reference image and the position of the corresponding pixel in the comparison image that minimizes the combined cost value Ls after combining by the cost combining unit 320. And the parallax image Ic which shows the parallax value in each pixel is output.
- the distance Z can be calculated from The process for obtaining the distance Z may be performed by the parallax value deriving unit 330 or may be performed by the CPU 32 or the server 704. In this way, the stereo camera device 110 can obtain distance information (or parallax value information) to each point of the captured image using parallax with respect to the captured image.
- a reference image or a comparative image that is, an image obtained from one image sensor 13a or 13b in the same manner as an image captured by a normal monocular camera. That is, one of the two images can be used.
- the stereo camera device 110 may install a polarizing filter 40 on the light receiving surfaces of the image sensors 13a and 13b.
- the polarizing filter 40 is a Sub-Wavelength Structure (SWS) polarizing filter.
- the polarizing filter 40 has a structure in which polarizer regions that transmit only light of the S-polarized component and polarizer regions that transmit only light of the P-polarized component are alternately arranged.
- the size of one polarizer region is the same as the size of one pixel of the light receiving elements of the image sensors 13a and 13b, and the polarizing filter 40 is installed so that each polarizer region is on each pixel.
- the stereo camera device 110 is configured, and the S-polarized component and the P-polarized component are separated by generating the image separately for each light reception signal of the light transmitted through each polarizer region. An image with only an image and a P-polarized component is obtained. Each image is an example of second digital information and fourth digital information. Since the stereo camera uses two image sensors, two images with only the S-polarized component and two images with only the P-polarized component are obtained. Distance).
- a polarization image By obtaining a polarization image, for example, it becomes easy to detect a difference in the plane orientation of a black subject. This is because the polarization state of light from the subject differs depending on the surface direction of the subject.
- the polarization image makes it easy to detect the presence or absence of a transparent subject. This is because when light passes through a transparent subject, the transmittance changes according to the polarization state of the light. That is, if a polarization camera is used, a high-contrast image can be obtained, and information that cannot be obtained with a luminance image can be obtained.
- the stereo camera device 110 a polarization stereo camera device, in addition to information such as frost and pests, for example, it is possible to detect a frozen road surface and measure a distance, or to accurately detect ridges on a farm field. In addition, distance measurement can be performed. Furthermore, detection of a plant structure can be facilitated by obtaining a polarization image with a polarization filter.
- the overall system 1500 can use this information to grasp the growth status of the plant and discriminate the type of plant (for example, whether it is a crop or a weed). If distance information is not required, only the polarization image information obtained from either one of the image sensors 13a or 13b may be used as in the case of a monocular camera. Of course, when it is desired to check the exact size using the distance information, information obtained from the two image sensors is used.
- the images captured by the imaging device 10a and the imaging device 10b shown in FIG. 10 are set as a reference image Ia and a comparative image Ib, respectively.
- FIG. 10 it is assumed that the imaging device 10a and the imaging device 10b are installed in parallel equiposition.
- an S point on the object E in the three-dimensional space is mapped to a position on the same horizontal line of the imaging device 10a and the imaging device 10b. That is, the S point in each image is imaged at a point Sa (x, y) in the reference image Ia and a point Sb (x ′, y ′) in the comparative image Ib.
- ⁇ x′ ⁇ x (Formula 1)
- the distance between the point Sa (x, y) in the reference image Ia and the intersection of the perpendicular drawn from the imaging lens 11a on the imaging surface is set to ⁇ a, and the comparison image Ib
- the parallax value ⁇ ⁇ a + ⁇ b.
- the distance Z between the imaging devices 10a and 10b and the object E can be derived.
- the distance Z is a distance from a plane including the focal position of the imaging lens 11a and the focal position of the imaging lens 11b to the specific point S on the object E.
- the focal length f of the imaging lens 11a and the imaging lens 11b the baseline length B that is the length between the imaging lens 11a and the imaging lens 11b, and the parallax value ⁇
- Equation 2 To calculate the distance Z.
- Z (B ⁇ f) / ⁇ (Formula 2)
- the larger the parallax value ⁇ the smaller the distance Z
- the parallax value ⁇ the larger the distance Z.
- FIGS. 11A is a reference image
- FIG. 11B is a conceptual diagram showing a parallax image by the edge detection method for FIG. 11A and FIG. 11C as a comparison target
- a parallax image by the SGM method for FIG. 11A is an image in which an object is indicated by luminance.
- the parallax image by the edge detection method is an image derived from the edge detection method, and is an image showing the parallax value of the edge portion of the reference image.
- the parallax image by application of the SGM method is an image derived from the reference image by the application technology of the SGM method, and is an image showing the parallax value at each coordinate of the reference image.
- the difference in the parallax value is shown by the shade of the color. In this example, the parallax value decreases as the color increases. That is, the distance becomes longer as the color becomes darker.
- the SGM method is a method for appropriately deriving the above-described parallax value even for an object having a weak texture.
- a parallax image by the SGM method illustrated in FIG. 11C is derived. Is the method.
- the edge parallax image shown in FIG. 11B is derived based on the reference image shown in FIG. 11A.
- the parallax image by the SGM method can represent detailed information such as a region having a weak texture compared to the parallax image by the edge detection method. More detailed distance measurement can be performed.
- This SGM method does not calculate a cost value that is a dissimilarity and immediately derives a parallax value, but after calculating a cost value, further calculates a composite cost value that is a composite dissimilarity (Synthesis Cost).
- This is a method of deriving a parallax value and finally deriving a parallax image (in this case, a parallax image by the SGM method) indicating the parallax value in all pixels.
- the cost value is calculated in the same way as the SGM method. However, unlike the SGM method, only the parallax value of the edge portion is calculated without calculating the synthesis cost value.
- FIGS. 12A is a conceptual diagram showing reference pixels in the reference image
- FIG. 12B is a diagram for calculating a shift amount (shift amount) while sequentially shifting (shifting) the corresponding pixel candidates in the comparison image with respect to the reference pixels in FIG. 12A
- FIG. FIG. 13 is a graph showing the cost value for each shift amount.
- d is the shift amount (shift amount) between the reference pixel p and the corresponding pixel candidate q, and in this embodiment, the shift amount in pixel units is represented. That is, in FIG.
- the corresponding pixel candidate q (x + d, y) is sequentially shifted by one pixel within a predetermined range (for example, 0 ⁇ d ⁇ 25), and the corresponding pixel candidate q (x + A cost value C (p, d), which is a similarity between the luminance values of d, y) and the reference pixel p (x, y), is calculated.
- FIG. 14 is a conceptual diagram for deriving a synthesis cost value.
- FIG. 15 is a graph showing a combined cost value for each parallax value.
- the calculation of the synthesis cost value in the present embodiment is a method peculiar to the SGM method.
- the pixel around the predetermined reference pixel p (x, y) is used as a reference.
- This is a method of calculating the combined cost value Ls (p, d) by consolidating the cost values in the case of pixels into the cost value C (p, d) in the reference pixel p (x, y).
- Lr (p, d) C (p, d) + min ⁇ (Lr (pr ⁇ d, d), Lr (pr ⁇ d, d ⁇ 1) + P1, Lr (pr ⁇ d, d + 1) + P1, Lrmin (p -R) + p2 ⁇ (Formula 3)
- r indicates an aggregation method.
- Lr is applied recursively as shown in Equation 3.
- Lr (p, d) is the cost value C of the reference pixel p (x, y), and each of the pixels in the r direction shown in FIG. It is obtained by adding the minimum value of the path cost value Lr of the pixel.
- Lr is obtained from the pixel at the extreme end in the r direction of the reference pixel p (x, y), and Lr is obtained along the r direction.
- Lr0, Lr45, Lr90, Lr135, Lr180, Lr225, Lr270, and Lr315 in eight directions are obtained, and finally, the synthesis cost value Ls is obtained based on (Expression 4). It is done.
- Ls (p, d) ⁇ Lr (Formula 4)
- the SGM method takes more processing time than the edge detection method, if the processing needs to be performed more quickly than the accuracy of distance measurement, the distance may be measured by the edge detection method. In this case, the process of the cost synthesis unit 320 in FIG. 9 is not performed, and the parallax value deriving unit 330 derives only the parallax value of the edge portion from the minimum cost value.
- the distance that can be measured by the stereo camera device 110 in this embodiment is 105 m, and the error is several centimeters.
- the size and length of the object can be known. That is, since the ROM 33 of the stereo camera device 110 stores a table of the relationship between the distance and the size and length per pixel, the CPU 32 can also specify the size and length of the object. Note that the ROM 33 stores not a table but a distance and a relational expression per pixel. Further, the server 704 and the control device 118 of the agricultural machine 100 having data necessary for calculating the size and length of the table and the like, not the processing in the stereo camera device 110, determine the size and length of the object. It may be calculated.
- FIG. 16 shows the configuration of the laser radar device 112.
- the shape information obtained by the laser radar device 112 is an example of first digital information and third digital information.
- the distance information by the laser radar device 112 is an example of second digital information and fourth digital information.
- the laser radar device 112 irradiates the target with pulsed laser light, measures the feedback time t of the reflected pulsed laser light, and calculates the distance L to the irradiation point using (Equation 5).
- L c ⁇ t / 2 (Equation 5)
- c is the speed of light.
- the laser radar device 112 can scan the laser beam in a two-dimensional direction, the direction to each point of the object is obtained and the shape of the object is measured. It can also be done.
- the laser radar device 112 includes a laser diode driving circuit 51, a laser diode 52, a light projecting lens 53, two reflection mirrors 68 and 70, a swing motor 54, a polygon mirror 55, a light receiving lens 56, a photo, in a main body 50. It has a diode 58, an amplifier circuit 60, a time interval counter 61, a motor control circuit 62, a controller 64, and a laser beam emission entrance window 66.
- the laser diode drive circuit 51 generates a pulse signal that is input to the laser diode 52.
- the laser diode 52 emits pulsed laser light.
- the light projecting lens 53 converts the pulsed laser light oscillated from the laser diode 52 into parallel light.
- the parallel light is changed in the traveling direction of the pulsed laser light by the reflection mirrors 68 and 70, and then rotated by the motor control circuit 62 controlled by the controller 64 at a constant speed around the ⁇ axis 55a. Is incident on.
- the polygon mirror 55 is oscillated around the ⁇ axis 54 a by using a swaying motor 54 swaying at a predetermined speed by a motor control circuit 62 controlled by a controller 64.
- the laser light incident on the polygon mirror 55 is scanned in a two-dimensional direction, and is irradiated onto the object via the laser light emission incident window 66.
- the controller 64 obtains a signal from a level (not shown), issues a command to the motor control circuit 62 so that the laser beam is always emitted in the horizontal direction, operates the swing motor 54, and rotates the ⁇ axis. The rotation can be controlled.
- the pulsed laser light reflected from the object is collected by the light receiving lens 56 via the polygon mirror 55, received by the photodiode 58, and converted into an electric signal.
- the converted electrical signal is amplified by the amplifier circuit 60, and the time interval of the start pulse synchronized with the pulse oscillation timing of the laser diode 52 and the stop pulse output from the amplifier circuit 60 is measured by the time interval counter 61.
- the controller 64 uses the measured feedback time t, the rotation angle ⁇ and the swing angle ⁇ of the polygon mirror 55 as polar coordinate system data (t, ⁇ , ⁇ ), and further uses the polar coordinate data as the installation position of the laser radar device 112.
- the shape of the object is obtained by converting into three-dimensional space data (X, Y, Z) as the origin.
- Information obtained by the controller 64 can be transmitted to the control device 118 of the agricultural machine 100, the server 704, and the user terminals 710 and 712.
- the horizontal field angle is about 60 degrees
- the vertical field angle is about 30 degrees
- the measurement range is about 60 m on the horizontal plane. Note that the measurement range varies depending on the type of the laser diode 52 and the output voltage of the laser diode drive circuit 51.
- the polygon mirror 55 is not moved by the swing motor 54.
- laser beam scanning in a one-dimensional direction is performed by rotating the polygon mirror 55 around the ⁇ axis.
- the laser diode 52 to be used is selected according to the work purpose. For example, when it is desired to measure the activity of a plant from the state of a plant leaf by an active method in combination with the multispectral camera device 113, a laser diode 52 that can emit laser light in the visible red region with a wavelength of around 660 nm is used. (A method for examining the activity of a plant using the multispectral camera device 113 will be described later). As described above, when the laser radar device 112 is used in combination with the multispectral camera device 113, the laser emitted from the laser radar device 112 is used. On the other hand, when working with the laser radar device 112 alone, there is no need to place them close to each other. Instead of using the oscillating motor 54 and the polygon mirror 55, laser light scanning may be performed by a Micro Electro Mechanical Systems (MEMS) mirror device capable of scanning in two dimensions.
- MEMS Micro Electro Mechanical Systems
- the laser radar device 112 may be configured by a non-scanning laser radar device that deflects laser light with a fixed optical element such as a grating without moving the mirror to scan the laser. In this way, if the non-scanning laser radar apparatus is used, the number of driving points can be reduced, so that failure can be reduced even when the vertical movement is severe during movement.
- the laser radar device 112 is rotatably installed at a position close to the multispectral camera device 113. This rotation operation is performed manually or controlled by the control device 118.
- FIG. 17 shows the appearance of the multispectral camera apparatus 113.
- Spectral information obtained by the multispectral camera device 113 is an example of second digital information and fourth digital information.
- the multispectral camera device 113 is a camera device that can obtain a captured image and a spectral reflectance in the captured image.
- This multi-spectral camera device 113 is suitable for detecting the state of a plant in a certain range (area, plane) at a time without contact and non-destructively, instead of a single point.
- the multispectral camera device 113 has a main body 400 and a lens barrel 402.
- the multispectral camera device 113 is rotatably installed on the agricultural machine 100. This rotation operation is performed manually or controlled by the control device 118. As a result, the reflected light of the object in various directions with respect to the periphery of the agricultural machine 100 can be imaged, and the growth status such as the plant activity, the length between branches, and the size of the leaves can be grasped.
- FIG. 18 is a diagram showing the configuration of the multispectral camera apparatus 113.
- the left is a front view, and the right is a cross-sectional view seen from the side.
- the main body 400 includes a microlens array 414, a light receiving element array 416, an FPGA 418, and a spectral reflectance calculation unit 420.
- the lens barrel 402 includes a light emitting diode (LED) 404, a main lens 408, a diaphragm 409, a filter 410, and a condenser lens 412.
- LED light emitting diode
- the microlens array 414 is an optical element in which a plurality of small lenses are arranged in a two-dimensional direction.
- the light receiving element array 416 is a monochrome sensor that has a plurality of light receiving elements and is not mounted with a color filter for each of the light receiving elements (hereinafter also referred to as “pixels”).
- the light receiving element array 416 is a sensor that converts optical information into electrical information.
- the FPGA 418 is a spectral image generation unit that generates a plurality of types of spectral images based on electrical information that is spectral information output from the light receiving element array 416.
- the spectral reflectance calculation unit 420 includes a semiconductor element such as a CPU, a ROM, and a RAM, and calculates a spectral reflectance for each pixel from the spectral image generated by the FPGA 418.
- the output from the multispectral camera device 113 is a plurality of types of spectral images generated by the FPGA 418 and the spectral reflectance of each pixel of those spectral images. These pieces of information are transmitted to the control device 118 of the agricultural machine 100, the server 704, the user terminals 710 and 712, the control unit of the state monitoring device 550, and the like.
- a plurality of LEDs 404 are light sources arranged in a state of being embedded at equal intervals in the distal end portion of the lens barrel portion 402. By using the LED as a light source, it becomes difficult to be influenced by the shooting environment, and stable spectral information can be obtained.
- the main lens 408 is a lens that guides reflected light from the object 406 to the filter 410 via the diaphragm 409.
- the stop 409 is a shield used to adjust the amount of light passing therethrough.
- the filter 410 has a spatially continuous change in spectral transmittance. That is, the filter 410 has a plurality of spectral characteristics. The directionality of the continuity of the spectral transmittance of the filter 410 is not limited as long as it is within one plane.
- the surface perpendicular to the optical axis of the main lens 408 may have continuity in the vertical direction in the right diagram of FIG. 18, the direction perpendicular thereto, or the direction intersecting obliquely.
- the condenser lens 412 is a lens for guiding the light that has passed through the filter 410 to the microlens array 414.
- the light beam incident on the main lens 408 is a target for spectral reflectance measurement.
- the light beam incident on the main lens 408 is a collection of innumerable light beams, and each light beam passes through a different position of the stop 409.
- the reflected light collected by the main lens 408 is incident on the filter 410 after the amount of light passing through the diaphragm 409 is adjusted.
- the diaphragm 409 exists on the filter 410, but is not limited thereto.
- Each light beam incident on the filter 410 passes through a filter having a different spectral transmittance.
- the light beam that has passed through the filter 410 is collected by the condenser lens 412 and once forms an image near the microlens array 414.
- the microlens array 414 is installed so that a plurality of microlenses (small lenses) are arranged in a direction orthogonal to the optical axis of the main lens 408.
- Each light beam once formed reaches another position of the light receiving element array 416 by the micro lens array 414. That is, since the position of the light receiving surface of the light receiving element array corresponds to the position of the filter 410 through which the light beam has passed, the spectral reflectance at a certain point of the object 406 can be measured simultaneously.
- FIG. 19 is a front view of the filter 410 and the diaphragm 409 used in this embodiment.
- the lower part of the filter 410 has a spectral transmittance peak with a short wavelength and the upper part with a long wavelength.
- the photographed image is an array of small circles as shown in FIG.
- the reason for forming a circle is that the shape of the stop 409 of the main lens 408 is a circle.
- Each small circle is referred to herein as a “macro pixel”.
- Each macro pixel is formed immediately below each small lens (micro lens) constituting the micro lens array 414. The diameter of the macro pixel and the diameter of the microlens are almost the same.
- the light beam that has passed through the lower portion of the filter 410 reaches the upper portion of the macro pixel, and the light beam that has passed through the upper portion of the filter 410 reaches the lower portion.
- the filter 410 is arranged so that the lower part of the filter 410 has a short wavelength and the upper part has a spectral transmittance of a long wavelength, a short wavelength light beam reaches the upper part of the macro pixel and a long wavelength light ray reaches the lower part.
- the FPGA 418 generates a spectral image from spectral information from a pixel to which a light beam for each wavelength reaches. Thereby, a plurality of spectral images for a desired wavelength are obtained.
- the spectral reflectance calculation unit 420 calculates an average value for each row of each macro pixel, the spectral intensity of illumination such as the LED 404, the spectral transmittance of the main lens 408 and the condenser lens 412, the spectral transmittance of the filter 410, and the light reception.
- Spectral reflectance can be obtained by calculating in consideration of the spectral sensitivity of the element array 416.
- FIG. An enlarged view of the macro pixel is shown in FIG.
- a case where one macro pixel is a 19 ⁇ 19 pixel is considered.
- the spectral reflectance at a certain point of the object 406 is obtained from this one macro pixel.
- Data obtained from the multispectral camera device 113 is an output value from the light receiving element, and the output value corresponds to the amount of light incident on the light receiving element.
- the amount of light is the spectral intensity of illumination such as the LED 404, the spectral reflectance of the object 406, the spectral transmittance of the optical system (main lens 408, condenser lens 412, etc.), the spectral transmittance of the filter 410, and the spectral sensitivity of the light receiving element array 416.
- the output value here, a value obtained by dividing the sum of the output values of 19 pixels in the bottom row of FIG. 21 by the area where the macro pixels are formed is used.
- the area where the macro pixel is formed is an area where light rays other than the area filled in black in FIG. 21 reach. This is to normalize the output value of each row.
- the relative value of the reflectance at ⁇ s can be obtained by the above procedure. Absolute values must be calibrated separately.
- the spectral intensity of illumination such as the LED 404, the spectral transmittance of the main lens 408 and the condenser lens 412, the spectral transmittance of the filter 410, the spectral sensitivity of the light receiving element array 416, and the area of each row of macro pixels are known at the time of design.
- the reflectance at 19 wavelengths can be obtained by applying the above processing to each row of macro pixels.
- the horizontal axis is the wavelength
- the vertical axis is the relative value of the spectral reflectance.
- the above is the process for one macro pixel.
- the filter 410 that can measure the two-dimensional spectral reflectance is applied to a transparent substrate such as optical glass.
- the thin film can be produced by vapor deposition so that the thickness changes in a wedge shape.
- the material of the thin film in this embodiment is niobium pentoxide, and the material on the short wavelength side is tantalum pentoxide.
- the thickness of the thin film is several tens to several hundreds nm. A thinner film thickness corresponds to a short wavelength and a thicker film corresponds to a long wavelength. Since the thickness of the thin film changes in a wedge shape (stepless), the spectral transmittance also changes continuously.
- the condition where transmitted light is strengthened corresponds to the peak wavelength of the spectral transmittance.
- the thickness of the transparent substrate only needs to be able to hold the filter. Some lenses are designed so that the lens is close to the aperture position. For example, it is about 0.5 mm.
- the filter 430 in FIG. 23A has a configuration divided for each transmission band. That is, the filter 430 includes a filter 430a corresponding to a wavelength range from 400 nm to 500 nm, a filter 430b corresponding to a wavelength range from 500 nm to 600 nm, and a filter 430c corresponding to a wavelength range from 600 nm to 700 nm. Therefore, the filter 430 is a filter whose spectral transmittance continuously changes in the ultraviolet region or the infrared region.
- Each of the filters 430a, 430b, and 430c is a filter whose spectral transmittance changes spatially and continuously.
- the wavelengths increase from the top to the bottom in the figure.
- the orientations of the filters 430a, 430b, and 430c in the longitudinal direction need not be unified. In short, it is sufficient that there is a region where the spectral transmittance continuously changes, and the directionality is not relevant.
- each filter 430a, 430b, 430c is not limited to the said structure, What is necessary is just to have a mutually different wavelength range at least in part.
- Each transmission band is an example, and is not limited to these values.
- the shape of the aperture 409 may be formed into a polygon such as a square or other desired shape.
- FIG. 24 shows a typical spectral reflectance spectrum for a plant leaf.
- a solid line 2401 is a spectrum of normal leaves (having high plant activity), and a broken line 2402 is a spectrum of dead leaves (having low plant activity).
- the wavelength is reflected by absorption of chlorophyll, which is a kind of chloroplast, in the visible red region (and shorter wavelength region) 2404 near 660 nm. The rate is low.
- the reflectance is high from 700 nm to 1100 nm in the near infrared region 2405.
- NDVI normalized vegetation index
- the normalized vegetation index NDVI can be obtained in the entire imaging region. That is, like the filter 440 of FIG. 23B, the filter 440a corresponding to the wavelength region of 660 nm in the visible red region 2404 and the filter 440b corresponding to the wavelength region of 770 nm which is the near infrared region 2405 are replaced by the multispectral camera of this embodiment. Adopted as a filter of the device 113. Note that as the near-infrared region 2405, a filter corresponding to a wavelength region of 785 nm or 900 nm may be employed for the filter 440b.
- 785 nm is a wavelength that can be easily obtained with a laser diode (LD).
- Half of the LEDs 404 are those that irradiate light with a high intensity around a wavelength of 660 nm, and the other half are those that can irradiate high intensity light at a wavelength near 770 nm.
- the target plant is irradiated with LED light by the multispectral camera device 113, and reflected light is imaged. Then, a spectral image at a wavelength of 660 nm and a spectral image at a wavelength of 770 nm are obtained with the FPGA 418.
- the spectral reflectance calculation unit 420 obtains the spectral reflectance at a desired position or region in those spectral images. Further, the CPU in the spectral reflectance calculation unit 420 applies (Expression 6) to obtain the normalized vegetation index NDVI. Note that the control device 118 or the server 704 of the agricultural machine 100 that has acquired the spectral image and spectral reflectance information may apply (Equation 6) to obtain the normalized vegetation index NDVI, not within the multispectral camera device 113. . The normalized vegetation index NDVI for each crop is sent to the database 708 and accumulated.
- the harvest time can be accurately predicted by daily observation of the normalized vegetation index NDVI.
- the normalized vegetation index NDVI is maximum (when the plant activity is the highest). Since the maximum value of the normalized vegetation index NDVI and the date when the maximum vegetation index NDVI is reached differ for each crop, the range of the normalized vegetation index NDVI that is desired to be harvested is determined for each plant. This can be performed by the server 704 or the user terminals 710 and 712 using the data of the normalized vegetation index NDVI stored in the database 708.
- the range of the normalized vegetation index NDVI to be harvested from the degree of variation determines 0.5 to 0.55
- crops having the normalized vegetation index NDVI obtained by the multispectral camera device 113 or the like are within the range may be harvested.
- the harvest time can be predicted by statistically obtaining the trend of daily change of the normalized vegetation index NDVI for each crop from the accumulated data.
- the quality (sugar content) of a product (fruit) or the like can be determined from the color.
- the filter 430 divided for each of the transmission bands of FIG. 23A 400 nm to 500 nm (430a), 500 nm to 600 nm (430b), 600 nm to 700 nm (430c)) is used, and the light receiving elements ( A color sensor in which RGB color filters are arranged in a Bayer array for each pixel) is used.
- This RGB color filter has a spectrum peak (maximum value) near 470 nm for B (blue), near 540 nm for G (green), and near 620 nm for R (red).
- the spectral characteristics of the filters (430a, 430b, and 430c) constituting the filter 430 and the RGB filters constituting the filter 440 in the color sensor are different.
- light can be transmitted only through the spectral transmission region of each filter. Therefore, in the present embodiment, substantially six types of spectral information are acquired. If six types of spectral information are obtained in this way, the natural spectrum can be measured with high accuracy, and the captured color can be accurately recognized.
- This multispectral camera device constitutes a colorimetric camera device capable of measuring visible light with high accuracy. For example, for a fruit whose sugar content increases as it ripens and becomes red like a certain kind of strawberry, the spectral reflectance in the visible red region in the spectral image of the whole fruit can be obtained by the multispectral camera device (colorimetric camera device) 113. Therefore, the sugar content can be evaluated.
- the spectral reflectance in the near infrared region can be measured by the multispectral camera device 113, and the sugar content can be evaluated from the spectrum distribution.
- the amount of water contained in the green leaves of the plant can be measured in a non-destructive and non-contact manner. This is because when a plant is deficient in water, water stress is applied to the plant and the spectral characteristics of the surface of the green leaf change, and the amount of water is measured by capturing this change. As shown in FIG. 24, there is a region (red edge) where the reflectance suddenly increases from the visible red region to the near infrared region. It is known that when the plant is subjected to water stress, the region where the reflectance is increased shifts to the blue side (left side) with a short wavelength (blue shift). A dotted line 2403 in FIG. 24 shows a blue shift when water stress is applied.
- the multispectral camera device 113 measures the reflectance at a plurality of wavelengths in a region where the reflectance suddenly increases from the visible red region to the near infrared region.
- a spectral filter that can handle the wavelength range is provided.
- the spectral filter may be a filter that changes continuously from the visible red region to the near infrared region like the filter 410, or a filter that selects and transmits a desired wavelength (for example, 715 nm, 740 nm). It may be.
- the reflectance for a desired wavelength is measured in a region where the reflectance suddenly increases from the visible red region to the near infrared region, and a reference reflectance (for example, a spectral reflectance for each wavelength in a state where no moisture stress is applied). ) Can be detected.
- a reference reflectance for example, a spectral reflectance for each wavelength in a state where no moisture stress is applied.
- the LED 404 an LED that can output light of a desired wavelength in a region where the reflectance suddenly increases from the visible red region to the near infrared region may be installed and used, or irradiation by the LED 404 is performed. Instead, the reflectance may be measured using sunlight.
- the spectral reflectance at the plurality of wavelengths acquired from sunlight reflected by plants is reflected from sunlight reflected by a standard white plate installed in the field or agricultural machine 100.
- the spectral reflectance to be measured is not limited to those for two wavelengths, and may be measured for three or more wavelengths in order to improve accuracy.
- the multispectral camera device 113 it is possible to perform non-destructive, non-contact and quick measurement on the plant to be measured.
- the laser radar device 112 may irradiate the plant with laser light having a desired wavelength, and the multispectral camera device 113 may capture the reflected light.
- the laser radar device 112 can measure the distance to the measurement position. For this reason, for example, the length of the stem between the branches and the size of the leaf are specified or estimated from the spectral image captured by the multispectral camera device 113 and the distance information detected by the laser radar device 112. In the present embodiment, this identification (or estimation) process is performed by the server 704.
- the server 704 recognizes leaves, branches, stems, and the like in the spectral image by a recognition process described later.
- the length between branches for example, if the length is 1000 pixels at a distance of 50 cm, it is specified (or estimated) as about 5.3 cm. Alternatively, if the leaf occupies 230,000 pixels at a distance of 50 cm, the area is specified (or estimated) as 100 square centimeters.
- two multispectral camera devices (colorimetric camera devices) 113 may be combined to measure the distance based on the same principle as the stereo camera device 110 described above. Thereby, the image of an object, spectral information, and distance information (parallax value information) can be acquired by one imaging operation.
- the multi-spectral camera device 113 measures the near-infrared spectral reflectance from the soil, and uses the difference in the absorption spectrum of nutrients (nitrogen, phosphoric acid, potassium) necessary for growing the crop to determine the state of the soil. Can be grasped.
- the overall system 1500 can adjust the balance of fertilizer and the like according to the grasped soil condition, and can efficiently manage the soil finely.
- FIG. 25 shows a state monitoring device 550 using this multispectral camera device 113.
- the state monitoring device 550 is a device for quickly measuring the activity of the crops, soil, and the like in a wide field.
- the state monitoring device 550 includes a multispectral camera device 113, a holding unit 450 that holds the multispectral camera device 113 rotatably with respect to a horizontal axis, and a rotary stage that holds the holding unit 450 rotatably with respect to a vertical axis.
- the storage unit 454A storing a control unit that performs input / output control and communication control of information from the device 113 and rotation control of the holding unit 450 and the rotation stage 452, and a control unit in the storage unit 454A are connected to the agricultural machine 100 and Wireless communication with the server 704 and user terminals 710 and 712
- the wireless antenna 458, a multi-spectral camera device 113 such as a transparent glass cover 462 to protect the surrounding environment, and a pillar 460 which supports the state monitoring device 550 at a high position.
- the state monitoring device 550 operate using electrical energy stored in the storage battery.
- the cover 462 may not be made of glass as long as it is transparent, and may be made of a resin such as acrylic.
- a solar panel 456 is installed above the cover 462, and a storage unit 454A is installed below the cover 462.
- a multispectral camera device 113, a holding unit 450, and a rotation stage 452 are installed in the cover 462.
- the state monitoring device 550 images crops in the surrounding area based on information transmitted from the user terminals 710 and 712, the server 704, and the agricultural machine 100, and checks the plant activity of these crops.
- the reflected sunlight may be imaged without using the LED 404.
- the control unit and the wireless antenna 458 also function as a wireless access point and can perform wireless relay. Thereby, the area
- the control unit transmits a signal for specifying the position of the agricultural machine 100 via the wireless antenna 458 according to an instruction from any of the agricultural machine 100, the server 704, and the user terminals 710 and 712.
- the agricultural machine 100 can specify the current position from the intensity (or attenuation) or reception time difference of received signals of signals transmitted from a total of three or more state monitoring devices 550 or the field monitoring device 500.
- an angle is provided inside or outside the cover 462 in the upper part of the state monitoring device 550. You may install a reflector. Thereby, the downward state which becomes a blind spot can also be monitored by the rotation stage 452 or the storage unit 454A.
- the state monitoring device 550 may be used as a monitoring device for monitoring an object (for example, soil) having a characteristic in which the spectral reflectance varies depending on the wavelength, in addition to the purpose of monitoring the state of the crop in the field.
- an object for example, soil
- the leaf itself, the leaf surface, and the like may be discolored due to effects of pests, frost, and the like, and the state monitoring device 550 can also detect a plant or region in which such a color has changed.
- FIG. 26 shows a field monitoring device 500 using the omnidirectional camera device 501.
- the omnidirectional camera device 501 is an example of a sensor.
- the omnidirectional camera device 501 is capable of imaging 360 degrees around the camera with a single imaging, and by installing it on the field, it is possible to monitor the weather from the sky image as well as the field, for example. . Moreover, according to this agricultural field monitoring apparatus 500, evaluation of the amount of sunshine, etc. can be performed in a wide area
- the structure denoted by the same reference numeral as that in FIG. 25 performs the same function as that described with reference to FIG. 454B is a storage unit that stores a storage battery and a control unit like the state monitoring device 550, and this control unit is not an instruction for the omnidirectional camera device 501 but the omnidirectional camera device 501 instead of the multispectral camera device 113. It differs from the control unit of the state monitoring device 550 in that input / output control of information from 501 is performed and rotation control is not performed.
- the optical systems A and B of the omnidirectional camera device 501 inside or outside the cover 462 of the upper portion of the field monitoring device 500.
- a reflector may be installed at an angle. Thereby, the downward state which becomes a blind spot can also be monitored by the storage unit 454B or the like.
- the field monitoring device 500 using the omnidirectional camera device 501 can be used as, for example, a monitoring camera device in addition to the purpose of monitoring the field.
- FIG. 27 is a front external view of the omnidirectional camera device 501.
- This camera has two optical systems A and B having a fisheye (wide angle) lens and a main body 502.
- FIG. 28 is a diagram showing an optical system of the omnidirectional camera apparatus 501.
- portions indicated by reference signs A and B indicate an imaging optical system.
- Each of the two imaging optical systems A and B includes a wide-angle lens having a field angle wider than 180 degrees and imaging elements IA and IB that capture an image by the wide-angle lens. That is, the imaging optical system A is configured by a front group constituted by lenses LA1 to LA3, a right-angle prism PA constituting a reflecting surface, and a rear group constituted by lenses LA4 to LA7.
- An aperture stop SA is disposed on the object side of the lens LA4.
- the imaging optical system B includes a front group composed of lenses LB1 to LB3, a right-angle prism PB that constitutes a reflecting surface, and a rear group composed of lenses LB4 to LB7.
- An aperture stop SB is disposed on the object side of the lens LB4.
- the lenses LA1 to LA3 constituting the front group of the imaging optical system A are, in order from the object side, a negative meniscus lens (LA1) made of a glass material, a negative lens (LA2) made of a plastic material, and a negative meniscus lens (LA3 made of a glass material).
- the lenses LA4 to LA7 constituting the rear group are, in order from the object side, a biconvex lens (LA4) made of a glass material, a cemented lens of a biconvex lens (LA5) and a biconcave lens (LA6) made of a glass material, and a biconvex lens (LA7 made of a plastic material).
- the lenses LB1 to LB3 constituting the front group of the imaging optical system B are, in order from the object side, a negative meniscus lens (LB1) made of a glass material, a negative lens (LB2) made of a plastic material, and a negative meniscus lens (LB3 made of a glass material). ).
- the lenses LB4 to LB7 constituting the rear group are, in order from the object side, a biconvex lens (LB4) made of a glass material, a cemented lens of a biconvex lens (LB5) and a biconcave lens (LB6) made of a glass material, and a biconvex lens (LB7 made of a plastic material). ).
- the negative lenses LA2 and LB2 made of the plastic material in the front group and the biconvex lenses LA7 and LB7 made of the plastic material in the rear group are both aspherical, and each lens made of other glass material is a spherical lens. It is.
- the position of the front principal point in each wide-angle lens is set between the second lenses LA2 and LB2 and the third lenses LA3 and LB3.
- the distance between the intersection between the optical axis of the front group and the reflecting surface and the front principal point is d1 in FIG.
- the distance: d is reduced under the conditions. To do. As d decreases, the distance between the lens LA3 (LB3) and the prism PA (PB) becomes narrow, and the restriction on the lens thickness for securing the refractive power necessary for the lens LA3 (LB3) becomes severe. If the lower limit of the condition (1) is not reached, the desired thickness and shape of the lens LA3 (LB3) cannot be processed, or the processing becomes difficult.
- the imaging optical systems A and B are as close to each other as possible in the horizontal direction in the drawing in order to reduce the size of the omnidirectional camera device 501.
- an increase in the parameter: d / f means that the distance: d between the intersection of the optical axis of the front group and the reflecting surface and the front principal point becomes large. It means to enlarge.
- Such an increase in the size of the front group makes it difficult to reduce the size of the omnidirectional camera device 501.
- the imaging optical systems A and B are arranged in a state where the slopes of the prisms PA and PB are close to each other in FIG.
- Condition (3) nd ⁇ 1.8 Stipulates that the material of the prisms PA and PB should have a refractive index with respect to d-line: nd greater than 1.8. Since the prisms PA and PB internally reflect the light from the front group toward the rear group, the optical path of the imaging light beam passes through the prism.
- the optical optical path length in the prism becomes longer than the actual optical path length, and the distance for bending the light beam can be increased.
- the optical path length between the front group and the rear group in the front group, the prism, and the rear group can be made longer than the mechanical optical path length, and the wide-angle lens can be configured compactly. Further, by arranging the prisms PA and PB near the aperture stops SA and SB, a small prism can be used, and the distance between the wide-angle lenses can be reduced.
- the prisms PA and PB are disposed between the front group and the rear group.
- the front group of the wide-angle lens has a function of taking in light rays having a wide angle of view of 180 degrees or more, and the rear group effectively functions for image formation for aberration correction.
- the prisms By arranging the prisms as described above, they are less susceptible to prism misalignment and manufacturing tolerances.
- the omnidirectional camera device 501 includes the imaging optical systems A and B, the imaging devices IA and IB, the image processing unit 504, the imaging control unit 506, the CPU 510, the ROM 512, the static random access memory ( SRAM) 514, Dynamic Random Access Memory (DRAM) 516, operation unit 518, network I / F 520, and communication unit 522.
- the imaging optical systems A and B the imaging devices IA and IB
- the image processing unit 504 the imaging control unit 506, the CPU 510, the ROM 512, the static random access memory ( SRAM) 514, Dynamic Random Access Memory (DRAM) 516, operation unit 518, network I / F 520, and communication unit 522.
- SRAM static random access memory
- DRAM Dynamic Random Access Memory
- the image sensors IA and IB generate an image sensor such as a CMOS sensor or a CCD sensor that converts an optical image obtained by a wide-angle lens into image data of an electrical signal and outputs it, a horizontal or vertical synchronization signal of this image sensor, a pixel clock, and the like. It includes a timing generation circuit, a register group in which various commands and parameters necessary for the operation of the image sensor are set.
- the image sensors IA and IB are each connected to the image processing unit 504 via a parallel I / F bus. Further, the image pickup devices IA and IB are connected to the image pickup control unit 506 through a serial I / F bus (I2C bus or the like).
- the image processing unit 504 and the imaging control unit 506 are connected to the CPU 510 via the bus 508. Further, ROM 512, SRAM 514, DRAM 516, operation unit 518, network I / F 520, and communication unit 522 are also connected to the bus 508.
- the image processing unit 504 captures the image data output from the image sensors IA and IB through the parallel I / F bus, performs predetermined processing on the respective image data, and then combines these image data. The data of the equirectangular image as shown in FIG. 30C is created.
- the imaging control unit 506 generally uses the imaging control unit 506 as a master device, the imaging elements IA and IB as slave devices, and uses a serial I / F bus such as an I2C bus to send commands to the register groups of the imaging elements IA and IB. Set. Necessary commands and the like are received from the CPU 510.
- the imaging control unit 506 also takes in the status data of the register groups of the imaging elements IA and IB using the serial I / F bus and sends them to the CPU 510. Further, the imaging control unit 506 instructs the imaging devices IA and IB to output image data at the timing when the shutter button of the operation unit 518 is pressed. In the field monitoring device 500, the operation unit 518 is omitted, and an image is taken based on an instruction from the control unit stored in the storage unit 454 connected to the network I / F 520.
- the imaging control unit 506 also functions as a synchronization control unit that synchronizes the output timing of the image data of the imaging elements IA and IB in cooperation with the CPU 510.
- the CPU 510 controls the entire operation of the omnidirectional camera device 501 and executes necessary processes.
- the ROM 512 stores various programs for the CPU 510.
- the SRAM 514 and the DRAM 516 are work memories, and store programs executed by the CPU 510, data being processed, and the like.
- the DRAM 516 stores image data being processed by the image processing unit 504 and processed equirectangular image data.
- the operation unit 115 is a general term for various operation buttons, a power switch, a shutter button, a touch panel that has both display and operation functions, and the like. The user can input various shooting modes and shooting conditions by operating the operation buttons.
- the network I / F 520 is a general term for an interface circuit (USB I / F, etc.) with an external medium such as an SD card or a USB memory, or a personal computer. Further, the network I / F 520 may be a network interface regardless of wireless or wired.
- the equirectangular image data stored in the DRAM 516 communicates with the control unit of the storage unit 454 via the network I / F 520.
- the communication unit 522 has a short-range wireless technology.
- the control unit in the storage unit 454 may have a communication function and communicate with the communication unit 522, but the communication unit 522 is omitted when the omnidirectional camera device 501 is used in the field monitoring device 500. it can.
- FIG. 30A is a hemispheric image (front side) photographed by the omnidirectional camera device 501
- FIG. 30B is a hemispheric image (rear side) photographed by the omnidirectional camera device 501
- FIG. 30C is represented by equirectangular projection. It is the figure which showed the obtained image (it is called "an equirectangular image").
- FIG. 30 shows an example in which a building is imaged. As shown in FIG.
- the image obtained by the imaging element IA is a hemispherical image (front side) curved by the imaging optical system A.
- the image obtained by the imaging element IB is a hemispherical image (rear side) curved by the imaging optical system B.
- the hemispherical image (front side) and the hemispherical image inverted by 180 degrees (rear side) are synthesized by the omnidirectional camera device 501 by the image processing unit 504, and as shown in FIG. A distance cylinder image is created.
- the image processing unit 504 first, a connection position is detected. That is, the shift amount between the reference image and the comparison image is calculated for each area by pattern matching processing. Subsequently, distortion correction is performed by geometric transformation. That is, the connection position detection result is converted into an omnidirectional image format in consideration of lens characteristics. Finally, the two images are blended to generate a single spherical image.
- a part of the omnidirectional camera device 501 of the field monitoring devices 500 and 555 installed in the field may be a night vision camera for nighttime monitoring.
- a high-sensitivity light-receiving element is used as the imaging elements IA and IB, and near-infrared light is irradiated as illumination to the field, and the reflected light is imaged to obtain a monochrome mode image.
- the omnidirectional camera device 501 includes a polarizing filter (such as a SWS polarizing filter) disposed on the light receiving side of the image sensors IA and IB, and detects an image by S-polarized light and P-polarized light. It may be. In this case, the omnidirectional camera device 501 can also acquire a high-contrast image. For this reason, it is difficult to detect with a camera device that is not a polarization camera device, a subject whose light polarization state varies depending on the plane direction (such as a black subject), or a subject whose transmittance changes according to the light polarization state (transparent subject) Etc.) can be improved.
- a polarizing filter such as a SWS polarizing filter
- FIG. 31 shows another example of the field monitoring device.
- the farm field monitoring apparatus 555 is installed above the pillar 470 without contacting the solar panel 456 and the wireless antenna 458 with the transparent cover 462.
- Other configurations are the same as those of the field monitoring apparatus 500. In this way, the solar panel does not get in the way even when it is desired to obtain a slightly upper image.
- 31 is replaced with the omnidirectional camera device 501 and includes a multispectral camera device 113, a holding unit 450, and a rotary stage 452 shown in FIG. 25, and a controller for controlling them.
- the monitoring device 550 may be configured.
- an angle is provided on the inside or outside of the cover above the cover 462.
- a reflector may be installed. Thereby, the downward state which becomes a blind spot can also be monitored by the storage unit 454B or the like.
- a plurality of the field monitoring devices 500 and 550 and the state monitoring device 555 are installed in the field. However, when the size of the field is small and one field can be monitored, only one may be used.
- the farm field monitoring devices 500 and 555 and the state monitoring device 550 are examples of sensors.
- the operation of the overall system 1500 in this embodiment will be described with reference to FIGS.
- the operation of the overall system 1500 is such that the agricultural machine 100, the server 704, the user terminals 710 and 712, the other field monitoring devices 500 and 555, the state monitoring device 550, the databases 706 and 708, and the like operate in cooperation.
- the agricultural machine 100 is not operated or operated manually. That is, it is an operation for the agricultural machine 100 to automatically operate and work.
- the operations shown in the flowcharts of these drawings are representative. Other operations and detailed operations have already been described in text, or will be described in the future.
- the operation is performed in accordance with the program and / or the user terminal instruction of the user terminal. However, for the sake of simplicity of explanation, it is assumed that the user terminals 710 and 712 collectively perform the operation. Furthermore, the operations of other devices (devices such as reference numerals 110, 112, 113, 500, 550, and 555) and databases 706 and 708 that have been described so far or will be described in the future are accurately provided. This is an operation performed by the control processor or CPU in accordance with a program stored in the device or database. For simplicity of explanation, other devices (reference numerals 110, 112, 113, 500, 550, and 555) are used. Etc.) and operations performed by the databases 706 and 708 and the like.
- FIGS. 32 and 33 are flowcharts for explaining the initial setting performed in the server 704, the user terminals 710 and 712, and the agricultural machine 100 in order for the agricultural machine 100 to move and work in the field. The description will be made with reference to these drawings. Basically, the operation performed by the agricultural machine 100 on the left side, the operation performed by the server 704 in the center, and the operation performed by the user terminals 710 and 712 on the right side are shown, but depending on the figure, operations performed by one or two of them. It is explained as.
- the server 704 identifies data for identifying the field, that is, position information (longitude and latitude, The user terminal is inquired to transmit data such as altitude (if possible) or / and the shape of the field (S102). Note that the start of the initial setting operation in step S100 is executed by an instruction from the user terminal.
- the user terminal 710 or 712 inputs data necessary for specifying a field (position information (longitude and latitude, altitude if possible) of corners and ends of the field) or / and input in a form to answer an inquiry.
- the field shape etc. is transmitted to the server 704 (S104).
- map information possessed by the server 704, the database 706, etc., or the Internet or other external system may be acquired, and data input may be performed using the map information. For example, a map in which position information such as latitude / longitude (and altitude) is associated with each point on the map is displayed on the screen of the user terminal 710 or 712, and the field is specified in such a manner that the user surrounds or traces the map.
- the position information obtained from the designated area may be transmitted to the server 704.
- the data for specifying the field may be set in advance by the provider of the overall system 1500.
- the server 704 receives the information for specifying the field sent from the user terminal 710 or 712, specifies the field where the work can be performed, attaches identification information such as a name, and stores it in the SSD in the server 704. (S106).
- the field specific information is also stored in the database 708 with identification information. With this stored information, work can be performed in the same field without any information input from the user in the future.
- the server 704 makes an inquiry to the user terminals 710 and 712 so as to transmit information for specifying a place to perform work in the field (S108).
- the user terminal 710 or 712 inputs data necessary for specifying the work location (position information (longitude and latitude, altitude if possible) of the corners and edges of the work area) input in the form of answering the inquiry or / And the work area shape information, work start position, work end position, pillow, etc.) are transmitted to the server 704 (S110).
- position information longitude and latitude, altitude if possible
- work area shape information work start position, work end position, pillow, etc.
- map information included in the server 704 the database 706, or the Internet or other external system may be acquired, and data input may be performed using the map information.
- a map showing at least a farm field associated with position information such as latitude / longitude (and altitude) at each point on the map is displayed on the screen of the user terminal 710 or 712, and the user ends the work from the work start position on the map.
- a work location is specified in a form that traces a position or a work location and specifies a work start / end position.
- position information or the like obtained from the designated area may be transmitted to the server 704.
- the server 704 receives the information for identifying the work location sent from the user terminal 710 or 712, identifies the location where the work is to be performed, attaches identification information such as a name, and stores it in the SSD within the server 704 ( S112). Information for specifying the work location is also stored in the database 708 with identification information. If the work location information is stored, it is not necessary for the user to input again when the same or different work is performed at the same work location in the future. Note that the server 704 may specify a work location based on information from the field monitoring devices 500 and 555 and the state monitoring device 550.
- the server 704 includes the type of work (cultivation, crushed soil, leveling, rice planting, fertilization, sowing, transplanting, harvesting, weeding, pesticide application / spraying, watering, mowing, etc.) and the agricultural machine that performs the work.
- the user terminal 710 or 712 is inquired about the operation method (inner turning, outer turning, outer turning (outer winding) plowing, inner turning (inner winding) plowing, one-way plowing, sequential plowing, vertical piling, diagonal cutting, etc.) or the operation route. (S114).
- the user terminal 710 or 712 transmits to the server 704 the type of work input by the user, the agricultural machine performing the work, the operation method, or the operation route (S116). At this time, it is possible to change the type of work for each part in the operation route, or to specify that the work is not performed in a certain part.
- the operation route may be obtained by acquiring map information that the servers 704 and 706 or the Internet or other external systems have, and inputting data using the map information. For example, a map indicating at least a work location associated with position information such as latitude / longitude (and altitude) at each point on the map is displayed on the screen of the user terminal 710 or 712, and the user works from the work start position on the map.
- the operation route is specified by setting in order by tracing the route to the end position. Furthermore, it is possible to specify that the work is to be changed on a part of the route or not to be performed.
- the server 704 receives the information sent from the user terminal 710 or 712 to identify the type of work, the agricultural machine that performs the work, the operation method or the operation route, identifies them, and attaches identification information such as a name. And stored in the SSD in the server 704 (S118). Information for identifying them is also stored in the database 708 with identification information.
- the server 704 collects the information specified in S106, S112, and S118 as work data, and transmits it to the user terminals 710 and 712 to confirm whether they are correct (S120). At this time, if the specified data is changed from the past, the data stored in the SSD and the database 708 is rewritten.
- the user terminal 710 or 712 sends a confirmation to the server 704 whether the received work data is changed or not (S122).
- the server 704 determines whether there is any change in the work data based on the confirmation information transmitted from the user terminal 710 or 712 (S124).
- the user terminals 710 and 712 are prompted to input data to be changed (S126).
- the user terminal 710 or 712 changes at least one of the data for specifying the farm field, the data for specifying the work location, the type of work, the agricultural machine, the operation method, or the data for specifying the operation route.
- the item to be performed is selected, and the change is transmitted to the server 704 (S128).
- the server 704 returns to the process of S120 and continues the subsequent processing.
- the server 704 determines that there is no change in the process of S124, the work data is transmitted to the agricultural machine 100 (S130).
- the agricultural machine 100 determines whether the recognized work device 106 can execute the type of work transmitted from the server 704 (S132).
- the work device 106 connected to the agricultural machine 100 is a fertilizer
- the work device 106 cannot execute the work type as in the case where the work type sent from the server is sowing.
- a negative determination is made in S132, and at least one of the working device and the working device to be connected is changed, or the working device is connected.
- the server 704 is urged for error information for prompting the user to make a request (S134).
- the server 704 Upon receiving such error information from the agricultural machine 100, the server 704 prompts the user terminals 710 and 712 to change at least one of the work devices connected to the work type or connect the work devices. (S136).
- the user terminals 710 and 712 receive this notification and change the work type or change or connect the work device to be connected (S138).
- the changed work type is transmitted to the server 704, and the process returns to step S130 to include the changed work type information. Send the data back to the agricultural machine 100.
- the agricultural machine 100 determines whether the change or connection has been performed (S140). .
- the process stops at step S140.
- the agricultural machine 100 may perform a notification for calling attention to the user terminal via the server 704.
- the process returns to the step S132.
- an appropriate work can be performed by the agricultural machine 100, and problems caused by work automation such as a seeding work where a watering work should be performed can be prevented.
- step S132 If it is determined in step S132 that the type of work received by the work device to which the agricultural machine 100 is connected can be executed, a notification of completion of the initial setting is transmitted to the server 704 (S142).
- the server 704 When the server 704 receives the initial setting completion notification, the server 704 registers the contents of the initial setting (finally set work data), the initial setting completion date and time in the SSD or the database 708. Further, notification of completion of the initial setting is sent to the user terminals 710 and 712 (S144).
- the user terminals 710 and 712 receive the initial setting completion notification (S146) and end the initial setting work (S148).
- each process of S102, S110, S116, S122, S128, and S130 is performed by the agricultural machine 100. You may make it perform from the manual operation part 116.
- the server 704 also makes an inquiry to the agricultural machine 100.
- server 704 may change the order of inquiries in S104, S110, and S116, or may make an inquiry in combination or collectively.
- the item to be changed as described in S128 and the changed data may be sent to the server 704.
- the agricultural machine 100 may transmit a notification of completion of the initial setting to both the server 704 and the user terminals 710 and 712.
- FIG. 34 shows a rough operation from the start of the work to the completion of the work (movement to the storage position).
- steps S162, S170, and S180 are processes defined separately in the description using FIG. 35A and the like.
- the work start (S150) starts when the user terminal 710 or 712 transmits a work start instruction to the server 704 (S152).
- the agricultural machine 100 may start work with instructions from the field monitoring devices 500 and 555 and the state monitoring device 550 in the field.
- the server 704 When the server 704 receives the work start instruction, the server 704 stores the information and the reception time (year / month / day / time) in the database 708, and supports the work start to the agricultural machine 100 that has been initially set (S154).
- the agricultural machine 100 that has received the work start instruction first confirms the current position (latitude, longitude) (S156). This confirmation can be performed by the agricultural machine 100 performing work in the past with the overall system 1500 and then acquiring the storage position information recorded in the database 708 when there is no movement.
- position confirmation is performed by differential GPS (DGPS) positioning, which is one of relative positioning methods. This is a method of improving accuracy by correcting an error in a GPS measurement result by using an FM broadcast radio wave transmitted from a reference station whose position is known.
- the GPS measurement is performed at the reference station, and the difference between the actual position and the position calculated by the GPS is transmitted by terrestrial waves, thereby correcting the measurement result based on the signal from the satellite.
- DGPS differential GPS
- Ordinary GPS positioning receives GPS signals from four satellites, measures the distance from the radio wave propagation time from the satellite with the satellite positions known, and obtains the latitude and longitude as the intersection of arcs that are equidistant from the satellite . That is, the code transmitted by the satellite is analyzed, the distance between the satellite and the agricultural machine 100 is determined from the time when the radio wave is transmitted and the time when the GPS antenna 120 is received, and the position of the agricultural machine 100 is specified from the positional relationship between the satellites. . Since the accuracy is still poor and an error of about 20 m is included, the error is suppressed to about 5 m by correcting it with the FM ground wave.
- positioning using GPS is not limited to the DGPS method, and the Real Time Kinetic GPS (RTKGPS) method and correction information distribution that measures the distance from the reference station to the satellite using the number of carriers and the phase, and can accommodate an error of several cm
- the Internet GPS method using the Internet may be used.
- the position may be specified by using a plurality of field monitoring devices 500 and 555 and a state monitoring device 550 whose positions in the field are known. In this method, a specific position specifying signal is transmitted from any one of the agricultural field monitoring devices 500 and 555 and the state monitoring device 550 and received by the wireless communication antenna 114 of the agricultural machine 100.
- the distance between the monitoring device and the agricultural machine 100 is obtained from the intensity (amplitude) or attenuation rate of the received signal at this time. Alternatively, the distance may be obtained by measuring the arrival time of the signal. By measuring the distance from the three or more farm field monitoring devices 500 and 555 and the state monitoring device 550, the intersection of these circular arcs is obtained and the position is specified.
- the position of the agricultural machine 100 may be specified from the positional relationship such as a sign with a plurality of known positions with the agricultural machine 100 in the images captured by the farm field monitoring apparatuses 500 and 555 and the state monitoring apparatus 550.
- the stereo camera device 110 may measure the distance from an object whose position is known at three or more points, obtain the intersection of those arcs from the distance, and specify the position. In this case, it is limited to the case where there are objects whose positions of three or more points are known in one captured image.
- the position may be specified. That is, if the distance from a point where the positions of the three points are known is known, the agricultural machine 100 or the server 704 can calculate the position.
- the GPS signal cannot be acquired in greenhouse cultivation such as a greenhouse, the current position is specified by a method other than the GPS signal.
- the current position to be confirmed may be a position (X, Y) indicated in a certain coordinate system, a direction indicated by a azimuth and a distance from a known point, in addition to the method indicated by longitude and latitude. . Further, altitude information measured using a GPS signal or an altimeter may also be used as information representing the current position.
- the agricultural machine 100 confirms the direction of forward or reverse (S158).
- the direction is confirmed by a geomagnetic sensor installed in the agricultural machine 100.
- the agricultural machine 100 is slightly moved forward or backward, and the position information is obtained in the same manner as in the step S156, and the forward or backward movement is determined from the relationship with the position specified in the step S156. What is necessary is just to specify an azimuth
- the agricultural machine 100 moves forward using the stereo camera device 110 after confirming that there is no obstacle up to the position to move forward in the traveling path.
- the control device 118 ignites the engine in the prime mover 102, moves the piston, puts the transmission 204 into the first speed, 202 is connected to transmit the power generated by the engine to the rear wheel 130 to advance the agricultural machine 100.
- the control device 118 disengages the main clutch 202 and increases the transmission device 204 to the second speed and the third speed.
- the controller 118 transmits the kinetic energy to the rear wheel 130 by rotating the motor in the prime mover 102-2 in the forward direction, and advances the agricultural machine 100.
- the control device 118 engages the main clutch 202 after putting the transmission 204 into the rear with the main clutch 202 engaged.
- the agricultural machine 100 moves backward by reversing the rotation direction of the motor.
- the overall system 1500 measures the azimuth before the movement and grasps the traveling direction, so that it is possible to prevent the agricultural machine 100 from traveling in the wrong direction.
- the agricultural machine 100 calculates a route from the current position to the position where the work starts (S160). At this time, a route corresponding to the type of work is calculated. For example, when the type of work is harvest, the shortest route that does not enter the work location is specified. If there is such a work location between the current position and the work start position, a detour route is calculated and specified. This is because if it enters the work area before harvesting and proceeds, it may destroy the crop to be harvested. This is particularly effective when a small crop that cannot be recognized as a crop from an image captured using the stereo camera device 110 is cultivated.
- the calculation of the shortest path may be performed by the server 704 instead of the agricultural machine 100.
- the other work data in the field stored in the database 708 is examined, the state of the other region is confirmed, and the route is calculated. For example, when crops are cultivated in other areas, the shortest path where the agricultural machine 100 does not enter the area or the shortest path that requires the minimum entry is derived and specified. By doing so, it is possible to prevent or minimize the influence on crops grown in other areas. Thereafter, the server 704 transmits the derived route to the agricultural machine 100.
- the agricultural machine 100 moves to the work start position along the specified route (S162). This movement process is defined in detail in the description using FIGS. 35A to 37.
- the agricultural machine 100 transmits to the server 704 that it has reached the work start position (S164).
- the server 704 Upon receiving this signal, the server 704 records the work start date and time in the database 708 (S166). As a result, the work log can be automatically left and used for the accounting process. Further, the user terminal 710 or 712 is notified of work start.
- the agricultural machine 100 may transmit not only the server 704 but also the user terminals 710 and 712 to the effect that the work start position has been reached.
- the user terminals 710 and 712 receive the notification of the start of work and the start time, so that the user can recognize when the work has started (S168).
- the agricultural machine 100 transmits to the server 704 that the work is finished (S172).
- the server 704 Upon receiving this signal, the server 704 records the work end date / time in the database 708 (S174). As a result, the work log can be automatically accumulated and used for billing processing using the log. Further, the end of work is notified to the user terminals 710 and 712.
- the agricultural machine 100 may transmit not only the server 704 but also the user terminals 710 and 712 to the effect that the agricultural machine has finished work.
- the user terminals 710 and 712 receive the notification of the end of work and the end time, so that the user can recognize when the work is finished (S176).
- the agricultural machine 100 calculates a route to the storage position of the agricultural machine 100 itself (S178).
- the agricultural machine 100 derives a route that does not cross as much as possible the region in which the work has been performed and that is the shortest to the storage position.
- This route calculation may be performed by the server 704.
- the details of the route calculation are substantially the same as those described in the step S160, but a route that does not enter the finished area such as leveling is calculated.
- the agricultural machine 100 moves along the route to the storage position (S180). This movement process is defined in the description using FIGS. 35A to 37.
- the stereo camera device 110 is used to capture an image of the traveling direction and confirm the traveling direction (S202). In this embodiment, this confirmation is performed in the agricultural machine 100.
- FIG. 35B shows a reference image among images captured by the stereo camera device 110 in the process of S202. Ranging is performed within the range captured by the stereo camera device 110.
- a portion where the distance (or parallax value, the same applies hereinafter) does not change continuously beyond a certain range by scanning (confirming) the parallax value information or distance information of the pixel (a portion where the distance does not change more than a certain amount between adjacent pixels) And indicates the boundary between the ground and the object (for example, h1 in the figure) and / or the distance suddenly changes greatly inside and outside the path when the pixel is scanned upward, and then the distance is continuously within a certain range.
- a portion that changes in a natural manner a portion where the distance changes between adjacent pixels more than a certain distance and indicates the boundary between the object and the back surface, for example, h2 in the figure.
- the reason for determining whether or not the distances are within a certain range is to prevent irregularities such as wrinkles in the field from being recognized as obstacles. For this reason, the fixed range is set to a value corresponding to a general ridge height.
- the distance measured by the stereo camera device 110 in the land is continuously increased as the distance from the agricultural machine 100 (in the drawing, the upper position is reached). It becomes longer (even if the land is slightly inclined, the distance increases continuously).
- the ranging position changes upward in the area where the object is imaged in the captured image.
- the change in the distance becomes smaller than the continuous change up to the object, does not change much, or the distance becomes shorter (one
- the second object is an object having a region that is at least inclined while being inclined toward the traveling direction of the agricultural machine 100, and the second object is an object having a region that is substantially perpendicular to the traveling direction of the agricultural machine 100, The last object is an object having a region that slopes towards the agricultural machine 100).
- the obstacle was taken as an example as one of the factors that hinder the movement here, it is not limited to this, and when the gradient in the traveling direction is too steep or the path is depressed and the hole is opened Is also an obstacle. In these cases as well, it is determined from the rate of change of the distance measured in the same manner as the obstacle.
- the reference image and the parallax value information or the distance information may be transmitted to the server 704 and confirmed by the server 704.
- the server 704 or the agricultural machine 100 may perform recognition processing (S208), which will be described later, to determine the presence or absence of an obstacle, but processing time is required for the recognition work.
- recognition processing S208
- the ultrasonic sonar device 126 can be used to check whether there is an obstacle. When an obstacle is confirmed with the ultrasonic sonar device 126, the agricultural machine 100 is moved backward in a direction in which no obstacle is detected, and the operation is continued.
- the confirmation processing in S202 it is determined whether the obstacle is large enough to be grasped on the route (S204). Being able to grasp here means that recognition processing can be performed in the subsequent process (S208).
- This process will be described with reference to FIG. 35B.
- the height direction of the obstacle O is the maximum between h1 and h2.
- the width direction of the object O is the maximum between w1 and w2.
- the determination may be made only in the height direction or the horizontal direction. That is, when there is a predetermined number or more pixels or more than the predetermined number between h1 and h2 (or w1 and w2), it is determined that the image recognition processing is possible, and the process proceeds to step S206. Further, when at least one of h2, w1, and w2 is not obtained (that is, when it is estimated that the obstacle O is too large to measure h2, w1, and w2), the image recognition process can be performed. The size is determined, and the process proceeds to step S206.
- the agricultural machine 100 is moved forward, and after a predetermined time has elapsed, that is, when the vehicle is approaching an obstacle a little (via steps S224, S226, S228, S230, S232, and S202), the determination is performed again.
- step S204 If it is determined in step S202 that there is no obstacle in the traveling direction, the determination in step S204 is denied and the process proceeds to step S224.
- the reference image and the parallax value information or the distance information may be transmitted to the server 704, and the server 704 may make the determination in S204.
- S206 may be omitted and the process of S208 may be performed.
- the agricultural machine 100 may perform obstacle recognition by performing recognition processing (S208) described later.
- the brake device 208, 214 is operated by the control device 118, and the agricultural machine is temporarily stopped. Let Then, the agricultural machine 100 transmits the image (reference image) captured by the stereo camera device 110 to the server 704 (S206). Note that when the agricultural machine 100 is stopped, the control device 118 does not perform the braking operation.
- the server 704 When the server 704 receives the image, it performs image recognition processing (S208).
- the server 704 performs recognition processing according to the following procedure. First, the server 704 performs image recognition through correction processing on a received image, then feature amount extraction processing, and identification processing by comparison with a standard pattern.
- the correction process is a process for reducing distortion and noise of the received image.
- the correction includes noise removal, smoothing, sharpening, two-dimensional filtering processing, binarization for facilitating feature amount extraction, and thinning processing for extracting a skeleton line of a graphic to be recognized.
- the server 704 performs normalization processing (image enlargement / reduction, rotation, movement, density conversion) for accurately performing post-processing pattern matching.
- the feature amount extraction process is a process for obtaining a feature parameter that is a parameter that faithfully represents the feature of an image and obtaining a feature pattern that is a shape of the feature parameter.
- the server 704 performs edge extraction that extracts a discontinuous portion of the image as an edge. That is, the server 704 extracts density change points and divides the image into several continuous regions. This edge extraction is obtained by connecting a sequence of points that are interrupted by the extended trace method and then performing a second derivative process. Note that the server 704 may perform region extraction or texture extraction by region division without using edge extraction or together with edge extraction. Subsequently, the server 704 performs identification processing by comparing the standard pattern with the feature pattern.
- the server 704 resembles a certain standard pattern, the image is regarded as the same category as the standard pattern.
- the server 704 performs pattern matching using a standard pattern stored in the database 706 and detects whether there are identical or similar ones.
- identification may be performed using a statistical identification method.
- the structure may be identified using a structure identification method. If the image can be identified in this way, the image can be recognized, and if the image cannot be identified, it cannot be recognized (S210).
- the image is not recognized, it is transmitted to the user terminals 710 and 712 to prompt the user (system user) to input the type of obstacle and the correspondence (S212).
- the user uses the user terminals 710 and 712 to store the types of obstacles (for example, natural objects (rocks, trees, animals such as kangaroos and cows), artificial objects (fences and gates)) and their correspondence (avoidance, neglect). This is transmitted to 704 (S214).
- obstacles for example, natural objects (rocks, trees, animals such as kangaroos and cows), artificial objects (fences and gates)
- This is transmitted to 704 (S214).
- the server 704 associates the information with the image and its feature pattern, and registers the information in the database 706 as a new standard pattern (S216). This makes it possible to recognize when recognizing an image similar to the current obstacle in the future. Further, the server 704 records information obtained from the user in the database 708. This makes it possible to automatically perform discounts when charging.
- the server 704 If the information of S214 cannot be obtained from the user terminals 710 and 712 within a certain time, the server 704 prompts the user terminals 710 and 712 for input. If there is no answer before a certain time elapses, the server 704 registers the type of the obstacle as “unknown obstacle” in the database 706 as “avoid” in consideration of safety. When the types of obstacles and correspondence are sent from the user terminals 710 and 712 later, the server 704 rewrites the registration information in the database 706 with the information from the user terminals.
- the agricultural machine 100 can take action according to the recognition result. Moreover, it can respond also based on the information from a user. Then, the server 704 determines whether or not to avoid the response (S218). If it is determined that the response is to be avoided, the position, direction, and turning angle (azimuth) of the turn are specified (S220). The first turning position is in front of the obstacle, but the distance that can be turned differs depending on the type of the working device 106 to which the agricultural machine 100 is connected. In the case where the working device 106 that is difficult to turn is provided, the turning is started at a position substantially in front of the obstacle.
- the vehicle may go straight to the vicinity of the obstacle.
- the turning direction is basically a direction that can be moved to the target position in the shortest time.
- the edge of the obstacle can be grasped by recognizing the image transmitted in the step S206, the detour is shortened. Turn to the shorter distance to the end. If it is not possible to determine from the image transmitted in the step S206, the stereo camera device 110 of the agricultural machine 100 is rotated by a predetermined angle left and right in the traveling direction, and the image captured at that time is sent to determine the distance to the end. You may make it recognize.
- the turning angle or the direction after turning is set so that the route becomes the shortest.
- a turning position, direction, and angle that make a large detour are set.
- the turning position, direction, and angle for the second and subsequent times are specified by estimating the type (size that can be specified) or the size of the recognized obstacle.
- All the turning positions, directions, and angles for the time being to the specified destination are transmitted from the server 704 to the agricultural machine 100, and the received agricultural machine 100 updates the new route information using those information (S222).
- step S223 if the server 704 is not “avoid” as the response of the agricultural machine 100, that is, if it is determined that the obstacle is ignored, the process proceeds to step S223. For example, when the obstacle does not hinder the progress of the agricultural machine 100 such as weeds, and there is no problem even if the obstacle travels on the obstacle, the obstacle is ignored and the process proceeds.
- the agricultural machine 100 checks the remaining fuel.
- the agricultural machine 100 is driven by an electric motor, the remaining battery level is confirmed.
- time measurement is started with a time clock in the control device 118.
- the agricultural machine 100 determines whether a predetermined period (for example, 3 seconds) has elapsed from the step S223 (S226). This is performed by using a time clock inside the control device 118.
- a predetermined period for example, 3 seconds
- the process returns to S223.
- the predetermined period has elapsed
- the current position is confirmed, and the current position information is transmitted to the server 704 (S228).
- the confirmation of the current position is as described in step S156.
- the server 704 stores the current position information in the database 708 together with the date and time at that time (S229). Thereby, the position of the agricultural machine 100 can be grasped almost in real time at predetermined time intervals.
- the agricultural machine 100 determines whether or not the target position (S162: work start position, S170: work end position, S180: storage position) has been reached (S230). This determination is made based on whether or not the current position obtained in S228 matches the target position. The determination of whether or not they match may have a width according to the position specifying accuracy. That is, if the longitude and latitude of the current position are within a certain range, it may be determined that they match.
- the agricultural machine 100 ends this movement process (S236).
- the agricultural machine 100 determines whether or not the current position is the turning position (S232). In this case as well, the width may be given to some extent, not only in the case of exact matching. If it is determined that the agricultural machine 100 is not in the turning position, the process returns to step S202. On the other hand, when it is determined that the position is the turning position, the turning is performed based on the route information (S234). In the turning operation, the control device 118 of the agricultural machine 100 operates the main clutch 202 and the transmission device 204 to put the gear into the first speed, operates the brake devices 208 and 214 to apply the brake, and temporarily stops or decelerates. .
- control device 118 operates the steering device 122 to make a turn by moving forward or backward while turning the rudder.
- the turning operation may be performed without shifting or stopping. Thereafter, the process returns to S202.
- FIG. 37A shows the operation of S224 in the case of S162 and S180, that is, when the operation reaches the predetermined position without performing work.
- the agricultural machine 100 proceeds when moving to a work start position or a storage position, that is, when only a simple movement without work is performed (S252).
- the agricultural machine 100 proceeds while operating the steering device 122 and the like so as to return to the original route.
- using the image captured by the stereo camera device 110 for example, correcting the trajectory when traveling away from the saddle, for example, adjusting a small error that cannot be confirmed by GPS or other position grasping or position confirmation system It is possible to travel on an accurate route while.
- processing for returning the trajectory of the agricultural machine 100 to the path or adjusting a minute positional deviation is performed in the same manner. This progress may be not only forward but also backward. Moreover, you may decelerate when approaching a work start position, a storage position, or a turning position.
- the control device 118 operates the main clutch 202, the transmission device 204, and the brake devices 208 and 214, and the agricultural machine 100 performs a deceleration operation.
- FIG. 37B is a flowchart showing the processing operation of S224 of the agricultural machine 100 in the case where the work target is not an individual farm product but performs continuous (or intermittent) work while proceeding in S170.
- Such operations include sowing, leveling, plowing, normal watering, and fertilization.
- the agricultural machine 100 performs a predetermined work while proceeding (S262).
- This work is the work set in FIG. 32 or FIG.
- This work is a work performed continuously or uniformly in a set work location (region) regardless of individual conditions such as crops, soil, and work positions. It may be a case where the work is performed intermittently.
- the work is usually performed by the work device 106 of the agricultural machine 100.
- the work resource is, for example, the remaining amount of fertilizer if the type of work is fertilization, the remaining amount of water if watering, and the remaining amount of seed if sowing.
- the work resource is, for example, the remaining amount of fertilizer if the type of work is fertilization, the remaining amount of water if watering, and the remaining amount of seed if sowing.
- the operation proceeds to the operation shown in FIG.
- the first cycle of the progress and work is completed (S266), and the process proceeds to the next and subsequent steps such as travel and work.
- Leveling work> As an example of the process shown in FIG. 37B, the field leveling operation using the laser radar device 112 will be described with reference to FIGS. 38 to 42 (in this example, the operation resource confirmation step of S264 is unnecessary and thus omitted).
- the leveling operation is performed in the field by the laser light receiving device (in the example of FIG. 39). 610) or a laser irradiation device (618 in the example of FIG. 42) is required, but the basic operations of automatic operation and automatic operation are the same as those described so far.
- this laser radar device 112 can irradiate a laser beam in a range of a horizontal angle of view of 60 degrees, it is efficient and efficient compared with the case of leveling using a normal laser (when using a laser leveler). Flattening work can be performed. Further, by rotating the laser radar device 112, it is possible to further reduce the number of changes in the installation position of the laser light receiving device.
- FIG. 38 shows an agricultural machine 100C provided with a work device 106C for leveling (leveling) the field.
- the configuration of this agricultural machine 100C is basically the same as that of the agricultural machine 100A, but differs from the agricultural machine 100A in that the laser radar device 112 (and the multispectral camera device 113) is installed on the roof.
- the swing motor 54 of the laser radar device 112 controls the rotation of the ⁇ axis so that the laser is irradiated horizontally based on a command from the controller 64 based on the level signal.
- This working device 106C includes a leveling plate 600 for carrying out soiling and laying work, a side plate 602 for preventing side spillage of the soil on the leveling plate 600, and crushing, softening, and overtightening the surface layer portion. It has a spring tine 604 to prevent, a spiral roller 606 for crushing and crushing, and an electric cylinder that moves the leveling plate 600 and the like up and down by a command from the control device 118 of the agricultural machine 100C.
- the working device 106C may include a control processor that exchanges signals with the control device 118 of the agricultural machine 100 and operates the electric cylinder to control the vertical movement of the leveling plate 600 and the like.
- the cylinder that moves up and down, such as the flat plate 600 may be any of a hydraulic, pneumatic, and hydraulic cylinder.
- FIG. 39 shows an example of a state in which leveling work is performed using this agricultural machine 100C.
- a laser receiver 610 provided with a laser receiver 612, a radio communication antenna 614, and a control processor is used.
- This laser light receiving device 610 is installed on the bag.
- the laser receiver 612 is installed so that the light receiving surface thereof is parallel to the vertical direction.
- the laser receiver 612 has a configuration in which a plurality of light receiving elements are installed in the vertical direction and the horizontal direction, and the position including the height at which the laser is received can be determined depending on which light receiving element has received the light.
- reference numeral 620 indicates the leveled area
- reference numeral 630 indicates the area before leveling
- Reference numeral 640 indicates the state of the laser beam emitted from the laser radar device 112. This laser light enters one of the light receiving elements of the laser receiver 612.
- a broken line in the figure indicates that the wireless communication antenna 114 of the agricultural machine 100C and the wireless communication antenna 614 of the laser light receiving device 610 are performing wireless communication.
- the received height information that is, the agricultural machine 100C determines whether the agricultural machine 100C is at a higher position, a lower position, or a reference position than the reference.
- the outline of the operation in such a system 1501 is as follows. That is, the agricultural machine 100C uses the laser radar device 112 to irradiate the laser while scanning the laser in one dimension toward the laser light receiving device 610 of the laser light receiving device 610.
- the position (height) information received by the laser light receiving device 610 is acquired by the control processor of the laser light receiving device 610, and transmitted to the agricultural machine 100C wirelessly using the wireless communication antenna 614. Based on the received information, the agricultural machine 100C progresses while leveling the flat plate 600 and the like of the work device 106C and leveles the field.
- the laser radar device 112 is installed on the roof of the agricultural machine 100C, but it may be installed on the work device 106C. If installed in the working device 106C, the leveling plate 600 and the like for performing leveling work can be raised and lowered by using information on the laser light receiving position of the laser receiver 612 to reduce the timing shift, so that more precise leveling can be achieved. It can be carried out. In this case, it is necessary to control the laser beam so that the laser beam from the laser radar device 112 is installed at a high position so that the laser beam is not blocked by the pillar or roof of the agricultural machine 100C and the laser beam is kept horizontal during operation.
- FIG. 40 shows the details of the step S262 in the case where the leveling operation is performed.
- the laser receiver 612 receives the laser light so that the standard level height is the height at which the flat plate 600 is at the reference position (the average height of the field after leveling).
- the device 610 is installed on the bag.
- the agricultural machine 100C sets the leveling plate 600 and the like to a height at the reference position, and irradiates the laser beam from the laser radar device 112 (S302). This laser light is incident on one of the light receiving elements of the laser receiver 612 of the laser receiving device 610.
- the received light signal is input to the control processor of the laser light receiving device 610.
- the control processor identifies the light receiving element installed at any position, and transmits the received light position information to the agricultural machine 100C using the wireless communication antenna 614. .
- the agricultural machine 100C determines whether or not the laser is received at a position higher than the standard position of the laser receiver 612 from the received information (S304).
- the fact that the light is received at a position higher than the standard indicates that the land is rising at the point where the agricultural machine 100C is present. In this case, it is necessary to flatten the raised land by lowering the flat plate 600 of the working device 106C. Therefore, if it is determined in step S304 that the agricultural machine 100C has received light at a position higher than the standard, the process proceeds slightly (S306), and the flat plate 600 or the like is received before reaching the position irradiated with the laser beam.
- the agricultural machine 100C gives a command to the working device 106C and lowers the flat plate 600 and the like (S308).
- the advance in S304 may be omitted.
- the leveling plate 600 or the like is considerably lowered.
- the leveling plate 600 or the like is slightly lowered.
- This is an operation based on information received from the laser receiving device 610 by the agricultural machine 100C. That is, the raising / lowering amount of the flat plate 600 or the like can be adjusted according to the light receiving position of the laser beam.
- the agricultural machine 100C is advanced (S316), and the work area is leveled.
- step S304 if it is determined in step S304 that the agricultural machine 100C does not receive light at a high position, it is determined based on the received information whether or not light is received at a low position (S310).
- Receiving light at a low position means that the agricultural machine 100C is at a lower position than the land to be leveled during laser irradiation. In this case, the agricultural machine 100C adjusts the amount by which the flat plate 600 and the like are leveled by causing the work device 106C to perform the operation of raising the flat plate 600 and the like.
- step S310 when it is determined in step S310 that the agricultural machine 100C has received light at a position lower than the standard position, the forward movement is slightly performed (S312), and before the flat plate 600 or the like reaches the position irradiated with the laser beam.
- the agricultural machine 100C gives a command to the working device 106C and raises the flat plate 600 and the like (S314).
- the advance of S312 may be omitted.
- the leveling plate 600 When the light receiving position of the laser receiver 612 is considerably lower than the standard position, the leveling plate 600 is raised considerably.
- the leveling plate 600 is raised. A little.
- step S310 determines whether the position is lower than the standard position. If it is not determined in step S310 that the position is lower than the standard position, the agricultural machine 100C is at the reference position at the time of laser irradiation, so the height of the flat plate 600 or the like is not changed. The process proceeds as it is (S316).
- FIG. 41 is a view of the agricultural machine 100C and the laser receiving device 610 observed from above.
- the arrows in the figure indicate the traveling direction of the agricultural machine 100C.
- the working device 106C connected to the agricultural machine 100C is omitted.
- FIG. 41A shows a case where the laser light receiving device 610 is present in the traveling direction of the agricultural machine 100C.
- the laser irradiation by the laser radar device 112 may be performed in the traveling direction, so that the laser is received on the field side where the laser receiver 612 is working.
- FIG. 41A shows a case where the laser light receiving device 610 is present in the traveling direction of the agricultural machine 100C.
- the laser irradiation by the laser radar device 112 may be performed in the traveling direction, so that the laser is received on the field side where the laser receiver 612 is working.
- FIG. 41A shows a case where the laser light receiving device 610 is present in the traveling direction of the agricultural machine 100C.
- the laser light receiving device 610 when the laser light receiving device 610 irradiates the laser from the traveling direction of the agricultural machine 100C toward the traveling direction, the laser light receiving device 610 is positioned so as not to receive the laser beam.
- the machine 100C rotates the laser radar device 112 so that the laser beam can be received by the laser receiver 612.
- the agricultural machine 100C when the traveling direction of the agricultural machine 100C is opposite to the laser light receiving device 610 side, the agricultural machine 100C further rotates the laser radar device 112 (for example, 180 degrees with respect to FIG. 41A). Rotate) and irradiate with laser light. Whether or not the laser radar device 112 needs to be rotated is determined by the laser beam receiving position of the laser receiver 612.
- the laser radar device 112 is rotated when light is received only on the side surface side without being received on the field side where the work is being performed. Alternatively, when light is received at a specific position, the information may be obtained and rotated. Note that the laser radar device 112 does not change the horizontal position or the irradiation angle of the laser with respect to the horizontal plane due to the rotation.
- step S320 the agricultural machine 100C rotates the laser radar device 112 (S320).
- the rotation angle at this time is rotated by a preset angle, but is not limited thereto, and the rotation angle may be changed according to the light receiving position.
- the laser receiver 612 is irradiated with the laser beam constantly or periodically, and the agricultural machine 100C receives the feedback of the light receiving position so that the laser beam reaches the laser receiver 612.
- the laser radar device 112 is rotated.
- step S318 If it is determined in the step S318 that rotation is unnecessary, or if rotation is performed in the step S318, the work (first cycle) is terminated (S322). Note that, as shown in FIGS. 34, 35A, 36, and 37B, the work in FIG. 40 is repeated from the work start position to the work end position. As a result, leveling of the entire work site can be performed efficiently. Note that the processes of S318 and S320 may be performed before S316, and the process following S308 and S314 and the process subsequent to a negative determination in S310 may be set to S318.
- FIG. 42 shows another example in which leveling work is performed using the agricultural machine 100. Irradiation of the laser beam used for leveling work is performed from a laser irradiation device 618 provided with a laser radar device 112-2.
- the laser irradiation device 618 includes a stage 622 for electrically rotating the laser radar device 112-2 along the horizontal direction, a tripod capable of adjusting the length and angle for supporting the stage 622, and an antenna 624 for wireless communication with the agricultural machine 100. And a control processor. Since the laser radar device 112 of the laser irradiation device 618 does not need to be rotated around the ⁇ axis, the mechanism (swing motor 54) for that purpose can be omitted.
- This laser irradiation device 618 is installed by a tripod so that laser light is emitted horizontally to the heel.
- the agricultural machine 100 connects the working device 106C-2 having the laser receiving device 616 to perform leveling work.
- a laser receiver 612-2 is installed above a pole extending in the vertical direction.
- the laser receiver 612-2 includes a plurality of light receiving elements in the vertical direction and the horizontal direction. A plurality of horizontal light receiving elements are arranged over the circumference of the laser receiver 612-2.
- the laser beam height information detected by the laser receiving device 616 is input to the control device 118 of the agricultural machine 100.
- the working device 106C-2 includes an electric cylinder that moves up and down the leveling plate or the like in response to a command from the control device 118 of the agricultural machine 100 in accordance with the position of the laser beam received by the laser receiver 612-2.
- the laser receiving device 616 may include a control processor that operates the electric cylinder in accordance with the position of the detected light receiving element to move the leveling plate and the like up and down.
- the broken line in a figure shows communication of the laser irradiation apparatus 618 and the agricultural machine 100 by radio
- the leveling operation is performed as follows. First, as an initial setting, the tripod of the laser irradiation device 618 installed on the eaves is set so that the laser beam emitted from the laser radar device 112-2 is horizontal and the level of the flat plate of the working device 106C-2 is a reference. The laser beam is adjusted so that the laser beam is irradiated to the standard position of the laser receiver 612-2 in the state where it is installed on the land. Next, the laser beam is emitted toward the laser receiver 612-2.
- the agricultural machine 100 determines whether the light is received at a position higher or lower than the standard position from the position of the received light receiving element, and if it is determined that the light is received at a position higher than the standard position, As the farming machine 100 rises, the agricultural machine 100 moves down the leveling plate to level the land. On the other hand, if it is determined that the light is received at a position lower than the standard position, the land is lower than the standard, so the agricultural machine 100 moves up the flat plate to level the land. In addition, raising and lowering the flat plate depends on how far the light receiving position is from the standard position. That is, as the light receiving position is received away from the standard position, the leveling plate is moved up or down largely.
- the agricultural machine 100 grasps which light receiving element has received light in the horizontal direction in the laser receiver 612-2. It is determined whether it is necessary to change the laser irradiation angle of the laser radar device 112-2 according to the light receiving position. If it is not necessary, no communication is performed. On the other hand, when it is determined that it is necessary to change the angle so that the laser beam can be received after the next operation, the agricultural machine 100 uses the laser information to rotate the stage 622 of the laser irradiation device 618 according to the light receiving position. Transmit to the irradiation device 618. The control processor of the laser irradiation device 618 rotates the stage 622 by a predetermined angle based on the received information. Thereby, even if the agricultural machine 100 is working in any position in the work area, the laser beam can be received at any time.
- S302 is an operation performed by the laser irradiation device 618, not the agricultural machine 100.
- the rotation of the laser radar device in S320 is performed by the laser irradiation device 618, not the agricultural machine 100.
- the ⁇ axis in the laser radar device 112 or 112-2 may be rotated by a predetermined angle from the angle of horizontal laser beam irradiation.
- ⁇ Individual work> It is possible to improve the efficiency of the entire work by determining the necessity of work for each work target such as a crop and performing the work on the work target only when necessary. 43 to 47, work that can be performed individually by the overall system 1500 by the system 1501 in the field and the information communication system 1502 described in FIG. 2 according to the situation of the crops and the like. The operation of will be described. In addition, there are fertilization, sowing, transplantation, harvesting, weeding, pesticide application / spraying, watering, mowing, etc. as individual work, but here we will explain the work mainly using fertilization as an example . Note that the operation of the system 1501 can also be applied to work performed individually for each crop other than fertilization.
- FIG. 43 is a diagram illustrating a state in which fertilization work is performed using the overall system 1500 in the present embodiment.
- the agricultural machine 100 automatically fails to cultivate a field in which the crop 350 that is normally grown (high plant activity) indicated by the solid line and the poorly grown (low activity) crop 360 indicated by the dotted line is cultivated. It is running automatically while performing work (fertilization) only on the agricultural crop 360.
- a state monitoring device 550 provided with a multispectral camera (or colorimetric camera) for monitoring the growth state of plants from a high position is installed in the farm field, and wireless communication is performed with the agricultural machine 100. .
- the overall system 1500 there are a plurality of labels 370 for specifying a predetermined position in the captured image from images captured by the field monitoring devices 500 and 555, the state monitoring device 550, and the like.
- the signs 370 are given different numbers, letters, colors, patterns, figures and shapes (or combinations thereof separately), and the positions where these are provided are known in the system 1501.
- the work location where the work is performed is specified based on the data from the user terminals 710 and 712, but the server 704 specifies the work location based on the information from the state monitoring device 550. You can also. This operation will be described later with reference to FIG.
- FIG. 43 it has shown that the crop 360 with poor growth concentrates on the location enclosed by the circle.
- FIG. 44 is a diagram illustrating a working state of the agricultural machine 100 in the system 1501.
- a fertilizer application device (fertilizer) 106D is connected to the main body.
- the agricultural machine 100 confirms the growth status of the crops 350 and 360 with the multispectral camera device 113, and sprays the fertilizer 802 in the vicinity of the crop 360 determined to be poor growth according to the confirmation result.
- the agricultural machine 100 transmits the fertilization information or the plant status information using the wireless communication antenna 114 and relays it to the server 704 via the field monitoring device 555 (indicated by a dotted line in the figure). Note that the agricultural machine 100 may transmit to the server 704 without relaying the agricultural field monitoring device 555 or the like when the wireless access point 700 exists within a wireless reachable range. Further, information may be transmitted via a relay such as another wireless access point.
- FIG. 45 shows a main part of a working device (fertilizer) 106D for supplying fertilizer to plants.
- FIG. 45A shows the appearance of the fertilizer applicator, while omitting the connection portion with the main body of the agricultural machine 100.
- 45B is a cross-sectional view taken along the plane indicated by the dotted line 4501 in this figure.
- This fertilizer applicator 106D includes a metal casing 800, a fertilizer feeder 810, a hammering drum 812, a spraying unit 804, an infrared sensor 806, and a fertilizer inlet 808, and a fertilizer 802 in the main body.
- the fertilizer 802 is input into the housing 800 from the fertilizer input port.
- the agricultural machine 100 beats the fertilizer 802 by driving the fertilizer feeder 810 and the hammering cylinder by driving the PTO shaft 222 controlled by an instruction from the controller 118 of the agricultural machine 100 or by a current from the power source 228.
- the fertilizer 802 is sprayed from the spraying port.
- the infrared sensor 806 detects the remaining amount of fertilizer. Information on the detected fertilizer remaining amount is transmitted to the control device 118.
- the spreading unit 804 can be flexibly bent, and the spraying port can be set on the right side, the left side, or the rear side with respect to the traveling direction of the agricultural machine 100.
- FIG. 46 is a flowchart showing an operation for the server 704 to specify a work location based on information from the state monitoring device 550 or the like. Based on this figure, the operation in which the server 704 specifies a work location using information from the state monitoring device 550 and the like will be described. By performing this process, it becomes possible to specify a work location in advance in a wide area, and work efficiency and speed can be improved.
- the server 704 When the processing is started (S330), the server 704 first includes image information obtained by capturing an area of the field with the state monitoring device 550 and additional information other than the image information (information indicating the growth state of the plant (such as NDVI)) Pest information, frost information, discoloration information due to pests, soil information, sugar content information, etc.) are acquired (S332). Note that the server 704 may acquire information (NDVI or the like) indicating the growth state of the plant by obtaining, for example, spectral reflectance information before information processing from the state monitoring device 550 and processing the information. .
- image information obtained by capturing an area of the field with the state monitoring device 550 and additional information other than the image information (information indicating the growth state of the plant (such as NDVI)) Pest information, frost information, discoloration information due to pests, soil information, sugar content information, etc.) are acquired (S332). Note that the server 704 may acquire information (NDVI or the like) indicating the growth state of the plant by obtaining
- the server 704 detects a location where there is a plant that requires operations such as watering, fertilizer application, and weeding from these images and additional information (S334). This is performed, for example, by specifying a part in the spectroscopic image where NDVI is equal to or less than a predetermined value or below a predetermined value. In addition to NDVI, a portion where the spectral reflectance is less than or equal to a predetermined value or less than a predetermined value in a spectral image for a visible red wavelength (for example, 660 nm) may be specified. In addition, the server 704 may acquire information detected by the field monitoring devices 500 and 555 or the like instead of the state monitoring device 550 or together with the state monitoring device 550, and may detect a work required portion.
- the server 704 specifies the position of the detected location (S336). For this, first, the server 704 performs image recognition processing in the same manner as in step S208, and recognizes a plurality of markers 370 present in the captured image. The server 704 determines the positions of these recognized signs. Then, the server 704 identifies the position of the place where the work is required from the positional relationship between the recognized plurality of signs and the place where the work is required.
- the server 704 transmits the specified position as the work location to the agricultural machine 100 and the user terminals 710 and 712 (S338), and ends the process (S340).
- the agricultural machine 100 does not have to measure the growth state and other states of the crops 350 and 360 over the entire workable region, and can work efficiently. For this reason, work time can be shortened. The user can also grasp the place where the work is necessary.
- FIG. 47 explains in detail the process of S224 in FIG. 35A in the case of performing work individually for each object.
- the agricultural machine 100 acquires image information and additional information and transmits them to the server 704 (S352). At this time, the progress of the agricultural machine 100 may be temporarily stopped.
- This step includes additional information (stereo) other than the captured image information obtained by the stereo camera device 110 or / and the multispectral camera device 113 which is an image sensing device of the agricultural machine 100 and image information obtained from these devices.
- additional information stereo
- the camera device 110 parallax value or distance information may be used
- spectral reflectance information for each wavelength or information calculated using the spectral reflectance thereof may be used.
- additional information other than image information such as distance information and shape information obtained by the laser radar device 112 may be used.
- an image or a state monitoring device 550 or a field monitoring device 500, 555 requested from the agricultural machine 100, the server 704, or the user terminal 710, 712 to the state monitoring device 550 or the field monitoring device 500, 555 is captured.
- information other than image information obtained from other than the image sensing device (stereo camera device 110, multispectral camera device 113, omnidirectional camera device 501) (for example, temperature, humidity, and Internet 702 obtained by the environmental monitoring device or the like).
- Information such as weather forecasts and sunshine hours acquired via) may be transmitted to the server 704.
- the step S352 may be performed by the field monitoring devices 500 and 555, the state monitoring device 550, and / or the server 704 other than the agricultural machine 100.
- the information acquired and transmitted in this step varies depending on what work the agricultural machine 100 performs.
- the spectral image and spectral reflectance information captured by the multispectral camera device 113 are suitable for examining the growth state of the target plant and / or examining the soil state ( Either acquired by the multispectral camera apparatus 113 of the agricultural machine 100 or acquired by the multispectral camera apparatus 113 of the state monitoring apparatus 550).
- the polarization image captured by the polarization stereo camera device and the distance information to the object are suitable for accurately discriminating the type of plant.
- the harvesting robot as the work device 106
- Image information captured by the stereo camera device 110 and information on the distance to the fruit are required as information for reliably moving the fruit cutting and holding arm to a predetermined position around the fruit (or a laser instead of the stereo camera device 110).
- the distance information may be measured by the radar device 112. Since the laser radar device 112 exists in the vicinity of the multispectral camera device 113, the position of the target fruit can be detected with high accuracy).
- the Internet 702 is used to determine the necessity of watering.
- weather forecasts precipitation forecasts.
- the work can be applied not only to the work described here, but also to fertilization, sowing, transplanting, harvesting, weeding, pesticide spraying / spraying, watering, mowing, etc., for each unit such as a crop.
- additional information other than the image information and the image information is acquired.
- the server 704 that received the information analyzes the information, generates information necessary for the work, and transmits it to the agricultural machine 100 (S353).
- This information analysis also changes depending on the work to be performed.
- the server 704 performs image recognition in the same manner as described in the step S208 of FIG. 35A, and checks the NDVI (plant activity) for each recognized leaf, or the spectrum of the soil. To investigate the distribution. As a result, determine if it is necessary to spray fertilizer. In the operation of weeding the weeds, the server performs image recognition (similar to the process of S208), determines whether or not the object is weed, and determines that the object is weed.
- the process of S353 may be performed by the agricultural machine 100.
- the agricultural machine 100 determines whether to perform work (S354).
- step S354 When it is determined in the step S354 that the agricultural machine 100 performs work, the process proceeds until the work target is within the work area of the work device 106 (S356).
- the agricultural machine 100 determines whether or not the workable area by the work device 106 has reached the vicinity of the work target such as a crop that needs work (S358). Here, this determination is made based on whether or not a known length from the point where the image is taken to the work area and the device where the image is taken is moved. This is done by measuring the distance in front with the stereo camera device 110 or by measuring the distance from the number of rotations of the wheels. This is because depending on the contents of the work, if the position is slightly shifted, the work may be hindered or the work may be failed.
- the server 704 identifies the position of the agricultural machine 100 from the plurality of signs 370 and calculates the distance to be moved. The distance information is transmitted to the agricultural machine 100, and the agricultural machine 100 is moved by that distance.
- the position specified in S156 can be specified accurately to such an extent that the work is not hindered, it may be determined by comparing the positions specified before and after the progress.
- the work can be performed at a point where image information is acquired, such as when there is an image sensing device (stereo camera device 110, multispectral camera device 113, omnidirectional camera device 501) or laser radar device 112 in the vicinity of the work device 106. In that case, the step S356 is omitted.
- an image sensing device stereo camera device 110, multispectral camera device 113, omnidirectional camera device 501
- laser radar device 112 in the vicinity of the work device 106.
- the step S356 is omitted.
- step S356 The determination of S358 and the progress of S356 are repeated until the work target becomes the workable area of the work device 106. If the work target is overworked, the process may be reversed in step S356.
- the agricultural machine 100 determines that the workable area of the work device 106 has reached the vicinity of the work target such as a crop that requires work, the agricultural machine 100 performs the work (S359).
- the agricultural machine 100 transmits work information such as work position (work target), work content, success / failure of work, and work time to the server 704 (S360).
- work information to be transmitted varies depending on the type of work. For example, if the work is a harvest, the work position, the work content, the success or failure of the work, and the work time are transmitted. If the work is sprinkling, the work position, work content, sprinkling amount, and work time are transmitted.
- the server 704 stores the information in the database 708 (S361).
- the overall system 1500 accumulates the work performed for each work position and work target, so that it is used for billing processing or by comparing the work condition and the growth condition to identify the work condition suitable for the work target. Or store data for future harvest predictions.
- the server 704 aggregates the work information for each work target in a certain period (for example, month or year, or the number of days since the start of cultivation), and displays the totaled work information (work contents) on the user terminals 710 and 712. It can be provided to those terminals.
- this work information is an information good, it can be independently subjected to commercial transactions, and may be sold to a system user, for example.
- the agricultural machine 100 determines whether the work in the area designated as the work location by the user terminals 710 and 712 (example in FIG. 32) or the state monitoring device 550 (example in FIG. 46) has been completed. This determination is made by comparing the final position of the work location with the current position.
- the method for specifying the position is the same as the description of the step S156 in FIG. 34, and the comparison is the same as the step S230 in FIG.
- the agricultural machine 100 determines whether there is a next work location (S364). This is determined based on information transmitted from the server 704 (S130 in FIG. 33 and S338 in FIG. 46).
- the agricultural machine 100 calculates a route to the work location (S366).
- the shortest path is calculated by the same method as in steps S160 and S178 in FIG.
- a work resource is a resource that is necessary to perform work. For example, if the work is fertilizer or watering, fertilizer or water, if harvesting, space for storing the harvest, if seeding, seeds, etc. It is.
- This S370 is also performed as a subsequent process when a negative determination is made in S362 and S364. The operation of the system 1501 when the agricultural machine 100 determines that there are not enough work resources will be described in detail with reference to FIG.
- the agricultural machine 100 After confirming the work resource, the agricultural machine 100 proceeds toward the next work position or the next work area (work location) in the same work area (S370), and finishes the work in the first cycle (S372).
- the progress of S370 may be not only forward but also backward.
- the process in S370 is performed up to a position where an image of an object for the next individual determination can be acquired. This position may be grasped in advance by the agricultural machine 100 or the server 704, or may be specified based on an image acquired by the stereo camera device 110, the multispectral camera device 113, or the like of the agricultural machine 100.
- FIG. 48 is a diagram illustrating a process in which the electrically driven agricultural machine 100B interrupts the operation and performs battery charging when the remaining battery level is low.
- the agricultural machine 100B is an electric drive type agricultural machine and includes a transmission device 104-2 shown in FIG.
- As the working device 106B a seeding device is provided, and seeds are sprinkled through a predetermined route at the work location.
- a broken line in the figure indicates a route on which the agricultural machine 100B has worked, and sowing is performed in a region indicated by a dotted line.
- the interruption position is stored, and the distance D to the nearest marker 370 is measured and stored by the stereo camera device 110. This distance D is used to accurately specify the position where the work is resumed after the battery is charged.
- the agricultural machine 100 ⁇ / b> B goes to the external power source 226 (non-contact power transmission device) for battery charging along a path indicated by a solid line. Then, after the charging is completed, it moves to the work interruption position and resumes the work from the accurate interruption position.
- fuel, battery, and work resources can be replenished with automatic control as much as possible, and work efficiency can be improved.
- FIG. 49 shows the above operation in detail.
- FIG. 49 is an operation flowchart when the operation (or movement) is interrupted. This FIG. 49 is also described by the positioning of the work of step S224 in FIG. 35A (another work when the work is interrupted).
- This operation flow is realized when the agricultural machine 100 recognizes that the work resources such as fertilizer and seeds are less than the predetermined amount or less than the predetermined amount in S368 of FIG. Is started (S400).
- the agricultural machine 100 determines whether or not the intended operation (charging, work resource filling, securing) has been completed (S402). Immediately after the start of this process, it is determined that the process has not been completed yet, so that the work interruption information such as the work interruption date and time, the interruption position, and the distance to the specified object is subsequently obtained (information in S416). Alternatively, it is determined whether or not the travel interruption information of S418 has already been transmitted to the server 704 (S405). Immediately after the start of this determination, since it is determined that no transmission has been made, it is subsequently determined whether the agricultural machine 100 is performing any work (eg, fertilization, leveling, harvesting, etc.) (S408).
- the intended operation charging, work resource filling, securing
- the agricultural machine 100 determines that it is some work, the agricultural machine 100 interrupts the work (S410). That is, the agricultural machine 100 stops traveling by the control device 118 and also stops the work by the work device 106. Subsequently, the agricultural machine 100 confirms the current position using the same method as described in the process of S156 in FIG. 34 and stores it in the control device 118 (S412).
- the distance from the sign 370 or other specific object is measured using the stereo camera device 110 and stored in the control device (S414). To do this, it is first necessary to recognize the object. For this recognition, an object in front is identified in the same manner as the obstacle identification described in the steps S202 and S204 in FIG. 35A, or the image recognition process described in the step S208 in FIG. There are methods to detect specific objects. The former can be performed more quickly than the latter, but the accuracy is worse than the latter. Subsequently, the distance from the specified position of the target object is measured. This position is set to the center of the object, but is not limited to this, and may be, for example, a location that becomes an edge.
- FIG. 50 shows an example of a captured image including this object and the measured distance.
- the object may be an artificial object such as a sign 370 (this sign indicates “12”) as shown in FIG. 50A, or a natural object such as a tree 372 as shown in FIG. 50B.
- FIG. 50A shows that the distance from the agricultural machine 100 (stereo camera device 110) to the sign 370 (center portion indicated by a black circle) is 17.2 m.
- FIG. 50B shows that the distance from the agricultural machine 100 (stereo camera device 110) to the tree 372 (the tip of the branch) is 19.0 m.
- the distance from one or more points may be measured in an image captured by two or more stereo camera devices 110. Thereby, the start position alignment of the agricultural machine 100 at the time of work resumption can be performed more correctly. Subsequently, work interruption information such as the work interruption date and time, the interruption position, and the distance to the identified object is transmitted to the server 704 (S416).
- the server 704 Upon receiving these, the server 704 sends work interruption information to the user terminals 710 and 712. Thereby, the user can grasp
- the agricultural machine 100 calculates a route to a position where a target operation (charging, work resource filling, securing) is performed (S419).
- This route calculation may be performed by the server 704.
- movement is given from the server 704 where those positions are memorize
- the user may be designated via the user terminals 710 and 712. This calculation is the same as that described in steps S160 and S178 in FIG. 34, and basically the shortest path is calculated.
- the current position is confirmed by the same method as in step S412 to obtain the current position information and the travel.
- the interruption information is transmitted to the server (S418), and the route from the current position to the position where the target operation is performed is calculated in the same manner as described above (S419).
- the server 704 may be made to perform route calculation.
- step S420 the agricultural machine 100 proceeds along the route (S420). This progress includes both forward and reverse travel. If it is determined in step S405 that the agricultural machine 100 has already transmitted the work interruption information, the process proceeds to step S420 and proceeds.
- the agricultural machine 100 determines whether or not it has arrived at a position where the target operation is performed (S422). This determination is made by the method described in the steps S230 and S232 of FIG. Note that the determination in S422 may be performed not for a predetermined period but for each predetermined distance (for example, 50 cm). If it is determined that it has not arrived, the process for interrupting the work is temporarily terminated, and the process of S400 is resumed through the series of operations shown in FIGS. 35A and 36. On the other hand, when it is determined in step S422 that the agricultural machine 100 has arrived at the position where the target operation is performed, the target operation is performed (S424).
- a predetermined time for example, 1 second
- This operation may be performed automatically unattended as in the system 1501 shown in FIG. 48, or may be performed manually.
- light oil or gasoline is manually supplied to the fuel tank of the agricultural machine 100, or fertilizer is replenished to the working device.
- This operation is monitored by S426 until the target operation is completed.
- the agricultural machine 100 detects the completion of the target operation by a sensor when a predetermined amount of battery charging is completed, a fertilizer is supplied in a predetermined amount or more, or a crop is removed from the crop storage space. It is determined that the operation is complete.
- FIG. 51 shows a state immediately after the battery 224-2 of the agricultural machine 100B is charged by the external power source (non-contact power transmission device) 226.
- the agricultural machine 100 transmits operation end information such as the target operation end date and time and the target operation content (how much battery, fuel, fertilizer, etc. have been supplied) to the server 704, and further up to the work interruption position. Or the route to the work start position or storage position is calculated (S427).
- the server 704 transmits the operation end information of the user terminals 710 and 712 and stores the information in the database 708.
- the route calculation method is the same as the method described so far (such as S419), and may be performed by the server 704.
- the agricultural machine 100 proceeds (forward or reverse) (S430).
- the process starts toward the work interruption position in order to resume the work.
- the process of S430 is also performed when it is determined that the agricultural machine 100 has completed the target operation in the process of S402 in the operation loop of FIG. 49 after the completion of the target operation.
- the agricultural machine 100 determines whether the work has been interrupted (S431). If it is not determined that the work has been interrupted, the process proceeds to S444, and this process is temporarily terminated.
- the agricultural machine 100 specifies an object corresponding to the object measured in S414 by using the stereo camera device 110 in the same manner as in S414, and measures the distance to it (S434).
- the distance from the object measured in S434 and the distance measured in S414 are determined to determine whether the stored distances are equal (S436).
- S414 when one stereo camera device 110 measures the distance to a plurality of points or a plurality of stereo camera devices 110 to a plurality of points, all the corresponding positions are also measured in S434, and all of them are measured. To determine if they match. Note that the matching accuracy is determined according to the distance measurement accuracy of the stereo camera device 110.
- the control device 118 of the agricultural machine 100 operates the steering device 122 and the transmission device 104 to move the agricultural machine 100 forward, backward, left, and right so that the distances match. (S438). If the distances are matched after one or more distance measurements, it is determined that the work interruption position and the work resumption position coincide with each other, and work resumption information is transmitted to the server 704 together with the work resumption date (S440). The work is started (S442). The server 704 transmits the work resumption information and the start time to the user terminals 710 and 712, and further stores the information in the database 708.
- step S436 when the agricultural machine 100 determines that the distances are the same in step S436, the process proceeds to S440.
- the work interruption position and the restart position can be made to coincide with each other even when there is a slight shift in the position information. Accordingly, for example, the work can be performed without duplication or a gap where the work is not performed.
- FIG. 52 shows an automatic operation when an abnormal event is detected through the field monitoring device 500 (in this example, when an abnormal source (mainly so-called harmful animal) 1000 that may cause damage to crops is detected). It shows how the abnormal source 1000 is observed by using the controlled agricultural machine 100 and the abnormal source 1000 is dealt with. A broken line in the figure indicates transmission / reception of information by wireless communication.
- the present invention is not limited to this, and information exchange via the server 704 may be performed.
- the content of the abnormality is not limited to the invading pests in the field, but includes all abnormal situations that occur due to human or natural forces. For example, a fire or stranger intrusion.
- FIG. 53 is a diagram for explaining the operation of the agricultural machine 100 of FIG. 52 in more detail, and is an observation of the farm field of FIG. 52 from above. As shown in the figure, the agricultural machine 100 performs an operation of approaching the abnormal source 1000 through the shortest path while avoiding the region where the crop 910 is cultivated.
- FIG. 54 and 55 show the operation of the overall system 1500 of this embodiment when this abnormal event occurs, and the operation of the server 704 and the agricultural machine 100 will be mainly described.
- FIG. 54 illustrates operations from occurrence of an abnormal event to completion of handling of the abnormality.
- FIG. 55 explains details of the operation of S422 in the process of S502 of FIG. 54 (the same operation as S400 to S444 of FIG. 49).
- the flow at the time of occurrence of an abnormal event is started by detecting an abnormality in an image that is reflected in the field monitoring device 500 (S450, S452). This is an operation executed by transmitting an image captured by the field monitoring apparatus 500 to the server 704 and analyzing the image by the server 704.
- the server 704 performs image recognition in the same manner as in step S208 in FIG. 35A. For example, when an image other than the agricultural machine 100, the crop 910, or the system user (user) is displayed in the image captured by the field monitoring device 500. Detects that an abnormality has occurred. In addition, you may detect abnormality from the image imaged with the other state monitoring apparatus 550 and the agricultural field monitoring apparatus 555.
- the server 704 When the server 704 detects this abnormality, the server 704 transmits to the user terminals 710 and 712 information indicating that the abnormality has been detected, an image of the abnormal state, its date, time, and the like (S454). Information similar to the information transmitted to the user terminals 710 and 712 is also stored in the database 708.
- the server 704 determines whether or not the processing for the abnormality that has occurred is completed (S456). Immediately after the occurrence of an abnormal event, the process is not normally completed, and the process proceeds to the next recognition process in S458.
- image recognition is performed in the same manner as in step S208 in FIG. This recognition process is performed by obtaining the characteristic amount of the abnormal part and comparing it with a standard pattern stored in the database 706 or the like.
- the server 704 determines that the process is completed by receiving the information. This is an operation that includes both cases.
- the server 704 determines whether or not the abnormal content has been recognized (S460). If the content of the abnormality can be recognized, an operation corresponding to the content of the abnormality is performed (S462). This operation is defined for each abnormality content. For example, the operation is ignored, the agricultural machine 100 located at the nearest position is approached and threatened with a horn or the like, and water is sprayed against the abnormality. Thereafter, the process returns to the process of S456, and when the server 704 determines that the process for the abnormality has been completed, the process when the abnormal event occurs is terminated (S474).
- the server 704 detects the abnormal region from the plurality of signs 370 whose positions existing in the image captured by the farm field monitoring apparatus 500 are known. A place is specified (S464).
- the server 704 determines whether an instruction has already been given to the agricultural machine 100 (S468). If it is determined that an instruction has not yet been given, the server 704 uses the information stored in the database 708 to identify the agricultural machine 100 that is closest to the abnormal location (abnormal area), and confirms the abnormal content. Is issued (S470). In addition, the agricultural machine 100 may specify the thing with the shortest path
- the server 704 transmits the location information of the abnormal location to the agricultural machine 100 that has been specified and instructed to confirm the abnormal content (S472).
- step S468 if it is determined in step S468 that an instruction for confirming the abnormality content has already been issued to the agricultural machine 100, the server 704 proceeds to step S472 and transmits the position information of the abnormal point specified in step S464. Then, the server 704 returns to the process of S456.
- the agricultural machine 100 operates as follows. S500, S501, and S502 described here show the process of S224 in FIG. 35A when an abnormal event occurs.
- the agricultural machine 100 receives the instruction of the step S470 (for interrupting the work only at the beginning and checking the content of the abnormality) and the location information of the abnormal part in S472. (S500). Then, the agricultural machine 100 recalculates the travel route according to the position information of the abnormal part (S501). Since the abnormal part does not always stay at the predetermined part as in the case where a pest invades, the agricultural machine 100 receives the positional information of the abnormal part every time in the process of S500.
- the agricultural machine 100 will update a progress path
- the agricultural machine 100 interrupts the original work and performs the process at the time of the work interruption defined in S400 to S444 (S502). Then, when the process of S502 ends, the process proceeds to S226.
- the processing at the time of occurrence of the abnormality starts based on an image captured by the farm monitoring apparatus 500 or the like, but the abnormality is detected by the stereo camera apparatus 110 or the multispectral camera apparatus 113 installed in the agricultural machine 100. You may start when. In this case, the agricultural machine 100 that captured the abnormal content performs processing.
- FIG. 55 illustrates in detail the operation of step S422 (determination as to whether or not the target position has been reached) when an abnormal event occurs in step S502. This process is executed by cooperation between the agricultural machine 100 and the server 704.
- step S444 If no abnormal content is detected, the process proceeds to step S444. On the other hand, when the abnormal content is detected, the agricultural machine 100 stops moving (S552). Then, the image including the abnormal content imaged by the stereo camera device 110, the distance information, the current position, and the like are transmitted to the server 704 (S554).
- the server 704 performs image recognition processing on the abnormal content (S558). For the image recognition process, the same method as in step S208 of FIG. 35A is used. Then, the server 704 determines whether this image recognition has been completed (S560).
- the agricultural machine 100 is within a predetermined distance (for example, within 3.5 m) or shorter than the predetermined distance from the distance information sent from the agricultural machine 100 to the abnormal content. It is determined whether there is any (S562). That is, it is determined whether or not it is close enough until the abnormal content can be recognized. This operation may be performed depending on whether or not it is determined that the length between the edges identified by the image recognition processing (that is, the size of the object) is greater than or equal to a predetermined value.
- step S562 If it is determined in step S562 that the distance is within a predetermined distance (a distance shorter than the predetermined distance), a notification that an unrecognizable abnormality has occurred in the user terminals 710 and 712 is displayed. It transmits with the image shown (S566). An example of an image displayed on the user terminals 710 and 712 at this time is shown in FIG.
- FIG. 56 is taken by the stereo camera device 110 at the position where the agricultural machine 100 is located in the lower right in FIG.
- the user terminals 710 and 712 display the abnormal content (abnormal source) 1000 in the image captured by the stereo camera device 110, and the distance 1103 to the abnormal source 1000, the size 1101 of the abnormal content, and the like. Additional information is displayed.
- the user recognizes the abnormal content by looking at the image and the displayed additional information from the user terminals 710 and 712, identifies the abnormal content, and selects a countermeasure (target operation). Then, by transmitting them to the server 704 using the user terminal 710 or 712, the server 704 determines that it has arrived at the target operation position, and further designates the target operation (S568).
- the server 704 registers the abnormality content, related images, feature patterns, and countermeasures in the database 706. As a result, it becomes possible to cope with the occurrence of the same or similar abnormal content in the future. Even when it is determined that the process can be recognized in the process of S560, the server 704 determines that it has arrived at the position of the target action, and the target action according to the recognized abnormal content (further close to the abnormal content, Or intimidate by illuminating the illumination lamp 124, ignore, proceed to abnormal content, discharge water to abnormal content, etc.) (S568). This server 704 transmits the arrival judgment and the target action to the agricultural machine 100.
- the server 704 does not determine arrival (S564).
- the server 704 transmits this non-arrival determination to the agricultural machine 100. As a result, the agricultural machine 100 further approaches the abnormal content.
- step S42 the agricultural machine 100 which received the judgment result of the process of S568 or S564 by the server 704 judges whether it was judged by the server 704 that it arrived at the position which performs target operation
- the overall system 1500 can efficiently perform appropriate processing for the abnormality.
- the recognition processing of images acquired by the stereo camera device 110, the multispectral camera device 113, etc. installed in the agricultural machine 100, the agricultural field monitoring devices 500, 555, the state monitoring device 550, and other image processing are mainly performed by the server.
- the example performed in 704 has been described, the present invention is not limited thereto, and image processing may be performed by the agricultural machine 100, the camera device, the farm field monitoring device 500, or the like. By doing so, the amount of data communicated wirelessly can be reduced, the amount of communication data can be suppressed as a whole system, and the performance of the overall system 1500 (speeding up of processing time) can be achieved.
- the amount of electricity generally used in the agricultural machine 100 or the like can be suppressed, and in particular, an electrically driven agricultural machine is used. In this case, it is possible to work for a long time with a single charge.
- each agricultural machine may be operated by interlocking a plurality of agricultural machines 100 by wireless communication or wired communication.
- the first agricultural machine plows the field, and the next agricultural machine performs fertilization and sowing.
- the leading agricultural machine basically performs the operation described so far, and the subsequent agricultural machine performs work in accordance with an instruction from the leading agricultural machine.
- FIGS. 57 and 58 show another example of the agricultural machine 100 in the present embodiment.
- FIG. 57 shows a state of watering work by a mobile sprinkler
- FIG. 58 shows a state of fertilizer application work by a helicopter (quad copter).
- the technique shown in FIG. 57 is center pivot irrigation using a sprinkler 850 as the agricultural machine 100.
- the sprinkler 850 is a triangular structure (truss structure) tower 854 in which a plurality of aluminum water spray pipes 856 are connected, and the tower 854 is moved by wheels 852 to sprinkle water.
- Water spray ports 858 and electronic valves 860 for controlling the supply of water to the water spray ports 858 are provided at various locations of the water spray pipe 856. It is more efficient to spray water near the crops 350 and 360 to prevent loss due to evaporation. For this reason, although the drop-type water spout 858 branched downward from the water sprinkling pipe 856 is used, it is not limited to this.
- the sprinkler 850 moves in a circle around one end. Then, the pumped-up groundwater is supplied from the central side.
- This sprinkler 850 is also provided with a GPS receiver, a radio communication antenna and a control device in the same manner as the agricultural machine 100.
- This control device also controls the opening and closing of each electronic valve 860. And it connects with the information communication system 1502 shown in FIG.
- water is sprayed only from the water spout 858 passing near the top of the crop 360 having low plant activity.
- the sprinkler 850 itself may be provided with the multispectral camera device 113 and the like, and the server or the sprinkler 850 may determine the plant activity based on the spectral image, spectral information, and the like obtained therefrom, and control the watering.
- the water to be used can be made more efficient.
- pest information may be acquired from the field monitoring device 555 provided with a polarizing filter, and pesticide spraying may be performed only on crops in which the pest exists.
- the polarization camera device may be installed on the sprinkler 850, and the server 704 or the sprinkler 850 may detect a pest from the polarization image obtained therefrom.
- the sprinkler 850 performs the operation by the operation described with reference to FIGS. 32, 33, 34, 35A, 36, 37, 46, and 47. However, since this sprinkler 850 has a predetermined movement route, complicated route calculation is not required. In addition, confirmation of the direction of travel in FIG. 35A (S202), route change based on the determination (S208 to S222), and turning operations (S232, S234) are unnecessary and can be omitted. Further, since it is not necessary to resume from the exact work interruption position when the work is interrupted, operations for positioning the agricultural machine such as S414, S434, and S438 in FIG. 49 can be omitted.
- the agricultural machine is not limited to the center pivot method, and for example, parallel movement irrigation may be performed.
- the entire system 1500 can perform the work only on an object that requires a work such as watering, so that resources can be used efficiently.
- FIG. 58 shows a state where a helicopter (quad copter) 1100 is used as the agricultural machine 100 and the liquid fertilizer 802B is sprayed.
- This helicopter 1100 includes four rotor heads 1102 installed near the arm tip extending from the helicopter 1100 main body, and four rotors 1104 that are rotatably connected by the rotor head 1102. Fly by turning.
- the helicopter 1100 includes at least a control device 118C that controls the helicopter 1100 including the rotation control of the GPS antenna 120, the wireless communication antenna 114, and the rotor 1104, the stereo camera device 110, the multispectral camera device 113, and the control.
- the stereo camera device 110 is installed on the helicopter 1100 so as to be rotatable by the control device 118C in a direction orthogonal to the vertical axis when the helicopter 1100 is flying horizontally. Then, the helicopter 1100 can point the stereo camera device 110 to the ground, thereby confirming the state of crops or the like, or measuring the distance between the ground surface and the helicopter 1100 to identify the altitude.
- the altitude is an example of the second digital information or the fourth digital information.
- the helicopter 1100 directs the stereo camera device 110 in the traveling direction so that obstacles (for example, artificial objects such as the field monitoring devices 500 and 555 and the state monitoring device 550 and natural objects such as tall trees other than crops) are present in the traveling direction. It can also be confirmed whether there is any.
- the altitude may be measured by an altimeter that identifies the altitude that is flying based on the change in atmospheric pressure.
- This helicopter 1100 detects the current position by a GPS signal or the like by the method described so far, and performs wireless communication with the information communication system 1502 of FIG.
- the helicopter 1100 or the server 704 grasps the state of the plant such as the plant activity based on the spectral image and the spectral reflectance captured by the multispectral camera apparatus 113, and only when the helicopter is less than a predetermined value. 1100 operates the work device 106E to spread the fertilizer 802B on the crop 360. Of course, the helicopter 1100 may perform different work (for example, watering work or agricultural chemical spraying work) by the same or different work device 106 based on these information or other information.
- the helicopter 1100 basically operates according to the same flow as described with reference to FIGS. 32, 33, 34, 35A, 36, 37, 46, 47, 54, and 55. I do. However, the altitude for flying is also set in S114 to S116 in FIG. Further, since it is possible to jump over crops and the like, the route can be calculated more easily than the traveling route of the agricultural machine 100.
- the progress in these operations is progress by flight.
- the agricultural machinery used is not limited to helicopters (quad copters).
- a multi-copter such as an octocopter having eight rotors or the like, a balloon type, an airplane type, or a glider type may be used.
- Remote operation means that the user operates the agricultural machine 100 using the user terminals 710 and 712.
- remote operation includes a case where the user operates the user terminals 710 and 712 while riding on the agricultural machine 100 or operates in the vicinity of the agricultural machine 100 or the like.
- an image captured by the agricultural machine 100 or the like and additional information may be displayed on the screens of the user terminals 710 and 712 via the server 704.
- additional information distance information or the like
- this image is transmitted as video (moving image) information.
- a video server 705 specially handling video information is provided separately in the information communication system 1502 shown in FIG. 2 so as to be connected to the Internet 702 (see FIG. 59).
- Video can be adjusted in compression rate of video information communicated according to the status of the communication line. It is transmitted and received as video data compliant with H.264 SVC.
- the agricultural machine 100 etc. are H.264.
- H.264 SVC formats other than H.264 Data may be transmitted in a format compliant with H.265 or the like.
- the agricultural machine 100 or the like may transmit still image information such as JPEG format or PING format continuously or intermittently instead of video information.
- Examples of screens displayed on the user terminals 710 and 712 are images captured by the stereo camera device 110 as shown in FIGS.
- a spectral image captured by the multispectral camera device 113 (for example, an image in which spectral reflectance is expressed by luminance or color shading) may be displayed.
- the user operates the agricultural machine 100 by instructing the traveling direction, turning, and speed, and can also perform work using the work device 106.
- one of the operation of the agricultural machine 100 and the work by the work device 106 may be automatically performed, and the user may perform only one of them.
- images captured by the farm field monitoring devices 500 and 555 and the state monitoring device 550 can be displayed together, and for example, the position of the farm machine 100 or the like performing remote operation can be displayed.
- the operation management server 707 issues an operation command to the agricultural machine 100 or the like based on information input to the user terminals 710 and 712, and remotely controls the agricultural machine 100 or the like.
- Information input on the user terminals 710 and 712 may be manually input by a touch panel, keyboard, or mouse operation, or may be voice or a gesture.
- the operation management server 707 performs recognition using a program for recognizing the information, and transmits an operation command corresponding to the recognition result to the agricultural machine 100 or the like.
- images transmitted from a plurality of agricultural machines 100 or the like and images captured by a plurality of image sensors can be displayed on the screens of the user terminals 710 and 712 at once or by switching.
- the system user can move and work while displaying information other than the image captured by the agricultural machine 100 on the user terminals 710 and 712.
- the action can be performed.
- the remote operation is basically executed as a mode different from the automatic control mode described with reference to FIGS. 32 to 56, but the remote operation can also be performed during the automatic control.
- images captured by the agricultural machine 100 are always sent to the user terminals 710 and 712 via the video server 705.
- the remote operation from the user terminal 710 or 712 can be realized by performing an operation in which an instruction from the user is interrupted to the operation described with reference to FIGS. In this case, when there is no instruction from the user, the overall system 1500 can return to the automatic control as described with reference to FIGS. 32 to 56 and execute the processing operation.
- the server 704 also performs accounting processing (billing processing).
- Accounting processing Billing processing
- System providers can collect system usage fees appropriately, so they can continue to operate, develop new services, and improve current services. The challenge is to be able to do it accurately and efficiently.
- the form of billing agreed between the system provider and the user at the start of system use is registered in the database 708.
- the server 704 transmits, to the user terminals 710 and 712 periodically (for example, monthly), billing for each of the charging forms I to III registered in the database 708, either individually or in combination.
- Pay-as-you-go billing forms include: i work type, ii work time, iii work site size, iv work agricultural machine, v server 704 analysis, vi harvest date forecasting, vii market demand acquisition , And viii, the amount of information communication in the system, respectively, and / or combinations thereof.
- the information from i to viii (or information for generating viii from i) is recorded in the database 708 by the server 704 as described above. For example, for the combination of i and ii, the server 704 generates a total fee of $ 100 for the work type (harvest: $ 5 / hour) and work time (20 hours), or for the combination of i and iii.
- a total fee of $ 200 is generated depending on the type of work (land preparation: $ 0.2 / square meter) and the size of the work site (1000 square meters).
- work contents work type, work time, size of work place, agricultural machine that performed the work, etc.
- a predetermined period for example, one month.
- the server 704 can generate a fee of 50 dollars in total for the number of times (5 times) that the vi harvest date is predicted (10 dollars per time). is there.
- These i to viii are calculated by the server 704 based on information registered in the database 708 for each work, and charge is charged to the user terminals 710 and 712 every certain period (for example, half a year).
- the time when the work is interrupted is subtracted, and calculation is performed based on the actual working time.
- the overall system 1500 also provides a success fee type billing form. i If you charge a certain percentage (for example, 20%) for the sales of crops harvested using the whole system 1500, or if you grow crops using the whole system 1500, the amount of sales increased. Charge at a certain rate (for example, 50%), or set the price by taking into account the market price of the harvested crops for the charge of iii and i, ii (for example, the market price is relative to the reference price) (If the price rises above a certain level, the ratio of i and ii is increased, and if the price falls drastically, it is decreased). Information for calculating i to iii is recorded in the database 708. The server 704 calculates these charges from the data stored in the database 708, and charges the user terminals 710 and 712 at regular intervals (for example, half a year).
- the fee may be discounted when the user satisfies a certain condition.
- a discount of 3 dollars per time is given up to a predetermined number (10 times / month). It can be carried out.
- the predetermined amount may be the upper limit.
- the server 704 since the information is recorded in the database 708, the server 704 refers to it and performs a discount.
- the provider of the overall system 1500 can acquire data necessary for efficient operation of the overall system 1500 in the future, and the user can receive a discount on the system usage fee. Become.
- the system usage fee can be reduced as compared with the case of automatic control (automatic control).
- the fee setting the fee is set higher in the order of the value provided by the overall system 1500 (in the order of automatic control, remote operation, and manual operation).
- manual operation using the manual operation unit 116 of the agricultural machine 100 can be made free of charge.
- the server 704 obtains such discount information from the data stored in the databases 706 and 708 and the SSD in the server 704, calculates a discount fee, and discounts the calculated fee to the user terminals 710 and 712. Make a charge for.
- the server 704 can charge these flat-rate charges, pay-as-you-go charges, and performance fee charges alone or in combination. At this time, the above discount is also given. As described above, since the entire system 1500 can automatically acquire and automatically collect information from the start of the work to the completion of the work, and further from the harvesting to the retail of the agricultural products, it is possible to perform an accurate and efficient charging process.
- the user of the overall system 1500 can make an electronic payment using a credit card, a debit card or other electronic money using the user terminals 710 and 712 or the like. Or it can be handled by bank transfer. If the server 704 cannot confirm payment of the fee within a predetermined date after charging the user terminal 710, 712, a reminder is sent to the user terminal 710, 712 or by another means such as mail. Or send it. If the transfer cannot be confirmed within a predetermined date from the sending of the reminder, the server 704 prevents the user from using part or all of the entire system 1500. Thereby, the use of the entire system 1500 by a user who does not pay a usage fee can be restricted.
- FIG. 60 shows a construction work machine (road roller) 1200 as another example of the moving body (working body) in the application embodiment of the present embodiment.
- This construction work machine 1200 has wheels (rollers) 2000 that are heavy and have a large ground contact area.
- the construction work machine 1200 travels while applying pressure to the entire surface of the road by the weight of the wheels, and hardens the soft ground.
- the construction machine 1200 further includes a prime mover 102D by an internal combustion engine, a transmission device 104D, a support device 108D, a stereo camera device 110, a wireless communication antenna 114, a manual operation unit 116, a control device 118D, a GPS antenna 120, a steering device 116, A pair of illumination lamps 124D, a set of ultrasonic sonar devices 126D, and a set of rear wheels 130B are provided.
- This construction work machine 1200 is connected to an information communication system 1502 equivalent to that shown in FIG. 2 or 59 by wireless communication.
- the stereo camera device 110 Based on information acquired by the stereo camera device 110, it detects obstacles ahead and detects ground irregularities by distance measurement, and only works on areas with a certain degree of irregularities that have not been solidified yet. Is possible.
- the application of the overall system 1500 that performs automatic control of the moving body described in the present embodiment is not limited to construction work machines, and can be applied to devices and machines that perform movement and work. That is, the overall system 1500 of this embodiment can be applied to a system in which movement is performed based on a plurality of types of information, and work can also be performed based on a plurality of types of information (for example, electromagnetic waves having different frequencies).
- the movement is basically controlled so as to follow a predetermined route or a corrected route while observing position information.
- the path determination and the trajectory correction at the time of movement are performed, and image information and distance information (or parallax value) acquired by the stereo camera device 110 ) Is also used to perform misalignment correction and move.
- the laser radar device 112 may be used to move while confirming the shape and distance in the traveling direction.
- the work is mainly performed based on surface information acquired by a camera device including a lens and an image sensor and information related to the surface information. For example, in the case of a stereo camera device, a captured image (surface information) and distance information (related information) in the captured image are obtained.
- the captured image (surface information) and the spectral reflectance information (related information) in the captured image are obtained.
- the captured spectral image (surface information), spectral reflectance information in the captured image, and distance (related information) in the image are obtained.
- a combination of a polarization camera and a laser radar device, or a polarization stereo camera device a high-contrast polarized captured image (surface information) and a distance (related information) in the captured image.
- a laser radar that can irradiate a laser in two dimensions, it is shape information (surface information) of an object and distance information (related information) in the shape.
- shape information surface information
- distance information related information
- the movement and work are controlled by the captured image, distance information, and spectral reflectance information (combination of stereo camera device and multispectral camera device), the captured image, distance information, and polarization. Movement and work may be controlled by image information (polarized stereo camera).
- the overall system 1500 can perform work using these complex information.
- the movement and operation of the machine are controlled using electromagnetic waves or light (image) and information related to the image.
- electromagnetic waves terahertz waves
- elastic waves (Sound wave)
- information superimposed on those waves, and other environmental information may be received, and the system may be used to control movement and work.
- the captured image is likely to be blurred if the image is taken while moving. This can be dealt with by slowing the moving speed (traveling speed, flight speed, diving speed, etc.) at the time of imaging, increasing the shutter speed, or installing a shake correction mechanism in the lens or imaging sensor. Further, the image may be corrected using a plurality of captured images.
- the method of moving the moving body and the working body along the route is basically described, but the present invention is not limited to this. That is, when a user specifies a certain work area or work location with the user terminals 710 and 712, various types of image sensors (camera devices), laser devices (laser radar devices), ultrasonic sonar devices, and wireless mobile or work objects can be detected. While grasping the environment, the user moves autonomously within the area or location, or performs work while moving.
- the moving body and the working body have complex algorithms for autonomous control, and the control device controls machines such as the moving body and the working body based on the algorithm.
- the control device in this case may be provided inside the machine or may be provided outside to control the machine by wireless communication. This algorithm performs autonomous control related to movement when moving, and performs autonomous control related to operation of an object when performing work.
- the information communicated in the overall system 1500 is also an information good having its own value, basically, it is securely managed as described above.
- This machine includes all machines such as agricultural machines, construction machines, and flying machines (the same applies hereinafter).
- a plurality of types of information for controlling movement not based on manual operation include information for wireless position specification, image information and distance information acquired by a stereo camera device and other distance measuring devices, Or the image information acquired with camera devices, such as a monitoring camera installed in a certain place, the distance information based on the image information, etc. correspond.
- image information spectral image information, distance information, reflectance information (spectral reflectance information) acquired by an image sensor, polarization
- image information shape information acquired by a laser device, distance information, and the like.
- both of the plurality of types of information for controlling the movement not based on the manual operation and the plurality of types of information for controlling the operation not based on the manual operation include (surface) information expressed in at least two dimensions. Contains.
- one of the information for performing the movement may be image information or information on the shape.
- one piece of information for performing the work may be image information or information on a shape.
- the movement of the machine and the operation on the object that are not performed by manual operation are usually performed alternately or simultaneously (the same applies hereinafter).
- the route is information such as information for wireless position identification, distance information acquired by a stereo camera device or other distance measuring device, or a camera such as a monitoring camera that can be installed at a specific location. It is specified by distance information based on image information acquired by the apparatus.
- the machine described in (2) can perform the operation based on a plurality of types of information. In the machine of (2), one of the information for performing the movement may be image information.
- a machine that performs work on a plurality of objects without manual operation, and that determines execution of the work for each object according to the state of each object acquired by the machine.
- the execution of work includes not only whether or not work is performed, but also includes the degree of work (amount of watering or fertilizer). For example, based on the activity of plants for each crop, the spectral reflectance for a specific wavelength, and the presence or absence of pests, automatic operations are performed on them (watering, fertilization, pesticide application, etc.). Also, for example, the ground is solidified according to the road surface condition in each area by automatic control.
- a control device that controls the movement of the machine and the work by the machine without manual operation, and performs the movement and the work of the machine based on image information acquired by the machine and information related to the image information. It is a control device to control.
- the control device may be a part of the machine, or may be different from the machine (for example, a server).
- a control device that controls the movement of the machine regardless of manual operation, and that controls the movement of the machine based on image information obtained by a device other than the machine and information related to the image information.
- the control device may be a part of the machine or may be different from the machine (for example, a server).
- a control device that controls work by a machine without manual operation, and that determines execution of the work according to the state of each target acquired by the machine.
- the execution of work includes not only whether or not work is performed, but also includes the degree of work (amount of watering or fertilizer).
- a machine provided with a device for leveling the ground surface and a device for irradiating light with a width at a predetermined angle, wherein the light is received by a light receiver different from the machine. It is a machine that controls a device for leveling according to the received position.
- a system comprising a work machine having a device for leveling the ground surface and a light receiving device for receiving light, and a light irradiation device for irradiating light with a predetermined angle and width.
- the work machine is a system that controls a device for leveling according to the position at which the light irradiated by the light irradiation device is received.
- a system including a machine that performs at least one of movement and work on an object without manual operation, and a control device that acquires information by wireless communication obtained from the machine and controls the machine, Information obtained from the machine is input to the control device via a plurality of wireless relays, and information output from the control device is input to the machine via a plurality of wireless relays.
- a machine that performs movement and work on an object regardless of manual operation, and stops the operation when an abnormality is detected, and performs an operation for the abnormality according to the content of the abnormality.
- a system including a machine that can move and a terminal that displays an image captured by the machine and allows a user to operate the machine, and also displays information related to the image on the terminal. It is.
- a system including a machine that performs a work and a control device that issues an instruction to cause the machine to perform the work using information acquired by the machine, the target of the work and the content of the work It is a system to memorize.
- the control device can use the stored information as information for determining the contents of future work.
- a machine that performs work on a target without manual operation acquires information for each target according to the type of work, and determines whether to perform work based on the acquired information. For example, if the type of work is watering, use a multispectral camera to obtain the state of the target crop (plant activity, etc.) and based on that, decide whether to water the target crop. In the case of pest removal, it is determined whether or not pests are attached to the surface of the target crop from the image captured by the polarization camera, and the pesticide is sprayed only when the pests are attached.
- a control device that gives an instruction to perform a work on a machine that can perform a work on a target without a manual operation, and when the work is performed, a usage fee corresponding to the work is charged. It is a control device to calculate.
- a control device that controls movement of a machine that can be moved regardless of manual operation, and when there is an obstacle that cannot be recognized in the traveling direction of the machine, the machine is further set as the obstacle. It is a control device that performs recognition processing after being closer.
- Visible light is mentioned as visible electromagnetic waves, An image is acquired from there.
- the invisible electromagnetic wave is a radio wave or invisible light.
- a method of performing at least one of movement and work without manual operation acquiring a plurality of electromagnetic waves having different frequencies, recognizing information included in the acquired plurality of electromagnetic waves, and based on the recognized information At least one of movement and work.
- This electromagnetic wave includes light (visible light and invisible light) and radio waves.
- a machine having an imaging device, which performs at least one of movement and work not based on manual operation based on information other than the image included in each micro area in the captured image.
- a machine that moves by specifying a position based on a received signal without using a manual operation, and has a distance information acquisition unit that acquires information about a distance to a certain point. The machine corrects the movement based on information about the acquired distance.
- (23) A machine that moves along a predetermined route, and that detects a reason for changing a route in the route, changes the route, and moves along the changed route. .
- a failure detection means for detecting a failure in the traveling direction a recognition means for recognizing the failure when the failure detection means detects the failure, and a case where the recognition means can recognize the failure.
- a system comprising a device for specifying a work area and a machine for performing the work without manual operation, wherein the machine moves to the area specified by the device, and the area It is a system that determines whether or not work is necessary for each object in the system and performs work only on the object that is determined to be necessary.
- the first sensor device acquires distance information between the image and the acquired image
- the second sensor device acquires color information in the image and the acquired image. Is a machine to get. While controlling the movement of the machine based on the distance information, it is possible to work on the object by the machine based on the color information. Alternatively, the object can be worked based on the distance and color information.
- a machine including a working device that performs work on an object and an information device that converts the phenomenon of the object into information, a recognition unit that recognizes a phenomenon computerized by the information device, and the recognition unit And a control device having a determination unit that performs a determination according to the phenomenon recognized by the machine, wherein the machine work device manually performs work on the object according to the determination result of the control device.
- the information processing apparatus include various camera apparatuses and sensor apparatuses such as a laser radar apparatus described in the present embodiment. In particular, if a sensor device that can take information of two or more dimensions is used, the range of phenomena that can be grasped can be expanded.
- the machine includes an acquisition device that acquires at least a part of the object.
- the control device is a system that obtains tracking information of an object acquired by the acquisition device and controls work on the object of the machine based on the tracking information.
- an example of the acquisition device is a harvesting device of the working device 106 described in the present embodiment.
- fruits that are part of the crop and the crop itself are harvested. Even after the fruits and crops are shipped, they can be tracked using bar code information, etc., and the shipping status and supply / demand status can be analyzed. Thereby, feedback to the same crop cultivation work becomes possible.
- a machine that performs movement and work without manual operation, and the non-contact sensor device acquires image information and information associated with the image information It is a camera device that can.
- a mobile work device that performs work while moving, or repeatedly moves and works;
- the information generated by the work information generation device is transmitted to a work information generation device that generates information related to the work and a control device that accumulates the received information and identifies work contents in a predetermined period based on the accumulated information.
- a transmission device In this case, the information related to the work may include information related to the start and end of the work. Further, the information related to the work may include information related to work interruption and resumption.
- a state specifying unit that specifies the state of the work target
- a determination unit that determines whether or not to perform work on the work target according to the state specified by the state specifying unit
- the information regarding the work is a machine including a determination result by the determination unit.
- a system including a machine that performs work on a work target without manual operation, and a control device that specifies work content based on information acquired from the machine, the machine moving and Performing the work, sending information about the work to the control device,
- the control device is a system that accumulates information about the received work, specifies work contents in a predetermined period based on the accumulated information, and presents the work content.
- the information on the work may include information on the start and end of the work.
- the information related to the work may include information related to work interruption and resumption.
- control device may present the control device without specifying the work content during the work interruption.
- the work target is a crop
- the control device may be a system that analyzes the harvest of the crop based on the accumulated information and presents the analysis result. Further, the analysis by the control device may be performed by using environmental information acquired from outside the system in addition to information on the work.
- a step of generating information regarding, a step of accumulating information generated in the step, a step of specifying work content in a predetermined period based on the stored information, and a step of outputting the specified work content It is a production method of work information including.
- a computer-readable recording medium recording the program for executing the step of identifying the work content in a predetermined period based on the program.
- a system comprising a detection device capable of detecting information for specifying the state of a crop, and a management device for acquiring information detected by the detection device, wherein the detection device is non-contact and A sensor for detecting information for specifying the state of the crop, and a transmission means for transmitting the information detected by the sensor to the management device, wherein the management device is based on information transmitted from the state detection device.
- a system comprising: a specifying unit that specifies a state of a crop; and a predicting unit that predicts crop yield based on a specific result of the specifying unit.
- the detection device includes a moving means for moving, and can detect information for judging the state of the crop while moving the area where the crop is cultivated.
- the detection device includes control means for operating the moving means without manual operation.
- a recording device for recording a specific result by the specifying unit is provided, and the prediction unit is configured to produce a crop based on the specific result recorded in the recording device and the specific result recorded in the past in the recording device.
- the transmission means in the detection device also transmits information related to the work for the crop whose state is specified to the management device, and the prediction means is a system that predicts crop yield using the information related to the work. There may be.
- a prediction data production method comprising: a prediction step for predicting crop harvesting based on the state of the crop that has been applied; and a step of outputting a result predicted in the prediction step.
- a recording step for recording the state of the crop specified in the specific step is provided, and the prediction step includes the state of the crop based on the state of the crop specified in the specific step and the state of the crop recorded in the past. Harvest may be predicted.
- a computer-readable recording medium storing the program for executing the prediction process and the process of displaying the result predicted in the prediction process.
- a machine that moves and works regardless of manual operation moving means that moves along a predetermined route, recognizing means that moves by the moving means to recognize a work object, and recognized by the recognizing means
- the operation when it is determined that the operation is performed by the determination unit, the operation may be performed on the operation target after moving to the vicinity of the operation target by the moving unit.
- the apparatus further comprises a measuring means for measuring the remaining amount of fuel or electric power for movement, and when the remaining amount falls below a predetermined value or falls below a predetermined value, the operation is interrupted and the fuel or electric power is It may be a machine that moves to the supply position and returns to the position where the work is interrupted after the supply is completed and resumes the work. Furthermore, a confirmation means for confirming a remaining amount of work resources for performing work is provided, and when the remaining amount falls below a predetermined value or falls below a predetermined value, the work is interrupted, It may be a machine that moves to the supply position and returns to the position where the work is interrupted after the supply is completed and resumes the work.
- a system including the machine according to (40) and a specifying device that specifies a region where a work target requiring work exists, wherein the specifying device is wider than a region that the machine can recognize
- a wide area recognition means for recognizing a plurality of work objects in the region, and a wide area determination means for judging whether or not a plurality of work objects recognized by the wide area recognition means include work objects that require work,
- the machine moves to a region where the work target that requires the work exists and is recognized by the recognition means. Is the system to start.
- a system comprising a machine that moves and works without manual operation, and a control device that controls the work, the machine moving means that moves along the predetermined route, and the movement Detecting means for moving the means to detect the work object; transmitting means for transmitting information on the work object detected by the detecting means to the control apparatus; and for the work object in response to a command from the control apparatus.
- Working means for performing work, and the control device recognizes the work target using receiving means for receiving information transmitted from the machine and information on the work target received by the receiving means.
- Recognizing means detecting means for detecting the state of the work object recognized by the recognizing means, and determining whether or not to work on the work object based on the detection result by the detecting means
- a determination means and when the determination means determines that the work is to be performed, the machine is instructed to perform the operation on the work object, and the determination means determines that the work is not performed.
- a control means for giving a command to the machine to move the moving means and detect the next work object by the detecting means without performing the work.
- the apparatus further includes a specifying device that specifies a region where a work target requiring work exists, and the specifying device detects a plurality of work targets in a region wider than a range that the machine can detect; and A transmission unit configured to transmit the detected information of the plurality of work objects to the control device, and the control device receives the information transmitted from the specific device, and the reception unit receives the information. And a wide area judging means for judging whether or not the plurality of work objects include work objects that require work based on the information on the plurality of work objects, and the machine performs the work by the wide area judging means.
- the moving unit moves to a region where the work target that requires the work exists and starts detection by the detection unit. It may be.
- a moving step of moving the work device along a predetermined route, a recognition step of moving the work device to recognize the work target by the work device, a detection step of detecting the state of the recognized work target, and detection A determination step for determining whether or not to perform work on the work target based on the result of the step, and a work step for performing work on the work target by the work device when it is determined that the work is performed by the determination step and a step of recognizing a next work object while moving by the moving step without performing the work by the working step when it is determined that the work is not performed by the determining step.
- a current position acquisition device that acquires the current position, a sensor device that acquires image information, a transmission device that moves by transmitting power generated from a power source, and a control that controls movement by the transmission device
- the control device controls movement by the transmission device based on the current position acquired by the current position acquisition means, and based on the image information acquired by the sensor device A machine that corrects movement.
- the correction of the movement may be a machine that is performed based on information about a distance obtained from the image information.
- the information related to the distance may be information related to the distance from the ground surface.
- the control device may recognize an object existing in the traveling direction of the machine from the image information and correct the movement according to a recognition result. And If the object cannot be recognized, the control device may prompt the user to specify the type of the object, or may correct the movement so as to avoid the object.
- a machine having a current position acquisition device that acquires a current position, a sensor device that acquires image information, and a transmission device that moves by transmitting power generated from a power source, and controls movement of the machine
- a control device wherein the machine transmits a current position acquired by the current position acquisition device and image information acquired by the sensor device to the control device, and the control device acquires The system controls the movement of the machine based on the current position and corrects the movement of the machine based on the acquired image information.
- a method of moving the machine without manual operation wherein a current position acquisition step of acquiring a current position, an image information acquisition step of acquiring image information, and power generated from a power source are transmitted to move the machine.
- a current position acquisition step of acquiring a current position, an image information acquisition step of acquiring image information, and power generated from a power source are transferred to a computer of a machine that moves without manual operation.
- a step of performing the movement based on the current position acquired in the current position acquisition step, and a step of correcting the movement based on the image information acquired in the image information acquisition step are transferred to a computer of a machine that moves without manual operation.
- a computer-readable recording medium on which the program is recorded are recorded.
Landscapes
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Soil Sciences (AREA)
- Environmental Sciences (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Optics & Photonics (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
100A、100C トラクター
110 ステレオカメラ装置
112 112-2 レーザレーダ装置
113 マルチスペクトルカメラ装置
310 コスト算出部
320 コスト合成部
330 視差値導出部
350、360 農作物
500、555 圃場モニタリング装置
501 全天球カメラ装置
550 状態モニタリング装置
704 サーバー
705 映像サーバー
706、708 データベース
707 操作管理用サーバー
710、712 ユーザ端末
850 スプリンクラー
1100 ヘリコプター
1500 全体システム
1501 システム
1502 情報通信システム
<圃場におけるシステム構成>
圃場における農作業において、トラクターなどの農業機械の移動および農業機械を使った作業の効率化を図ることが課題となっており、可能な限り人の手を使わずに自動制御でそれらを行うことが望まれている。図1に本実施形態が適用される圃場におけるシステム1501の構成を示す。本実施形態の全体システム1500は、図1のシステム1501と図2の情報通信システム1502の全体である。以下では、全体システム1500について述べた場合でも、システム1501、又は、情報通信システム1502の説明の場合があるし、システム1501、又は、情報通信システム1502について述べた場合でも、全体システム1500の説明の場合がある。
図2は、本実施形態が適用される情報通信システムの構成を示す。本情報通信システム1502は、無線アクセスポイント700、インターネット702、サーバー704、データベース706、データベース708、ユーザ端末710、ユーザ端末712を含んでいる。
〔農業機械、装置の説明〕
続いて図3から図31を使って、本実施形態における農業機械、農業機械等に備え付けられる各種センサ装置、圃場に設置される装置について説明する。
全体システム1500の構成要素の一つである農業機械は、効率的な作業を実現するために、サーバー704からの指示に基づいて自動で走行し、自動で作業対象である農作物(第一の対象の一例である)や土壌(第二の対象の一例である)などに対して作業を行うことができるものである。図3は主として農業機械100Aの外観を示す図である。なお、他の図面において同じ符号がつけられているときには、同様の機能を発揮するので説明を省略する場合がある。
支持装置108は、原動機102Aや伝動装置104、作業装置106Aをそれぞれ適所に保持する部分である。
図5は、図3または4の伝動装置104を詳細に説明する図である。この伝動装置104が農業機械100や作業装置106を移動させるための手段となる。図中の実線は運動エネルギーの伝達を、破線は電気信号の伝達を、一点鎖線は電気の供給線を示す。図5は原動機102の動力発生源が内燃機関(エンジン)であり、後輪二輪駆動の例である。原動機102の動力発生源が電気モータの例は図6に示す。なお、駆動方式は二輪駆動に限られず、四輪駆動でもよい。
原動機102-2は、モータコントローラと電動機(モータ)を含むパワーユニットである。この伝動装置104-2は、モータによる回転数や回転方向を制御するので、基本的に図5で説明した変速装置204は不要だが、よりスムーズな走行を行うために変速装置204はあってもよい。
電気モータによる駆動では、モータ特性により、モータの回転数が低い場合(すなわち、低速で移動する場合)でもトルクを高くできるので、自動車などと比較して低速で作業を行う農業機械に適している。また、後述のように自動でバッテリ224-2への充電を行うことができるので、一連の農作業について人手をかけずに効率的に行うことが可能となる。なお、伝動装置104-2はモータを車輪内に入れこむ、インホイールモータ方式で駆動するものであってもよい。
A.ステレオカメラ装置の構成
図7は、ステレオカメラ装置110の外観を示す。ステレオカメラ装置110はある領域を撮像して、農業機械100の制御装置118やサーバー704、ユーザ端末710、712に伝達可能な画像データを生成するのに加え、撮像した画像内の各地点におけるステレオカメラ装置110からの距離情報(または視差値情報)を取得するものである。もちろん、距離情報(または視差値情報)も制御装置118等に伝達可能である。このステレオカメラ装置110は、Semi-Global Matching(SGM)法を応用した測距を行うことができる。
続いて、ステレオカメラ装置110による測距の方法、とくにSGM法を用いて視差値を求める方法について説明する。まず、図10から図15を用いて、SGM法を用いた測距方法の概略について説明する。
Δ=x'-x (式1)
ここで、図10のような場合には、基準画像Ia中の点Sa(x,y)と撮像レンズ11aから撮像面上におろした垂線の交点との距離をΔaにし、比較画像Ib中の点Sb(x',y')と撮像レンズ11bから撮像面上におろした垂線の交点との距離をΔbにすると、視差値Δ=Δa+Δbとなる。
Z=(B×f)/Δ (式2)
この(式2)により、視差値Δが大きいほど距離Zは小さく、視差値Δが小さいほど距離Zは大きくなる。
Lr(p,d)=C(p,d)+min{(Lr(p-r,d),Lr(p-r,d-1)+P1,Lr(p-r,d+1)+P1,Lrmin(p-r)+p2} (式3)
ここで、rは、集約方法を示す。min{}は、最小値を求める関数である。Lrは、式3に示されているように再帰的に適用される。また、P1及びP2は、予め実験により定められた固定パラメータであり、基準画素p(x,y)から離れた画素ほど、経路コスト値Lrに影響を与えないようなパラメータになっている。例えば、P1=48、P2=96である。
また、(式3)に示されているように、Lr(p,d)は、基準画素p(x,y)におけるコスト値Cに、図14に示されているr方向の各画素における各画素の経路コスト値Lrの最小値を加算することで求められる。このように、r方向の各画素におけるLrを求めるため、最初は、基準画素p(x,y)のr方向の一番端の画素からLrを求め、r方向に沿ってLrが求められる。 そして、図14に示されているように、8方向のLr0,Lr45,Lr90,Lr135,Lr180,Lr225,Lr270,Lr315求められ、最終的に(式4)に基づいて、合成コスト値Lsが求められる。
Ls(p,d)=ΣLr (式4)
このようにして算出された合成コスト値Ls(p,d)は、図15に示されているように、シフト量d毎に示されるグラフによって表すことができる。図15では、合成コスト値Lsは、シフト量d=3の場合が最小値となるため、視差値Δ=3として算出される。
図16はレーザレーダ装置112の構成を示す。レーザレーダ装置112による形状情報は第一のデジタル情報、第三のデジタル情報の一例である。また、レーザレーダ装置112による距離情報は第二のデジタル情報、第四のデジタル情報の一例である。レーザレーダ装置112は、パルスレーザ光を対象物へ照射し、反射したパルスレーザ光の帰還時間tを計測し、その照射点までの距離Lを(式5)を使って算出する。
L=c×t/2 (式5) ただし、cは光速
さらにレーザレーダ装置112は二次元方向にレーザ光をスキャンできるので、対象物各点への方位を求め、対象物の形状の測定を行うこともできる。
図17にマルチスペクトルカメラ装置113の外観を示す。マルチスペクトルカメラ装置113による分光情報は第二のデジタル情報、第四のデジタル情報の一例である。このマルチスペクトルカメラ装置113は、撮像画像と撮像された画像における分光反射率を得ることができるカメラ装置である。このマルチスペクトルカメラ装置113は、ある一点ではなく、ある範囲(領域、面)における植物の状態を非接触・非破壊で一度に検出するのに適している。マルチスペクトルカメラ装置113は、本体部400と鏡筒部402を有する。このマルチスペクトルカメラ装置113は回転可能に農業機械100に設置される。この回転動作は手動または制御装置118からの制御により行う。これによって、農業機械100周辺に対してさまざまな方向にある対象物の反射光を撮像し、植物活性度や枝間の長さや葉の大きさなど育成状況を把握することができる。
NDVI=(IR-R)/(IR+R) (式6)
通常、正規化植生指数NDVIは-1から+1までの値があり、値が大きいほど植物の活性度が高いことを示す。マルチスペクトルカメラ装置113を使えば、理論的には撮像領域すべてにおいて、この正規化植生指数NDVIを求めることができる。すなわち、図23Bのフィルタ440のように、可視赤域2404の660nmの波長域に対応するフィルタ440aと、近赤外域2405である770nmの波長域に対応するフィルタ440bを本実施形態のマルチスペクトルカメラ装置113のフィルタとして採用する。なお、近赤外域2405として、785nmや900nmの波長域に対応するフィルタをフィルタ440bに採用してもよい。この場合、785nmはレーザダイオード(LD)で容易に得られる波長である。LED404の半分は波長が660nm付近の強度が高い光を照射するものを、残りの半分は波長が770nm付近で強度の高い光を照射できるものを設置する。このような構成で、マルチスペクトルカメラ装置113で対象の植物にLED光の照射を行い、反射光の撮像を行う。そして、FPGA418で波長660nmにおける分光画像と波長770nmにおける分光画像を得る。分光反射率算出部420によりそれらの分光画像内の所望の位置または領域における分光反射率を求める。さらに、分光反射率算出部420内のCPUは(式6)を適用して正規化植生指数NDVIを得る。なお、マルチスペクトルカメラ装置113内ではなく、分光画像や分光反射率情報を取得した農業機械100の制御装置118またはサーバー704が(式6)を適用し、正規化植生指数NDVIを求めてもよい。なお、農作物ごとの正規化植生指数NDVIはデータベース708に送られ、蓄積される。なお、正規化植生指数NDVIではなく、可視赤域(たとえば660nm)2404の波長の分光反射率だけを用いて植物の育成状況を把握してもよい。この可視赤域2404において、植物活性度の違いによる分光反射率の変化が大きいからである。これにより、生育状況を把握できるとともに近赤外域2405における分光反射率の測定や正規化植生指数NDVIの算出を省略でき、処理や判断の迅速化を図ることができる。一方、正規化植生指数NDVIを求めれば、正規化されたより精密な育成状況(植物活性度)の情報を得ることができる。
広大な圃場を管理する場合、広範囲にわたる農作物の育成状況などの状態を迅速に認識することが望まれる。図25は、このマルチスペクトルカメラ装置113を使った状態モニタリング装置550を示す。この状態モニタリング装置550は、圃場などにおける農作物の活性度や土壌等を広範囲で素早く測定するための装置である。状態モニタリング装置550は、マルチスペクトルカメラ装置113と、マルチスペクトルカメラ装置113を水平軸に対して回転可能に保持する保持部450と、保持部450を鉛直軸に対して回転可能に保持する回転ステージ452と、太陽光エネルギーを電気エネルギーに変換する太陽電池を複数並べ相互接続されたソーラーパネル456と、ソーラーパネル456で発生させた電気を蓄電する蓄電池とマルチスペクトルカメラ装置113へ命令やマルチスペクトルカメラ装置113からの情報の入出力制御や通信制御、保持部450や回転ステージ452の回転制御を行う制御部を格納した格納部454Aと、格納部454A内の制御部と接続され、農業機械100やサーバー704、ユーザ端末710、712と無線通信を行うための無線アンテナ458と、マルチスペクトルカメラ装置113等を周辺環境から保護する透明なガラス製のカバー462と、状態モニタリング装置550を高い位置に支持する柱460を備えている。この状態モニタリング装置550の通信や各種制御、撮像は蓄電池に蓄えられた電気エネルギーを利用して動作する。なお、蓄電池の電気エネルギー不足や蓄電池を使用しない場合には、外部からの電力を使用してもよい。また、カバー462は透明であればガラス製でなくてもよく、たとえばアクリルなどの樹脂製であってもよい。このカバー462の上方にはソーラーパネル456が、下方には格納部454Aが設置される。また、このカバー462内にマルチスペクトルカメラ装置113、保持部450、回転ステージ452が設置される。この状態モニタリング装置550は、ユーザ端末710、712やサーバー704、農業機械100からの送信された情報に基づき、周辺領域の農作物を撮像し、それらの農作物の植物活性度を調べる。なお、この撮像においてはLED404を使わずに、反射した太陽光を撮像してもよい。また、制御部と無線アンテナ458は、無線アクセスポイントとしても機能し、無線の中継を行うことができる。これにより、無線通信を行える領域を拡大することができる。また、制御部は、農業機械100やサーバー704、ユーザ端末710、712のいずれかからの指示により、当該農業機械100の位置を特定するための信号を、無線アンテナ458を経由して送信する。農業機械100は、合計3つ以上の状態モニタリング装置550または圃場モニタリング装置500から送信される信号の受信信号の強度(または減衰)または受信時間差から現在の位置を特定できる。
一般的なカメラ装置は、一回で一方向しか撮像を行うことしかできない。したがって、そのようなカメラ装置を使って周囲すべてをモニタリングする場合には、カメラ装置を回転させるなどの操作を行わなければならず、モニタリング装置の大型化や回転機構を設けるためのコストも必要になる。さらには稼動部があるということで一般的に故障が生じやすい。このため広大な圃場におけるモニタリングにおいては、一回の撮影で、できるだけ広い範囲を撮像できることが望まれる。図26は、全天球カメラ装置501を使った圃場モニタリング装置500を示す。全天球カメラ装置501はセンサの一例である。全天球カメラ装置501は一度の撮像で、そのカメラの周囲360度を撮像できるもので、圃場に設置することで、圃場はもちろんのこと、たとえば空の画像から天候のモニタリングも行うこともできる。また、この圃場モニタリング装置500によれば、日照量の評価などを広い領域で行うことができる。図26において、図25と同一の符号をつけた構成は、図25を用いて説明したものと同一の機能を果たすのでここでの説明は省略する。454Bは、状態モニタリング装置550のように蓄電池と制御部を格納する格納部であるが、この制御部はマルチスペクトルカメラ装置113ではなく全天球カメラ装置501への指示や当該全天球カメラ装置501からの情報の入出力制御を行う点と回転制御は行わない点で状態モニタリング装置550の制御部と異なる。
図27から図30を使って本実施形態における全天球カメラ装置501を説明する。図27は、全天球カメラ装置501の正面外観図である。このカメラは魚眼(広角)レンズを備えた2つの光学系A、Bと本体部502を有する。
図28は、全天球カメラ装置501の光学系を示す図である。図28において、符号A、Bで示す部分は撮像光学系を示している。2個の撮像光学系A、Bは何れも180度より広い画角を持つ広角レンズと、この広角レンズによる像を撮像する撮像素子IA、IBとにより構成されている。即ち、撮像光学系Aは、レンズLA1~LA3により構成される前群、反射面を構成する直角プリズムPA、レンズLA4~LA7により構成される後群により構成されている。そして、レンズLA4の物体側に開口絞りSAが配置されている。撮像光学系Bは、レンズLB1~LB3により構成される前群、反射面を構成する直角プリズムPB、レンズLB4~LB7により構成される後群により構成されている。そして、レンズLB4の物体側に開口絞りSBが配置されている。
条件(1)7.0<d/f<9.0
を満足する。
条件(1)の意義について説明すると、条件(1)のパラメータ:d/fが小さくなることは、全系の焦点距離:fが長くなるか、前群の光軸と反射面との交点と前側主点との距離:dが小さくなることを意味する。焦点距離:fが大きくなれば、広角レンズの光軸上のレンズ全長が長くなるので、コンパクト化の観点からこれを適当な値に設定すると、その条件においては距離:dが小さくなることを意味する。dが小さくなると、レンズLA3(LB3)とプリズムPA(PB)との間隔が狭くなり、レンズLA3(LB3)に必要な屈折力を確保するためのレンズ肉厚に対する制限が厳しくなる。そして、条件(1)の下限値を下回ると、レンズLA3(LB3)の所望の肉厚、形状を加工できなくなったり、加工が難しくなったりする。図28において、撮像光学系A、Bは、図における左右方向において、なるべく近接させることが全天球カメラ装置501の小型化の目的に沿う。反射面は直角プリズムPA、PBの斜面であるので、この斜面同士をなるべく近接させることが、上記小型化に有効である。条件(1)において、パラメータ:d/fが大きくなることは、前群の光軸と反射面との交点と前側主点との距離:dが大きくなることを意味し、これは前群が大型化することを意味する。このような前群の大型化は、全天球カメラ装置501の小型化を困難にする。この場合、前群の大きさの増大による全天球カメラ装置501の大型化を吸収する方法として、プリズムPA、PBの斜面同士を近接させた状態で、撮像光学系AとBとを図1の上下方向へずらして配置することが考えられる。しかしこのようにすると、各撮像光学系の広角レンズの前群の光軸同士が、図28で上下方向にずれるので、このズレ量が程度を超えれば視差の影響が大きくなる。視差の影響を有効に抑えつつ、前群の大型化を許容できるのは、パラメータ:d/fが条件(1)の上限より小さい場合である。上記距離:dと焦点距離:fとの比:d/fに対する条件を、全天球カメラ装置501について規制するのが
条件(4)16≦(d1+d2)/f<21
であり、視差の影響を抑えつつ、条件(4)の下限を超えると、プリズムPAとPBの反射面同士が干渉してしまうし、上限を超えると視差の影響を無視できなくなる。
条件(3)nd≧1.8
は、プリズムPA、PBの材質として、d線に対する屈折率:ndが1.8より大きいものを用いるべきことを定めている。プリズムPA、PBは、前群からの光を後群に向かって内部反射させるので、結像光束の光路はプリズム内を通る。プリズムの材料が条件(3)を満足するような高屈折率であると、プリズム内の光学的な光路長が、実際の光路長より長くなり、光線を屈曲させる距離を広げることができる。前群・プリズム・後群における前群と後群の間の光路長を機械的な光路長よりも長くでき、広角レンズをコンパクトに構成できる。また、開口絞りSA、SBの近くにプリズムPA、PBを配置することにより、小さいプリズムを用いるができ、広角レンズ相互の間隔を小さくできる。プリズムPA、PBは、前群と後群の間に配置される。広角レンズの前群は、180度以上の広画角の光線を取り込む機能をもち、後群は収差補正の結像に効果的に機能する。プリズムを上記の如く配置することにより、プリズムの配置ずれや製造公差の影響を受けにくい。
次に、図29を用いて本実施形態の全天球カメラ装置501の構成を示す。図29に示されているように、全天球カメラ装置501は、撮像光学系A、B、撮像素子IA、IB、画像処理ユニット504、撮像制御ユニット506、CPU510、ROM512、Static Random Access Memory(SRAM)514、Dynamic Random Access Memory(DRAM)516、操作部518、ネットワークI/F520、及び、通信部522を備えている。撮像素子IA、IBは、広角レンズによる光学像を電気信号の画像データに変換して出力するCMOSセンサやCCDセンサなどの画像センサ、この画像センサの水平又は垂直同期信号や画素クロックなどを生成するタイミング生成回路、この撮像素子の動作に必要な種々のコマンドやパラメータなどが設定されるレジスタ群などを有している。撮像素子IA、IBは、各々、画像処理ユニット504とはパラレルI/Fバスで接続されている。さらに、撮像素子IA、IBは、撮像制御ユニット506とは、シリアルI/Fバス(I2Cバス等)で接続されている。画像処理ユニット504及び撮像制御ユニット506は、バス508を介してCPU510と接続される。さらに、バス508には、ROM512、SRAM514、DRAM516、操作部518、ネットワークI/F520、通信部522も接続される。画像処理ユニット504は、撮像素子IA、IBから出力される画像データをパラレルI/Fバスを通して取り込み、それぞれの画像データに対して所定の処理を施した後、これらの画像データを合成処理して、図30Cに示されているような正距円筒画像のデータを作成する。撮像制御ユニット506は、一般に撮像制御ユニット506をマスタデバイス、撮像素子IA、IBをスレーブデバイスとして、I2Cバス等のシリアルI/Fバスを利用して、撮像素子IA、IBのレジスタ群にコマンド等を設定する。必要なコマンド等は、CPU510から受け取る。また、該撮像制御ユニット506は、同じくシリアルI/Fバスを利用して、撮像素子IA、IBのレジスタ群のステータスデータ等を取り込み、CPU510に送る。また、撮像制御ユニット506は、操作部518のシャッターボタンが押下されたタイミングで、撮像素子IA、IBに画像データの出力を指示する。なお、圃場モニタリング装置500においてはこの操作部518は省略され、ネットワークI/F520と接続される、格納部454に格納された制御部からの指示に基づいて撮像されることになる。
図31は圃場モニタリング装置の他の例を示す。この圃場モニタリング装置555は上述の圃場モニタリング装置500と異なり、ソーラーパネル456や無線アンテナ458を透明のカバー462と接触させずに、柱470を介して上方に設置している。その他の構成は圃場モニタリング装置500と同じである。このようにすることで、少しだけ上方の画像を取得したい場合にもソーラーパネルが邪魔にならずにすむ。また、図31の圃場モニタリング装置555において、全天球カメラ装置501に代えて、図25に示すマルチスペクトルカメラ装置113、保持部450、回転ステージ452を備え、それらを制御する制御器を有する状態モニタリング装置550を構成してもよい。
圃場モニタリング装置500、550、状態モニタリング装置555は、圃場内に複数設置されているが、圃場の大きさが小さく、1個でモニタリングできる場合には1個だけでもよい。圃場モニタリング装置500、555、状態モニタリング装置550はセンサの一例である。
手動によらず農業機械100に移動および作業を行わせるためには、その動作を実行する前に作業箇所や作業内容等を設定する必要がある。図32と図33は農業機械100が圃場で移動や作業を行うために、サーバー704、ユーザ端末710、712、農業機械100において行う初期設定を説明するためのフローチャートである。これらの図に沿って説明を行う。なお、基本的に左側に農業機械100が行う動作、中央にサーバー704が行う動作、右側にユーザ端末710、712が行う動作を示すが、図によってはそれらのうちの一つまたは二つが行う動作として説明がなされている。
続いて、図34から図56を用いて通常の作業開始から作業終了までの動作を説明する。農業に限らず、移動と作業を行う機械の自動制御においては、作業開始位置まで機械を移動させ、作業を行い、作業完了後に機械保管する位置まで機械を移動させる必要がある。図34は作業開始から作業完了(保管位置への移動)までの大まかな動作を示す。図中において、S162、S170およびS180の工程は別途、図35Aなどを利用した説明で定義される処理である。
農業機械100が作業開始位置まで到達すると、農業機械100は作業開始位置に到達した旨をサーバー704に送信する(S164)。
図37Bで示す処理の一例として、図38から図42を用いて、レーザレーダ装置112を使った圃場の均平化作業について説明する(なお、この例ではS264の作業リソース確認工程は不要なので省略される)。均平化作業は、これまで説明してきた図1で説明した圃場におけるシステム1501と図2で説明した情報通信システム1502による全体システム1500のほかに、圃場内でレーザ受光装置(図39の例における610)またはレーザ照射装置(図42の例における618)など特別な装置が必要となるが、自動運転および自動作業の基本的な動作はこれまで説明を行った動作と同様である。このレーザレーダ装置112は水平画角60度の範囲でレーザ光を照射できるので、通常のレーザを使って整地する場合(レーザレベラーを使用する場合)と比較して手間をかけず効率的な均平化作業を行うことができる。さらにレーザレーダ装置112を回転させることで、レーザ受光装置の設置位置の変更回数をさらに減少させることができる。
この処理について、図41を用いて説明を行う。
農作物などの作業対象ごとに作業の必要性を判断し、必要な場合にのみその作業対象に対して作業を行うことで全体の作業の効率化を図ることができる。図43から図47を用いて、今度は圃場におけるシステム1501と図2で説明した情報通信システム1502とによる全体システム1500によって自動制御により行うことができる、作物等の状況に応じて個別に行う作業の動作について説明を行う。なお、個別に行う作業としては、施肥、播種、移植、収穫、除草、農薬散布・噴霧、散水、刈払などがあるが、ここでは主として施肥を例に挙げて、作業の様子の説明を行う。なお、システム1501の動作は施肥以外の農作物等ごとに個別に行う作業に対しても適用することができる。
自動制御による移動や作業を行う場合に、その移動や作業が中断するような場合にも自動的に対処することが望まれる。とくに燃料切れによる移動不可などの中断事由が発生する以前にその中断を予測することで、手動によらなければ復帰ができなくなる前に処理を講じておくことが望ましい。図48から図51を用いて、農業機械100の燃料(バッテリ)残量が少なくなったり、肥料などの作業リソースが少なくなったりした場合の作業中断の処理について説明を行う。なお、圃場モニタリング装置500,555によって、何らかの異常を検知した場合のように燃料やバッテリ、作業リソース以外の事情で作業が中断される特別な場合については、図52から図56を用いて詳細に説明がなされる。
圃場が広大になればなるほど、たとえば害獣などが侵入するなど異常が発生した場合にマニュアルで追い払うことは大変である。そこで、そのような場合でも自動的に異常に対応できることが望ましい。図52から図56を用いて、圃場において異常の発生を検知した場合の動作を説明する。図52は、圃場モニタリング装置500を通じて、異常イベントを検知した場合(本例では、農作物に被害を与える可能性のある異常源(主に動物のいわゆる害獣)1000を検知した場合)に、自動制御の農業機械100を使って異常源1000を観測しにいき、異常源1000への対処を行う様子を示すものである。図中の破線は無線通信による情報の送受信を示すものである。この例では、圃場モニタリング装置500と農業機械100の間で情報のやり取りを行っているが、これに限らず、サーバー704を介した情報のやりとりでもよい。また、異常の内容は圃場に害獣が侵入したというものに限られず、人的あるいは自然の力によって発生する異常事態すべてを含む。たとえば、火事や見知らぬ者の不法侵入などである。
S204またはS223の工程からS224の工程に進むと、農業機械100は(最初だけ作業を中断させ、異常内容確認を行わせるための)S470の工程の指示、およびS472の異常箇所の位置情報を受信する(S500)。そして、農業機械100は、異常箇所の位置情報にあわせて進行経路を再計算する(S501)。害獣が侵入する場合のように異常箇所は常に所定の箇所にとどまっているとは限らないので、農業機械100はS500の工程で毎度異常箇所の位置情報を受信する。そして、位置が変わる場合にはS501の工程で農業機械100は進行経路の更新を行うことになる。農業機械100は、本来の作業を中断し、S400からS444で定義される作業中断時の処理を行う(S502)。そして、S502の工程が終了するとS226の工程に進むことになる。
これまで説明した農業機械100は主としてトラクターの例であったが、図57および図58は、本実施形態における農業機械100の他の例を示す。図57は移動式のスプリンクラーによる散水作業の様子を、図58はヘリコプター(クアッドコプター)による肥料散布作業の様子を示す。
図57で示す技術は、農業機械100としてスプリンクラー850を使ったセンターピボット灌漑である。スプリンクラー850は、アルミニウム製の散水管856を複数連結したものを三角構造(トラス構造)の塔854に搭載し、この塔854を車輪852で移動させながら散水を行う。散水管856の各所に散水口858と各散水口858への水の供給を制御する電子弁860を備える。蒸発による損失を防いで農作物350や360の近くで水を撒いたほうが効率的である。このため、散水管856から下に枝分かれさせたドロップ型の散水口858を使っているが、これに限られない。このスプリンクラー850は一端を中心として円を描くように移動する。そして、その中心となる側から、くみ上げた地下水を供給する。
図58は、農業機械100としてヘリコプター(クアッドコプター)1100を利用して液体の肥料802Bの散布作業を行っている様子を示す。このヘリコプター1100は、ヘリコプター1100本体から伸びたアーム先端付近に設置される4つのローターヘッド1102と、そのローターヘッド1102によって回転可能に接続されている4枚のローター1104を備え、それらのローター1104を回転させることにより、飛行を行う。このヘリコプター1100は、他に少なくとも、GPS用アンテナ120、無線通信用アンテナ114、ローター1104の回転制御も含めてヘリコプター1100の制御を行う制御装置118C、ステレオカメラ装置110、マルチスペクトルカメラ装置113、制御装置118Cの制御に応じて農薬散布を行う作業装置106E、着地する場合に着地点である地表などに接地するランディングギア1106を備えている。ステレオカメラ装置110は、ヘリコプター1100が水平飛行している状態において鉛直軸と直交する方向に制御装置118Cによって回転可能にヘリコプター1100に設置される。そして、ヘリコプター1100はステレオカメラ装置110を地面に向けることで、農作物などの状況を確認したり、地表とヘリコプター1100との間の距離を計測して高度を特定したりすることができる。高度は、第二のデジタル情報又は第四のデジタル情報の一例である。
これまでの例は、本実施形態の全体システム1500により農業機械100、スプリンクラー850、ヘリコプター1100などの進行および作業を手動操作によらない自動制御により行う例である。一方、システム利用者において、農業機械100の移動や作業を自分の目で見ながら行いたいというニーズが存在する。特に自動制御で行うことが難しい緻密な作業や細かい移動制御をシステム利用者自らが行いたい場合がある。全体システム1500を応用すると、システム利用者(ユーザ)は遠隔操作により農業機械100などを動作させることもできる。図59にこの遠隔操作を行うための情報通信システム1502の一例を示す。遠隔操作は、ユーザ端末710、712を使用してユーザが農業機械100を操作することをいう。この場合、ユーザが農業機械100に乗車しながらユーザ端末710、712を操作する、または農業機械100など近傍で操作する場合も遠隔操作に含まれる。
上述のとおり、サーバー704(または課金管理用サーバー。以下同じ)は、課金処理(請求処理)も行う。システム提供者はシステム利用料を適切に回収できることで、経営を継続するとともに新たなサービスを開発したり、現在のサービスを改善したりすることが可能となるので、課金処理を技術によって自動的に正確かつ効率的に行えるようにすることが課題となる。課金方法にはさまざまな形態があり、本実施形態の全体システム1500のユーザが選択できるようになっている。定額制の課金形態としては、
I図2または図59で示すような情報通信システム1502の使用料
II図1で示すような圃場のシステム(圃場モニタリング装置500、555、状態モニタリング装置550、農業機械100など)1501の賃貸料(装置を1個当たり100ドル/月、農業機械を1個当たり200ドル/月など)
III土地(圃場)の賃貸料(1平方メートルあたり15ドルなど)、がある。
図60は本実施形態の応用の実施形態における移動体(作業体)の他の一例として、建設作業機械(ロードローラー)1200を示す。この建設作業機械1200は、重量が重く、接地面積の大きな車輪(ローラー)2000を持ち、その車輪の重量によって路面一面に圧力をかける作業を行いながら走行し、軟らかい地面を固める。建設作業機械1200は、さらに内燃機関による原動機102D、伝動装置104D、支持装置108D、ステレオカメラ装置110、無線通信用アンテナ114、手動操作部116、制御装置118D、GPS用アンテナ120、舵取り装置116、一対の照明灯124D、一組の超音波ソナー装置126D、及び、一組の後輪130Bを備えている。この建設作業機械1200は、図2または図59で示すのと同等の情報通信システム1502と無線通信により接続されている。
以上説明を行った本実施形態および応用例には少なくとも次の発明が含まれる。
(1)手動操作によらず移動と対象に対する作業を行う機械であって、複数種類の情報に基づいて前記移動を行い、複数種類の情報に基づいて前記作業を行う機械である。この機械は農業機械、建設機械、飛行機械などあらゆる機械を含む(以下同様)。手動操作によらない(自動制御に基づく)移動を制御するための複数種類の情報とは、無線による位置特定のための情報やステレオカメラ装置その他の測距装置により取得する画像情報や距離情報、あるいはある場所に設置されたモニタリングカメラなどのカメラ装置で取得した画像情報やその画像情報に基づく距離情報などが該当する。手動操作によらない(自動制御に基づく)作業を制御するための複数種類の情報とは、撮像素子で取得した画像情報や分光画像情報、距離情報、反射率情報(分光反射率情報)、偏光画像情報、レーザ装置により取得した形状情報や距離情報などが該当する。このように、手動操作によらない移動を制御するための複数種類の情報と手動操作によらない作業を制御するための複数種類の情報はともに少なくとも二次元で表される(面の)情報を含んでいる。(1)の機械において、前記移動を行うための情報の一つは画像情報であってもよいし、形状に関する情報であってもよい。さらには(1)の機械において、前記作業を行うための情報の一つが画像情報であってもよいし、形状に関する情報であってもよい。なお、これらの機械がステレオカメラ装置を有する場合、それにより取得する距離情報を用いて移動と作業の両方の制御を行うことができる。この機械による手動操作によらない移動と対象に対する作業は、通常、交互または同時に行うものである(以下同様)。
前記作業に関する情報を生成する作業情報生成装置と、受信した情報を蓄積し、蓄積した情報に基づいて所定の期間における作業内容を特定する制御装置に、前記作業情報生成装置で生成した情報を送信する送信装置と、を備えた機械である。
この場合、前記作業に関する情報は、作業の開始と終了に関する情報を含んでもよい。さらに、前記作業に関する情報は作業の中断と再開に関する情報を含んでもよい。
前記制御装置は、受信した前記作業に関する情報を蓄積し、蓄積した情報に基づいて所定期間における作業内容を特定し、提示するシステムである。
この場合、前記移動の修正は、前記画像情報から得られる距離に関する情報に基づいて行われる機械であってもよい。さらにこの距離に関する情報は、地表との距離に関する情報であってもよい。またさらに、前記制御装置は、前記画像情報から前記機械の進行方向に存在する物体の認識を行い、認識結果に応じて前記移動の修正を行ってもよい。そして、
前記制御装置は、前記物体の認識ができない場合、ユーザに前記物体の種類の特定を促すものでも、前記物体を回避するように前記移動の修正を行うものであってもよい。
以上、農業機械や圃場のシステムを実施例により説明したが、本発明は上記実施例に限定されるものではなく、本発明の範囲内で種々の変形及び改良が可能である。
Claims (16)
- 第一の対象に対する動作を行う第一の動作装置と、
前記第一の対象からアナログ情報を取得する少なくとも一つのセンサと、
該センサにより取得したアナログ情報から取得される前記第一の対象に関する複数種類のデジタル情報のうち、少なくとも一種類の第一のデジタル情報に基づいて前記第一の対象を特定し、前記第一のデジタル情報とは異なる少なくとも一種類の第二のデジタル情報に基づいて前記特定した第一の対象に対する前記第一の動作装置による動作を制御する制御装置と、を備えたシステム。 - 前記第一の動作装置は、動力発生源で発生した動力を、移動のために伝達する伝動装置である、請求項1に記載のシステム。
- 前記センサは、前記第一の対象で反射した光を取得する二つの撮像素子を備えたステレオカメラであり、
前記制御装置は、前記第一のデジタル情報として前記第一の対象のデジタル画像情報を、前記第二のデジタル情報として前記第一の対象までの距離に関する情報を取得し、前記第一の対象のデジタル画像情報により前記第一の対象を特定し、前記第一の対象までの距離に関する情報に基づいて前記特定した前記第一の対象に対する前記伝動装置による移動を制御する請求項2に記載のシステム。 - 前記第二のデジタル情報は、前記第一の対象の状態を特定する情報であり、
前記第一の動作装置は、前記第一の対象に対して作業を行う作業装置であり、
前記制御装置は、前記第一の対象の状態を特定する情報に基づいて前記特定した第一の対象に対する前記作業装置による作業を制御する請求項1に記載のシステム。 - 前記センサは、前記第一の対象で反射した光をP偏光成分とS偏光成分に分離して情報を取得するものであり、
前記第一の対象の状態を特定する情報は、前記第一の対象のP偏光成分によるデジタル画像情報と前記第一の対象のS偏光成分によるデジタル画像情報の少なくとも一方のデジタル画像情報から取得した情報であり、
前記制御装置は、少なくとも一方の前記デジタル画像情報から取得した情報に基づいて、前記特定した第一の対象に対する前記作業装置による作業の実施を行う請求項4に記載のシステム。 - 前記センサは、前記第一の対象で反射した光を分光して取得するマルチスペクトルカメラであり、
前記第一の対象の状態を特定する情報は、前記第一の対象の分光反射率に関する情報であり、
前記制御装置は、前記第一の対象の分光反射率に関する情報に基づいて、前記特定した第一の対象に対する前記作業装置による作業の実施の判断を行う請求項4に記載のシステム。 - 前記第一の対象とは異なる第二の対象に動作を行う第二の動作装置を備え、
前記制御装置は、前記複数種類のデジタル情報のうち、少なくとも一種類の第三のデジタル情報に基づいて前記第二の対象を特定し、前記第三のデジタル情報とは異なる少なくとも一種類の第四のデジタル情報に基づいて前記特定した第二の対象に対する前記第二の動作装置による動作を制御する、請求項2または3に記載のシステム。 - 前記第四のデジタル情報は、前記第二の対象の状態を特定する情報であり、
前記第二の動作装置は、前記第二の対象に対して作業を行う作業装置であり、
前記制御装置は、前記第二の対象の状態を特定する情報に基づいて前記特定した第二の対象に対する前記作業装置による作業を制御する請求項7に記載のシステム。 - 前記センサは、前記第二の対象で反射した光をP偏光成分とS偏光成分に分離して情報を取得するものであり、
前記第二の対象の状態を特定する情報は、前記第二の対象のP偏光成分によるデジタル画像情報と前記第二の対象のS偏光成分によるデジタル画像情報の少なくとも一方のデジタル画像情報から取得した情報であり、
前記制御装置は、少なくとも一方の前記デジタル画像情報から取得した情報に基づいて、前記特定した第二の対象に対する前記作業装置による作業の実施を行う請求項8に記載のシステム。 - 前記センサは、前記第一の対象で反射した光を分光して取得するマルチスペクトルカメラであり、
前記第二の対象の状態を特定する情報は、前記第二の対象の分光反射率に関する情報であり、
前記制御装置は、前記第二の対象の分光反射率に関する情報に基づいて、前記特定した第二の対象に対する前記作業装置による作業の実施の判断を行う請求項8に記載のシステム。 - 請求項2、3、7,8、9、10のいずれか一項に記載のシステムを備え、前記伝動装置による移動を行う機械であって、
前記制御装置は、前記特定した第一の対象を回避して移動を行うよう前記伝動装置による移動の制御を行う、機械。 - 請求項4ないし6のいずれか一項に記載のシステムを備え、
前記制御装置は、前記特定した第一の対象に対して前記第一の対象の状態に応じた作業を行うよう前記作業装置の作業の制御を行う、機械。 - 請求項8ないし10のいずれか一項に記載のシステムを備え、
前記制御装置は、前記特定した第二の対象に対して前記第二の対象の状態に応じた作業を行うよう前記作業装置の作業の制御を行う、機械。 - センサで取得した情報に基づいて対象に対応した動作の制御を行う制御方法であって、
前記センサで取得した前記対象に関するアナログ情報から前記対象に関する複数種類のデジタル情報を取得する工程と、
該工程により取得した前記対象に関する複数種類のデジタル情報のうち、少なくとも一種類のデジタル情報に基づいて前記対象を特定し、他の少なくとも一種のデジタル情報に基づいて前記特定した対象に対する動作を制御する工程と、を含む制御方法。 - コンピュータに、
センサで取得した対象に関するアナログ情報から前記対象に関する複数種類のデジタル情報を取得する工程と、
該工程により取得した前記対象に関する複数種類のデジタル情報のうち、少なくとも一種類のデジタル情報に基づいて前記対象を特定し、他の少なくとも一種のデジタル情報に基づいて前記特定した対象に対する動作を制御する工程と、を実行させるためのプログラム。 - 請求項15に記載のプログラムを記録したコンピュータ読み取り可能な記録媒体。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201580037887.4A CN106687877A (zh) | 2014-07-16 | 2015-04-15 | 系统,机械,控制方法和程序 |
EP15822571.4A EP3171241A4 (en) | 2014-07-16 | 2015-04-15 | System, machine, control method, and program |
JP2016534298A JP6344473B2 (ja) | 2014-07-16 | 2015-04-15 | システム、機械、制御方法、プログラム |
US15/405,663 US20170131718A1 (en) | 2014-07-16 | 2017-01-13 | System, machine, and control method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014-146163 | 2014-07-16 | ||
JP2014146163 | 2014-07-16 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/405,663 Continuation US20170131718A1 (en) | 2014-07-16 | 2017-01-13 | System, machine, and control method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016009688A1 true WO2016009688A1 (ja) | 2016-01-21 |
Family
ID=55078193
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/061542 WO2016009688A1 (ja) | 2014-07-16 | 2015-04-15 | システム、機械、制御方法、プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170131718A1 (ja) |
EP (1) | EP3171241A4 (ja) |
JP (2) | JP6344473B2 (ja) |
CN (1) | CN106687877A (ja) |
WO (1) | WO2016009688A1 (ja) |
Cited By (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106843062A (zh) * | 2017-03-08 | 2017-06-13 | 江苏大学 | 智能变量施肥控制系统及控制方法 |
JP2017158532A (ja) * | 2016-03-03 | 2017-09-14 | 株式会社リコー | 農作業用装置および農作業用装置の制御方法 |
WO2017187909A1 (ja) * | 2016-04-28 | 2017-11-02 | ヤンマー株式会社 | 無線通信システム |
CN107390699A (zh) * | 2017-09-04 | 2017-11-24 | 广西民族大学 | 一种甘蔗种植机的路线规划系统及其路线规划方法 |
WO2017221756A1 (ja) * | 2016-06-22 | 2017-12-28 | ソニー株式会社 | センシングシステム、センシング方法、及び、センシング装置 |
JP2018014045A (ja) * | 2016-07-22 | 2018-01-25 | 株式会社クボタ | 作業車 |
CN107678041A (zh) * | 2016-08-02 | 2018-02-09 | 三星电子株式会社 | 用于检测对象的系统和方法 |
WO2018059648A1 (en) * | 2016-09-29 | 2018-04-05 | Agro Intelligence Aps | A system and a method for optimizing the trajectory to be followed when weeding crops |
JP2018106596A (ja) * | 2016-12-28 | 2018-07-05 | ラピスセミコンダクタ株式会社 | 地表移動検出装置、無線タグ、地表移動検出方法及び災害救助支援システム |
JP2018117560A (ja) * | 2017-01-24 | 2018-08-02 | 株式会社クボタ | 作業車 |
JP2018164439A (ja) * | 2017-03-28 | 2018-10-25 | ヤンマー株式会社 | 作業車両の自律走行システム |
JP2018170991A (ja) * | 2017-03-31 | 2018-11-08 | ヤンマー株式会社 | 農作業車両の自律走行システム |
WO2018201961A1 (zh) * | 2017-05-04 | 2018-11-08 | 深圳乐动机器人有限公司 | 三角测距激光雷达 |
JP2019049896A (ja) * | 2017-09-11 | 2019-03-28 | 井関農機株式会社 | 自動運転システム |
WO2019078336A1 (ja) * | 2017-10-19 | 2019-04-25 | ソニー株式会社 | 撮像装置および信号処理装置 |
WO2019091725A1 (de) * | 2017-11-10 | 2019-05-16 | Zf Friedrichshafen Ag | Verfahren und anzeigegerät zum führen einer arbeitsmaschine |
CN109857096A (zh) * | 2017-11-30 | 2019-06-07 | 井关农机株式会社 | 作业车辆 |
KR20190073444A (ko) * | 2017-03-03 | 2019-06-26 | 얀마 가부시키가이샤 | 주행 경로 특정 시스템 |
JP2019160289A (ja) * | 2018-03-07 | 2019-09-19 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
WO2019187883A1 (ja) | 2018-03-29 | 2019-10-03 | ヤンマー株式会社 | 作業車両用の障害物検知システム |
JP2019170309A (ja) * | 2018-03-29 | 2019-10-10 | ヤンマー株式会社 | 作業車両 |
JP2019175343A (ja) * | 2018-03-29 | 2019-10-10 | 西日本電信電話株式会社 | 情報収集装置、情報収集方法及びコンピュータープログラム |
JPWO2018109796A1 (ja) * | 2016-12-12 | 2019-10-24 | 株式会社オプティム | 遠隔制御システム、遠隔制御方法、およびプログラム |
JP2019187352A (ja) * | 2018-04-27 | 2019-10-31 | 井関農機株式会社 | 作業車両 |
JPWO2018123268A1 (ja) * | 2016-12-27 | 2019-10-31 | パナソニックIpマネジメント株式会社 | 測位システム、基地局および測位方法 |
CN110525539A (zh) * | 2019-09-29 | 2019-12-03 | 江苏省肿瘤医院 | 一种赤霉病数据原位采集车 |
WO2020111096A1 (ja) * | 2018-11-27 | 2020-06-04 | 株式会社ナイルワークス | 作業計画装置、作業計画装置の制御方法、および、その制御プログラム、ならびにドローン |
JP2020103128A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社クボタ | 管理機 |
JP2020110158A (ja) * | 2016-09-05 | 2020-07-27 | 株式会社クボタ | 作業車自動走行システム及び走行経路管理装置 |
JP2020113304A (ja) * | 2016-07-22 | 2020-07-27 | 株式会社クボタ | 作業車 |
JP2020119600A (ja) * | 2020-04-21 | 2020-08-06 | ヤンマーパワーテクノロジー株式会社 | 自律走行システム |
JP2020126307A (ja) * | 2019-02-01 | 2020-08-20 | ヤンマーパワーテクノロジー株式会社 | 作業車両用の目標経路生成システム |
JP2020152473A (ja) * | 2019-03-18 | 2020-09-24 | 住友重機械工業株式会社 | 作業機械 |
WO2022131176A1 (ja) * | 2020-12-15 | 2022-06-23 | Hapsモバイル株式会社 | 制御装置、プログラム、システム、及び方法 |
JP2023513975A (ja) * | 2020-12-30 | 2023-04-05 | 広東視場科技有限公司 | 自動運転車プラットフォームに基づく作物マルチスペクトル採取分析システム |
US11726485B2 (en) | 2016-09-05 | 2023-08-15 | Kubota Corporation | Autonomous work vehicle travel system, travel route managing device, travel route generating device, and travel route determining device |
US20230380345A1 (en) * | 2020-08-17 | 2023-11-30 | Deere & Company | Close loop control of an illumination source based on sample heating |
JP7490865B2 (ja) | 2019-12-26 | 2024-05-27 | ヤンマーパワーテクノロジー株式会社 | 作業車両 |
JP7586698B2 (ja) | 2020-12-15 | 2024-11-19 | ソフトバンク株式会社 | 制御装置、プログラム、システム、及び方法 |
Families Citing this family (94)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11064152B2 (en) * | 2014-10-15 | 2021-07-13 | IL Holdings, LLC | Remote fishery management system |
CN105021225B (zh) * | 2015-07-08 | 2017-07-14 | 江苏大学 | 一种温室智能移动检测平台 |
US10620300B2 (en) | 2015-08-20 | 2020-04-14 | Apple Inc. | SPAD array with gated histogram construction |
US10282821B1 (en) * | 2015-08-27 | 2019-05-07 | Descartes Labs, Inc. | Observational data processing and analysis |
JP6363579B2 (ja) * | 2015-09-30 | 2018-07-25 | 株式会社クボタ | 圃場管理システム |
US20170099476A1 (en) * | 2015-10-01 | 2017-04-06 | Samsung Electronics Co., Ltd. | Photographing device and method of controlling the same |
DE102015118767A1 (de) * | 2015-11-03 | 2017-05-04 | Claas Selbstfahrende Erntemaschinen Gmbh | Umfelddetektionseinrichtung für landwirtschaftliche Arbeitsmaschine |
US9976284B2 (en) * | 2016-07-18 | 2018-05-22 | Caterpillar Inc. | Control system for headlights of a machine |
WO2018101351A1 (ja) | 2016-12-02 | 2018-06-07 | 株式会社クボタ | 走行経路管理システム及び走行経路決定装置 |
US10599959B2 (en) * | 2017-04-05 | 2020-03-24 | International Business Machines Corporation | Automatic pest monitoring by cognitive image recognition with two cameras on autonomous vehicles |
US10531603B2 (en) * | 2017-05-09 | 2020-01-14 | Cnh Industrial America Llc | Agricultural system |
AU2017414991B2 (en) * | 2017-05-17 | 2021-10-21 | Inaho, Inc. | Agricultural work apparatus, agricultural work management system, and program |
WO2018220609A1 (en) * | 2017-06-01 | 2018-12-06 | Osr Enterprises Ag | A system and method for fusing information of a captured environment |
US20180359023A1 (en) * | 2017-06-09 | 2018-12-13 | Keysight Technologies, Inc. | System integration of solar panels/cells and antennas (span system) |
DE102017113726A1 (de) | 2017-06-21 | 2018-12-27 | Claas E-Systems Kgaa Mbh & Co Kg | Landwirtschaftliche Arbeitsmaschine |
US10830879B2 (en) | 2017-06-29 | 2020-11-10 | Apple Inc. | Time-of-flight depth mapping with parallax compensation |
CN107389130B (zh) * | 2017-07-17 | 2023-04-28 | 西南交通大学 | 智能灌溉巡检车以及灌溉方法 |
US11427218B2 (en) * | 2017-08-04 | 2022-08-30 | Sony Corporation | Control apparatus, control method, program, and moving body |
RU2662907C1 (ru) * | 2017-08-24 | 2018-07-31 | Владимир Эльич Пашковский | Способ борьбы с засветкой астрономических приборов светом уличных осветительных приборов |
US10955552B2 (en) | 2017-09-27 | 2021-03-23 | Apple Inc. | Waveform design for a LiDAR system with closely-spaced pulses |
EP3704510B1 (en) | 2017-12-18 | 2022-10-05 | Apple Inc. | Time-of-flight sensing using an addressable array of emitters |
CN108170145A (zh) * | 2017-12-28 | 2018-06-15 | 浙江捷尚人工智能研究发展有限公司 | 基于激光雷达的机器人避障系统及其应用方法 |
CN108362326B (zh) * | 2018-01-03 | 2020-12-18 | 江苏大学 | 一种悬轨式温室综合信息自动巡航监测装置 |
CN108495078A (zh) * | 2018-01-24 | 2018-09-04 | 青岛理工大学 | 一种小麦割根施肥手动对行监控系统 |
CA3089518A1 (en) * | 2018-01-25 | 2019-08-01 | Eleos Robotics Inc. | Autonomous unmanned ground vehicle for pest control |
US11061144B2 (en) * | 2018-01-30 | 2021-07-13 | Valmont Industries, Inc. | System and method for GPS alignment using real-time kinetics |
EP3729945B1 (en) * | 2018-02-28 | 2022-12-07 | Honda Motor Co., Ltd. | Control device, work machine and program |
WO2019185930A1 (en) * | 2018-03-30 | 2019-10-03 | Positec Power Tools (Suzhou) Co., Ltd | Self-moving device, working system, automatic scheduling method and method for calculating area |
US11150648B2 (en) * | 2018-04-03 | 2021-10-19 | Deere & Company | Overhead power cable detection and avoidance |
CN108303938A (zh) * | 2018-04-28 | 2018-07-20 | 湖南文理学院 | 一种用于水田作业的智能化履带式拖拉机 |
CN108713362A (zh) * | 2018-05-28 | 2018-10-30 | 苏州格目软件技术有限公司 | 一种基于植被光谱分析的自动化农业机械 |
US11419261B2 (en) | 2018-06-25 | 2022-08-23 | Deere & Company | Prescription cover crop seeding with combine |
CN111766877B (zh) * | 2018-06-27 | 2021-08-31 | 北京航空航天大学 | 一种机器人 |
US20200015408A1 (en) * | 2018-07-16 | 2020-01-16 | Alan Dean Armstead | Autonomously Operated Agricultural Vehicle and Method |
ES2967296T3 (es) | 2018-08-06 | 2024-04-29 | Doosan Bobcat North America Inc | Controles aumentados de pala cargadora |
US11738643B2 (en) | 2019-02-27 | 2023-08-29 | Clark Equipment Company | Display integrated into door |
JP7034866B2 (ja) * | 2018-08-20 | 2022-03-14 | 株式会社クボタ | 収穫機 |
JP7068969B2 (ja) * | 2018-08-29 | 2022-05-17 | ヤンマーパワーテクノロジー株式会社 | 自動走行システム |
US10820472B2 (en) | 2018-09-18 | 2020-11-03 | Cnh Industrial America Llc | System and method for determining soil parameters of a field at a selected planting depth during agricultural operations |
US20200110403A1 (en) * | 2018-10-08 | 2020-04-09 | Cnh Industrial America Llc | Agricultural data center systems, networks, and methods |
US20200110423A1 (en) * | 2018-10-08 | 2020-04-09 | Cnh Industrial America Llc | Real-time communications between agricultural machines |
US20200122711A1 (en) * | 2018-10-19 | 2020-04-23 | GEOSAT Aerospace & Technology | Unmanned ground vehicle and method for operating unmanned ground vehicle |
US11343967B1 (en) * | 2018-11-14 | 2022-05-31 | Cv Robotics Booster Club | Robotic automation of mechanical field harvesting of broccoli plants |
EP3887601A2 (en) | 2018-11-28 | 2021-10-06 | The Toro Company | Autonomous ground surface treatment system and method of operation of such a system |
US11921218B2 (en) * | 2018-11-30 | 2024-03-05 | Garmin Switzerland Gmbh | Marine vessel LIDAR system |
DE102018221250A1 (de) * | 2018-12-07 | 2020-06-10 | Robert Bosch Gmbh | Verfahren und System zur Steuerung einer landwirtschaftlichen Maschine |
US20220046859A1 (en) * | 2018-12-11 | 2022-02-17 | Tevel Aerobotics Technologies Ltd. | System and method for selective harvesting at night or under poor visibility conditions, night dilution and agriculture data collection |
US10721458B1 (en) * | 2018-12-14 | 2020-07-21 | Ambarella International Lp | Stereoscopic distance measurements from a reflecting surface |
JP7174619B2 (ja) * | 2018-12-27 | 2022-11-17 | ヤンマーパワーテクノロジー株式会社 | 圃場管理装置 |
CN109814551A (zh) * | 2019-01-04 | 2019-05-28 | 丰疆智慧农业股份有限公司 | 谷物处理自动驾驶系统、自动驾驶方法以及自动识别方法 |
CN109631903A (zh) * | 2019-01-04 | 2019-04-16 | 丰疆智慧农业股份有限公司 | 谷物处理自动驾驶系统及其自动驾驶方法和路径规划方法 |
CN109753003B (zh) * | 2019-01-30 | 2019-10-29 | 农业农村部南京农业机械化研究所 | 秸秆捡拾粉碎播后覆盖复式作业机具配套控制装置及方法 |
RU2710163C1 (ru) * | 2019-02-04 | 2019-12-24 | Открытое акционерное общество "Авангард" | Устройство позиционирования мобильных агрегатов при возделывании агрокультур |
US10955234B2 (en) | 2019-02-11 | 2021-03-23 | Apple Inc. | Calibration of depth sensing using a sparse array of pulsed beams |
DE102019201915A1 (de) * | 2019-02-14 | 2020-08-20 | Zf Friedrichshafen Ag | Steuerung von Landmaschinen basierend auf Kombination aus Abstandsensorik und Kamera |
DE102019202040B4 (de) * | 2019-02-15 | 2022-01-27 | Zf Friedrichshafen Ag | Sichere autonome Landmaschine |
CN109813852B (zh) * | 2019-03-08 | 2024-03-05 | 山东农业大学 | 一种田间小麦高通量表型信息获取装置及其控制方法 |
WO2020192905A1 (en) * | 2019-03-27 | 2020-10-01 | Volvo Truck Corporation | A method for controlling a vehicle |
CN110286670A (zh) * | 2019-04-09 | 2019-09-27 | 丰疆智能科技股份有限公司 | 多台自动收割机的行驶路径规划系统及其方法 |
WO2020210607A1 (en) * | 2019-04-10 | 2020-10-15 | Kansas State University Research Foundation | Autonomous robot system for steep terrain farming operations |
US11126188B2 (en) * | 2019-04-15 | 2021-09-21 | Caterpillar Inc. | System and method for maintaining a work surface at a worksite |
WO2020218528A1 (ja) * | 2019-04-25 | 2020-10-29 | 株式会社クボタ | 収穫機等の農作業機 |
US11500094B2 (en) | 2019-06-10 | 2022-11-15 | Apple Inc. | Selection of pulse repetition intervals for sensing time of flight |
US20210015023A1 (en) * | 2019-06-21 | 2021-01-21 | SmarTerra LLC | Turf Maintenance System |
US11778934B2 (en) * | 2019-07-02 | 2023-10-10 | Bear Flag Robotics, Inc. | Agricultural lane following |
US11352768B2 (en) * | 2019-07-16 | 2022-06-07 | Caterpillar Inc. | Locking out a machine to prohibit movement |
US11555900B1 (en) | 2019-07-17 | 2023-01-17 | Apple Inc. | LiDAR system with enhanced area coverage |
EP4256914A3 (en) * | 2019-08-06 | 2023-12-13 | The Toro Company | Vehicle with detection system for detecting ground surface and sub-surface objects, and method for controlling vehicle |
JP7342541B2 (ja) * | 2019-09-06 | 2023-09-12 | オムロン株式会社 | ハウス管理システム、ハウス管理装置、ハウス管理方法及びプログラム |
CN110502021B (zh) * | 2019-09-24 | 2022-07-15 | 一米信息服务(北京)有限公司 | 一种农机作业路径规划方法及系统 |
US11906424B2 (en) | 2019-10-01 | 2024-02-20 | The Regents Of The University Of California | Method for identifying chemical and structural variations through terahertz time-domain spectroscopy |
EP4052458A4 (en) | 2019-10-31 | 2023-11-29 | The Regents of the University of California | METHODS AND SYSTEMS FOR DETECTING WATER STATE IN PLANTS USING TERAHERTZ RADIATION |
CN114391056A (zh) | 2019-11-12 | 2022-04-22 | 克拉克设备公司 | 集成到门中的显示器 |
JP7415480B2 (ja) * | 2019-11-25 | 2024-01-17 | コベルコ建機株式会社 | 作業支援サーバおよび作業支援システム |
US11733359B2 (en) | 2019-12-03 | 2023-08-22 | Apple Inc. | Configurable array of single-photon detectors |
US20210186005A1 (en) * | 2019-12-21 | 2021-06-24 | Verdant Robotics Inc. | Agricultural delivery system to apply one or more treatments with micro-precision to agricultural objects autonomously |
CN111552281A (zh) * | 2020-04-13 | 2020-08-18 | 程国军 | 一种智能耕种系统及其装置 |
US11852621B2 (en) | 2020-04-23 | 2023-12-26 | Cnh Industrial Canada, Ltd. | System and method for monitoring tilled floor conditions using a tilled floor sensing assembly |
US12004504B2 (en) | 2020-05-29 | 2024-06-11 | Cnh Industrial America Llc | Systems and methods for controlling a nozzle assembly of an agricultural applicator |
DE102020117477A1 (de) * | 2020-07-02 | 2022-01-05 | Claas E-Systems Gmbh | System zur Bestimmung der Position einer Kamera einer Kameraanordnung bezüglich einer Bodenebene |
CN112541911A (zh) * | 2020-12-23 | 2021-03-23 | 北京百度网讯科技有限公司 | 图像处理方法及装置 |
US11536608B2 (en) | 2021-01-04 | 2022-12-27 | Argo AI, LLC | Systems and methods for characterizing spectral reflectance of real world objects |
CN112868300A (zh) * | 2021-02-25 | 2021-06-01 | 欧美英 | 一种园林艺术用花种播种装置 |
CN113075145A (zh) * | 2021-04-17 | 2021-07-06 | 上海市测绘院 | 基于云计算的多光谱激光智能识别设备 |
US11681028B2 (en) | 2021-07-18 | 2023-06-20 | Apple Inc. | Close-range measurement of time of flight using parallax shift |
IT202100019664A1 (it) * | 2021-07-23 | 2023-01-23 | Cnh Ind Italia Spa | Metodo per l’identificazione di una traiettoria in una piantagione di alberi da frutto quali aranceti |
US20230046882A1 (en) * | 2021-08-11 | 2023-02-16 | Deere & Company | Obtaining and augmenting agricultural data and generating an augmented display |
KR102620280B1 (ko) * | 2021-08-20 | 2024-01-03 | (주)카탈로닉스 | 관수 및 시비를 위한 이동형 로봇 시스템 |
CN113938476B (zh) * | 2021-10-13 | 2024-10-01 | 廊坊市大华夏神农信息技术有限公司 | 一种像元级农业环境物联网监测站及工作方法 |
US20230133026A1 (en) * | 2021-10-28 | 2023-05-04 | X Development Llc | Sparse and/or dense depth estimation from stereoscopic imaging |
US11995859B2 (en) | 2021-10-28 | 2024-05-28 | Mineral Earth Sciences Llc | Sparse depth estimation from plant traits |
US20230205195A1 (en) * | 2021-12-28 | 2023-06-29 | Blue River Technology Inc. | Compensatory actions for automated farming machine failure |
WO2024038330A1 (en) * | 2022-08-16 | 2024-02-22 | Precision Planting Llc | Systems and methods for biomass identification |
CN115644054B (zh) * | 2022-11-14 | 2023-08-15 | 浙江农业商贸职业学院 | 一种用于智慧农业的果树传粉授粉装置 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03208105A (ja) * | 1989-09-28 | 1991-09-11 | Tennant Co | 床面積を処理する方法及び機械の誘導システム |
JPH06133624A (ja) * | 1992-10-27 | 1994-05-17 | Yanmar Agricult Equip Co Ltd | 収穫装置 |
JP2004280451A (ja) * | 2003-03-14 | 2004-10-07 | Matsushita Electric Works Ltd | 自律移動装置 |
JP2012084121A (ja) * | 2010-09-16 | 2012-04-26 | Ricoh Co Ltd | 物体識別装置、並びに、これを備えた移動体制御装置及び情報提供装置 |
JP2013020543A (ja) * | 2011-07-13 | 2013-01-31 | Yamaha Motor Co Ltd | 車両用障害物検出装置及びそれを用いた車両 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS5977517A (ja) * | 1982-10-27 | 1984-05-04 | Kubota Ltd | 走行車輌 |
JPH0397012A (ja) * | 1989-09-11 | 1991-04-23 | Honda Motor Co Ltd | 自走型作業ロボット |
JP2001112102A (ja) * | 1999-10-05 | 2001-04-20 | Denso Corp | 移動ロボット |
JP2002318620A (ja) * | 2001-04-19 | 2002-10-31 | Toshiba Tec Corp | ロボットクリーナ |
JP2006039760A (ja) * | 2004-07-23 | 2006-02-09 | Victor Co Of Japan Ltd | 移動ロボット |
DE102006055858A1 (de) * | 2006-11-27 | 2008-05-29 | Carl Zeiss Ag | Verfahren und Anordnung zur Steuerung eines Fahrzeuges |
US9188980B2 (en) * | 2008-09-11 | 2015-11-17 | Deere & Company | Vehicle with high integrity perception system |
JP5291420B2 (ja) * | 2008-09-26 | 2013-09-18 | 日産自動車株式会社 | 障害物回避装置及び自走車両 |
EP2439716B1 (en) * | 2010-09-16 | 2013-11-13 | Ricoh Company, Ltd. | Object identification device, moving object controlling apparatus having object identification device and information presenting apparatus having object identification device |
US8498786B2 (en) * | 2010-10-14 | 2013-07-30 | Deere & Company | Material identification system |
JP5872399B2 (ja) * | 2012-07-06 | 2016-03-01 | 本田技研工業株式会社 | 配置決定方法、配置決定装置及び移動体 |
CA2829914C (en) * | 2012-12-07 | 2016-07-05 | The Boeing Company | Forest sensor deployment and monitoring system |
-
2015
- 2015-04-15 JP JP2016534298A patent/JP6344473B2/ja active Active
- 2015-04-15 WO PCT/JP2015/061542 patent/WO2016009688A1/ja active Application Filing
- 2015-04-15 EP EP15822571.4A patent/EP3171241A4/en not_active Withdrawn
- 2015-04-15 CN CN201580037887.4A patent/CN106687877A/zh active Pending
-
2017
- 2017-01-13 US US15/405,663 patent/US20170131718A1/en not_active Abandoned
-
2018
- 2018-05-24 JP JP2018099275A patent/JP2018160257A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH03208105A (ja) * | 1989-09-28 | 1991-09-11 | Tennant Co | 床面積を処理する方法及び機械の誘導システム |
JPH06133624A (ja) * | 1992-10-27 | 1994-05-17 | Yanmar Agricult Equip Co Ltd | 収穫装置 |
JP2004280451A (ja) * | 2003-03-14 | 2004-10-07 | Matsushita Electric Works Ltd | 自律移動装置 |
JP2012084121A (ja) * | 2010-09-16 | 2012-04-26 | Ricoh Co Ltd | 物体識別装置、並びに、これを備えた移動体制御装置及び情報提供装置 |
JP2013020543A (ja) * | 2011-07-13 | 2013-01-31 | Yamaha Motor Co Ltd | 車両用障害物検出装置及びそれを用いた車両 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3171241A4 * |
Cited By (65)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2017158532A (ja) * | 2016-03-03 | 2017-09-14 | 株式会社リコー | 農作業用装置および農作業用装置の制御方法 |
WO2017187909A1 (ja) * | 2016-04-28 | 2017-11-02 | ヤンマー株式会社 | 無線通信システム |
JPWO2017221756A1 (ja) * | 2016-06-22 | 2019-04-18 | ソニー株式会社 | センシングシステム、センシング方法、及び、センシング装置 |
WO2017221756A1 (ja) * | 2016-06-22 | 2017-12-28 | ソニー株式会社 | センシングシステム、センシング方法、及び、センシング装置 |
CN107640108B (zh) * | 2016-07-22 | 2022-05-10 | 株式会社久保田 | 作业车 |
JP2018014045A (ja) * | 2016-07-22 | 2018-01-25 | 株式会社クボタ | 作業車 |
CN107640108A (zh) * | 2016-07-22 | 2018-01-30 | 株式会社久保田 | 作业车 |
JP2020113304A (ja) * | 2016-07-22 | 2020-07-27 | 株式会社クボタ | 作業車 |
CN107678041A (zh) * | 2016-08-02 | 2018-02-09 | 三星电子株式会社 | 用于检测对象的系统和方法 |
JP2020110158A (ja) * | 2016-09-05 | 2020-07-27 | 株式会社クボタ | 作業車自動走行システム及び走行経路管理装置 |
US11726485B2 (en) | 2016-09-05 | 2023-08-15 | Kubota Corporation | Autonomous work vehicle travel system, travel route managing device, travel route generating device, and travel route determining device |
WO2018059648A1 (en) * | 2016-09-29 | 2018-04-05 | Agro Intelligence Aps | A system and a method for optimizing the trajectory to be followed when weeding crops |
US11129323B2 (en) | 2016-09-29 | 2021-09-28 | Agro Intelligence Aps | System and a method for optimizing the trajectory to be followed when weeding crops |
JPWO2018109796A1 (ja) * | 2016-12-12 | 2019-10-24 | 株式会社オプティム | 遠隔制御システム、遠隔制御方法、およびプログラム |
JP7113178B2 (ja) | 2016-12-27 | 2022-08-05 | パナソニックIpマネジメント株式会社 | 測位システム、基地局および測位方法 |
JPWO2018123268A1 (ja) * | 2016-12-27 | 2019-10-31 | パナソニックIpマネジメント株式会社 | 測位システム、基地局および測位方法 |
JP2018106596A (ja) * | 2016-12-28 | 2018-07-05 | ラピスセミコンダクタ株式会社 | 地表移動検出装置、無線タグ、地表移動検出方法及び災害救助支援システム |
JP2018117560A (ja) * | 2017-01-24 | 2018-08-02 | 株式会社クボタ | 作業車 |
KR102365444B1 (ko) * | 2017-03-03 | 2022-02-18 | 얀마 파워 테크놀로지 가부시키가이샤 | 주행 경로 특정 시스템 |
EP3591488A4 (en) * | 2017-03-03 | 2020-11-04 | Yanmar Power Technology Co., Ltd. | TRAVEL ITINERARY SPECIFICATION SYSTEM |
CN110325936A (zh) * | 2017-03-03 | 2019-10-11 | 洋马株式会社 | 行驶路径特定系统 |
KR20190073444A (ko) * | 2017-03-03 | 2019-06-26 | 얀마 가부시키가이샤 | 주행 경로 특정 시스템 |
CN106843062B (zh) * | 2017-03-08 | 2019-04-30 | 江苏大学 | 智能变量施肥控制系统及控制方法 |
CN106843062A (zh) * | 2017-03-08 | 2017-06-13 | 江苏大学 | 智能变量施肥控制系统及控制方法 |
JP2018164439A (ja) * | 2017-03-28 | 2018-10-25 | ヤンマー株式会社 | 作業車両の自律走行システム |
JP2018170991A (ja) * | 2017-03-31 | 2018-11-08 | ヤンマー株式会社 | 農作業車両の自律走行システム |
WO2018201961A1 (zh) * | 2017-05-04 | 2018-11-08 | 深圳乐动机器人有限公司 | 三角测距激光雷达 |
CN107390699A (zh) * | 2017-09-04 | 2017-11-24 | 广西民族大学 | 一种甘蔗种植机的路线规划系统及其路线规划方法 |
CN107390699B (zh) * | 2017-09-04 | 2023-07-28 | 广西民族大学 | 一种甘蔗种植机的路线规划系统及其路线规划方法 |
JP2019049896A (ja) * | 2017-09-11 | 2019-03-28 | 井関農機株式会社 | 自動運転システム |
WO2019078336A1 (ja) * | 2017-10-19 | 2019-04-25 | ソニー株式会社 | 撮像装置および信号処理装置 |
US11431911B2 (en) | 2017-10-19 | 2022-08-30 | Sony Corporation | Imaging device and signal processing device |
JPWO2019078336A1 (ja) * | 2017-10-19 | 2020-12-17 | ソニー株式会社 | 撮像装置および信号処理装置 |
JP7259756B2 (ja) | 2017-10-19 | 2023-04-18 | ソニーグループ株式会社 | 撮像装置および信号処理装置 |
WO2019091725A1 (de) * | 2017-11-10 | 2019-05-16 | Zf Friedrichshafen Ag | Verfahren und anzeigegerät zum führen einer arbeitsmaschine |
JP2019097454A (ja) * | 2017-11-30 | 2019-06-24 | 井関農機株式会社 | 作業車両 |
CN109857096A (zh) * | 2017-11-30 | 2019-06-07 | 井关农机株式会社 | 作业车辆 |
JP7225763B2 (ja) | 2018-03-07 | 2023-02-21 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
JP2019160289A (ja) * | 2018-03-07 | 2019-09-19 | カシオ計算機株式会社 | 自律移動装置、自律移動方法及びプログラム |
KR20200139126A (ko) | 2018-03-29 | 2020-12-11 | 얀마 파워 테크놀로지 가부시키가이샤 | 작업 차량용의 장애물 검지 시스템 |
JP2019175343A (ja) * | 2018-03-29 | 2019-10-10 | 西日本電信電話株式会社 | 情報収集装置、情報収集方法及びコンピュータープログラム |
WO2019187883A1 (ja) | 2018-03-29 | 2019-10-03 | ヤンマー株式会社 | 作業車両用の障害物検知システム |
JP2019170309A (ja) * | 2018-03-29 | 2019-10-10 | ヤンマー株式会社 | 作業車両 |
JP2019187352A (ja) * | 2018-04-27 | 2019-10-31 | 井関農機株式会社 | 作業車両 |
JP7217894B2 (ja) | 2018-11-27 | 2023-02-06 | 株式会社ナイルワークス | 作業計画装置、作業計画装置の制御方法、および、その制御プログラム、ならびにドローン |
WO2020111096A1 (ja) * | 2018-11-27 | 2020-06-04 | 株式会社ナイルワークス | 作業計画装置、作業計画装置の制御方法、および、その制御プログラム、ならびにドローン |
JPWO2020111096A1 (ja) * | 2018-11-27 | 2021-10-14 | 株式会社ナイルワークス | 作業計画装置、作業計画装置の制御方法、および、その制御プログラム、ならびにドローン |
JP2020103128A (ja) * | 2018-12-27 | 2020-07-09 | 株式会社クボタ | 管理機 |
JP7085980B2 (ja) | 2018-12-27 | 2022-06-17 | 株式会社クボタ | 管理機 |
JP7534842B2 (ja) | 2019-02-01 | 2024-08-15 | ヤンマーパワーテクノロジー株式会社 | 作業車両用の目標経路生成システム |
JP2020126307A (ja) * | 2019-02-01 | 2020-08-20 | ヤンマーパワーテクノロジー株式会社 | 作業車両用の目標経路生成システム |
JP2020152473A (ja) * | 2019-03-18 | 2020-09-24 | 住友重機械工業株式会社 | 作業機械 |
JP7258613B2 (ja) | 2019-03-18 | 2023-04-17 | 住友重機械工業株式会社 | 作業機械 |
CN110525539B (zh) * | 2019-09-29 | 2020-06-09 | 江苏省肿瘤医院 | 一种赤霉病数据原位采集车 |
CN110525539A (zh) * | 2019-09-29 | 2019-12-03 | 江苏省肿瘤医院 | 一种赤霉病数据原位采集车 |
JP7490865B2 (ja) | 2019-12-26 | 2024-05-27 | ヤンマーパワーテクノロジー株式会社 | 作業車両 |
JP2020119600A (ja) * | 2020-04-21 | 2020-08-06 | ヤンマーパワーテクノロジー株式会社 | 自律走行システム |
JP7248619B2 (ja) | 2020-04-21 | 2023-03-29 | ヤンマーパワーテクノロジー株式会社 | 自律走行システム |
JP2022063315A (ja) * | 2020-04-21 | 2022-04-21 | ヤンマーパワーテクノロジー株式会社 | 自律走行システム |
US20230380345A1 (en) * | 2020-08-17 | 2023-11-30 | Deere & Company | Close loop control of an illumination source based on sample heating |
US12029162B2 (en) * | 2020-08-17 | 2024-07-09 | Deere & Company | Close loop control of an illumination source based on sample heating |
WO2022131176A1 (ja) * | 2020-12-15 | 2022-06-23 | Hapsモバイル株式会社 | 制御装置、プログラム、システム、及び方法 |
JP7586698B2 (ja) | 2020-12-15 | 2024-11-19 | ソフトバンク株式会社 | 制御装置、プログラム、システム、及び方法 |
JP2023513975A (ja) * | 2020-12-30 | 2023-04-05 | 広東視場科技有限公司 | 自動運転車プラットフォームに基づく作物マルチスペクトル採取分析システム |
JP7374511B2 (ja) | 2020-12-30 | 2023-11-07 | 広東視場科技有限公司 | 自動運転車プラットフォームに基づく作物マルチスペクトル採取分析システム |
Also Published As
Publication number | Publication date |
---|---|
EP3171241A4 (en) | 2017-12-13 |
EP3171241A1 (en) | 2017-05-24 |
JP6344473B2 (ja) | 2018-06-27 |
JP2018160257A (ja) | 2018-10-11 |
CN106687877A (zh) | 2017-05-17 |
US20170131718A1 (en) | 2017-05-11 |
JPWO2016009688A1 (ja) | 2017-04-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6344473B2 (ja) | システム、機械、制御方法、プログラム | |
JPWO2016009688A6 (ja) | システム、機械、制御方法、プログラム | |
JP6365668B2 (ja) | 情報処理装置、機器、情報処理システム、制御信号の生産方法、プログラム | |
US11367207B2 (en) | Identifying and treating plants using depth information in a single image | |
JP6230975B2 (ja) | 不要植物の除去システム | |
US20160063420A1 (en) | Farmland management system and farmland management method | |
JP2019095937A (ja) | 農作物育成支援システム、情報収集装置、育成支援サーバ、および農作物販売支援システム | |
Latif | An agricultural perspective on flying sensors: State of the art, challenges, and future directions | |
US11816874B2 (en) | Plant identification using heterogenous multi-spectral stereo imaging | |
US12080019B2 (en) | Extracting feature values from point clouds to generate plant treatments | |
Marín et al. | Urban lawn monitoring in smart city environments | |
WO2022107588A1 (ja) | 移動体、制御ユニット、データ生成ユニット、移動体の動作を制御する方法、およびデータを生成する方法 | |
Moreno et al. | Proximal sensing for geometric characterization of vines: A review of the latest advances | |
WO2022107587A1 (ja) | 移動体、データ生成ユニット、およびデータを生成する方法 | |
CN117296644A (zh) | 虫草激光清除方法及激光虫草清除机 | |
Upadhyay et al. | Advances in ground robotic technologies for site-specific weed management in precision agriculture: A review | |
US20220207852A1 (en) | Generating a ground plane for obstruction detection | |
Hutsol et al. | Robotic technologies in horticulture: analysis and implementation prospects | |
US20220100996A1 (en) | Ground Plane Compensation in Identifying and Treating Plants | |
WO2022107586A1 (ja) | 移動体、制御ユニット、および移動体の動作を制御する方法 | |
WO2023276226A1 (ja) | 作物列検出システム、作物列検出システムを備える農業機械、および、作物列検出方法 | |
Abinaya et al. | Efficiency of IDRONE Technology in Finding Pest-Free Solutions for Tea Farming-Looper Caterpillar and Tea Leaf Hopper | |
Nagel | Computational Contributions to the Automation of Agriculture | |
Martinsanz et al. | Image Processing in Agriculture and Forestry |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15822571 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016534298 Country of ref document: JP Kind code of ref document: A |
|
REEP | Request for entry into the european phase |
Ref document number: 2015822571 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015822571 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |