[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2014105928A1 - Method and applications of local coordinate system based on optical flow with video cameras - Google Patents

Method and applications of local coordinate system based on optical flow with video cameras Download PDF

Info

Publication number
WO2014105928A1
WO2014105928A1 PCT/US2013/077755 US2013077755W WO2014105928A1 WO 2014105928 A1 WO2014105928 A1 WO 2014105928A1 US 2013077755 W US2013077755 W US 2013077755W WO 2014105928 A1 WO2014105928 A1 WO 2014105928A1
Authority
WO
WIPO (PCT)
Prior art keywords
optical flow
agricultural machine
camera
image
local coordinate
Prior art date
Application number
PCT/US2013/077755
Other languages
French (fr)
Inventor
Paul Matthews
Original Assignee
Agco Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agco Corporation filed Critical Agco Corporation
Publication of WO2014105928A1 publication Critical patent/WO2014105928A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A01AGRICULTURE; FORESTRY; ANIMAL HUSBANDRY; HUNTING; TRAPPING; FISHING
    • A01BSOIL WORKING IN AGRICULTURE OR FORESTRY; PARTS, DETAILS, OR ACCESSORIES OF AGRICULTURAL MACHINES OR IMPLEMENTS, IN GENERAL
    • A01B69/00Steering of agricultural machines or implements; Guiding agricultural machines or implements on a desired track
    • A01B69/001Steering by means of optical assistance, e.g. television cameras

Definitions

  • GNSS global navigation satellite systems
  • GPS global positioning systems
  • GLONASS global positioning systems
  • Galileo Galileo
  • FIG. 1 shown is an example agricultural machine embodied as a tractor 10 in which an embodiment of an optical flow measurement system may be implemented.
  • a tractor 10 in which an embodiment of an optical flow measurement system may be implemented.
  • the example tractor 10 shown in FIG. 1 is merely illustrative, and that other configurations (e.g., track-based) and/or other types of agricultural machines may serve as a host for an optical flow measurement system.
  • certain embodiments of optical flow measurement systems may be mounted to a towed vehicle instead of the towing vehicle, or on both vehicles in some embodiments.
  • the example tractor 10 comprises an operator's cab 12 that is mounted to a chassis 14.
  • the images may be embodied as frames (or pictures) of a video acquisition, or as described below, as plural, sequential snapshots of the ground.
  • plural images 18 e.g., 18A, 18B, and 18C
  • FIGS. 2A and 2B Beneath each of the respective images 18A, 18B, and 18C in FIGS. 2A and 2B are labels, image (t), image (t+1 ), and image (t+2), signifying that each snapshot occurs in a time-progressive sequence referenced in this example from the first depicted frame at time equal to t as the tractor 10 (FIG. 1 ) traverses a field.
  • An arrow, denoted with reference numeral 20, refers to the direction of movement of the tractor 10, which may, in one implementation, be forwardly as depicted in FIG. 2A.
  • the images 18 are images of the ground being traversed by the tractor 10 (e.g., in real-time).
  • certain features of the ground shown in images captured by the camera 16 may be tracked to determine one or more vectors associated with optical flow. These trackable features or identifiable areas are symbolically denoted in FIGS.
  • the tractor 10 may drive in a forward direction along path 28A, the optical flow measurement system enabling continual corrections or adjustments to a local coordinate system, which is provided to a steering sub-system and/or drive train sub-system to maintain the tractor 10 in a straight-line path or direction according to a set speed.
  • the optical flow measurement system cumulatively tracks the left and right movements sensed by the camera 16 and heading changes, enabling the guidance of the tractor 10 along the field.
  • the operator may reach a headlands, requiring the tractor 10 to make a turn to start down an opposite direction along a path 28B. For instance, the operator may change direction using the steering wheel, which may suspend the optical flow measurement system until the tractor is traversing the field in an opposite direction, such as along the path 28B.
  • an operator may load a field map into memory of the tractor 10, which enables the tractor 10 to traverse the field 26 according to a pre-recorded plan or wayline(s) (e.g., based on a prior traversal and recording of points), enabling the tracking of recorded points with the guidance system while supplementing the traversal controlled by the guidance system with the optical flow measurement system to ensure a smoother and more accurate traversal of the field 26 between the tracked points.
  • a global coordinate system e.g., via an on-board positioning or guidance system
  • an operator may load a field map into memory of the tractor 10, which enables the tractor 10 to traverse the field 26 according to a pre-recorded plan or wayline(s) (e.g., based on a prior traversal and recording of points), enabling the tracking of recorded points with the guidance system while supplementing the traversal controlled by the guidance system with the optical flow measurement system to ensure a smoother and more accurate traversal of the field 26 between the tracked points.
  • the control system 28 comprises a controller 32 coupled in a network 34 (e.g., high-speed network, though other and/or additional networks may be used, and hence the control system 28 is not limited to a single network) to the camera 16, an optional guidance receiver 36 (e.g., which includes the ability to access one or more known constellations jointly or separately), machine controls 38, and a user interface 40.
  • a network 34 e.g., high-speed network, though other and/or additional networks may be used, and hence the control system 28 is not limited to a single network
  • an optional guidance receiver 36 e.g., which includes the ability to access one or more known constellations jointly or separately
  • machine controls 38 e.g., which includes the ability to access one or more known constellations jointly or separately
  • the camera 16 has been described already, and may include visible and non-visible spectrum devices, such as still photo type cameras, video cameras, infrared cameras, etc.
  • the machine controls 38 collectively comprise the various actuators, sensors, and/or subsystems residing on
  • the controller 32 receives and processes the information from the camera 16 and delivers control signals to the machine controls 38 (e.g., directly, or indirectly through an intermediary device in some embodiments).
  • the controller 32 may receive input from the machine controls 38 (e.g., such as to enable feedback as to the position or status of certain devices, such as a header height and/or width) and/or receive input from the guidance receiver 36 as explained above.
  • the controller 32 may also receive input from the user interface 40, such as during the process of adjustment to enable intervention of machine operation by the operator, to provide feedback of a change in speed or direction and/or or an impending change or need or recommendation for change.
  • the optical flow measurement software 52 enables the selection and tracking of features in captured images, the determination of vectors associated with the tracked features, comparison of the vectors, speed determinations, directional determinations, and determination (ad/or adjustment) of local coordinate systems.
  • One embodiment of pseudo code for performing optical flow measurements and adjusting a local coordinate system comprises the following:

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Soil Sciences (AREA)
  • Environmental Sciences (AREA)
  • Guiding Agricultural Machines (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method includes receiving a first image of the ground from a downward-facing camera mounted underneath an agricultural machine and receiving a second image of the ground from the camera. An optical flow is determined based on the first and second images and automated steering of the agricultural machine is enabling based on the optical flow.

Description

METHOD AND APPLICATIONS OF LOCAL COORDINATE SYSTEM BASED ON
OPTICAL FLOW WITH VIDEO CAMERAS
CROSS REFERENCE TO RELATED APPLICATION
[0001] This application claims priority to copending U.S. provisional application entitled,
METHOD AND APPLICATIONS OF LOCAL COORDINATE SYSTEM BASED ON OPTICAL FLOW WITH VIDEO CAMERAS," having serial number 61/746,684, filed December 28, 2012, which is entirely incorporated herein by reference.
TECHNICAL FIELD
[0002] The present disclosure is generally related to agriculture technology, and, more particularly, computer-assisted farming.
BACKGROUND
[0003] Recent efforts have been made to automate or semi-automate farming
operations. Such efforts serve not only to reduce operating costs but also improve working conditions on operators and reduce operator error, enabling gains in operational efficiency and yield. For instance, agricultural machines may employ a guidance system to reduce operator fatigue and costs.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Many aspects of the disclosure can be better understood with reference to the following drawings. The components in the drawings are not necessarily to scale, emphasis instead being placed upon clearly illustrating the principles of the present disclosure. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views. [0005] FIG. 1 is a schematic diagram of an example agricultural machine comprising an embodiment of an optical flow measurement system.
[0006] FIG. 2A is a schematic diagram showing a sequence of images of the ground acquired by an embodiment of an optical flow measurement system and vectors associated with a direction of an agricultural machine.
[0007] FIG. 2B is a schematic diagram showing a sequence of images of the ground acquired by an embodiment of an optical flow measurement system and vectors associated with a direction of an agricultural machine.
[0008] FIG. 3 is a schematic diagram showing an example path an agricultural machine autonomously or semi-autonomously traverses in a field with a speed and direction of the agricultural machine based all or in part on local coordinate systems provided by an embodiment of an optical flow measurement system.
[0009] FIG. 4A is a block diagram showing an embodiment of a control system that includes an embodiment of an optical flow measurement system.
[0010] FIG. 4B is a block diagram showing an embodiment of a controller for the control system of FIG. 4A.
[001 1] FIG. 5 is a flow diagram that illustrates an example embodiment of an optical flow measurement method.
DESCRIPTION OF EXAMPLE EMBODIMENTS
Overview
[0012] In one embodiment, a method comprising receiving a first image of the ground from a downward-facing camera mounted underneath an agricultural machine; receiving a second image of the ground from the camera; determining optical flow based on the first and second images; and enabling automated steering of the agricultural machine based on the optical flow. Detailed Description
] Certain embodiments of optical flow measurement systems and methods are disclosed that enable an agricultural machine to function according to automated or semi-automated guidance and steering control, enabling the machine to traverse a field with minimal or no operator intervention. In one embodiment, an optical flow
measurement system comprises a downward facing camera mounted to the underside of the agricultural machine and a controller. The controller receives the captured images (e.g., a sequence of frames) from the camera and measures the optical flow (e.g., speed and/or direction of the agricultural machine) of each frame to determine the speed and relative position of the agricultural machine. Based on the optical flow measurements, certain embodiments of the optical flow measurement system adjusts a prior determined or inputted local coordinate system for input to one or more subsystems of the agricultural machine, such as a guidance system, steering system, among other systems.
] Digressing briefly, more and more farming applications are becoming dependent on positioning, or guidance, systems (e.g., global navigation satellite systems (GNSS), such as global positioning systems (GPS), GLONASS, Galileo, among other
constellations). An example application is machine auto-guidance, which can utilize a corrected GPS to auto-steer the machine. With precision farming applications, there is a push toward more accurate positioning. High accuracy positioning, while becoming less expensive, still comprises a significant portion of the total cost of a system that utilizes positioning. In addition, while GNSS-based positioning systems are becoming more robust, such as with the inclusion of additional constellations such as GLONASS, they are still subject to interference from atmospheric effects, further hampered by solar cycle peaks. In contrast, one or more embodiments of optical flow measurement systems provide local coordinate systems (which may work alone or in conjunction with a global coordinate system provided by the machine guidance system), are immune to (or substantially immune to) ionosphere effects, and/or are not subject to roll/pitch corrections unlike roof-top GNSS-based systems.
[0015] Having summarized certain features of optical flow measurement systems of the present disclosure, reference will now be made in detail to the description of the disclosure as illustrated in the drawings. While the disclosure will be described in connection with these drawings, there is no intent to limit it to the embodiment or embodiments disclosed herein. For instance, in the description that follows, one focus is on an agricultural machine embodied as a tractor, though it should be appreciated that other machines, towed or self-propelled, may have an optical flow measurement system mounted thereon and hence are contemplated to be within the scope of the disclosure. Further, although the description identifies or describes specifics of one or more embodiments, such specifics are not necessarily part of every embodiment, nor are all various stated advantages necessarily associated with a single embodiment or all embodiments. On the contrary, the intent is to cover all alternatives, modifications and equivalents included within the spirit and scope of the disclosure as defined by the appended claims. Further, it should be appreciated in the context of the present disclosure that the claims are not necessarily limited to the particular embodiments set out in the description.
[0016] Note that references hereinafter made to certain directions, such as, for example,
"front", "rear", "left" and "right", are made as viewed from the rear of the tractor looking forwardly.
[0017] Referring now to FIG. 1 , shown is an example agricultural machine embodied as a tractor 10 in which an embodiment of an optical flow measurement system may be implemented. It should be understood by one having ordinary skill in the art, in the context of the present disclosure, that the example tractor 10 shown in FIG. 1 is merely illustrative, and that other configurations (e.g., track-based) and/or other types of agricultural machines may serve as a host for an optical flow measurement system. For instance, certain embodiments of optical flow measurement systems may be mounted to a towed vehicle instead of the towing vehicle, or on both vehicles in some embodiments. The example tractor 10 comprises an operator's cab 12 that is mounted to a chassis 14. The chassis 14 also comprises, or has mounted to it, other well-known sub-systems, such as a steering mechanism, axles, drivetrain, among other components omitted here for brevity. Mounted to the chassis 14, on the underside of the tractor 10, is one or more cameras, such as camera 16. The camera 16 is mounted a distance, D, from the ground, enabling a calibration factor to be used for determining the distance the tractor 10 travels. Note that the fore and aft position of the camera 16 depicted in FIG. 1 is merely illustrative, and in some embodiments, may be positioned more forwardly or rearwardly. The camera 16 may be configured for operation in the visible spectrum. In some embodiments, the camera 16 may operate in other areas of the electromagnetic spectrum, such as in the infrared spectrum. The camera 16 may be embodied as a still- photo camera (e.g., taking snapshots) or a video camera (acquiring moving pictures or frames). Reference throughout the disclosure to images or image capture is intended to include both types of camera operations (snapshot and moving pictures or frames) and cameras (e.g., still and video).
In one embodiment, the camera 16 is oriented directly downward (e.g., downward when considered in the context of a flat, horizontal surface upon which the tractor 10 rests or travels). In some embodiments, the camera 16 may be oriented substantially directly downward (e.g., from straight down to approximately up to thirty (30) degrees offset from vertical). The downward orientation of the camera 16 enables directional vectors to be oriented in the same direction, facilitating the rejection of outliers. Having described an example environment in which certain embodiments of optical flow measurement systems may be implemented, attention is directed to FIGS. 2A-2B, which illustrate example processing of a sequence of images of the ground captured by the camera 16 (FIG. 1 ). As indicated above, the images may be embodied as frames (or pictures) of a video acquisition, or as described below, as plural, sequential snapshots of the ground. Referring to FIG. 2A, plural images 18 (e.g., 18A, 18B, and 18C) are depicted, having been captured (and undergoing processing) by the camera 16. Beneath each of the respective images 18A, 18B, and 18C in FIGS. 2A and 2B are labels, image (t), image (t+1 ), and image (t+2), signifying that each snapshot occurs in a time-progressive sequence referenced in this example from the first depicted frame at time equal to t as the tractor 10 (FIG. 1 ) traverses a field. An arrow, denoted with reference numeral 20, refers to the direction of movement of the tractor 10, which may, in one implementation, be forwardly as depicted in FIG. 2A. The images 18 are images of the ground being traversed by the tractor 10 (e.g., in real-time). As is known (e.g., see Shi and Tomasi's publications on algorithms for tracking movement, and/or Lucas and Kanade's publications on algorithms for optical flow methods for tracking movement, among other well known publications), certain features of the ground shown in images captured by the camera 16 may be tracked to determine one or more vectors associated with optical flow. These trackable features or identifiable areas are symbolically denoted in FIGS. 2A and 2B with various geometric shapes (e.g., triangles, squares, octagons, etc.) associated with the lines corresponding to features in each image. As shown in FIG. 2A, and referring to image 18A, plural vectors, such as vector 22A (the quantity shown merely an example for illustrative purposes), are associated with the distinguishing features of the ground, such as feature 24A (symbolically denoted with a rectangle), and in particular, are associated with a measurement of the speed and direction of the agricultural machine. For instance, as the agricultural machine moves in the direction indicated by arrow 20, a sequence of images 18 is captured. The capture of image 18A is followed in time (by increment t+1 ) by the capture of image 18B, which is followed in time (e.g., t+2 from capture of image 18A) by the capture of image 18C. An embodiment of optical flow measurement systems compares the vector 22B of tracked feature 24B (among comparisons of other possible tracked features and associated vectors) with the vector 22A of the same feature 24A in the image 18A, and determines the speed and direction of the agricultural machine. Such determinations of the speed and direction may involve consideration of additional parameters, such as the camera resolution, frame rate (or snapshot rate), and/or mounting height (e.g., distance "D" of the camera 16, FIG. 1 ). In some embodiments, the change in speed may be tracked, such as to ensure a consistent speed of travel of the agricultural machine. Referring to FIG. 2B, the captured images are used by an embodiment of an optical flow measurement system to determine the speed and direction of travel (e.g., relative direction), particularly when the tractor 10 (FIG. 1 ) is turning Similar to the sequence of captured images in FIG. 2A, the image 18D is captured by the camera 16, followed in time (t+1 ) by the capture of image 18E, which is followed in time (t+2) by the capture of image 18F. Once again, one or more features of the captured images are tracked to enable a determination of a vector for the respective feature and a comparison by the optical flow measurement system of one or more vectors among the plural sequential images. For instance, in the image 18D, the feature 24C (e.g., symbolically denoted with a rectangle), among other features, has an associated vector 22C. Similarly, the same feature, designated 24D in the image 18E, has an associated vector 22D. Comparing the two images 18D and 18E, the vectors 22C and 22D reveal a change in direction of the agricultural machine. An embodiment of the optical flow measurement system uses the differences in vectors to determine the direction of travel of the agricultural machine. As with the speed and direction determinations involved in FIG. 2A, directional changes may be determined based on addition information, such as camera resolution, frame (or snapshot) rate, and/or mounting height of the camera 17. It should be appreciated in the context of the present disclosure that reference to speed and/or direction determinations may include relative determinations (e.g., compared to the prior image) or absolute (e.g., when considering other parameters, such as machine parameters, guidance receiver input, etc.).
[0021] In general, since the camera 16 is directed downward at the ground, every feature in the scene moves at the same direction (e.g., same or similar vector direction) and rate, reducing vector outliers. In contrast, a horizon-facing camera may have features tracking in different directions and/or rates (e.g., outliers). It should be appreciated within the context of the present disclosure that certain embodiments may determine speed, direction, or a combination of both.
[0022] Using any one of a number of well-known techniques, the information determined from comparison of the plural images may be used to determine a local coordinate system (e.g., directional coordinates), with each determination enabling an adjustment of a prior local coordinate system. In one embodiment, the optical flow measurement system may use the adjusted local coordinate systems as an alternative (or in some embodiments, as a supplement) to a positioning system for conventional auto-steering systems. In some embodiments, the optical flow measurement system may use the adjusted local coordinate systems (e.g., with speed parameters) as an alternative (or supplement in some embodiments) to a radar device (e.g., speed determination). In some embodiments, the optical flow measurement system may remove or reduce the need for inertial components (e.g., no roll or pitch compensation), sometimes used to supplement a positioning system, while providing the ability to perform low speed forward and/or reverse detection.
[0023] Referring to FIG. 3, shown is a schematic diagram of the tractor 10 equipped with the optical flow measurement system, which includes the camera 16 (FIG. 1 ). In one embodiment, the tractor 10 may omit a positioning or guidance system, and use the optical flow measurement system to enable a semi-autonomous traversal (e.g., for harvesting or otherwise processing of crop material) of a field. For instance, an operator may drive the tractor 10 to a field, and initiate a starting position. The operator may activate the optical flow measurement system (e.g., via a button or switch on a control panel, or on a display device, among other methods), and engage the tractor 10. The tractor 10 may drive in a forward direction along path 28A, the optical flow measurement system enabling continual corrections or adjustments to a local coordinate system, which is provided to a steering sub-system and/or drive train sub-system to maintain the tractor 10 in a straight-line path or direction according to a set speed. In other words, the optical flow measurement system cumulatively tracks the left and right movements sensed by the camera 16 and heading changes, enabling the guidance of the tractor 10 along the field. The operator may reach a headlands, requiring the tractor 10 to make a turn to start down an opposite direction along a path 28B. For instance, the operator may change direction using the steering wheel, which may suspend the optical flow measurement system until the tractor is traversing the field in an opposite direction, such as along the path 28B.
[0024] One or more variations of the above description of the traversal of the path 28 may be implemented. For instance, in one embodiment, the operator may merely hit a switch or tap the steering wheel, the contact enabling an autonomous, preset turning ratio based, for instance, on the width of an attached header or trailer that permits little or no overlap among traversed rows. Each turn may be implemented with the assistance of the direction and/or speed determinations of embodiment of the optical flow measurement system. In other words, the optical flow measurement system may enable the tractor 10 to perform autonomous straight-path travel as well as curved-path travel.
[0025] In embodiments where the optical flow measurement system cooperates with a global coordinate system (e.g., via an on-board positioning or guidance system), an operator may load a field map into memory of the tractor 10, which enables the tractor 10 to traverse the field 26 according to a pre-recorded plan or wayline(s) (e.g., based on a prior traversal and recording of points), enabling the tracking of recorded points with the guidance system while supplementing the traversal controlled by the guidance system with the optical flow measurement system to ensure a smoother and more accurate traversal of the field 26 between the tracked points.
[0026] Although a standalone (e.g., with no machine guidance system) optical flow measurement system embodiment may realize significant costs savings (e.g., saving on the cost of machine guidance systems and inertial systems), even when used in conjunction with a machine guidance system, savings may be made realized since the optical flow measurement system provides robustness and at least some of the functionality normally performed by inertial systems, at a lower cost.
[0027] Attention is now directed to FIG. 4A, which illustrates a control system 28 that includes an embodiment of an optical flow measurement system 30. It should be appreciated within the context of the present disclosure that some embodiments (e.g., of the control system 28 and/or the optical flow measurement system 30, which includes the camera 16 and the controller 32) may include additional components (e.g., a guidance receiver) or fewer or different components, and that the example depicted in FIG. 4A is merely illustrative of one embodiment among others. The control system 28 comprises a controller 32 coupled in a network 34 (e.g., high-speed network, though other and/or additional networks may be used, and hence the control system 28 is not limited to a single network) to the camera 16, an optional guidance receiver 36 (e.g., which includes the ability to access one or more known constellations jointly or separately), machine controls 38, and a user interface 40. The camera 16 has been described already, and may include visible and non-visible spectrum devices, such as still photo type cameras, video cameras, infrared cameras, etc. The machine controls 38 collectively comprise the various actuators, sensors, and/or subsystems residing on the tractor 10 (FIG. 1 ), including those used to control machine navigation (e.g., speed, direction (such as a steering system), etc.), implement (e.g., header or trailer) position, and/or control, internal processes, among others. The user interface 40 may be a keyboard, mouse, microphone, touch-type display device, joystick, steering wheel, or other devices (e.g., switches) that enable input by an operator (e.g., such as while in the operator's cab 12 (FIG. 1 )). The guidance receiver 36 may enable autonomous or semi- autonomous operation of the tractor 10 in cooperation with machine controls 38 and the controller 32 (e.g., via guidance software residing in the controller 32).
The controller 32 receives and processes the information from the camera 16 and delivers control signals to the machine controls 38 (e.g., directly, or indirectly through an intermediary device in some embodiments). In some embodiments, the controller 32 may receive input from the machine controls 38 (e.g., such as to enable feedback as to the position or status of certain devices, such as a header height and/or width) and/or receive input from the guidance receiver 36 as explained above. The controller 32 may also receive input from the user interface 40, such as during the process of adjustment to enable intervention of machine operation by the operator, to provide feedback of a change in speed or direction and/or or an impending change or need or recommendation for change.
[0029] It should be appreciated within the context of the present disclosure that variations of above are contemplated to be within the scope of the disclosure. For instance, in some embodiments, the optical flow computations may be implemented entirely in a camera (e.g., a higher-end camera), with transmission from the camera limited primarily to direction and speed data.
[0030] FIG. 4B further illustrates an example embodiment of the controller 32. One having ordinary skill in the art should appreciate in the context of the present disclosure that the example controller 32 is merely illustrative, and that some embodiments of controllers may comprise fewer or additional components, and/or some of the functionality associated with the various components depicted in FIG. 4B may be combined, or further distributed among additional modules, in some embodiments. Referring to FIG. 4B, with continued reference to FIG. 4A, the controller 32 is depicted in this example as a computer system, but may be embodied as a programmable logic controller (PLC), FPGA, among other devices. It should be appreciated that certain well- known components of computer systems are omitted here to avoid obfuscating relevant features of the controller 32. In one embodiment, the controller 32 comprises one or more processing units, such as processing unit 42, input/output (I/O) interface(s) 44, and memory 46, all coupled to one or more data busses, such as data bus 48. The memory 46 may include any one or a combination of volatile memory elements (e.g., random- access memory RAM, such as DRAM, and SRAM, etc.) and nonvolatile memory elements (e.g., ROM, hard drive, tape, CDROM, etc.). The memory 46 may store a native operating system, one or more native applications, emulation systems, or emulated applications for any of a variety of operating systems and/or emulated hardware platforms, emulated operating systems, etc. In some embodiments, the memory 46 may store one or more field maps that were recorded from a prior traversal of a given field, enabling autonomous or semi-autonomous traversal of a given field when activated. In the embodiment depicted in FIG. 4B, the memory 46 comprises an operating system 50, optical flow measurement software 52, and guidance software 54. It should be appreciated that in some embodiments, additional or fewer software modules (e.g., combined functionality) may be employed in the memory 46 or additional memory. In some embodiments, a separate storage device may be coupled to the data bus 48, such as a persistent memory (e.g., optical, magnetic, and/or semiconductor memory and associated drives).
[0031] The optical flow measurement software 52 enables the selection and tracking of features in captured images, the determination of vectors associated with the tracked features, comparison of the vectors, speed determinations, directional determinations, and determination (ad/or adjustment) of local coordinate systems. One embodiment of pseudo code for performing optical flow measurements and adjusting a local coordinate system comprises the following:
[0032] Create starting reference point (local coordinate)
[0033] Repeat {
[0034] Read camera frame/image
[0035] Identify prominent features (e.g., rocks, clods of dirt, etc.)
[0036] Determine optical flow relative to prior frame/image
[0037] Filter the optical flow information for all prominent features {
[0038] (based on camera resolution, frame rate, and/or mounting height)
[0039] Determine direction of travel
[0040] Determine speed of travel
[0041] } [0042] Filter outliers
[0043] Adjust current local coordinate by amount defined by speed of travel and direction of travel based on the frame/image rate of the camera
[0044] }
[0045] It should be appreciated that the pseudo code described is merely one example among other possible methods of achieving the same or similar functionality. The local coordinate system provided by the optical flow measurement software 52 may be used in the same or formatted form as a coordinate system that a radar may use or as inputs into a guidance system.
[0046] The guidance software 54 may coordinate inputs from the guidance receiver 36 and output control signals to one or more machine controls 38 to enable guided traversal and/or performance of various farming operations on a field. In some embodiments, the guidance software 54 may receive directional coordinates (e.g., in a coordinate system) and/or speed parameters from the optical flow measurement software 52, and provide the same to steering mechanisms of the machine controls 38 to cause automated left or right maneuvers (and/or speed changes) of the tractor 10.
[0047] Execution of the software modules 52 and 54 may be implemented by the processing unit 42 under the management and/or control of the operating system 50. In some embodiments, the operating system 50 may be omitted and a more rudimentary manner of control implemented. The processing unit 42 may be embodied as a custom- made or commercially available processor, a central processing unit (CPU) or an auxiliary processor among several processors, a semiconductor based microprocessor (in the form of a microchip), a macroprocessor, one or more application specific integrated circuits (ASICs), a plurality of suitably configured digital logic gates, and/or other well-known electrical configurations comprising discrete elements both individually and in various combinations to coordinate the overall operation of the controller 32. [0048] The I/O interfaces 44 provide one or more interfaces to the network 34 and other networks. In other words, the I/O interfaces 44 may comprise any number of interfaces for the input and output of signals (e.g., analog or digital data) for conveyance over the network 34. The input may comprise input by an operator (local or remote) through the user interface 44 (e.g., a keyboard, joystick, steering wheel, or mouse or other input device (or audible input in some embodiments)), and input from signals carrying information from one or more of the components of the control system 28, such as the camera 16, guidance receiver 36, and/or machine controls 38, among other devices.
[0049] When certain embodiments of the controller 32 are implemented at least in part as software (including firmware), as depicted in FIG. 4B, it should be noted that the software can be stored on a variety of non-transitory computer-readable medium for use by, or in connection with, a variety of computer-related systems or methods. In the context of this document, a computer-readable medium may comprise an electronic, magnetic, optical, or other physical device or apparatus that may contain or store a computer program (e.g., executable code or instructions) for use by or in connection with a computer-related system or method. The software may be embedded in a variety of computer-readable mediums for use by, or in connection with, an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions.
[0050] When certain embodiment of the controller 32 are implemented at least in part as hardware, such functionality may be implemented with any or a combination of the following technologies, which are all well-known in the art: a discrete logic circuit(s) having logic gates for implementing logic functions upon data signals, an application specific integrated circuit (ASIC) having appropriate combinational logic gates, a programmable gate array(s) (PGA), a field programmable gate array (FPGA), etc.
[0051] Having described certain embodiments of an optical flow measurement system 30, it should be appreciated within the context of the present disclosure that one embodiment of an optical flow measurement method, denoted as method 56 as illustrated in FIG. 5, comprises receiving a first image of the ground from a downward- facing camera mounted underneath an agricultural machine (58); receiving a second image of the ground from the camera (60); determining optical flow based on the first and second images (62); and enabling automated steering of the agricultural machine based on the optical flow (64). For instance, the optical flow measurement system 30 extracts speed and/or heading change data and uses the same as input to a guidance system to provide automated steering. Stated otherwise, the optical flow measurement system 30 provides a local coordinate system (e.g., passes coordinates) to a guidance system (e.g., in one embodiment, the guidance system comprising a guidance receiver 36 and guidance software 54), which implements automated steering through cooperation with machine controls 38.
[0052] Any process descriptions or blocks in flow diagrams should be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps in the process, and alternate implementations are included within the scope of the embodiments in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present disclosure.
[0053] It should be emphasized that the above-described embodiments of the present disclosure, particularly, any "preferred" embodiments, are merely possible examples of implementations, merely set forth for a clear understanding of the principles of the disclosure. Many variations and modifications may be made to the above-described embodiment(s) of the disclosure without departing substantially from the spirit and principles of the disclosure. All such modifications and variations are intended to be included herein within the scope of this disclosure and protected by the following claims.

Claims

CLAIMS:
1. A method, comprising:
receiving a first image of the ground from a downward-facing camera mounted underneath an agricultural machine;
receiving a second image of the ground from the camera;
determining optical flow based on the first and second images; and
enabling automated steering of the agricultural machine based on the optical flow.
2. The method of claim 1 , wherein determining the optical flow comprises determining a direction of the agricultural machine, and enabling comprises passing directional coordinates to a guidance system that controls the steering.
3. The method of claim 1 , wherein determining the optical flow comprises determining a speed of the agricultural machine, and enabling further comprises passing speed parameters to a guidance system that controls the speed of the agricultural machine.
4. The method of claim 1 , wherein determining the optical flow comprises determining a direction and a speed of the agricultural machine, and enabling comprises passing directional coordinates and speed parameters to a guidance system that controls the steering and speed of the agricultural machine.
5. The method of claim 1 , wherein determining the optical flow comprises comparing one or more features of the second image with the one or more features of the first image.
6. The method of claim 1 , wherein determining the optical flow is based on one or any combination of camera resolution, frame rate, or mounting height relative to the ground.
7. The method of claim 6, further comprising adjusting an initial local coordinate system based on the determining.
8. The method of claim 7, further comprising adjusting a guidance system of the agricultural machine based on the adjusted initial local coordinate system.
9. The method of claim 7, further comprising adjusting a steering system of the agricultural machine based on the adjusted initial local coordinate system.
10. The method of claim 1 , wherein the camera is directly downward-facing.
1 1 . An agricultural machine, comprising:
a chassis;
a camera mounted to the underside of the chassis and facing downward; and a controller configured to:
receive a first image of the ground from the camera;
receive a second image of the ground from the camera; determine optical flow based on the first and second images; and enable automated navigation of the agricultural machine based on the optical flow.
12. The agricultural machine of claim 1 1 , further comprising a guidance system, wherein the optical flow comprises a direction of the agricultural machine, a speed of the agricultural machine, or a combination of both, and wherein the controller is configured to enable automated navigation by passing directional coordinates, speed parameters, or a combination of both to a guidance system that controls navigation.
13. The agricultural machine of claim 1 1 , wherein the controller is configured to determine the optical flow by comparing one or more features of the second image with the one or more features of the first image.
14. The agricultural machine of claim 1 1 , wherein the controller is configured to determine the optical flow based on one or any combination of camera resolution, frame rate, or mounting height relative to the ground.
15. The agricultural machine of claim 14, wherein the controller is further configured to adjust an initial local coordinate system based on the determining.
16. The agricultural machine of claim 15, further comprising a guidance system, wherein the controller is further configured to provide the adjusted initial local coordinate system to the guidance system to affect the navigation.
17. The agricultural machine of claim 15, further comprising a steering system, wherein the controller is further configured to provide the adjusted initial local coordinate system to the steering system to affect the navigation.
18. The agricultural machine of claim 1 1 , wherein the camera is a video camera.
19. The agricultural machine of claim 1 1 , wherein the camera is an infrared camera.
20. A system, comprising:
a chassis;
a guidance system;
a camera mounted to the underside of the chassis and pointing downward; and a controller configured to:
determine optical flow based on images of the ground captured by the camera;
adjust a prior local coordinate system based on the determined optical flow; and
provide the adjusted local coordinate system to the guidance system.
PCT/US2013/077755 2012-12-28 2013-12-26 Method and applications of local coordinate system based on optical flow with video cameras WO2014105928A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261746684P 2012-12-28 2012-12-28
US61/746,684 2012-12-28

Publications (1)

Publication Number Publication Date
WO2014105928A1 true WO2014105928A1 (en) 2014-07-03

Family

ID=51022052

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/077755 WO2014105928A1 (en) 2012-12-28 2013-12-26 Method and applications of local coordinate system based on optical flow with video cameras

Country Status (1)

Country Link
WO (1) WO2014105928A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104145550A (en) * 2014-08-07 2014-11-19 昆明理工大学 Farmland track device
CN105277735A (en) * 2014-07-24 2016-01-27 南车株洲电力机车研究所有限公司 Detection method and device for speed and displacement of rail train
US10398084B2 (en) 2016-01-06 2019-09-03 Cnh Industrial America Llc System and method for speed-based coordinated control of agricultural vehicles
US11357153B2 (en) 2019-12-11 2022-06-14 Cnh Industrial Canada, Ltd. System and method for determining soil clod size using captured images of a field

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5442552A (en) * 1993-03-16 1995-08-15 The Regents Of The University Of California Robotic cultivator
US5911669A (en) * 1996-04-19 1999-06-15 Carnegie Mellon University Vision-based crop line tracking for harvesters
US6101795A (en) * 1997-05-13 2000-08-15 Claas Kgaa Automatic steering mechanism and method for harvesting machine
US6141614A (en) * 1998-07-16 2000-10-31 Caterpillar Inc. Computer-aided farming system and method
US6336051B1 (en) * 1997-04-16 2002-01-01 Carnegie Mellon University Agricultural harvester with robotic control
US20090319170A1 (en) * 2008-06-20 2009-12-24 Tommy Ertbolle Madsen Method of navigating an agricultural vehicle, and an agricultural vehicle implementing the same
US20120095652A1 (en) * 2010-10-14 2012-04-19 Noel Wayne Anderson Material identification system
US20120253612A1 (en) * 2011-03-28 2012-10-04 Byrne Terrence K Mobile pothole patching machine

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5442552A (en) * 1993-03-16 1995-08-15 The Regents Of The University Of California Robotic cultivator
US5911669A (en) * 1996-04-19 1999-06-15 Carnegie Mellon University Vision-based crop line tracking for harvesters
US6336051B1 (en) * 1997-04-16 2002-01-01 Carnegie Mellon University Agricultural harvester with robotic control
US6101795A (en) * 1997-05-13 2000-08-15 Claas Kgaa Automatic steering mechanism and method for harvesting machine
US6141614A (en) * 1998-07-16 2000-10-31 Caterpillar Inc. Computer-aided farming system and method
US20090319170A1 (en) * 2008-06-20 2009-12-24 Tommy Ertbolle Madsen Method of navigating an agricultural vehicle, and an agricultural vehicle implementing the same
US20120095652A1 (en) * 2010-10-14 2012-04-19 Noel Wayne Anderson Material identification system
US20120253612A1 (en) * 2011-03-28 2012-10-04 Byrne Terrence K Mobile pothole patching machine

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105277735A (en) * 2014-07-24 2016-01-27 南车株洲电力机车研究所有限公司 Detection method and device for speed and displacement of rail train
CN104145550A (en) * 2014-08-07 2014-11-19 昆明理工大学 Farmland track device
US10398084B2 (en) 2016-01-06 2019-09-03 Cnh Industrial America Llc System and method for speed-based coordinated control of agricultural vehicles
US11357153B2 (en) 2019-12-11 2022-06-14 Cnh Industrial Canada, Ltd. System and method for determining soil clod size using captured images of a field

Similar Documents

Publication Publication Date Title
EP3787909B1 (en) Coupler and tow-bar detection for automated trailer hitching via cloud points
US11252869B2 (en) Imaging system for facilitating the unloading of agricultural material from a vehicle
CN112298353B (en) System and method for calibrating steering wheel neutral position
US11287827B2 (en) Visual object tracker
JP2019061695A (en) Guide control system
US10139234B2 (en) Path planning based on obstruction mapping
RU2753004C2 (en) System and method for positioning and controlling the aircraft as it moves along the taxiway
US20200039517A1 (en) Automated Reversing By Choice of Target Location
US7844378B2 (en) Farm apparatus having implement sidehill drift compensation
CN111373338A (en) Method and apparatus for operating a mobile system
US20110015817A1 (en) Optical tracking vehicle control system and method
US10219422B2 (en) Machine-to-machine sharing of wayline deviation information
US20090326763A1 (en) System and method for providing towed implement compensation
AU2016256796A1 (en) Single-mode implement steering
WO2014105928A1 (en) Method and applications of local coordinate system based on optical flow with video cameras
US20210214008A1 (en) Transverse steering method and transverse steering device for moving a vehicle into a target position, and vehicle for this purpose
US20230016335A1 (en) Dynamically modifiable map
WO2013083311A1 (en) Method and control device for guiding an agricultural machine
Niu et al. Camera-based lane-aided multi-information integration for land vehicle navigation
CN115280960B (en) Combined harvester steering control method based on field vision SLAM
US20220361392A1 (en) Method and system for driving view-based agricultural machinery and device for agricultural machinery applying method
de Saxe et al. Estimation of trailer off-tracking using visual odometry
EP3290297B1 (en) Methods and apparatuses for disturbance and stability detection by vehicle guidance systems
CN105137468A (en) Photoelectric type automobile continuous navigation data acquiring device and method in GPS blind area environment
US20230406410A1 (en) Method for displaying an environment of a vehicle having a coupled trailer, computer program, computing device and vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13868138

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13868138

Country of ref document: EP

Kind code of ref document: A1