[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109947119A - A kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion and method - Google Patents

A kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion and method Download PDF

Info

Publication number
CN109947119A
CN109947119A CN201910326362.2A CN201910326362A CN109947119A CN 109947119 A CN109947119 A CN 109947119A CN 201910326362 A CN201910326362 A CN 201910326362A CN 109947119 A CN109947119 A CN 109947119A
Authority
CN
China
Prior art keywords
robot
aoa
information
personnel
laser
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910326362.2A
Other languages
Chinese (zh)
Other versions
CN109947119B (en
Inventor
方正
周思帆
曾杰鑫
张伟义
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ruige Intelligent Technology Shenyang Co ltd
Original Assignee
Northeastern University China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Northeastern University China filed Critical Northeastern University China
Priority to CN201910326362.2A priority Critical patent/CN109947119B/en
Publication of CN109947119A publication Critical patent/CN109947119A/en
Application granted granted Critical
Publication of CN109947119B publication Critical patent/CN109947119B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The present invention provides a kind of autonomous system for tracking of the mobile robot based on Multi-sensor Fusion and method, is related to technical field of automation in industry.The system includes upper layer navigation elements, basic motion control unit and power supply unit;The upper layer navigation elements obtain the location information of target person by sensor, carry out planning calculating to the track that robot is moved to target person, select the optimal planned trajectory of local path, and send control instruction to basic motion control unit;Basic motion control unit controls robot according to the control instruction of upper layer navigation elements and is moved to target person;Power supply unit is powered for whole system;The specific method independently followed based on the system mobile robot is also provided simultaneously.The autonomous system for tracking of mobile robot provided by the invention based on Multi-sensor Fusion and method in the case where mobile robot can be made to be blocked including target person by barrier under dynamic barriers environment, realize that stable personnel independently follow function.

Description

A kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion and method
Technical field
The present invention relates to technical field of automation in industry more particularly to a kind of mobile robots based on Multi-sensor Fusion Autonomous system for tracking and method.
Background technique
It is gradually transitions in family or individual application field along with robot from industrial environment, between people and robot Directly interaction has become a novel research field, wherein detecting and following people is a requisite skill.In recent years, right It deepens continuously, is widely used in the environment such as hospital, market and battlefield in the research and application for following robot, help common big Crowd completes some simple tasks in daily life or working environment.Robot follows pedestrian to realization fast and stable, must It must can obtain the accurate location information of target person.
Location technology common at present includes indoor and outdoors location technology: based on real-time positioning and building figure (SLAM) skill Art, the indoor and outdoor location technology based on GPS, the location technology etc. based on indoor beacon system.
This small-scale positioning application scenarios for the recognition detection of personnel at present, it is excessively high and smart using GPS cost Degree is less applicable in, therefore mode frequently with SLAM or is positioned based on beacon base station system.The side of vision in SLAM Formula carries out personnel positioning by the position of pedestrian in detection image, and cost is relatively low and related application algorithm is mature.But due to Camera fields of view has certain limitation, is easy to appear pedestrian and is detached from the phenomenon that visual field is so as to cause tracking failure.And visual sensing Device has identical pain spot, more sensitive for illumination condition, and is not suitable for outdoor environment.The laser sensor in SLAM There is wide-field advantage, entire 360 degree of range can be scanned, therefore also there are many applications in personnel's detection. But when laser detection personnel, only only used the profile information of people, different personnel cannot be distinguished, personnel it is more, interference compared with Big occasion is easy to happen misrecognition.And the case where for following personnel to be blocked by barrier, also can not effectively be handled.Letter The advantages of mark system, is that personnel can hold a beacon, can obtain opposite position by vehicle-mounted beacon base station and believe Breath.It can be applied to personnel to be blocked in such vision and the unrecognized situation of laser sensor by barrier, but beacon system Signal it sometimes appear that it is biggish drift and also data information be not sufficiently stable.
Summary of the invention
The technical problem to be solved by the present invention is in view of the above shortcomings of the prior art, provide one kind to melt based on multisensor The autonomous system for tracking of the mobile robot of conjunction and method, enable mobile robot independently to follow target person.
In order to solve the above technical problems, the technical solution used in the present invention is: on the one hand, the present invention provides one kind and is based on The autonomous system for tracking of the mobile robot of Multi-sensor Fusion, including upper layer navigation elements, basic motion control unit and power supply Unit;The upper layer navigation elements include that two-dimensional laser radar, router, AOA beacon system, camera, industrial personal computer and TTL turn USB module;The basic motion control unit includes robot body, embedded board and photoelectric encoder;
The two-dimensional laser radar is used to detect the plane control location information in fixed range, passes through one of router LAN mouthfuls are connected with industrial personal computer, it is ensured that stability, safety and the real-time that data are transmitted between laser radar and industrial personal computer; Embedded board connect for realizing the motion control to robot with second LAN mouthfuls of router;Industrial personal computer is for real The navigation programming on existing upper layer, connect with LAN mouthfuls of third of router, passes through the wired connection of router, it is ensured that industrial personal computer Control instruction is sent to basic motion control unit, and then controls the movement velocity and the direction of motion of robot;The camera It is mounted on robot upper front part, for obtaining the image information under present viewing field, is connect with industrial personal computer, it is ensured that in real time effectively Image information transmission;The AOA beacon system includes that AOA beacon base station and AOA hold beacon, wherein AOA beacon base station peace In robot, AOA holds beacon in the target person hand to be followed, and AOA beacon base station, which is obtained, holds beacon with AOA Relative pose;AOA beacon base station turns USB module by TTL and connect with industrial personal computer, so that industrial personal computer real-time reception AOA hand The information of beacon is held, to realize the data fusion of laser radar and AOA beacon system information;The mobile wheel of the robot Using DC speed-reducing;The embedded board connects motor drive module, and motor drive module connects direct current deceleration electricity Machine, photoelectric encoder are mounted at robot car wheel shaft, are connect with embedded board for obtaining robot car wheel speed;Institute Industrial personal computer plug-in is stated for realizing the detection of target person and follows the planning in path;Said supply unit is separately connected Layer navigation elements and bottom control unit are powered for whole system.
Preferably, said supply unit includes on-vehicle battery and power management module, wherein power management module connects vehicle Battery is carried, on-vehicle battery voltage is converted in system and is attached after voltage required for each component with each component, is entire System power supply.
Preferably, the robot body uses two-wheel difference trolley.
Preferably, the program built in the industrial personal computer, including personnel's detection unit and follow navigation elements, for realizing with Lower function:
(1) information data of the personnel to be detected acquired to two-dimensional laser radar and camera is handled;
(2) information from two-dimensional laser, camera and AOA beacon system is merged, a more accurate mesh is obtained Mark personnel positioning information;
(3) corresponding ID is distributed for target person, treated that data classification stores by two-dimensional laser and camera, will be new Matching Object Creation be tracking object, and saved;
(4) planning calculating is carried out to the track that robot is moved to target person, selects the optimal planning rail of local path Mark;
(5) control instruction is sent to basic motion control unit, and then controls the movement velocity and the direction of motion of robot.
On the other hand, the present invention also provides a kind of autonomous follower method of the mobile robot based on Multi-sensor Fusion, packets Include following steps:
Step 1, the information data that personnel to be detected are acquired by two-dimensional laser radar and camera, and to the data of acquisition It is handled, identifies target person;
The method that the data for the acquisition of two-dimensional laser radar are handled are as follows: the point that laser returns is carried out first Cluster, the point less than certain threshold value are divided into a cluster, cluster to these and generate geometrical characteristic;The geometrical characteristic includes The distance and angle of the quantity of laser point, the width x length of laser cluster, relative laser;Passed through based on these geometrical characteristics random gloomy Woods classifier is classified, and the meeting market's demand of people leg model is trained;Then the laser cluster in laser data is subjected to feature It extracts, the feature extracted is compared with the meeting market's demand trained by random forest, to detect the people of ambient enviroment Leg;When the distance of the two legs detected is less than 0.4m, take the average value of two legs position as the position of new leg;
The method that the data for camera acquisition are handled are as follows: mentioned for each frame image using HOG feature The method taken calculates the gradient value of different pixels block, and the feature extracted is then sent into branch according to the gradient value calculated It holds in vector machine (Support Vector Machine, i.e. SVM) classifier and is trained, train the meeting market's demand of human body; Then the feature that the feature extracted in vision data goes out with classifier training is compared, to identify the target person in the visual field Member, to realize the identification for personnel;
Step 2, the pedestrian lower leg information that laser and camera are obtained and pedestrian image information using Hungary Algorithm into The information matches of row counterpart personnel are handled, and the information that the two identifies is matched according to corresponding rule, to obtain one group The fusion position of the personnel of corresponding visual identity and the personnel of laser identification;Then using based on Kalman filter Interactive multi-model (interacting multiple model, IMM) the filter fusion of (Kalman Filter, KF) comes From the information of two-dimensional laser, camera and AOA beacon system, a more accurate target person location information is obtained;
Step 3 distributes corresponding ID for target person, and by two-dimensional laser and camera, treated that data classification stores, It is tracking object by new matching Object Creation, and is saved, the target of tracking failure is removed, to distinguish difference Target person;
Step 4 generates a target potential field using fast marching algorithms (Fast Marching Method, i.e. FMM), so Referred to afterwards by the measurement that improved dynamic window algorithm (Dynamic Window Approach, i.e. DWA) increases direction gradient field Mark, carrys out the planned trajectory of constrained robot, and then select the optimal planned trajectory of local path, method particularly includes:
Environmental information is perceived using laser sensor first, establishes a rolling grating map centered on robot, In order to measure the time T that each point in map reaches target point, a target is established on rolling grating map using FMM algorithm Potential field, the time that coordinate points (x, y) reach target position are indicated with T (x, y);Gradient derivation is carried out to this potential field and obtains one Direction gradient field, direction gradient fields provide the reference azimuth θ (x, y) of each coordinates robot on map;
In order to select the optimal trajectory that robot is moved to target, using following evaluation method:
First of all for robot is made, to target point effective exercise, the target cost letter of robot motion's validity is evaluated in introducing Number, shown in following formula:
Wherein, whether goal_cost is track validity cost, low to arrival time Flow Field Numerical for evaluating this track Position movement;β is the azimuthal impact factor of robot, (xe, ye) it is robot trajectory final position coordinate, θeFor machine The azimuth of people's final on trajectory, θr(xe, ye) it is robot at track final position, the reference azimuth that direction gradient field provides Angle, T (xe, ye) it is the time that robot reaches final on trajectory;
When the reference azimuth difference that the direction of final on trajectory and direction gradient field provide increases, goal_cost is just put Big certain multiple, to be more likely to the track that selection is consistent with vector field reference direction when track being made to appraise and choose excellent;
In order to evaluate the target-bound cost of motion profile terminal, the angle of evaluation robot motion direction validity is introduced Cost function, shown in following formula:
Wherein, T (xs, ys) be robot motion's Track Initiation point arrival time, d (xs, ys) it is robot motion track The nearest obstacle distance of initial point, α is barrier impact factor, for evaluating influence of the barrier for the path of planning;
Objective cost function and the sum of angle cost function is used to evaluate robot to target point as overall cost function The superiority-inferiority for these tracks advanced, and the optimal planning rail of local path is selected with this by minimizing overall cost function Mark;
Shown in the following formula of overall cost function:
Total_cost=goal_cost+angel_cost
Wherein, goal_cost is objective cost function, and angel_cost is angle cost function.
The beneficial effects of adopting the technical scheme are that provided by the invention a kind of based on Multi-sensor Fusion The autonomous system for tracking of mobile robot and method laser and visual sensor detection letter utilized using Kalman filtering algorithm Breath corrects AOA information, eliminates some transient oscillations, obtains smooth personnel motion trail.Improved DWA algorithm is used simultaneously, On the basis of target potential field, a direction gradient field is generated, which provides each coordinates robot on map Reference azimuth, the validity in robot direction is measured with gradient fields.It is possible thereby to avoid robot blindly to target It is close, and have ignored the adjustment of deflection.It can make mobile robot under dynamic barriers environment, including target person is hindered In the case where hindering object to block, realize that stable personnel independently follow function.This method can be adapted for a variety of robot motion's moulds Type, suitable for different operative scenarios, the scope of application is wider, and application is stronger.
Detailed description of the invention
Fig. 1 is a kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion provided in an embodiment of the present invention Structural block diagram;
Fig. 2 is a kind of autonomous follower method of mobile robot based on Multi-sensor Fusion provided in an embodiment of the present invention Flow chart;
Fig. 3 is to follow flow of navigation the present invention is based on the autonomous follower method of the mobile robot of Multi-sensor fusion Figure;
Fig. 4 is the direction gradient field schematic diagram that FMM algorithm provided in an embodiment of the present invention is established;
Fig. 5 is the schematic diagram of robot kinematics provided in an embodiment of the present invention modeling;
Fig. 6 is the schematic diagram provided in an embodiment of the present invention for improving DWA algorithm, wherein (a) is to improve DWA algorithm in machine The direction of motion provided when device people's proper motion, (b) the movement side provided for improvement DWA algorithm when robot encounters barrier To (c) schematic diagram by robot motion direction with improvement DWA algorithm to reference direction when identical;
Fig. 7 be it is provided in an embodiment of the present invention only use personnel positions track that AOA label obtains with by Kalman's filter The personnel positions track comparison diagram that wave merges;
Fig. 8 is the provided in an embodiment of the present invention personnel positions testing result and karr obtained by laser and camera The personnel positions testing result comparison diagram of graceful filtering fusion;
Fig. 9 is the movement locus schematic diagram provided in an embodiment of the present invention that target person is followed in obstacle environment.
In figure, 1, only use the personnel track of AOA information;2, AOA, laser and camera are merged by Kalman filtering The personnel track that information obtains;3, the personnel track that laser and camera joint-detection obtain;4, robot follows target mobile Motion profile;5, target person actual motion track.
Specific embodiment
With reference to the accompanying drawings and examples, specific embodiments of the present invention will be described in further detail.Implement below Example is not intended to limit the scope of the invention for illustrating the present invention.
The autonomous system for tracking of a kind of mobile robot based on Multi-sensor Fusion, as shown in Figure 1, including that upper layer navigation is single Member, basic motion control unit and power supply unit;The upper layer navigation elements include two-dimensional laser radar, router, AOA beacon System, camera, industrial personal computer and TTL turn USB module;The basic motion control unit includes robot body, embedded opens Send out plate and photoelectric encoder;The robot body uses two-wheel difference trolley.
The two-dimensional laser radar is used to detect the plane control location information in fixed range, passes through one of router LAN mouthfuls are connected with industrial personal computer, it is ensured that stability, safety and the real-time that data are transmitted between laser radar and industrial personal computer; Embedded board connect for realizing the motion control to robot with second LAN mouthfuls of router;Industrial personal computer is for real The navigation programming on existing upper layer, connect with LAN mouthfuls of third of router, passes through the wired connection of router, it is ensured that industrial personal computer Control instruction is sent to basic motion control unit, and then controls the movement velocity and the direction of motion of robot;The camera It is mounted on robot upper front part, for obtaining the image information under present viewing field, is connect with industrial personal computer, it is ensured that in real time effectively Image information transmission;The AOA beacon system includes that AOA beacon base station and AOA hold beacon, wherein AOA beacon base station peace In robot, AOA holds beacon in the target person hand to be followed, and AOA beacon base station, which is obtained, holds beacon with AOA Relative pose;AOA beacon base station turns USB module by TTL and connect with industrial personal computer, so that industrial personal computer real-time reception AOA hand The information of beacon is held, to realize the data fusion of laser radar and AOA beacon system information;The mobile wheel of the robot Using DC speed-reducing;The embedded board connects motor drive module, and motor drive module connects direct current deceleration electricity Machine, photoelectric encoder are mounted at robot car wheel shaft, are connect with embedded board for obtaining robot car wheel speed;Institute It states power supply unit and is separately connected upper layer navigation elements and bottom control unit, power for whole system.Said supply unit includes On-vehicle battery and power management module, wherein power management module connects on-vehicle battery, and on-vehicle battery voltage is converted to system In be attached with each component after voltage required for each component, power for whole system.
The industrial personal computer plug-in, for realizing target person detection and follow the planning in path, including personnel's inspection It surveys unit and follows navigation elements, for realizing following functions:
(1) information data of the personnel to be detected acquired to two-dimensional laser radar and camera is handled;
(2) information from two-dimensional laser, camera and AOA beacon system is merged, a more accurate mesh is obtained Mark personnel positioning information;
(3) corresponding ID is distributed for target person, treated that data classification stores by two-dimensional laser and camera, will be new Matching Object Creation be tracking object, and saved;
(4) planning calculating is carried out to the track that robot is moved to target person, selects the optimal planning rail of local path Mark;
(5) control instruction is sent to basic motion control unit, and then controls the movement velocity and the direction of motion of robot.
In the present embodiment, the model STM32F407VET6 of embedded control panel;The model of industrial personal computer accounts for beautiful gk400; TTL turns the model CH340C of USB module;The model of 2D laser radar is PEPPERL+FUCHS;Camera is using colored monocular Camera, model Asus Xtion Pro, industrial personal computer underlying operating system are Ubuntu16.04LST;Secondary operation system is ROS System;On-vehicle battery model Kai Meiwei 12V100A lithium battery;Power management module model SD-50B-12;Motor driven mould Block model ZLAC706;Router model is NETGEAR R6020;Beacon base station and hand-held beacon are AOA.
A kind of autonomous follower method of mobile robot based on Multi-sensor Fusion, as shown in Figure 2, comprising the following steps:
Step 1, the information data that personnel to be detected are acquired by two-dimensional laser radar and camera, and to the data of acquisition It is handled, identifies target person;
The method that the data for the acquisition of two-dimensional laser radar are handled are as follows: the point that laser returns is carried out first Cluster, the point less than certain threshold value are divided into a cluster, cluster to these and generate geometrical characteristic;The geometrical characteristic includes The distance and angle of the quantity of laser point, the width x length of laser cluster, relative laser;Passed through based on these geometrical characteristics random gloomy Woods classifier is classified, and the meeting market's demand of people leg model is trained;Then the laser cluster in laser data is subjected to feature It extracts, the feature extracted is compared with the meeting market's demand trained by random forest, to detect the people of ambient enviroment Leg;When the distance of the two legs detected is less than 0.4m, take the average value of two legs position as the position of new leg;
The method that the data for camera acquisition are handled are as follows: mentioned for each frame image using HOG feature The method taken calculates the gradient value of different pixels block, and the feature extracted is then sent into branch according to the gradient value calculated It holds in vector machine (Support Vector Machine, i.e. SVM) classifier and is trained, train the meeting market's demand of human body; Then the feature that the feature extracted in vision data goes out with classifier training is compared, to identify the target person in the visual field Member, to realize the identification for personnel;
Step 2, the pedestrian lower leg information that laser and camera are obtained and pedestrian image information using Hungary Algorithm into The information matches of row counterpart personnel are handled, and the information that the two identifies is matched according to corresponding rule, to obtain one group The fusion position of the personnel of corresponding visual identity and the personnel of laser identification;Then using based on Kalman filter Interactive multi-model (interacting multiple model, IMM) the filter fusion of (Kalman Filter, KF) comes From the information of two-dimensional laser, camera and AOA beacon system, a more accurate target person location information is obtained;
Hungary Algorithm is a kind of combinatorial optimization algorithm that Task Allocation Problem is solved in polynomial time.The algorithm is logical It crosses and matches the information of the two identification according to corresponding rule, to obtain the personnel of one group of corresponding visual identity and swash The fusion position of the personnel of light identification.Shown in the following formula of obtained fused data, n is the personnel amount that t moment detects.? The different personnel positions that t moment detects are expressed as
IMM algorithm introduces multiple target movement models, have the characteristics that it is adaptive, can be effectively to the general of each model Rate is adjusted, and is weighted according to state estimation of the corresponding probability to each model, realizes the tracking to moving target. IMM algorithm contains multiple filters, an interactive affector, a model probability renovator and an estimation mixer, Multi-model tracks the motion of automobile of a target by reciprocation, and the transfer between each model shifts square by Markov probability Battle array determines.The present invention establishes multiple movements using KF filter above-mentioned as the filter in IMM algorithm, to target person Model, the information using KF-IMM algorithm fusion from laser, vision and AOA label.
One key factor of IMM algorithm is the determination of target movement model, it should really react target as far as possible Actual motion situation.The present invention is for studying its motion model for the common movement of target person.
General motion model is as follows, is divided into prediction process and renewal process.The prediction process and renewal process of target person It indicates are as follows:
X (k)=F (k-1) X (k-1)+W (k-1)
Z (k)=hX (k)+V (k)
Wherein, k indicates sampling instant;X(k)∈RnIt is the state vector of prediction process;F is that n maintains system transfer matrix;Z (k)∈RmIt is the measurement vector of renewal process;W (k)~N (0, R) and V (k)~N (0, Q) is Gaussian process noise and survey respectively Measure noise;Wherein state vector and measurement vector are different according to the difference of selected model.
The present embodiment models the motion process of the target person followed, and the motion model of target person is divided into three kinds, point It is not uniform rectilinear motion model (Constant Velocity, CV), uniformly accelerated motion model (Constant Accelerator, CA) and at the uniform velocity turning motion model (Constant Turn, CT).
In the dynamic model of CA, CT and CV motion model, x, y indicate the position of target person,WithIndicate speed, WithIndicate that acceleration, ω indicate rate of turn, w (t) indicates that white Gaussian noise, T indicate the sampling period.
(1) CV model selects state variableAssuming that itsWithAs random noise processing, i.e.,The then prediction process status equation of CV model are as follows:
X (k)=FCV(k-1)X(k-1)+W(k-1)
Wherein, Fcv=diag { A, A }, A2×2For newton matrix;W (k-1)=[Wx, Wy]′k-1It is zero-mean gaussian white noise Sound.
(2) CT model selects state variableAssuming that itsWithAs random noise processing, i.e.,W (t) is white-noise process.The then prediction process status equation of CT model are as follows:
X (k)=FCT(k-1)X(k-1)+W(k-1)
Wherein, W (k-1)=[Wx, Wy, Wω] ' be zero mean Gaussian white noise,
(3) CA model selects state variableWhereinWithAs random noise processing, i.e.,W (t) is white-noise process.The then prediction process status equation of CA model are as follows:
X (k)=FCA(k-1)X(k-1)+W(k-1)
Wherein, FcA=diag { A, A }, A3×3For newton matrix;W (k-1)=[Wx, Wy]′k-1It is zero-mean gaussian white noise Sound.
In forecast period, respectively every kind of motion model distributes a probability value as probability values.If each is right The corresponding probability values of the motion model answered are as follows:Wherein 0 indicate that the probability value carved at the beginning, t-1 indicate The probability value at t-1 moment.(i=0,1,2) is corresponding in turn to three kinds of motion models (CA, CT, CV), and ∑ Wi=1.
As follows to the process of target person positioning in the present embodiment: the position for using AOA label to obtain is as personnel state Initial value X.By personnel, the coordinate (x, y) under global coordinate system indicates personnel state.The prediction process of Kalman filtering is for three kinds Personnel's motion model is all estimated, can get corresponding state-transition matrix F and motion artifacts according to above-mentioned motion model W, wherein F ∈ (FCA, FCV, FCT).Information from two-dimensional laser, camera and AOA beacon system is merged, is obtained One more accurate target person location information, specifically include two steps: first step is carried out using AOA label information It updates.Since the location information fluctuation that AOA label obtains is larger, glide filter is carried out to this data and is obtainedFor for the first time It updates, observation noise matrixWhat is be generally arranged is smaller.Second step uses the data of laser and visual sensor first Hungary Algorithm carries out data fusion, then using fused data as the observed quantity for target personAOA label obtains Although the information fluctuation arrived is larger,Value it is too many without departing from true value.Choose ztMiddle distanceNearest numerical value IfWithDistance is less than 0.8m, carries out second and updates.Secondary measurement noise matrixSize and ztMiddle distance The distance between two nearest position datas correlation.By taking robot linear uniform motion as an example, Kalman filtering merges AOA letter It ceases as shown in table 1 with the pseudocode of laser and visual information.
1 Kalman filtering algorithm pseudocode of table
Step 3 distributes corresponding ID for target person, and by two-dimensional laser and camera, treated that data classification stores, It is tracking object by new matching Object Creation, and is saved, the target of tracking failure is removed, to distinguish difference Target person;
Step 4 generates a target potential field using fast marching algorithms (Fast Marching Method, i.e. FMM), so Referred to afterwards by the measurement that improved dynamic window algorithm (Dynamic Window Approach, i.e. DWA) increases direction gradient field Mark, carrys out the planned trajectory of constrained robot, and then select the optimal planned trajectory of local path, as shown in figure 3, specific method Are as follows:
Environmental information is perceived using laser sensor first, establishes a rolling grating map centered on robot, In order to measure the time T that each point in map reaches target point, a target is established on rolling grating map using FMM algorithm Potential field, the time that coordinate points (x, y) reach target position are indicated with T (x, y);Gradient derivation is carried out to this potential field and obtains one Direction gradient field, as shown in figure 4, direction gradient fields provide each coordinates robot on map reference azimuth θ (x, y);
In order to select the optimal trajectory that robot is moved to target, using following evaluation method:
First of all for robot is made, to target point effective exercise, the target cost letter of robot motion's validity is evaluated in introducing Number, shown in following formula:
Wherein, whether goal_cost is track validity cost, low to arrival time Flow Field Numerical for evaluating this track Position movement;β is the azimuthal impact factor of robot, (xe, ye) it is robot trajectory final position coordinate, θeFor machine The azimuth of people's final on trajectory, θr(xe, ye) be robot trajectory final position when, direction gradient field provide reference azimuth, T(xe, ye) it is the time that robot reaches final on trajectory;
When the reference azimuth difference that the direction of final on trajectory and direction gradient field provide increases, goal_cost is just put Big certain multiple, to be more likely to the track that selection is consistent with vector field reference direction when track being made to appraise and choose excellent;
The objective cost function of evaluation robot motion's validity of introducing, its essence is build to robot kinematics Mould, as shown in Figure 5.
In order to evaluate the target-bound cost of motion profile terminal, the angle of evaluation robot motion direction validity is introduced Cost function, shown in following formula:
Wherein, T (xs, ys) be robot motion's Track Initiation point arrival time, as T (xs, ys) very little when, angel_ Cost quickly increases, so that robot be made to be more likely to the track that selection meets direction gradient field reference direction;d(xs, ys) it is machine The nearest obstacle distance of device people's motion profile initial point, when d (xs, ys) very little when, robot quickly adjusts the direction of motion extremely Direction gradient field reference direction, to avoid falling into barrier predicament, α is barrier impact factor, for evaluating barrier pair Influence in the path of planning;
Objective cost function and the sum of angle cost function is used to evaluate robot to target point as overall cost function The superiority-inferiority for these tracks advanced, and the optimal planning rail of local path is selected with this by minimizing overall cost function Mark;As shown in fig. 6, middle arrow direction and the reference direction difference that gradient fields provide are very big, thus comment in Fig. 6 (a) The cost for sentencing track can amplify certain multiple.And although lower section arrow direction distance objective point is farther out, track is most Whole direction is not much different with gradient fields reference direction, thus the cost for judging track is less than the corresponding rail of middle arrow direction Mark.When robot falls into local optimum, the analog track that track sampling obtains can all collide with barrier, the present invention Proposed method can also flee from such case.In Fig. 6 (b), when robot is too close to barrier, to three of front simulation Track can all run into the barrier in front, and as the track simulated backward, traditional DWA algorithm thinks current location closer to mesh Punctuate, potential field is lower, therefore will not take setback.And method proposed by the invention, when assessing robot cost, when The direction of front position differs larger with gradient fields reference direction, thus increases cost value;The lower left arrow direction simulated backward It is consistent with gradient fields reference direction, though increase with target point distance, but still it is smaller than the cost of current location, therefore robot can select It selects to draw back and flees from predicament.When adjustment robot azimuth is consistent with gradient fields reference direction, as shown in Fig. 6 (c), robot It can advance according to middle arrow direction to target point.
Shown in the following formula of overall cost function:
Total_cost=goal_cost+angel_cost
Wherein, goal_cost is objective cost function, and angel_cost is angle cost function.
FMM algorithm is by the Viscosity Solutions of Numerical Methods Solve eikonal equation, to solve the propagation problem at interface.Eikonal equation Shown in following formula:
Wherein, x indicates the point in search space, and the form of expression in two-dimensional space is x=(x, y).T (x) is from The time of point point of arrival x, W (x) is local propagation speed of the interface at point x.It can be in space by discretization gradient T (x) In every x solve eikonal equation, x correspond to the grid of i-th row j column in the planning space of grid representation.
DWA algorithm is a kind of online local paths planning method of classics, the operational excellence in dynamic uncertain environments.It should Method mainly samples multiple groups speed trajectory in the velocity space (v, w), and dummy robot's certain time at these speeds Interior track.After obtaining multiple groups track, these tracks are evaluated, choose speed corresponding to optimal trajectory to drive Robot motion.The algorithm projecting point is dynamic, it is meant that the acceleration and deceleration performance according to mobile robot limits speed Using in a feasible dynamic range in space.The track for wanting dummy robot needs to know the motion model of robot.This reality The two-wheel difference mobile robot used in example is applied, it can only forward-reverse and rotation.Consider two adjacent moments, move distance is very It is short, the track between two consecutive points can be seen and be in line, i.e., move v along robot coordinate systemt*Δt.It is projected into World coordinates fastens the variation that coordinate under world coordinate system can be obtained.Assuming that robot t moment pose is (xt, yt, θt), it presses T+1 moment robot pose is calculated according to following formula:
xt+1=xt+vt*Δt*cosθt
yt+1=yt+vt*Δt*sinθt
θt+1t+wt*Δt
In robot speed space down-sampling multiple groups speed, the expection pose of robot under friction speed is calculated, is generated The analog track of robot.
In the present embodiment, when target person is not blocked by barrier, first will by Kalman filtering merge AOA, The track that the information of laser and camera obtains compares experiment with the personnel track for only using AOA information.Robot with When walking with target person, the people that the personnel positions track that AOA label obtains is merged with by Kalman filtering is only used Member's location track is as shown in Figure 7.From the figure, it can be seen that the estimation to personnel's pose is inaccuracy when only with AOA information , and sometimes there is violent bounces.Using Kalman filtering algorithm, the detection information of laser and camera is utilized AOA information is corrected, some transient oscillations is eliminated, obtains smooth personnel track.
The present embodiment will also only pass through laser and camera carries out the result of personnel positions detection and Kalman filtering merges Personnel positions testing result compare, as shown in figure 8, the rail that laser and camera are detected as can be seen from Fig. Mark is roughly the same with the track that Kalman filtering obtains, but the track that Kalman filtering obtains is more smooth.
When target person is blocked by barrier, tracking failure frequently can lead to by laser and visual information merely.This In invention, AOA signal value can be effectively relieved and fluctuate violent problem by merging laser and visual information by Kalman filtering.And AOA label have unique ID, exactly can for laser identification initial value be provided, will not occur pedestrian misidentify the problem of, confidence level compared with It is high.The track obtained by Kalman filtering fusion laser and vision and AOA information, is arrived laser detection using AOA information People's leg matched again with tracked personnel, laser and visual information can be successfully managed and blocked in target person by barrier When, the case where can not detecting people.Even if present system and method can also effectively carry out people in multi obstacles environment Member's detection, and realize smooth pursuit path, as shown in figure 9, the black square in figure is barrier.
Finally, it should be noted that the above embodiments are merely illustrative of the technical solutions of the present invention, rather than its limitations;Although Present invention has been described in detail with reference to the aforementioned embodiments, those skilled in the art should understand that: it still may be used To modify to technical solution documented by previous embodiment, or some or all of the technical features are equal Replacement;And these are modified or replaceed, model defined by the claims in the present invention that it does not separate the essence of the corresponding technical solution It encloses.

Claims (7)

1. a kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion, it is characterised in that: navigate including upper layer single Member, basic motion control unit and power supply unit;The upper layer navigation elements include two-dimensional laser radar, router, AOA beacon System, camera, industrial personal computer and TTL turn USB module;The basic motion control unit includes robot body, embedded opens Send out plate and photoelectric encoder;
The two-dimensional laser radar is used to detect the plane control location information in fixed range, passes through a LAN of router Mouth is connected with industrial personal computer, it is ensured that stability, safety and the real-time that data are transmitted between laser radar and industrial personal computer;Insertion Formula development board connect for realizing the motion control to robot with second LAN mouthfuls of router;Industrial personal computer is for realizing upper The navigation programming of layer, connect with LAN mouthfuls of third of router, passes through the wired connection of router, it is ensured that industrial personal computer is the bottom of to Layer motion control unit sends control instruction, and then controls the movement velocity and the direction of motion of robot;The camera installation It is connect for obtaining the image information under present viewing field with industrial personal computer in robot upper front part;The AOA beacon system packet It includes AOA beacon base station and AOA holds beacon, wherein AOA beacon base station is mounted in robot, and AOA holds beacon and to follow Target person hand in, AOA beacon base station obtain with AOA hold beacon relative pose;AOA beacon base station passes through TTL Turn USB module to connect with industrial personal computer so that industrial personal computer real-time reception AOA hold beacon information, thus realize laser radar and The data fusion of AOA beacon system information;The mobile wheel of the robot uses DC speed-reducing;The embedded development Plate connects motor drive module, and motor drive module connects DC speed-reducing, and photoelectric encoder is mounted on robot car wheel shaft Place, connect for obtaining robot car wheel speed with embedded board;The industrial personal computer plug-in is for realizing target person The detection of member and the planning for following path;Said supply unit is separately connected upper layer navigation elements and bottom control unit, is whole A system power supply.
2. the autonomous system for tracking of a kind of mobile robot based on Multi-sensor Fusion according to claim 1, feature Be: said supply unit includes on-vehicle battery and power management module, wherein power management module connects on-vehicle battery, will On-vehicle battery voltage is converted in system and is attached after voltage required for each component with each component, is powered for whole system.
3. the autonomous system for tracking of a kind of mobile robot based on Multi-sensor Fusion according to claim 1, feature Be: the robot body uses two-wheel difference trolley.
4. the autonomous system for tracking of a kind of mobile robot based on Multi-sensor Fusion according to claim 1, feature It is: the program built in the industrial personal computer, including personnel's detection unit and navigation elements are followed, for realizing following functions:
(1) information data of the personnel to be detected acquired to two-dimensional laser radar and camera is handled;
(2) information from two-dimensional laser, camera and AOA beacon system is merged, a more accurate target person is obtained Member's location information;
(3) corresponding ID is distributed for target person, treated that data classification stores by two-dimensional laser and camera, by new It is created as tracking object with object, and is saved;
(4) planning calculating is carried out to the track that robot is moved to target person, selects the optimal planned trajectory of local path;
(5) control instruction is sent to basic motion control unit, and then controls the movement velocity and the direction of motion of robot.
5. the autonomous follower method of a kind of mobile robot based on Multi-sensor Fusion, using a kind of base described in claim 1 Independently following for mobile robot is realized in the autonomous system for tracking of the mobile robot of Multi-sensor Fusion, it is characterised in that: packet Include following steps:
Step 1, the information data that personnel to be detected are acquired by two-dimensional laser radar and camera, and the data of acquisition are carried out Processing, identifies target person;
Step 2, the pedestrian lower leg information obtained to laser and camera and pedestrian image information are carried out pair using Hungary Algorithm It answers the information matches of personnel to handle, the information that the two identifies is matched according to corresponding rule, to obtain one group of correspondence Visual identity personnel and laser identification personnel fusion position;Then more using the interactive mode based on Kalman filter Model filter merges the information from two-dimensional laser, camera and AOA beacon system, obtains a more accurate target Personnel positioning information;
Step 3 distributes corresponding ID for target person, and by two-dimensional laser and camera, treated that data classification stores, will be new Matching Object Creation be tracking object, and saved, the target of tracking failure removed, to distinguish different mesh Mark personnel;
Step 4 generates a target potential field using fast marching algorithms, then increases direction by improved dynamic window algorithm The measurement index of gradient fields, carrys out the planned trajectory of constrained robot, and then selects the optimal planned trajectory of local path.
6. the autonomous follower method of a kind of mobile robot based on Multi-sensor Fusion according to claim 5, feature It is: the method that the data of two-dimensional laser radar acquisition are handled described in step 1 are as follows: the click-through that laser is returned first Row cluster, the point less than certain threshold value are divided into a cluster, cluster to these and generate geometrical characteristic;The geometrical characteristic packet Include the distance and angle of the quantity of laser point, the width x length of laser cluster, relative laser;Passed through based on these geometrical characteristics random Forest classified device is classified, and the meeting market's demand of people leg model is trained;Then the laser cluster in laser data is carried out special Sign is extracted, and the feature extracted is compared with the meeting market's demand trained by random forest, to detect ambient enviroment People's leg;When the distance of the two legs detected is less than 0.4m, take the average value of two legs position as the position of new leg;
The method that the data for camera acquisition are handled are as follows: for each frame image using HOG feature extraction Method calculates the gradient value of different pixels block, then the feature extracted is sent into according to the gradient value calculated support to It is trained in amount machine classifier, trains the meeting market's demand of human body;Then by the feature extracted in vision data and classification The feature that device trains compares, to identify the target person in the visual field, to realize the identification for personnel.
7. the autonomous follower method of a kind of mobile robot based on Multi-sensor Fusion according to claim 5, feature It is: the step 4 method particularly includes:
Environmental information is perceived using laser sensor first, establishes a rolling grating map centered on robot, in order to The time T that each point in map reaches target point is measured, establishes a mesh on rolling grating map using fast marching algorithms Scalar potential field, the time that coordinate points (x, y) reach target position are indicated with T (x, y);Gradient derivation is carried out to this potential field and obtains one A direction gradient field, direction gradient fields provide the reference azimuth θ (x, y) of each coordinates robot on map;
In order to select the optimal trajectory that robot is moved to target, using following evaluation method:
First of all for robot is made, to target point effective exercise, the objective cost function of robot motion's validity is evaluated in introducing, Shown in following formula:
Wherein, goal_cost is track validity cost, for evaluating this track position whether low to arrival time Flow Field Numerical Set movement;β is the azimuthal impact factor of robot, (xe, ye) it is robot trajectory final position coordinate, θeFor robot rail The azimuth of mark terminal, θr(xe, ye) it is robot at track final position, the reference azimuth that direction gradient field provides, T (xe, ye) it is the time that robot reaches final on trajectory;
When the reference azimuth difference that the direction of final on trajectory and direction gradient field provide increases, goal_cost just amplifies one Multiple is determined, to be more likely to the track that selection is consistent with vector field reference direction when track being made to appraise and choose excellent;
In order to evaluate the target-bound cost of motion profile terminal, the angle cost of evaluation robot motion direction validity is introduced Function, shown in following formula:
Wherein, T (xs, ys) be robot motion's Track Initiation point arrival time, d (xs, ys) it is robot motion's Track Initiation The nearest obstacle distance of point, α is barrier impact factor, for evaluating influence of the barrier for the path of planning;
It uses objective cost function and the sum of angle cost function to evaluate robot as overall cost function to advance to target point These tracks superiority-inferiority, and the optimal planned trajectory of local path is selected with this by minimizing overall cost function;
Shown in the following formula of overall cost function:
Total_cost=goal_cost+angel_cost
Wherein, goal_cost is objective cost function, and angel_cost is angle cost function.
CN201910326362.2A 2019-04-23 2019-04-23 Mobile robot autonomous following method based on multi-sensor fusion Active CN109947119B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910326362.2A CN109947119B (en) 2019-04-23 2019-04-23 Mobile robot autonomous following method based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910326362.2A CN109947119B (en) 2019-04-23 2019-04-23 Mobile robot autonomous following method based on multi-sensor fusion

Publications (2)

Publication Number Publication Date
CN109947119A true CN109947119A (en) 2019-06-28
CN109947119B CN109947119B (en) 2021-06-29

Family

ID=67015959

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910326362.2A Active CN109947119B (en) 2019-04-23 2019-04-23 Mobile robot autonomous following method based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN109947119B (en)

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427034A (en) * 2019-08-13 2019-11-08 浙江吉利汽车研究院有限公司 A kind of target tracking system and method based on bus or train route collaboration
CN110609561A (en) * 2019-11-18 2019-12-24 深圳市优必选科技股份有限公司 Pedestrian tracking method and device, computer readable storage medium and robot
CN110865534A (en) * 2019-11-15 2020-03-06 合肥工业大学 Intelligent following system with improved Kalman filtering for navigation positioning
CN110879592A (en) * 2019-11-08 2020-03-13 南京航空航天大学 Artificial potential field path planning method based on escape force fuzzy control
CN111089590A (en) * 2019-12-09 2020-05-01 泉州装备制造研究所 Method for tracking human leg by mobile robot through fusion of vision and laser
CN111103887A (en) * 2020-01-14 2020-05-05 大连理工大学 Multi-sensor-based multi-mobile-robot scheduling system design method
CN111708042A (en) * 2020-05-09 2020-09-25 汕头大学 Robot method and system for pedestrian trajectory prediction and following
CN112015186A (en) * 2020-09-09 2020-12-01 上海有个机器人有限公司 Robot path planning method and device with social attributes and robot
CN112083732A (en) * 2020-10-28 2020-12-15 中航华东光电(上海)有限公司 Robot navigation method and system for detecting visible line laser
CN112148011A (en) * 2020-09-24 2020-12-29 东南大学 Electroencephalogram mobile robot sharing control method under unknown environment
CN112237400A (en) * 2020-09-04 2021-01-19 安克创新科技股份有限公司 Method for area division, self-moving robot and computer storage medium
CN112488068A (en) * 2020-12-21 2021-03-12 重庆紫光华山智安科技有限公司 Method, device and equipment for searching monitoring target and computer storage medium
CN112509264A (en) * 2020-11-19 2021-03-16 深圳市欧瑞博科技股份有限公司 Abnormal intrusion intelligent shooting method and device, electronic equipment and storage medium
CN112834764A (en) * 2020-12-28 2021-05-25 深圳市人工智能与机器人研究院 Sampling control method and device of mechanical arm and sampling system
CN112904855A (en) * 2021-01-19 2021-06-04 四川阿泰因机器人智能装备有限公司 Follow-up robot local path planning method based on improved dynamic window
CN112890680A (en) * 2019-11-19 2021-06-04 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control method, device, robot and storage medium
CN112965081A (en) * 2021-02-05 2021-06-15 浙江大学 Simulated learning social navigation method based on feature map fused with pedestrian information
CN113126600A (en) * 2019-12-26 2021-07-16 沈阳新松机器人自动化股份有限公司 Follow system and article transfer cart based on UWB
CN113156933A (en) * 2020-12-30 2021-07-23 徐宁 Robot traveling control system and method
CN113222122A (en) * 2021-06-01 2021-08-06 重庆大学 High-quality neural network system suitable for singlechip
CN113221755A (en) * 2021-05-14 2021-08-06 深圳中智永浩机器人有限公司 Robot chassis foot-pressing prevention method and device, computer equipment and storage medium
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof
CN113504777A (en) * 2021-06-16 2021-10-15 广州市东崇科技有限公司 Artificial intelligent automatic following method and system for AGV
CN113535861A (en) * 2021-07-16 2021-10-22 子亥科技(成都)有限公司 Track prediction method for multi-scale feature fusion and adaptive clustering
CN113671940A (en) * 2020-05-14 2021-11-19 东元电机股份有限公司 Following mobile platform and method thereof
CN113741550A (en) * 2020-05-15 2021-12-03 北京机械设备研究所 Mobile robot following method and system
CN113885487A (en) * 2020-07-02 2022-01-04 苏州艾吉威机器人有限公司 Path tracking method, system, device and computer readable storage medium
WO2022016941A1 (en) * 2020-07-20 2022-01-27 华为技术有限公司 Method and device for planning obstacle avoidance path for traveling device
CN114061590A (en) * 2021-11-18 2022-02-18 北京仙宇科技有限公司 Method for dynamically creating robot cruise coordinate and robot navigation method
CN114216463A (en) * 2021-11-04 2022-03-22 国家电网有限公司 Path optimization target positioning method and device, storage medium and unmanned equipment
CN114237256A (en) * 2021-12-20 2022-03-25 东北大学 Three-dimensional path planning and navigation method suitable for under-actuated robot
CN114326732A (en) * 2021-12-28 2022-04-12 无锡笠泽智能科技有限公司 Robot autonomous following system and autonomous following control method
WO2022166067A1 (en) * 2021-02-04 2022-08-11 武汉工程大学 System and method for coordinated traction of multi-machine heavy-duty handling robot
US11458627B2 (en) 2020-08-13 2022-10-04 National Chiao Tung University Method and system of robot for human following
WO2022228019A1 (en) * 2021-04-25 2022-11-03 深圳市优必选科技股份有限公司 Moving target following method, robot, and computer-readable storage medium
CN115437299A (en) * 2022-10-10 2022-12-06 北京凌天智能装备集团股份有限公司 Accompanying transportation robot advancing control method and system
US20230091806A1 (en) * 2021-09-23 2023-03-23 Honda Motor Co., Ltd. Inverse optimal control for human approach
CN115857408A (en) * 2022-12-09 2023-03-28 中国船舶集团有限公司第七一六研究所 Unmanned vehicle autonomous following system based on target recognition

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9889566B2 (en) * 2015-05-01 2018-02-13 General Electric Company Systems and methods for control of robotic manipulation
CN107765220A (en) * 2017-09-20 2018-03-06 武汉木神机器人有限责任公司 Pedestrian's system for tracking and method based on UWB and laser radar mixed positioning
US20180149753A1 (en) * 2016-11-30 2018-05-31 Yujin Robot Co., Ltd. Ridar apparatus based on time of flight and moving object
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9889566B2 (en) * 2015-05-01 2018-02-13 General Electric Company Systems and methods for control of robotic manipulation
US20180149753A1 (en) * 2016-11-30 2018-05-31 Yujin Robot Co., Ltd. Ridar apparatus based on time of flight and moving object
CN107765220A (en) * 2017-09-20 2018-03-06 武汉木神机器人有限责任公司 Pedestrian's system for tracking and method based on UWB and laser radar mixed positioning
CN109129507A (en) * 2018-09-10 2019-01-04 北京联合大学 A kind of medium intelligent introduction robot and explanation method and system

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
AMR MOHAMED; JING REN; HAOXIANG LANG; MOUSTAFA EL-GINDY: "Optimal collision free path planning for an autonomous articulated vehicle with two trailers", 《2017 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL TECHNOLOGY (ICIT)》 *
RODRIGO VENTURA,AAMIR AHMAD: "Towards Optimal Robot Navigation in Domestic Spaces", 《LECTURE NOTES IN COMPUTER SCIENCE》 *
李鹏飞: "面向服务机器人的行人检测与跟踪", 《中国优秀硕士学位论文全文数据库•信息科技辑》 *

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110427034A (en) * 2019-08-13 2019-11-08 浙江吉利汽车研究院有限公司 A kind of target tracking system and method based on bus or train route collaboration
CN110879592A (en) * 2019-11-08 2020-03-13 南京航空航天大学 Artificial potential field path planning method based on escape force fuzzy control
CN110879592B (en) * 2019-11-08 2020-11-20 南京航空航天大学 Artificial potential field path planning method based on escape force fuzzy control
CN110865534A (en) * 2019-11-15 2020-03-06 合肥工业大学 Intelligent following system with improved Kalman filtering for navigation positioning
CN110609561A (en) * 2019-11-18 2019-12-24 深圳市优必选科技股份有限公司 Pedestrian tracking method and device, computer readable storage medium and robot
CN112890680B (en) * 2019-11-19 2023-12-12 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control device, robot and storage medium
CN112890680A (en) * 2019-11-19 2021-06-04 科沃斯机器人股份有限公司 Follow-up cleaning operation method, control method, device, robot and storage medium
CN111089590A (en) * 2019-12-09 2020-05-01 泉州装备制造研究所 Method for tracking human leg by mobile robot through fusion of vision and laser
CN111089590B (en) * 2019-12-09 2021-10-15 泉州装备制造研究所 Method for tracking human leg by mobile robot through fusion of vision and laser
CN113126600A (en) * 2019-12-26 2021-07-16 沈阳新松机器人自动化股份有限公司 Follow system and article transfer cart based on UWB
CN111103887B (en) * 2020-01-14 2021-11-12 大连理工大学 Multi-sensor-based multi-mobile-robot scheduling system design method
CN111103887A (en) * 2020-01-14 2020-05-05 大连理工大学 Multi-sensor-based multi-mobile-robot scheduling system design method
TWI742644B (en) * 2020-05-06 2021-10-11 東元電機股份有限公司 Following mobile platform and method thereof
CN111708042A (en) * 2020-05-09 2020-09-25 汕头大学 Robot method and system for pedestrian trajectory prediction and following
CN113671940A (en) * 2020-05-14 2021-11-19 东元电机股份有限公司 Following mobile platform and method thereof
CN113741550B (en) * 2020-05-15 2024-02-02 北京机械设备研究所 Mobile robot following method and system
CN113741550A (en) * 2020-05-15 2021-12-03 北京机械设备研究所 Mobile robot following method and system
CN113885487A (en) * 2020-07-02 2022-01-04 苏州艾吉威机器人有限公司 Path tracking method, system, device and computer readable storage medium
WO2022016941A1 (en) * 2020-07-20 2022-01-27 华为技术有限公司 Method and device for planning obstacle avoidance path for traveling device
TWI780468B (en) * 2020-08-13 2022-10-11 國立陽明交通大學 Method and system of robot for human following
US11458627B2 (en) 2020-08-13 2022-10-04 National Chiao Tung University Method and system of robot for human following
CN112237400A (en) * 2020-09-04 2021-01-19 安克创新科技股份有限公司 Method for area division, self-moving robot and computer storage medium
CN112237400B (en) * 2020-09-04 2022-07-01 安克创新科技股份有限公司 Method for area division, self-moving robot and computer storage medium
CN112015186A (en) * 2020-09-09 2020-12-01 上海有个机器人有限公司 Robot path planning method and device with social attributes and robot
CN112148011B (en) * 2020-09-24 2022-04-15 东南大学 Electroencephalogram mobile robot sharing control method under unknown environment
CN112148011A (en) * 2020-09-24 2020-12-29 东南大学 Electroencephalogram mobile robot sharing control method under unknown environment
CN112083732A (en) * 2020-10-28 2020-12-15 中航华东光电(上海)有限公司 Robot navigation method and system for detecting visible line laser
CN112509264B (en) * 2020-11-19 2022-11-18 深圳市欧瑞博科技股份有限公司 Abnormal intrusion intelligent shooting method and device, electronic equipment and storage medium
CN112509264A (en) * 2020-11-19 2021-03-16 深圳市欧瑞博科技股份有限公司 Abnormal intrusion intelligent shooting method and device, electronic equipment and storage medium
CN112488068B (en) * 2020-12-21 2022-01-11 重庆紫光华山智安科技有限公司 Method, device and equipment for searching monitoring target and computer storage medium
CN112488068A (en) * 2020-12-21 2021-03-12 重庆紫光华山智安科技有限公司 Method, device and equipment for searching monitoring target and computer storage medium
CN112834764B (en) * 2020-12-28 2024-05-31 深圳市人工智能与机器人研究院 Sampling control method and device for mechanical arm and sampling system
CN112834764A (en) * 2020-12-28 2021-05-25 深圳市人工智能与机器人研究院 Sampling control method and device of mechanical arm and sampling system
CN113156933A (en) * 2020-12-30 2021-07-23 徐宁 Robot traveling control system and method
CN112904855B (en) * 2021-01-19 2022-08-16 四川阿泰因机器人智能装备有限公司 Follow-up robot local path planning method based on improved dynamic window
CN112904855A (en) * 2021-01-19 2021-06-04 四川阿泰因机器人智能装备有限公司 Follow-up robot local path planning method based on improved dynamic window
WO2022166067A1 (en) * 2021-02-04 2022-08-11 武汉工程大学 System and method for coordinated traction of multi-machine heavy-duty handling robot
CN112965081B (en) * 2021-02-05 2023-08-01 浙江大学 Simulated learning social navigation method based on feature map fused with pedestrian information
CN112965081A (en) * 2021-02-05 2021-06-15 浙江大学 Simulated learning social navigation method based on feature map fused with pedestrian information
WO2022228019A1 (en) * 2021-04-25 2022-11-03 深圳市优必选科技股份有限公司 Moving target following method, robot, and computer-readable storage medium
US12093054B2 (en) 2021-04-25 2024-09-17 Ubkang (Qingdao) Technology Co., Ltd. Moving target following method, robot and computer-readable storage medium
CN115552348A (en) * 2021-04-25 2022-12-30 深圳市优必选科技股份有限公司 Moving object following method, robot, and computer-readable storage medium
CN113221755A (en) * 2021-05-14 2021-08-06 深圳中智永浩机器人有限公司 Robot chassis foot-pressing prevention method and device, computer equipment and storage medium
CN113222122A (en) * 2021-06-01 2021-08-06 重庆大学 High-quality neural network system suitable for singlechip
CN113504777A (en) * 2021-06-16 2021-10-15 广州市东崇科技有限公司 Artificial intelligent automatic following method and system for AGV
CN113504777B (en) * 2021-06-16 2024-04-16 新疆美特智能安全工程股份有限公司 Automatic following method and system for artificial intelligence AGV trolley
CN113535861A (en) * 2021-07-16 2021-10-22 子亥科技(成都)有限公司 Track prediction method for multi-scale feature fusion and adaptive clustering
CN113535861B (en) * 2021-07-16 2023-08-11 子亥科技(成都)有限公司 Track prediction method for multi-scale feature fusion and self-adaptive clustering
US20230091806A1 (en) * 2021-09-23 2023-03-23 Honda Motor Co., Ltd. Inverse optimal control for human approach
CN114216463B (en) * 2021-11-04 2024-05-28 国家电网有限公司 Path optimization target positioning method and device, storage medium and unmanned equipment
CN114216463A (en) * 2021-11-04 2022-03-22 国家电网有限公司 Path optimization target positioning method and device, storage medium and unmanned equipment
CN114061590A (en) * 2021-11-18 2022-02-18 北京仙宇科技有限公司 Method for dynamically creating robot cruise coordinate and robot navigation method
CN114237256B (en) * 2021-12-20 2023-07-04 东北大学 Three-dimensional path planning and navigation method suitable for under-actuated robot
CN114237256A (en) * 2021-12-20 2022-03-25 东北大学 Three-dimensional path planning and navigation method suitable for under-actuated robot
CN114326732A (en) * 2021-12-28 2022-04-12 无锡笠泽智能科技有限公司 Robot autonomous following system and autonomous following control method
CN115437299A (en) * 2022-10-10 2022-12-06 北京凌天智能装备集团股份有限公司 Accompanying transportation robot advancing control method and system
CN115857408A (en) * 2022-12-09 2023-03-28 中国船舶集团有限公司第七一六研究所 Unmanned vehicle autonomous following system based on target recognition
CN115857408B (en) * 2022-12-09 2024-09-06 中国船舶集团有限公司第七一六研究所 Unmanned vehicle autonomous following system based on target identification

Also Published As

Publication number Publication date
CN109947119B (en) 2021-06-29

Similar Documents

Publication Publication Date Title
CN109947119A (en) A kind of autonomous system for tracking of mobile robot based on Multi-sensor Fusion and method
CN110202583B (en) Humanoid manipulator control system based on deep learning and control method thereof
CN105701479B (en) Intelligent vehicle multilasered optical radar fusion identification method based on target signature
Dong et al. Real-time avoidance strategy of dynamic obstacles via half model-free detection and tracking with 2d lidar for mobile robots
Bucher et al. Image processing and behavior planning for intelligent vehicles
CN112101128B (en) Unmanned formula racing car perception planning method based on multi-sensor information fusion
Breitenmoser et al. A monocular vision-based system for 6D relative robot localization
CN107741234A (en) The offline map structuring and localization method of a kind of view-based access control model
Jean et al. Robust visual servo control of a mobile robot for object tracking using shape parameters
CN114998276B (en) Robot dynamic obstacle real-time detection method based on three-dimensional point cloud
CN108759839A (en) A kind of unmanned vehicle paths planning method based on situation space
CN109271857A (en) A kind of puppet lane line elimination method and device
Jin et al. A robust autonomous following method for mobile robots in dynamic environments
Adachi et al. Turning at intersections using virtual lidar signals obtained from a segmentation result
Brookshire Person following using histograms of oriented gradients
US11467592B2 (en) Route determination method
CN114636422A (en) Positioning and navigation method for information machine room scene
Kress et al. Pose based trajectory forecast of vulnerable road users
CN108152829A (en) A kind of two-dimensional laser radar for installing the linear guide additional builds map device and its builds drawing method
Nomatsu et al. Development of an autonomous mobile robot with self-localization and searching target in a real environment
Gebregziabher Multi object tracking for predictive collision avoidance
Zhang et al. Design of Blind Guiding Robot Based on Speed Adaptation and Visual Recognition
Huang et al. Multi-object detection, tracking and prediction in rugged dynamic environments
Wang et al. Agv navigation based on apriltags2 auxiliary positioning
Zhao et al. People following system based on lrf

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20231218

Address after: Room 4X-139, No. 96 Sanhao Street, Heping District, Shenyang City, Liaoning Province, 110057

Patentee after: Shenyang Ruige Holdings Co.,Ltd.

Address before: No.11, Wenhua Road, Sanxiang, Heping District, Shenyang City, Liaoning Province

Patentee before: Fang Zheng

Patentee before: Shenyang Ruige Holdings Co.,Ltd.

Effective date of registration: 20231218

Address after: No.11, Wenhua Road, Sanxiang, Heping District, Shenyang City, Liaoning Province

Patentee after: Fang Zheng

Patentee after: Shenyang Ruige Holdings Co.,Ltd.

Address before: 110819 No. 3 lane, Heping Road, Heping District, Shenyang, Liaoning 11

Patentee before: Northeastern University

TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20240116

Address after: No. 94-2 Sanhao Street, Heping District, Shenyang City, Liaoning Province, 110057 (3008)

Patentee after: Ruige Intelligent Technology (Shenyang) Co.,Ltd.

Address before: Room 4X-139, No. 96 Sanhao Street, Heping District, Shenyang City, Liaoning Province, 110057

Patentee before: Shenyang Ruige Holdings Co.,Ltd.

TR01 Transfer of patent right