[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN106525053A - Indoor positioning method for mobile robot based on multi-sensor fusion - Google Patents

Indoor positioning method for mobile robot based on multi-sensor fusion Download PDF

Info

Publication number
CN106525053A
CN106525053A CN201611230784.2A CN201611230784A CN106525053A CN 106525053 A CN106525053 A CN 106525053A CN 201611230784 A CN201611230784 A CN 201611230784A CN 106525053 A CN106525053 A CN 106525053A
Authority
CN
China
Prior art keywords
icp
robot
delta
pose
track
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201611230784.2A
Other languages
Chinese (zh)
Inventor
刘召
宋立滨
于涛
陈恳
刘莉
陈洪安
张智祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qing Hua Hua Yi (Tianjin) Education Technology Co., Ltd.
Original Assignee
Qing Yu Advantech Intelligent Robot (tianjin) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qing Yu Advantech Intelligent Robot (tianjin) Co Ltd filed Critical Qing Yu Advantech Intelligent Robot (tianjin) Co Ltd
Priority to CN201611230784.2A priority Critical patent/CN106525053A/en
Publication of CN106525053A publication Critical patent/CN106525053A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/14Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by recording the course traversed by the object

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides an indoor positioning method for a mobile robot based on multi-sensor fusion. The milemeter position data based on track plotting is adopted for compensating for an unrecognized similar environment in laser matching positioning. According to the invention, the sensor positioning data based on an inertia unit and a milemeter is fused, and the laser radar matching positioning is also considered as a reference index, so that the accumulative error of the milemeter is reduced, and the defect that a short-distance laser radar cannot be matched and positioned under a single characteristic environment is overcome, so that more accurate positioning data can be supplied for map drawing and navigation of the robot.

Description

A kind of mobile robot indoor orientation method based on Multi-sensor Fusion
Technical field
The invention belongs to wheeled mobile robot indoor positioning field, more particularly, to a kind of based on Multi-sensor Fusion Mobile robot indoor orientation method.
Background technology
When mobile robot is moved in environment indoors, first it is to be understood that where laying oneself open to, this is that robot enters Row environmental map is drawn, the important prerequisite of independent navigation is exactly accurate position and pose.Therefore, indoor mobile robot positioning Technology is always the study hotspot and difficult point of robot field.
With the progress of sensing technology, the sensor for indoor mobile robot positioning is also being constantly updated, overall next Say, current robot localization method is broadly divided into two classes:Relative localization method and absolute fix method.Relative localization method refers to robot According to self-sensor device, such as speedometer, identification loop etc. obtains the relative displacement and steering in the short time, with reference to during upper one sampling The cumulative pose carved draws pose of the robot at current time, mainly has dead reckoning and inertial navigation method.
Dead reckoning is mainly used in short distance positioning, and during long range, the cumulative error of encoder is fairly obvious;Acceleration Meter and gyroscope there are problems that systematic error and drift, will also result in cumulative error.
Absolute fix Fa Zhi robots directly determine its pose in world coordinate system by external sensor, often use Road sign method, GPS, map match etc..Road sign method is difficult in maintenance, and needs to be changed environment;GPS is used for outdoor, no It is adapted to environment indoors to use;The conventional laser radar of map match obtains environmental information, data will be adopted in front and back to be matched with Robot global pose is obtained, but the laser radar of wide range involves great expense, the laser radar of small-range cannot be processed as corridor General environment feature similarity, changes unconspicuous scene.
In conventional research, it is thus proposed that at set intervals using laser scanning matching calibration speedometer positioning.But This is generally required using expensive large range laser radar, and in the single indoor environment of some features, laser With displacement cannot be inferred according to matching result, error in data can be caused hence with its location data calibration speedometer, therefore, this Invention proposes a kind of mobile robot indoor orientation method based on Multi-sensor Fusion, using the speedometer based on reckoning The situation of None- identified similar environments in position data compensation laser matching positioning, on the basis of indoor position accuracy is improved, Its environmental suitability is higher.
The content of the invention
In view of this, it is contemplated that proposing a kind of mobile robot indoor orientation method based on Multi-sensor Fusion, Mapping and navigation for robot provides more accurate location data.
To reach above-mentioned purpose, the technical scheme is that what is be achieved in that:
A kind of mobile robot indoor orientation method based on Multi-sensor Fusion, is calculated using the ICP that laser data is matched Method carries out positioning estimation, and the pose change obtained using dead reckoning carries out positioning result compensation.
Further, the ICP algorithm of described utilization laser data matching carries out positioning estimation to be included:
If the sampling period is Δ t, before the ith sample cycle, the accurate pose of robot is
P=(x, y, θ)T
In the ith sample cycle, the pose change for obtaining robot movement by laser scanning matching is turned to:
Δpicp_i=[Δ xipc_i Δyipc_i Δθipc_i]T
If now the translational speed of robot is vicp_i=[vicp_xi,vicp_yi,wicp_i]T, therefore there is following relation:
vicp_xi=Δ xicp_i/ Δ t or vicp_xi=Δ yicp_i/ Δ t, and Δ Sicp_i=Δ xicp_iOr Δ Sicp_i=Δ yicp_i,
Wherein Δ Sicp_iIt is displacement of the robot obtained by matching algorithm within this sampling period.
Further, the ICP algorithm for being matched using laser data is carried out positioning estimation and is specifically included:
(a1) laser scanning of this moment is designated as into Current Scan D, upper moment scanning is designated as reference scan M;
(b1) obtain D is matched the optimal transformation (R, T) of M by the matching process of point to line, wherein R is rotation transformation Matrix, T are translation vector:
(c1) pose changes delta p of current robot is calculated according to (R, T)k=(Δ xk,Δyk,Δθk)T, it is assumed that the k moment Robot pose is pk=(xk,ykk)T, then k+1 moment robots pose be:
(d1) Current Scan D is designated as into new reference scan M, continues sampling laser data, by step (a1) again Iteration.
Further, the pose change that described use dead reckoning is obtained carries out positioning result compensation to be included:
Positioning estimation is carried out using dead reckoning, the pose change for obtaining ith sample cycle inner machine people movement is turned to:
Δptrack_i=[Δ xtrack_i,Δytrack_i,Δθtrack_i]T
If now the translational speed of robot is vtrack_i=[vtrack_xi,vtrack_yi,wtrack_i]T, then:
vtrack_xi=Δ Strack_i/ Δ t, vtrack_yi=0.0, wi=Δ θtrack_i/Δt;
Wherein Δ Strack_iIt is displacement of the robot obtained by dead reckoning within this sampling period.
Further, the pose change that described use dead reckoning is obtained carries out positioning result compensation and specifically includes:
(a2) assume, within a sampling period, to receive motor encoder feedback, unit conversion obtains the shifting of revolver, right wheel Dynamic distance respectively Δ SlWith Δ Sr, the angle for turning over is Δ θ, then can calculate and obtain robot at this using in the cycle Displacement and anglec of rotation angle:
Wherein, 2R be robot two-wheeled spacing, Δ S be robot movement distance, the angle that Δ θ is turned over for robot;
(b2) motion of the robot under world coordinate system can be shown below:
Wherein, the cumulative angle that robot has been rotated through before θ is this moment;
(c2) it follows that k moment robots pose is pk=(xk,ykk), unit sampling time inner machine people pose Change turns to Δ pk=(Δ xk,Δyk,Δθk), then according to dead reckoning, pose of the robot at the k+1 moment is:
Further, also include the ICP algorithm matched using laser data carry out pose change that positioning estimation obtains with And the pose change obtained using dead reckoning is made the difference, pose difference Δ p is obtained.
Further, if meeting
Δ p≤ξ, wherein ξ>0,
Then think that the pose change that laser scanning matching now is obtained is correct
Further, if being unsatisfactory for
Δ p≤ξ, wherein ξ>0,
Then enabling the pose change that dead reckoning obtains carries out positioning result compensation.
Relative to prior art, a kind of mobile robot indoor positioning side based on Multi-sensor Fusion of the present invention Method has the advantage that:
(1) the environments match location algorithm based on laser radar is tied by the present invention with the dead reckoning based on speedometer Close, complete the indoor positioning of mobile robot;
(2) present invention solves use using the speedometer compensation data laser radar location data based on reckoning Short range laser radar cannot be distinguished by that feature is single and the unconverted environment of long-time.
Description of the drawings
The accompanying drawing for constituting the part of the present invention is used for providing a further understanding of the present invention, the schematic reality of the present invention Apply example and its illustrate, for explaining the present invention, not constituting inappropriate limitation of the present invention.In the accompanying drawings:
Fig. 1 is that a kind of mobile robot indoor orientation method based on Multi-sensor Fusion described in the embodiment of the present invention shows It is intended to.
Specific embodiment
It should be noted that in the case where not conflicting, the embodiment and the feature in embodiment in the present invention can phase Mutually combine.
Below with reference to the accompanying drawings and in conjunction with the embodiments describing the present invention in detail.
Scan matching positioning of the present invention using the speedometer location data compensation laser radar based on reckoning, the party The realization of method is based on following theoretical:
(1) the reckoning model based on encoder
Assume within the sampling period, receive motor encoder feedback, unit conversion obtain revolver, right wheel movement away from From respectively Δ SlWith Δ Sr, the angle for turning over is Δ θ, then can calculate the movement for obtaining robot within this is using the cycle Distance and anglec of rotation angle:
Wherein, 2R be robot two-wheeled spacing, Δ S be robot movement distance, the angle that Δ θ is turned over for robot.
Due to the sampling interval it is very short, it is believed that robot displacement is similar to straight line, then robot is in world coordinate system Under motion can be shown below:
Wherein, the cumulative angle that robot has been rotated through before θ is this moment.
It follows that k moment robots pose is pk=(xk,ykk), the change of unit sampling time inner machine people pose For Δ pk=(Δ xk,Δyk,Δθk), then according to dead reckoning, pose of the robot at the k+1 moment is:
(2) ICP algorithm matched based on laser data
The thought of ICP algorithm is that the two neighboring laser data frame to continuous acquisition is matched, and is obtained between the two Relative pose transformation relation, so as to incrementally update the current pose of robot, algorithm steps can be summarized as:
A the laser scanning of this moment is designated as Current Scan D by (), upper moment scanning is designated as reference scan M;
B () obtains D is matched the optimal transformation (R, T) of M by the matching process of point to line, wherein R is rotation transformation Matrix, T are translation vector:
C () calculates pose changes delta p of current robot according to (R, T)k=(Δ xk,Δyk,Δθk)T, it is assumed that k moment machines Device people pose is pk=(xk,ykk)T, then k+1 moment robots pose be:
D Current Scan D is designated as new reference scan M by (), continue sampling laser data, changed by step (a) again Generation.
As shown in figure 1, the algorithm steps of Multi-sensor Fusion positioning of the present invention:
In circumstances not known, during environmental map is set up in manual control machine device people walking, straight ahead and original place rotation are only done Rotate and make, the purpose of this agreement is reduced during mapping because the map that error causes is inclined.
The hypothesis sampling period is Δ t, and before the ith sample cycle, the accurate pose of robot is p=(x, y, θ)T
The first step, carries out positioning estimation using the ICP algorithm matched based on laser data mentioned in key theory (2), In the ith sample cycle, the pose change for obtaining robot movement by laser scanning matching is turned to:Δpicp_i=[Δ xipc_i Δyipc_i Δθipc_i]T
If now the translational speed of robot is vicp_i=[vicp_xi,vicp_yi,wicp_i]T, for Two-wheeled movement is moved Mobile robot, vy=0 permanent establishment.Arranged according to premise, only exist straight forward and pivot turn, thus below equation it is permanent into It is vertical:
vicp_xi=Δ xicp_i/ Δ t or vicp_xi=Δ yicp_i/ Δ t, and Δ Sicp_i=Δ xicp_iOr Δ Sicp_i=Δ yicp_i
Wherein Δ Sicp_iIt is displacement of the robot obtained by matching algorithm within this sampling period.
Second step, carries out positioning estimation using dead reckoning, obtains machine in the ith sample cycle according to formula (1)~(3) The pose change of device people movement is turned to:
Δptrack_i=[Δ xtrack_i,Δytrack_i,Δθtrack_i]T
If now the translational speed of robot is vtrack_i=[vtrack_xi,vtrack_yi,wtrack_i]T, then:
vtrack_xi=Δ Strack_i/ Δ t, vtrack_yi=0.0, wi=Δ θtrack_i/Δt。
Wherein Δ Strack_iIt is displacement of the robot obtained by dead reckoning within this sampling period.
3rd step, verifies whether the pose change that laser scanning matching is obtained is correct:
First and second step is that the pose change to ith sample cycle inner machine people movement is estimated, if both estimates Calculate correct, Ying You:
|Δptrack_i-Δpicp_i|≤ξ (5)
Wherein ξ is a number more than zero, and its implication is the limits of error of the pose that two kinds of pose estimation algorithms are obtained.
If result meets (5) formula, then it is assumed that the pose change that laser scanning matching now is obtained is correct, then during the sampling Between after, robot pose piCan be expressed as:
pi=pi-1+Δpicp_i (6)
If result of calculation is unsatisfactory for (5) formula, determine whether, if now vicp_i≈ 0.0 but vtrack_i≠ 0.0, then sentence The pose change mistake that disconnected laser scanning matching now is obtained, guess robot enter the single section of architectural feature, enable boat The pose change that mark predication method is obtained carries out positioning result compensation, i.e.,:
pi=pi-1+Δptrack_i (7)
And so on, till the environmental map of closing is created until robot.
Presently preferred embodiments of the present invention is the foregoing is only, not to limit the present invention, all essences in the present invention Within god and principle, any modification, equivalent substitution and improvements made etc. should be included within the scope of the present invention.

Claims (8)

1. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion, it is characterised in that:Using laser data The ICP algorithm matched somebody with somebody carries out positioning estimation, and the pose change obtained using dead reckoning carries out positioning result compensation.
2. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 1, its feature It is:The ICP algorithm of described utilization laser data matching carries out positioning estimation to be included:
If the sampling period is Δ t, before the ith sample cycle, the accurate pose of robot is
P=(x, y, θ)T
In the ith sample cycle, the pose change for obtaining robot movement by laser scanning matching is turned to:
Δpicp_i=[Δ xipc_i Δyipc_i Δθipc_i]T
If now the translational speed of robot is vicp_i=[vicp_xi,vicp_yi,wicp_i]T, therefore there is following relation:
vicp_xi=Δ xicp_i/ Δ t or vicp_xi=Δ yicp_i/ Δ t, and Δ Sicp_i=Δ xicp_iOr Δ Sicp_i=Δ yicp_i,
Wherein Δ Sicp_iIt is displacement of the robot obtained by matching algorithm within this sampling period.
3. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 2, its feature It is:The ICP algorithm matched using laser data is carried out positioning estimation and is specifically included:
(a1) laser scanning of this moment is designated as into Current Scan D, upper moment scanning is designated as reference scan M;
(b1) obtain D is matched the optimal transformation (R, T) of M by the matching process of point to line, wherein R is rotation transformation square Battle array, T is translation vector:
(c1) pose changes delta p of current robot is calculated according to (R, T)k=(Δ xk,Δyk,Δθk)T, it is assumed that k moment machines People's pose is pk=(xk,ykk)T, then k+1 moment robots pose be:
( x k + 1 , y k + 1 , θ k + 1 ) T = x k y k θ k + cosθ k sinθ k 0 - sinθ k cosθ k 0 0 0 1 Δx k Δy k Δθ k
(d1) Current Scan D is designated as into new reference scan M, continues sampling laser data, the iteration again by step (a1).
4. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 1, its feature It is:The pose change that described use dead reckoning is obtained carries out positioning result compensation to be included:
Positioning estimation is carried out using dead reckoning, the pose change for obtaining ith sample cycle inner machine people movement is turned to:
Δptrack_i=[Δ xtrack_i,Δytrack_i,Δθtrack_i]T
If now the translational speed of robot is vtrack_i=[vtrack_xi,vtrack_yi,wtrack_i]T, then:
vtrack_xi=Δ Strack_i/ Δ t, vtrack_yi=0.0, wi=Δ θtrack_i/Δt;
Wherein Δ Strack_iIt is displacement of the robot obtained by dead reckoning within this sampling period.
5. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 4, its feature It is:The pose change that described use dead reckoning is obtained carries out positioning result compensation and specifically includes:
(a2) assume within the sampling period, receive motor encoder feedback, unit conversion obtain revolver, right wheel movement away from From respectively Δ SlWith Δ Sr, the angle for turning over is Δ θ, then can calculate the movement for obtaining robot within this is using the cycle Distance and anglec of rotation angle:
Δ S = ΔS r + ΔS l 2 Δ θ = ΔS r - ΔS l 2 R
Wherein, 2R be robot two-wheeled spacing, Δ S be robot movement distance, the angle that Δ θ is turned over for robot;
(b2) motion of the robot under world coordinate system can be shown below:
Δ x = Δ S c o s ( θ + Δ θ / 2 ) Δ y = Δ S s i n ( θ + Δ θ / 2 )
Wherein, the cumulative angle that robot has been rotated through before θ is this moment;
(c2) it follows that k moment robots pose is pk=(xk,ykk), the change of unit sampling time inner machine people pose is turned to Δpk=(Δ xk,Δyk,Δθk), then according to dead reckoning, pose of the robot at the k+1 moment is:
p k + 1 = x k + Δx k y k + Δy k θ k + Δ θ .
6. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 1, its feature It is:Also include that the ICP algorithm matched using laser data carries out the pose change and push away using flight path that positioning estimation obtains The pose change that algorithm is obtained makes the difference, and obtains pose difference Δ p.
7. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 6, its feature It is:If meeting
Δ p≤ξ, wherein ξ>0,
Then think that the pose change that laser scanning matching now is obtained is correct.
8. a kind of mobile robot indoor orientation method based on Multi-sensor Fusion according to claim 6, its feature It is:
If being unsatisfactory for
Δ p≤ξ, wherein ξ>0,
Then enabling the pose change that dead reckoning obtains carries out positioning result compensation.
CN201611230784.2A 2016-12-28 2016-12-28 Indoor positioning method for mobile robot based on multi-sensor fusion Pending CN106525053A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201611230784.2A CN106525053A (en) 2016-12-28 2016-12-28 Indoor positioning method for mobile robot based on multi-sensor fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201611230784.2A CN106525053A (en) 2016-12-28 2016-12-28 Indoor positioning method for mobile robot based on multi-sensor fusion

Publications (1)

Publication Number Publication Date
CN106525053A true CN106525053A (en) 2017-03-22

Family

ID=58337819

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201611230784.2A Pending CN106525053A (en) 2016-12-28 2016-12-28 Indoor positioning method for mobile robot based on multi-sensor fusion

Country Status (1)

Country Link
CN (1) CN106525053A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN107300696A (en) * 2017-06-16 2017-10-27 北京军立方机器人科技有限公司 A kind of position of mobile robot bearing calibration and device based on RFID
CN107990893A (en) * 2017-11-24 2018-05-04 南京航空航天大学 The detection method that environment is undergone mutation is detected in two-dimensional laser radar SLAM
CN108036786A (en) * 2017-12-01 2018-05-15 安徽优思天成智能科技有限公司 Position and posture detection method, device and computer-readable recording medium based on auxiliary line
CN108332758A (en) * 2018-01-26 2018-07-27 上海思岚科技有限公司 A kind of corridor recognition method and device of mobile robot
CN108548536A (en) * 2018-01-05 2018-09-18 广东雷洋智能科技股份有限公司 The dead reckoning method of unmanned intelligent robot
CN108664030A (en) * 2018-05-23 2018-10-16 上海圭目机器人有限公司 A kind of intelligent disinfecting robot system
CN109144056A (en) * 2018-08-02 2019-01-04 上海思岚科技有限公司 The global method for self-locating and equipment of mobile robot
CN109129468A (en) * 2018-07-27 2019-01-04 广东工业大学 A kind of mobile robot based on MYRIO platform
CN110045733A (en) * 2019-04-04 2019-07-23 肖卫国 A kind of real-time location method and its system, computer-readable medium
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110954100A (en) * 2019-12-30 2020-04-03 广东省智能制造研究所 Method for estimating body state of foot type robot based on fusion of laser and inertial navigation
CN111638530A (en) * 2020-05-27 2020-09-08 广州蓝胖子机器人有限公司 Forklift positioning method, forklift and computer readable storage medium
CN112923931A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)科技有限公司 Feature map matching and GPS positioning information fusion method based on fixed route
CN115366102A (en) * 2022-08-23 2022-11-22 珠海城市职业技术学院 Navigation method and system of mobile robot in indoor unknown dynamic environment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777220A (en) * 2014-01-17 2014-05-07 西安交通大学 Real-time and accurate pose estimation method based on fiber-optic gyroscope, speed sensor and GPS
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105547288A (en) * 2015-12-08 2016-05-04 华中科技大学 Self-localization method and system for mobile device in underground coal mine
CN106123890A (en) * 2016-06-14 2016-11-16 中国科学院合肥物质科学研究院 A kind of robot localization method of Fusion

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103777220A (en) * 2014-01-17 2014-05-07 西安交通大学 Real-time and accurate pose estimation method based on fiber-optic gyroscope, speed sensor and GPS
CN105180933A (en) * 2015-09-14 2015-12-23 中国科学院合肥物质科学研究院 Mobile robot track plotting correcting system based on straight-running intersection and mobile robot track plotting correcting method
CN105547288A (en) * 2015-12-08 2016-05-04 华中科技大学 Self-localization method and system for mobile device in underground coal mine
CN106123890A (en) * 2016-06-14 2016-11-16 中国科学院合肥物质科学研究院 A kind of robot localization method of Fusion

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106918830A (en) * 2017-03-23 2017-07-04 安科机器人有限公司 A kind of localization method and mobile robot based on many navigation modules
CN107300696A (en) * 2017-06-16 2017-10-27 北京军立方机器人科技有限公司 A kind of position of mobile robot bearing calibration and device based on RFID
CN107990893A (en) * 2017-11-24 2018-05-04 南京航空航天大学 The detection method that environment is undergone mutation is detected in two-dimensional laser radar SLAM
CN108036786A (en) * 2017-12-01 2018-05-15 安徽优思天成智能科技有限公司 Position and posture detection method, device and computer-readable recording medium based on auxiliary line
CN108548536A (en) * 2018-01-05 2018-09-18 广东雷洋智能科技股份有限公司 The dead reckoning method of unmanned intelligent robot
CN108332758A (en) * 2018-01-26 2018-07-27 上海思岚科技有限公司 A kind of corridor recognition method and device of mobile robot
CN108332758B (en) * 2018-01-26 2021-07-09 上海思岚科技有限公司 Corridor identification method and device for mobile robot
CN108664030A (en) * 2018-05-23 2018-10-16 上海圭目机器人有限公司 A kind of intelligent disinfecting robot system
CN109129468B (en) * 2018-07-27 2021-03-12 广东工业大学 Mobile robot based on MYRIO platform
CN109129468A (en) * 2018-07-27 2019-01-04 广东工业大学 A kind of mobile robot based on MYRIO platform
CN109144056B (en) * 2018-08-02 2021-07-06 上海思岚科技有限公司 Global self-positioning method and device for mobile robot
CN109144056A (en) * 2018-08-02 2019-01-04 上海思岚科技有限公司 The global method for self-locating and equipment of mobile robot
CN110045733A (en) * 2019-04-04 2019-07-23 肖卫国 A kind of real-time location method and its system, computer-readable medium
CN110045733B (en) * 2019-04-04 2022-11-01 肖卫国 Real-time positioning method and system and computer readable medium
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof
CN110553652B (en) * 2019-10-12 2022-06-24 上海高仙自动化科技发展有限公司 Robot multi-sensor fusion positioning method and application thereof
CN112923931A (en) * 2019-12-06 2021-06-08 北理慧动(常熟)科技有限公司 Feature map matching and GPS positioning information fusion method based on fixed route
CN110954100A (en) * 2019-12-30 2020-04-03 广东省智能制造研究所 Method for estimating body state of foot type robot based on fusion of laser and inertial navigation
CN111638530A (en) * 2020-05-27 2020-09-08 广州蓝胖子机器人有限公司 Forklift positioning method, forklift and computer readable storage medium
CN111638530B (en) * 2020-05-27 2023-09-19 广州蓝胖子移动科技有限公司 Fork truck positioning method, fork truck and computer readable storage medium
CN115366102A (en) * 2022-08-23 2022-11-22 珠海城市职业技术学院 Navigation method and system of mobile robot in indoor unknown dynamic environment

Similar Documents

Publication Publication Date Title
CN106525053A (en) Indoor positioning method for mobile robot based on multi-sensor fusion
CN113781582B (en) Synchronous positioning and map creation method based on laser radar and inertial navigation combined calibration
US11802769B2 (en) Lane line positioning method and apparatus, and storage medium thereof
Cao et al. Accurate position tracking with a single UWB anchor
KR101214143B1 (en) Method and apparatus for detecting position and orientation
CN106681320A (en) Mobile robot navigation control method based on laser data
WO2019034115A1 (en) Label incorporating simultaneous localization and mapping navigation method, device and system
CN103207634A (en) Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN110243358A (en) The unmanned vehicle indoor and outdoor localization method and system of multi-source fusion
CN109946732A (en) A kind of unmanned vehicle localization method based on Fusion
CN104501838B (en) SINS Initial Alignment Method
Wang et al. Vehicle localization at an intersection using a traffic light map
CN106338991A (en) Robot based on inertial navigation and two-dimensional code and positioning and navigation method thereof
CN103697889A (en) Unmanned aerial vehicle self-navigation and positioning method based on multi-model distributed filtration
CN106969784B (en) A kind of combined error emerging system for concurrently building figure positioning and inertial navigation
CN103033189A (en) Inertia/vision integrated navigation method for deep-space detection patrolling device
CN106767827A (en) A kind of mobile robot point cloud map creating method based on laser data
WO2022147924A1 (en) Method and apparatus for vehicle positioning, storage medium, and electronic device
CN111025366B (en) Grid SLAM navigation system and method based on INS and GNSS
CN105698822A (en) Autonomous inertial navigation action initial alignment method based on reverse attitude tracking
CN109813305A (en) Unmanned fork lift based on laser SLAM
CN108362288A (en) Polarized light S L AM method based on unscented Kalman filtering
CN111426320A (en) Vehicle autonomous navigation method based on image matching/inertial navigation/milemeter
CN105333869A (en) Unmanned reconnaissance aerial vehicle synchronous positioning and picture compositing method based on self-adaption EKF
CN111221020A (en) Indoor and outdoor positioning method, device and system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20180920

Address after: 300300 3, 202 Building 5, Chuang Hui Valley Park, Hong Shun Road, Huaming street, Dongli, Tianjin.

Applicant after: Qing Hua Hua Yi (Tianjin) Education Technology Co., Ltd.

Address before: 300300, 2 floor, building 4, Chuang Hui Valley, Huaming high tech Industrial Zone, Dongli, Tianjin.

Applicant before: Qing Yu Advantech intelligent robot (Tianjin) Co., Ltd.

TA01 Transfer of patent application right
RJ01 Rejection of invention patent application after publication

Application publication date: 20170322

RJ01 Rejection of invention patent application after publication