[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN103886107B - Robot localization and map structuring system based on ceiling image information - Google Patents

Robot localization and map structuring system based on ceiling image information Download PDF

Info

Publication number
CN103886107B
CN103886107B CN201410149081.1A CN201410149081A CN103886107B CN 103886107 B CN103886107 B CN 103886107B CN 201410149081 A CN201410149081 A CN 201410149081A CN 103886107 B CN103886107 B CN 103886107B
Authority
CN
China
Prior art keywords
corner feature
image
module
feature
tracking
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201410149081.1A
Other languages
Chinese (zh)
Other versions
CN103886107A (en
Inventor
张文强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Radiant Photovoltaic Technology Co Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201410149081.1A priority Critical patent/CN103886107B/en
Publication of CN103886107A publication Critical patent/CN103886107A/en
Application granted granted Critical
Publication of CN103886107B publication Critical patent/CN103886107B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • General Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The present invention relates to robot field, more particularly to a kind of robot localization based on ceiling image information and map structuring system.The system that the present invention is provided includes image collection module, is arranged in robot, its optical axis is perpendicular to ceiling, for being shot in robot motion to ceiling, obtains current ceiling image information;Described image processing module, for handling the ceiling image information that described image acquisition module is obtained;The characteristic extracting module, for extracting feature in the image information after the processing of described image processing module, obtains set of image characteristics;The feature tracking module, each characteristics of image that the characteristics of image for being extracted to the characteristic extracting module is concentrated carries out matched jamming, the set of image characteristics after being tracked;The memory management module, for storing the set of image characteristics that the feature tracking module is obtained, and according to described image feature set map making.The present invention facilitates map structuring.

Description

Robot localization and map structuring system based on ceiling image information
Technical field
The present invention relates to robot field, more particularly to a kind of robot localization based on ceiling image information and ground Figure constructing system.
Background technology
With automation and intelligentized development, robot is in the industry using more and more extensively, and status is more and more important. Robot research is to compare popular field, wherein Simultaneous Localization and Mapping now(It is synchronous Positioning and map structuring, hereinafter referred to as SLAM)Technology is the problem of robot field is more classical, and SLAM problems can be described For:Robot is moved in circumstances not known since a unknown position, is entered in moving process according to location estimation and map Row self poisoning, while building increment type map on the basis of self poisoning, realizes autonomous positioning and the navigation of robot.
Because vision sensor can obtain abundant environmental information, it is used widely, that is, is regarded in SLAM in recent years Feel SLAM.Vision SLAM can be divided into the SLAM based on monocular vision, and two kinds of binocular and multi-vision visual SLAM.But for For mobile robot, in order to save the complexity especially data correlation problem of cost and reduction system, in many applications The video camera of a table top forward is only equipped with, this direction is conducive to other operations of robot, such as long-range display, remote operating And avoidance.
But the defect of this technology is:The motion of robot, which is generally approximate along camera optical axis, to be translated, therefore is obtained Consecutive image vary less, map structuring is relatively difficult.
The content of the invention
In order to solve the above technical problems, the present invention provides a kind of robot localization and map based on ceiling image information Constructing system, the consecutive image change of acquisition is more apparent, facilitates map structuring.
The robot localization based on ceiling image information of the present invention and map structuring system, including:Image obtains mould Block, characteristic extracting module, feature tracking module and memory management module;
Described image acquisition module, is arranged in robot, and its optical axis is perpendicular to ceiling, in robot motion Ceiling is shot, current ceiling image information is obtained;
Described image processing module, for handling the ceiling image information that described image acquisition module is obtained;
The characteristic extracting module, for extracting feature in the image information after the processing of described image processing module, is obtained To set of image characteristics;
The feature tracking module, each figure that the characteristics of image for being extracted to the characteristic extracting module is concentrated As feature carries out matched jamming, the set of image characteristics after being tracked;
The memory management module, for storing the set of image characteristics that the feature tracking module is obtained, and according to described Set of image characteristics map making.
Wherein, described image is characterized as Corner Feature, and described image feature set is Corner Feature collection.
Wherein, the feature tracking module includes optical flow tracking module and correction module;
The optical flow tracking module, the Corner Feature collection for tracking present image by Lucas-Kanade optical flow methods In each Corner Feature, obtain track Corner Feature collection;
The correction module, for it is described state optical flow tracking module obtain tracking Corner Feature collection be corrected.
Further, the correction module extracts Corner Feature again on current frame image, obtains new Corner Feature Collection, concentrates each Corner Feature to be concentrated with new Corner Feature the tracking Corner Feature obtained by the optical flow tracking module The Corner Feature that is overlapped in the range of preset window of Corner Feature as the Corner Feature collection after correction.
Or, the correction module includes the first tracking cell, the second tracking cell and judging unit:
If current frame image is image imgA, its Corner Feature integrates as A, and next two field picture is imgB;
First tracking cell, for based on Corner Feature collection A, angle point to be obtained in next two field picture imgB tracking Feature set B;
Second tracking cell, for based on Corner Feature collection B, angle point to be obtained in current frame image imgA tracking Feature set B2;
The judging unit, for judging whether each Corner Feature label of the Corner Feature collection A is special with the angle point Each Corner Feature label collected in B2 is identical, if identical, takes label identical Corner Feature to add preparation Corner Feature collection In.
Or, the correction module includes extraction unit, predicting unit, comparison unit:
The extraction unit, the Corner Feature collection B2 for extracting present image;Some Corner Feature has been in B2 The tracking Corner Feature collection B that the present image is obtained is tracked through being present in;
The predicting unit, the Corner Feature collection for predicting present image according to the Corner Feature in Corner Feature storehouse B’;
The Corner Feature of the comparison unit, for contrasting B2, B and B ', B and B will not be appeared in B2 ' is judged as newly Corner Feature, will be added in Corner Feature storehouse by the new Corner Feature of threshold value.
Beneficial effects of the present invention are compared with prior art:It is provided in an embodiment of the present invention to be based on ceiling image information Robot localization and map structuring system, due to the direction of video camera to be changed to the fortune in direction, i.e. robot to ceiling Optical axis of the direction perpendicular to video camera is moved, therefore the consecutive image change obtained is more apparent, facilitates map structuring;Further, When tracking characteristics of image, in order to obtain more preferable tracking information, traditional tracking result is corrected, makes what tracking was obtained Corner Feature is more stablized and accurate, and the precision of map structuring is also higher, that is, the map built is also just more accurate.
Brief description of the drawings
Fig. 1 is that the robot localization provided in an embodiment of the present invention based on ceiling image information shows with map structuring system It is intended to;
Fig. 2 is another robot localization and map structuring based on ceiling image information provided in an embodiment of the present invention System schematic;
Fig. 3 is the schematic diagram of correction module in Fig. 2;
Fig. 4 is another schematic diagram of correction module in Fig. 2.
Embodiment
With reference to the accompanying drawings and examples, the embodiment to the present invention is described in further detail.Implement below Example is used to illustrate the present invention, but is not limited to the scope of the present invention.
Referring to Fig. 1, the embodiments of the invention provide a kind of robot localization based on ceiling image information and map structure System is built, the acquisition of vision information direction of the system is towards ceiling so that the consecutive image change of acquisition greatly, is easy to map structure Build.The system is specifically included:Image collection module 1, image processing module 2, characteristic extracting module 3 and is deposited feature tracking module 4 Store up management module 5;
Wherein, image collection module 1, is arranged in robot, and its optical axis is perpendicular to ceiling, in robot motion When ceiling is shot, obtain current ceiling image information;
Image collection module 1 can be mounted in robot, optical axis is perpendicular to video camera of ceiling etc..
Image processing module 2, for handling the ceiling image information that image collection module 1 is obtained;
Specifically, image processing module 2 is obtained to image collection module 1 ceiling image information carry out gray proces, Distortion correction and cutting so that characteristic extracting module 3 is convenient to extract characteristics of image.
Image in the indoor ceiling video of shooting is the coloured image based on RGB color.Such image is not Only memory space is big, while being also not easy to be handled.Therefore need these coloured images being converted to processing more easily Gray-scale map.
Gray-scale map is a kind of monochrome image with 256 grades of tonal gradations from black to white.Each pixel in image is with 8 Data represent, therefore the gray value of pixel is one kind in 256 kinds of gray scales between black and white.Gray-scale map only has gray scale etc. Level, without the change of color.
In YUV color spaces, Y-component represents the brightness degree of pixel, can express figure by this brightness value The gray value of picture.According to RGB and YUV color space conversion formula, RGB image and the corresponding relation of Y-component can be obtained, it is such as public Shown in formula:
(1)
RGB image can be converted to by gray-scale map according to formula (1).
Further, realized because image collection module 1 obtains image by optical principle, based on existing optics skill The limitation of art level, it is impossible to produce perfect lens, and a lens and imager keeping parallelism also are difficult to from mechanical aspects, this A little situations can cause lens distortion.For example:Should be that the lines of straight line are bent in picture, this distortion can be to vision SLAM systems produce influence, because vision SLAM needs the characteristics of image in image information accurate as far as possible, and the meeting pair that distorts Extraction, tracking and map structuring on characteristics of image are impacted.For the Corner Feature of tracking, the shadow of this distortion Sound can make the Corner Feature that traces into produce the skew relative to actual environment, although displacement is smaller, on the move can be with Self-recision is carried out, but for the robot matched according to Corner Feature is repositioned, Corner Feature distance can be influenceed Judge.Therefore before using video camera, image collection module can be demarcated, obtains distort matrix, the figure got As information is corrected using distortion matrix.
Simultaneously as it is bigger apart from the more remote distortion of image center location in the image obtained, for the ease of characteristics of image Storage and the extraction of reduction wall feature are, it is necessary to the actual picture model for feature extraction of ratio that the image range of acquisition is set Enclose small, that is, do not carry out feature extraction and tracking processing to image peripheral part, and when characteristics of image to be preserved, Ke Yiquan Portion is preserved.
Characteristic extracting module 3, for extracting characteristics of image in the image information after the processing of image processing module 2, is obtained Set of image characteristics;
Characteristics of image has Corner Feature, line feature, edge feature, textural characteristics, color lump feature etc., due to Corner Feature Extract and using fairly simple quick, also allow for storing and applied to map structuring, therefore the embodiment of the present invention is with characteristics of image To be illustrated exemplified by Corner Feature.
Specifically, when characteristics of image is Corner Feature, characteristic extracting module 3 can be carried out using Harris angle points method Extract.Further, can be using sub-pix angular-point detection method to logical in order to be able to obtain the Corner Feature information of higher precision Cross Harris angle point grids to Corner Feature be modified, obtain the exact position of Corner Feature.Harris angle point grids and Sub-pix angular-point detection method belongs to the common knowledge of those skilled in the art, therefore is no longer described in detail in the present embodiment Journey.
Feature tracking module 4, each characteristics of image that the characteristics of image for being extracted to characteristic extracting module 3 is concentrated Carry out matched jamming, the set of image characteristics after being tracked;
Referring to Fig. 2, the embodiment of the present invention additionally provides another robot localization based on ceiling image information and ground Figure constructing system schematic diagram, wherein, feature tracking module 4 include optical flow tracking module 41 and correction module 42, wherein light stream with Track module 41, for by every in Corner Feature collection cornersA in Lucas-Kanade optical flow methods tracking present image imgA One Corner Feature information, obtains tracking Corner Feature collection cornersB.
Specifically, optical flow tracking module 41 is realized by optical flow method and tracked, and light stream is referred to:Object band optical signature position Mobile projector express the change of image to light stream, light stream is just formed on the plane of delineation, it contains the information of moving object. Optical flow method comprises the following steps:
1)To each Corner Feature in Corner Feature collection cornersA in present image imgA, it is obtained in next frame Correspondence position in image imgB, calculates the luminosity gradient Ix and luminosity gradient Iy in Y-direction in X-direction.Computational methods can To utilize difference approximation, or calculated using operator, such as sobel operators.It is public that specific calculating process belongs to those skilled in the art General knowledge is known, not in procedure detailed.
2)Under brightness permanence condition, with the Constrained equations of pixel near default window calculation Corner Feature;
3)The above-mentioned default equation group tried to achieve is carried out being converted to matrix M;
4)It is all unusual to judge Metzler matrix obtained above, if nonsingular, is obtained by solving in image imgB CornersA tracking Corner Feature collection cornersB.
Lucas-Kanade optical flow methods belong to the common knowledge of those skilled in the art, therefore no longer detailed in the present embodiment Carefully repeat said process.
In actual applications, what is often occurred due to video camera and photograph is image blurring, or video camera Cause image that in order to attractive in appearance real world images are produced with modification due to carrying video effector, these all can angle steel joint spy The extraction and tracking levied are impacted.For example, when the serrating phenomenon of lines image generation of oblique line directions, or image are rocked There is larger grain effect in region that should be flat in a certain two field picture suddenly, may all be extracted as Corner Feature, Then optical flow tracking module 41 may proceed to be tracked the Corner Feature of these erroneous judgements.This erroneous judgement is most normal with the situation of oblique line Occur, due to determining to have extracted Corner Feature on oblique line, and the difference very little of each neighborhood territory pixel on oblique line, when taking the photograph The image information that camera is shot with video camera move when, then these Corner Features are tracked it is possible that Corner Feature Position without changing or mobile distance is less than the situation of actual range because oblique line just Vertical Square of lines when mobile It is little to movement, and similitude is too many on lines so that tracking mistake.That is, optical flow tracking module 41 be actually pair The tracking of pixel in one window, it is impossible to itself judge whether pixel is characterized a little in window.
Therefore, single to have above-mentioned defect using optical flow tracking module 41, the present invention further employs straightening die Block 42, is corrected for obtaining tracking Corner Feature collection cornersB to above-mentioned optical flow tracking module 41.
Specifically, correction module 42 extracts Corner Feature again on current frame image, obtains new Corner Feature collection, will The angle point that the tracking Corner Feature obtained by optical flow tracking module 41 concentrates each Corner Feature to be concentrated with new Corner Feature The Corner Feature that feature is overlapped in the range of preset window is used as the Corner Feature collection after correction.
For example, setting current frame image as imgB, obtained Corner Feature is tracked on image imgB and is integrated as cornersB, corrected Module 42 extracts Corner Feature on image imgB again, new Corner Feature collection cornersB2 is obtained, if wicket scope is Cornerscale, this experiment span is the 7*7 windows using Corner Feature center as window center.Corner Feature will be tracked The Corner Feature in Corner Feature collection cornersB2 is appeared in for most in wicket near each Corner Feature in collection cornersB The Corner Feature determined eventually, that is to say, that by each Corner Feature in tracking Corner Feature collection cornersB and new Corner Feature After the Corner Feature that Corner Feature in collection cornersB2 is overlapped in preset window scope cornerscale is as correction Corner Feature collection, is added in preparation Corner Feature collection cornersC.It should be noted that Corner Feature here is to change into New Corner Feature collection cornersB2 positional information, Corner Feature label information does not change.
Or, it is the schematic diagram of correction module 42 in Fig. 2 referring to Fig. 3, wherein, it is single that correction module 42 includes the first tracking Member 421, the second tracking cell 422 and judging unit 423:
If current frame image is image imgA, its Corner Feature integrates as A, and next two field picture is imgB.
Wherein, the first tracking cell 421, for based on Corner Feature collection A, being obtained in next two field picture imgB tracking Corner Feature collection B.
Second tracking cell 422, for based on Corner Feature collection B, angle point to be obtained in current frame image imgA tracking Feature set B2.
Judging unit 423, for judge Corner Feature collection A each Corner Feature label whether with Corner Feature collection B2 Each Corner Feature label is identical, if identical, takes label identical Corner Feature to add in preparation Corner Feature collection cornersC, Prepared for next frame image trace.
Relatively stable tracking Corner Feature information can be obtained to a certain extent by above-mentioned correction module 42, because Correction module 42 can remove partly because feature substantially or to neighbouring Corner Feature more similar Corner Feature, retain The Corner Feature for having stronger feature participates in the tracking of next two field picture.
Or, it is another schematic diagram of correction module 42 in Fig. 2 referring to Fig. 4, correction module 42 can also include extracting Unit 424, predicting unit 425, comparison unit 426:
Wherein, extraction unit 424, the Corner Feature collection B2 for extracting present image;Some angle point is special in B2 Levy to be already present on and track the tracking Corner Feature collection B that the present image is obtained;
Predicting unit 425, the Corner Feature collection B ' for predicting present image according to the Corner Feature in Corner Feature storehouse;
The Corner Feature of comparison unit 426, for contrasting B2, B and B ', B and B will not be appeared in B2 ' is judged as newly Corner Feature, continues certain frame number according to new angle point rule and judges the confidence level of the Corner Feature and by by the new angle point of threshold value Feature is added in Corner Feature storehouse.
It should be noted that Corner Feature not all in the new Corner Feature finally obtained is relative to structure The new Corner Feature in the Corner Feature storehouse of map is built, certain position, the current institute of image collection module 1 are moved to when there is robot When the image range of acquisition spatially produces overlapping with the image once obtained, the new angle point that above-mentioned comparison unit 426 is obtained There may be the Corner Feature being already present in Corner Feature storehouse in feature.
Or, correction module 42 can also use pyramid LK algorithms to be tracked;Or, it is special for mobile big angle point The angle point tracing function after parameter regulation can be used to handle two front and rear frame pictures by levying, increase image triangle The number of plies, position is moved larger Corner Feature can also judge.After the tracking of this two field pictures has been handled, follow-up is straight The tracing function used when line mobile image is according to linear motion is calculated.
Memory management module 5, for storing the Corner Feature information that feature tracking module 4 is obtained, and it is special according to the angle point Reference ceases map making.
Robot localization provided in an embodiment of the present invention based on ceiling image information and map structuring system, due to inciting somebody to action The direction of video camera is changed to optical axis of the direction of motion perpendicular to video camera of the direction to ceiling, i.e. robot, therefore obtains Consecutive image change it is more apparent, facilitate map structuring;Further, track characteristics of image when, in order to obtain preferably with Track information, is corrected to traditional tracking result, the Corner Feature that tracking is obtained more is stablized and accurate, map structuring Precision is also higher, that is, the map built is also just more accurate.
Described above is only the preferred embodiment of the present invention, it is noted that for the ordinary skill people of the art For member, without departing from the technical principles of the invention, some improvement and modification can also be made, these improvement and modification Also it should be regarded as protection scope of the present invention.

Claims (1)

1. a kind of robot localization based on ceiling image information and map structuring system, it is characterised in that the system bag Include:Image collection module, characteristic extracting module, feature tracking module and memory management module;
Described image acquisition module, is arranged in robot, its optical axis perpendicular to ceiling, in robot motion to day Card is shot, and obtains current ceiling image information;
Described image processing module, for handling the ceiling image information that described image acquisition module is obtained;
The characteristic extracting module, for extracting feature in the image information after the processing of described image processing module, obtains figure As feature set;
The feature tracking module, each image that the characteristics of image for being extracted to the characteristic extracting module is concentrated is special Levy carry out matched jamming, the set of image characteristics after being tracked;
The memory management module, for storing the set of image characteristics that the feature tracking module is obtained, and according to described image Feature set map making;
Described image is characterized as Corner Feature, and described image feature set is Corner Feature collection;
The feature tracking module includes optical flow tracking module and correction module;
The optical flow tracking module, for passing through the current figure of Lucas-Kanade optical flow methods tracking
The Corner Feature of picture concentrates each Corner Feature, obtains tracking Corner Feature collection;
The correction module, for it is described state optical flow tracking module obtain tracking Corner Feature collection be corrected;
The correction module extracts Corner Feature again on current frame image, obtains new Corner Feature collection, will be by described The Corner Feature that the tracking Corner Feature that optical flow tracking module is obtained concentrates each Corner Feature to be concentrated with new Corner Feature exists The Corner Feature overlapped in the range of preset window is used as the Corner Feature collection after correction;
The correction module includes the first tracking cell, the second tracking cell and judging unit:If current frame image is image ImgA, its Corner Feature integrates as A, and next two field picture is imgB;
First tracking cell, for based on Corner Feature collection A, Corner Feature to be obtained in next two field picture imgB tracking Collect B;Second tracking cell, for based on Corner Feature collection B, Corner Feature to be obtained in current frame image imgA tracking Collect B2;
The judging unit, for judge the Corner Feature collection A each Corner Feature label whether with the Corner Feature collection Each Corner Feature label in B2 is identical, if identical, takes label identical Corner Feature to add preparation Corner Feature and concentrates;
The correction module includes extraction unit, predicting unit, comparison unit:
The extraction unit, the Corner Feature collection B2 for extracting present image;Some Corner Feature has been deposited in B2 It is to track the tracking Corner Feature collection B that the present image is obtained;
The predicting unit, the Corner Feature collection B ' for predicting present image according to the Corner Feature in Corner Feature storehouse;
The Corner Feature of the comparison unit, for contrasting B2, B and B ', B and B will not be appeared in B2 ' is judged as new angle point Feature, will be added in Corner Feature storehouse by the new Corner Feature of threshold value.
CN201410149081.1A 2014-04-14 2014-04-14 Robot localization and map structuring system based on ceiling image information Expired - Fee Related CN103886107B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410149081.1A CN103886107B (en) 2014-04-14 2014-04-14 Robot localization and map structuring system based on ceiling image information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410149081.1A CN103886107B (en) 2014-04-14 2014-04-14 Robot localization and map structuring system based on ceiling image information

Publications (2)

Publication Number Publication Date
CN103886107A CN103886107A (en) 2014-06-25
CN103886107B true CN103886107B (en) 2017-10-03

Family

ID=50954999

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410149081.1A Expired - Fee Related CN103886107B (en) 2014-04-14 2014-04-14 Robot localization and map structuring system based on ceiling image information

Country Status (1)

Country Link
CN (1) CN103886107B (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107229274B (en) * 2016-03-24 2022-06-28 松下电器(美国)知识产权公司 Position indication method, terminal device, self-propelled device, and program
CN106197427A (en) * 2016-07-04 2016-12-07 上海思依暄机器人科技股份有限公司 Method, device and the robot of a kind of indoor positioning navigation
CN106959691B (en) * 2017-03-24 2020-07-24 联想(北京)有限公司 Mobile electronic equipment and instant positioning and map construction method
WO2019057179A1 (en) * 2017-09-22 2019-03-28 华为技术有限公司 Visual slam method and apparatus based on point and line characteristic
WO2019104740A1 (en) * 2017-12-01 2019-06-06 深圳市沃特沃德股份有限公司 Method and system for measuring odometer compensation coefficient of vision cleaning robot
WO2019104741A1 (en) * 2017-12-01 2019-06-06 深圳市沃特沃德股份有限公司 Method and system for measuring compensating coefficient for odometer of visual robot cleaner
CN108181610B (en) * 2017-12-22 2021-11-19 鲁东大学 Indoor robot positioning method and system
CN108180917B (en) * 2017-12-31 2021-05-14 芜湖哈特机器人产业技术研究院有限公司 Top map construction method based on pose graph optimization
CN108665508B (en) * 2018-04-26 2022-04-05 腾讯科技(深圳)有限公司 Instant positioning and map construction method, device and storage medium
CN108888188B (en) * 2018-06-14 2020-09-01 深圳市无限动力发展有限公司 Sweeping robot position calibration method and system
CN108983769B (en) * 2018-06-22 2022-06-21 理光软件研究所(北京)有限公司 Instant positioning and map construction optimization method and device
CN110244772B (en) * 2019-06-18 2021-12-03 中国科学院上海微系统与信息技术研究所 Navigation following system and navigation following control method of mobile robot
CN112683266A (en) * 2019-10-17 2021-04-20 科沃斯机器人股份有限公司 Robot and navigation method thereof
CN111898557B (en) * 2020-08-03 2024-04-09 追觅创新科技(苏州)有限公司 Map creation method, device, equipment and storage medium of self-mobile equipment
CN115962783B (en) * 2023-03-16 2023-06-02 太原理工大学 Positioning method of cutting head of heading machine and heading machine

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102596517A (en) * 2009-07-28 2012-07-18 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
CN103680291A (en) * 2012-09-09 2014-03-26 复旦大学 Method for realizing simultaneous locating and mapping based on ceiling vision

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102596517A (en) * 2009-07-28 2012-07-18 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
CN103680291A (en) * 2012-09-09 2014-03-26 复旦大学 Method for realizing simultaneous locating and mapping based on ceiling vision

Also Published As

Publication number Publication date
CN103886107A (en) 2014-06-25

Similar Documents

Publication Publication Date Title
CN103886107B (en) Robot localization and map structuring system based on ceiling image information
CN111462135B (en) Semantic mapping method based on visual SLAM and two-dimensional semantic segmentation
CN111983639B (en) Multi-sensor SLAM method based on Multi-Camera/Lidar/IMU
CN110070615B (en) Multi-camera cooperation-based panoramic vision SLAM method
CN107635129B (en) Three-dimensional trinocular camera device and depth fusion method
Zuo et al. Devo: Depth-event camera visual odometry in challenging conditions
CN108492316A (en) A kind of localization method and device of terminal
CN106878687A (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN113674416B (en) Three-dimensional map construction method and device, electronic equipment and storage medium
CN113888639B (en) Visual odometer positioning method and system based on event camera and depth camera
JP5833507B2 (en) Image processing device
CN206611521U (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN113223045A (en) Vision and IMU sensor fusion positioning system based on dynamic object semantic segmentation
CN112541973B (en) Virtual-real superposition method and system
CN111998862A (en) Dense binocular SLAM method based on BNN
CN117593650B (en) Moving point filtering vision SLAM method based on 4D millimeter wave radar and SAM image segmentation
CN111681275A (en) Double-feature-fused semi-global stereo matching method
CN112652020B (en) Visual SLAM method based on AdaLAM algorithm
Savinykh et al. Darkslam: Gan-assisted visual slam for reliable operation in low-light conditions
CN109883433A (en) Vehicle positioning method in structured environment based on 360 degree of panoramic views
Dai et al. A tightly-coupled event-inertial odometry using exponential decay and linear preintegrated measurements
CN114511592B (en) Personnel track tracking method and system based on RGBD camera and BIM system
Zuo et al. Cross-modal semi-dense 6-dof tracking of an event camera in challenging conditions
CN113538510A (en) Real-time workpiece tracking and positioning device on production line
CN113610001B (en) Indoor mobile terminal positioning method based on combination of depth camera and IMU

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20170828

Address after: 200082 Handan Road, Shanghai, No. 220, No.

Applicant after: Zhang Wenqiang

Address before: 215000, Jiangsu City, Suzhou province Longgang large village 28 west side commercial

Applicant before: SUZHOU HUATIANXIONG INFORMATION TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20181212

Address after: 636 Zixu Road, Xukou Town, Wuzhong District, Suzhou City, Jiangsu Province

Patentee after: SUZHOU RADIANT PHOTOVOLTAIC TECHNOLOGY Co.,Ltd.

Address before: 200082 No. 220 Handan Road, Yangpu District, Shanghai

Patentee before: Zhang Wenqiang

TR01 Transfer of patent right
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20171003

CF01 Termination of patent right due to non-payment of annual fee