[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113074727A - Indoor positioning navigation device and method based on Bluetooth and SLAM - Google Patents

Indoor positioning navigation device and method based on Bluetooth and SLAM Download PDF

Info

Publication number
CN113074727A
CN113074727A CN202010009926.2A CN202010009926A CN113074727A CN 113074727 A CN113074727 A CN 113074727A CN 202010009926 A CN202010009926 A CN 202010009926A CN 113074727 A CN113074727 A CN 113074727A
Authority
CN
China
Prior art keywords
module
indoor positioning
navigation device
bluetooth
time
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010009926.2A
Other languages
Chinese (zh)
Inventor
杜珣弤
雷鸣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Fuxi Artificial Intelligence Technology Co ltd
Original Assignee
Bot3 Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bot3 Inc filed Critical Bot3 Inc
Priority to CN202010009926.2A priority Critical patent/CN113074727A/en
Priority to PCT/CN2020/141624 priority patent/WO2021139590A1/en
Publication of CN113074727A publication Critical patent/CN113074727A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Navigation (AREA)

Abstract

The invention discloses an indoor positioning navigation device, which comprises a Bluetooth module, a Bluetooth module and a communication module, wherein the Bluetooth module is used for acquiring Mesh network information of the indoor positioning navigation device and acquiring addresses, attributes, RSSI (received signal strength indicator), IQ (in-phase quadrature) data, angles and arrival time of a plurality of Bluetooth nodes in a Mesh network; and the vision module is used for acquiring images around the robot. The odometer module is used for estimating the position of the robot in real time; the data storage and processing module is used for receiving data information acquired and calculated by the Bluetooth module, the vision module and the odometer module; the position fusion and estimation module is used for fusing data in the data storage and processing module; the map building module is used for building a three-dimensional environment map according to the space position information of the Bluetooth nodes and the characteristic points, and the path planning and motion control module is used for driving the robot and planning and navigating the path according to the built three-dimensional environment map. The positioning navigation device can effectively improve the positioning navigation precision.

Description

Indoor positioning navigation device and method based on Bluetooth and SLAM
Technical Field
The invention relates to the field of robot control, in particular to an indoor positioning navigation device and method based on Bluetooth and SLAM.
Background
With the development of mobile internet technology, intelligent mobile terminal technology and internet of things technology, people have more and more diversified requirements on indoor positioning service, so that the indoor positioning navigation technology becomes a research hotspot in the field of intelligent positioning service. SLAM (simultaneous localization and mapping) technology is a more classical problem in the field of robotics, and can be described as: the robot starts to move from an unknown position in an unknown environment, and is positioned according to position estimation and a map in the moving process, and meanwhile, an incremental map is built on the basis of self positioning, so that the autonomous positioning and navigation of the robot are realized. The existing indoor positioning navigation technology realizes positioning navigation through various physical sensors, such as a camera, an IMU and the like, and has the problem of low positioning precision caused by the influence of environmental factors or low precision of the sensors in an indoor complex environment. For example, each sensor is influenced by different factors, the camera is light, the camera cannot acquire characteristic points in the dark, the IMU is an external factor, and accumulated errors exist in long-time work such as artificial dragging and ground slipping.
In order to solve the problem of positioning accuracy of indoor positioning service, a plurality of positioning sensors are subjected to data fusion, the anti-interference capability is strong, and higher-accuracy positioning is realized. The invention provides an indoor positioning navigation device and method based on Bluetooth and SLAM, which can not only effectively reduce the influence caused by environmental factors to improve the positioning precision, but also provide a three-dimensional environment map with higher reliability and more accuracy, so that the three-dimensional environment map has wider expansion application. Such as: the bluetooth is not influenced by illumination and skids the influence, but can lead to the positioning accuracy not high because of barriers such as wall, desk shelter from, so bluetooth, camera and the odometer that relate to all can have some defects of self, need the advantage of a plurality of positioning sensor to combine together, get the strong point and offset the weak point.
Disclosure of Invention
The invention aims to solve the technical problem of providing an indoor positioning navigation device and method based on Bluetooth and SLAM, wherein a scene map is constructed by Bluetooth node information, characteristic points and characteristic information acquired by a visual module and real-time position information estimated by an odometer module, so that accurate positioning and navigation functions are realized.
The invention discloses an indoor positioning navigation device, wherein the indoor positioning navigation device is arranged in a movable intelligent platform or a robot, and the indoor positioning navigation device comprises: and the Bluetooth module is used for acquiring Mesh network information of the indoor positioning navigation device, and acquiring addresses, attributes, RSSI (Received Signal Strength Indication), IQ (In-phase Quadrature) data, angles and arrival time of a plurality of Bluetooth nodes In the Mesh network. And the vision module is used for acquiring images around the robot, performing gray processing and image correction on the acquired images, and performing feature point extraction and feature line segment extraction on the images. And the odometer module is used for calculating the change of the position of the robot at each moment relative to the position at the last moment through relative positioning and estimating the position of the robot in real time. And the data storage and processing module is used for receiving the data information acquired and calculated by the Bluetooth module, the vision module and the odometer module. And the position fusion and estimation module is used for fusing the data in the data storage and processing module to acquire the real-time position of the indoor positioning navigation device. The map building module is used for storing the real-time position of the indoor positioning navigation device estimated by the position fusion and estimation module, a three-dimensional environment map created according to the space position information of the Bluetooth nodes and the characteristic points in the data storage and processing module, and the path planning and motion control module is used for driving the robot and planning and navigating the path according to the three-dimensional environment map created by the map building module.
The invention discloses an indoor positioning and navigation method, which comprises the following steps: calibrating a camera in the visual module; acquiring data collected by a Bluetooth module, a vision module and an odometer module; performing Kalman filtering on the acquired data; and creating a three-dimensional environment map according to the real-time position of the indoor positioning navigation device, the Bluetooth node and the spatial position of the feature point obtained by calculation, and performing positioning and navigation according to the created three-dimensional environment map.
Advantageously, the indoor positioning navigation device and method of the invention can construct a scene map through the bluetooth node information, the feature points of the image, the feature line segment information and the real-time position information estimated by the odometer in the Mesh network, and can be used for accurate positioning and path planning of the robot.
Drawings
Fig. 1 is a block diagram of an indoor positioning navigation device based on bluetooth and SLAM according to an embodiment of the present invention.
Fig. 2 is a flowchart of the operation of the vision module in the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention.
Fig. 3 is a flowchart of a position fusion and estimation module kalman filter in the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention.
Fig. 4 is a positioning model diagram of a bluetooth module in a bluetooth and SLAM based indoor positioning navigation device according to an embodiment of the present invention.
Fig. 5 is a motion model diagram of an odometer module in a bluetooth and SLAM based indoor positioning navigation device according to an embodiment of the present invention.
Fig. 6 is a flowchart of the operation of the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention more clearly apparent, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a block diagram of a bluetooth and SLAM based indoor positioning navigation device 100 according to an embodiment of the present invention. As shown in fig. 1, the bluetooth and SLAM based indoor positioning navigation device 100 includes: a bluetooth module 111, a vision module 112, an odometer module 113, a data storage and processing module 114, a location fusion and estimation module 115, a mapping module 116, and a path planning and motion control 117. The indoor positioning navigation device 100 can be arranged in a sweeping robot and an indoor robot, and is used for realizing the map construction, the accurate positioning and the navigation of the robot.
As shown in fig. 1, the bluetooth module 111 is a chip basic circuit set integrated with a bluetooth function, and can be used for wireless network communication and can implement data transmission and audio transmission functions. The bluetooth module 111 is located in the indoor positioning navigation device 100, and has intelligent networking and positioning functions, specifically including implementing networking and positioning of bluetooth devices in a bluetooth network. The bluetooth module 111 obtains information such as an address, an attribute, RSSI (Received Signal Strength Indication), IQ (In-phase Quadrature) data, an angle (including a direction angle and an elevation angle), and arrival time of a Mesh network where the indoor positioning navigation device 100 is located, and a bluetooth node In the bluetooth network. The RSSI is expressed in dbm, which is a negative value, and can reflect the distance between two bluetooth nodes in the bluetooth network. The address information of the Bluetooth node is an identifier of the Bluetooth node and is represented by bytes; the attributes of the Bluetooth node include five types, namely a friendly node, a low-power node, a relay node, a standard node and a proxy node.
Further, the bluetooth module 111 obtains the TOA (Time Of arrival), RSSI, IQ data, and AOA (Angle Of arrival) Of the surrounding bluetooth nodes to the robot (including the indoor positioning navigation device 100) through the positioning function. The bluetooth module 111, the RSSI ranging model and the phase difference ranging model construct a positioning model equation based on TOA, RSSI, phase and AOA as follows:
Figure BDA0002356758430000041
wherein (x, y, z) represents the spatial position of the indoor positioning navigation device 100, Bluetooth node BiIs (X)bi,Ybi,Zbi),θiIs a Bluetooth node BiThe azimuth angle with the bluetooth antenna array plane of the indoor positioning navigation device 100,
Figure BDA0002356758430000042
is a Bluetooth node BiElevation angle, D, with the plane of the Bluetooth antenna array of the indoor positioning navigation device 1001iIs an indoor positioning navigation device 100 and a Bluetooth device B obtained by an RSSI ranging modeliA distance between, D2iIs an indoor positioning navigation device 100 and a Bluetooth device B obtained by a TOA ranging modeliA distance between, D3iIs an indoor positioning navigation device 100 and a Bluetooth device B obtained by a phase difference ranging modeliThe distance between them. The optimal solution of the positioning model is obtained by a least square method, and can be calculatedBluetooth node BiThe spatial position of the navigation device 100, i.e. the value of (x, y, z), is located relative to the room. In particular with respect to the parameter θi
Figure BDA0002356758430000051
D1i,D2iAnd D3iReference may be made to fig. 4. Fig. 4 is a positioning model diagram of a bluetooth module in a bluetooth and SLAM based indoor positioning navigation device according to an embodiment of the present invention. (X)bi,Ybi,Zbi) Representing a node B in a Bluetooth networkiCoordinates of (A) Bi As a Bluetooth node BiProjection onto a two-dimensional plane.
A vision module 112, configured to acquire an image around the robot, perform graying processing and image rectification on the acquired image, and perform feature point extraction and feature line segment extraction on the image, that is, perform preprocessing on the image to remove the influence of lens distortion on feature description, and extract feature points and feature line segments in the image by using an orb (organized Fast and rotaed brief) feature point detection method. Specifically, the wide-angle camera in the vision module 112 is used to collect image information of the environment where the robot is located. In the present invention, a 12 × 9 black and white checkerboard grid calibration board of 10mm × 10mm is used to calibrate the wide-angle camera in the vision module 112 of the indoor positioning navigation device 100. The calibration method can avoid the defects of high equipment requirement and complex operation of the traditional method, and has higher precision compared with the existing calibration method. The vision module 112 obtains camera intrinsic and distortion parameter information, such as: the intra-camera parameters are the lateral focal length fx, the longitudinal focal length fy, the principal point abscissa u0, and the principal point ordinate v0, and the distortion parameters include the radial distortion parameters k1, k2, k3 and the tangential distortion parameters p1, p2, including but not limited thereto.
And the odometer module 113 is used for calculating the change of the position of the robot at each moment relative to the position at the last moment through relative positioning, so as to realize real-time estimation of the position. Namely, the X-direction offset delta X, the Y-direction offset delta Y and the angle offset delta phi around the Z direction of the robot at each moment are calculated. As shown in fig. 5, fig. 5 is a motion model diagram of an odometer module in a bluetooth and SLAM based indoor positioning navigation device according to an embodiment of the present invention. X (k-1) is the state quantity at time k-1, X (k) is the state quantity at the next time, i.e., time k, and u (k) is the offset of time k relative to time k-1, where u (k) has the values of X-direction offset Δ X, Y-direction offset Δ Y, and angular offset Δ φ around Z-direction. The data storage and processing module 115 is coupled to the bluetooth module 111, the visual module 112 and the odometer module 113, and is configured to receive data information collected and calculated by the bluetooth module 111, the visual module 112 and the odometer module 113. Specifically, the data information includes, in addition to the information such as the attribute, address, RSSI, IQ data, angle (elevation angle and azimuth angle), arrival time, etc. of the bluetooth node acquired by the bluetooth module, the wide-angle camera internal parameter and distortion parameter information, feature point image coordinates and description, feature line segment and description, etc. acquired by the vision module, and the position variation of the intelligent platform or robot acquired by the odometer module. The data calculated by the data information comprises the space position of the Bluetooth node, the space position of the feature point, and the real-time updated position information of the intelligent platform or the robot, which is acquired by the position fusion and estimation module.
The position fusion and estimation module 115 is coupled to the data storage and processing module 114, and is configured to fuse data in the data storage and processing module 114, where the fusion algorithm includes a least square method, an LM algorithm, a BA algorithm, an intelligent optimization algorithm (such as a genetic algorithm, a particle swarm algorithm, an ant colony algorithm, etc.), a kalman filter algorithm, and the like. The real-time accurate positioning of the indoor positioning navigation device 100 can be realized through the kalman filtering. It should be understood by those skilled in the art that the present embodiment is not to be construed as limiting the invention
The map building module 116 is coupled to the position fusion and estimation module 115, and is used for storing the real-time position of the indoor positioning navigation device 100 estimated by the position fusion and estimation module 115, and a three-dimensional environment map created by the spatial positions of the bluetooth nodes and the feature points in the data storage and processing module 114.
The path planning and motion control 117 is coupled to the mapping module 116 for performing path planning and navigation according to the three-dimensional environment map stored in the mapping module 116. The path planning and motion control 117 is used to drive the robot accordingly and to obtain the current position information of the robot in real time. Each module is optionally implemented as logic, a non-transitory computer readable medium having stored instructions, firmware, and/or a combination thereof. For logic implemented with stored instructions and/or solids, a processor may be provided to execute such instructions to cause existing indoor positioning and navigation devices to perform the methods described herein.
Fig. 2 is a flowchart of the operation of the vision module in the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention. The method comprises the following steps:
step 202: the wide-angle camera in the vision module 112 collects image information around the indoor positioning navigation device 100.
Step 204: the vision module 112 pre-processes the acquired images. Specifically, the method includes a graying process of the image.
Step 206: the vision module 112 corrects for image distortion.
Step 208: and extracting feature point information in the image information by using an ORB feature extraction method carried by OpenCV. The method has the characteristics of high operation speed, certain noise resistance, rotation resistance and the like. After the image is processed by using the ORB feature extraction method, a series of feature point data can be obtained, and the feature information is stored in a feature database. ORB feature points in the image include lights, corners, etc.
Step 210: and extracting a characteristic line segment in the image information, wherein the characteristic line segment is a 1sd line segment, and is generally an edge of a wall or an edge line of an object with a certain length, such as an edge line of a square lamp and the like.
Fig. 3 is a flowchart of a position fusion and estimation module kalman filter in the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention. The Kalman filtering mainly comprises the following steps:
step 302: sub-module initialization, specifically, the definition and initialization of variables and matrices involved in kalman filtering.
Step 304: and the position fusion and estimation module establishes a motion equation and an observation equation. The method specifically comprises the following steps:
attitude variation U at time k +1 by indoor positioning navigation device 100k+1(Δ x (k +1), Δ y (k +1), Δ φ (k +1)) the data is data acquired by the mileage module 114. The system state X (k) at time k is estimated as system state X (k +1| k) at time k + 1:
X(k+1|k)=X(k)+Uk+1+Qk+1…………(2)
wherein Q isk+1The noise of the motion equation at the moment of k +1 is determined by the positioning accuracy of the odometer module; u shapek+1Is the attitude change at the time of k + 1.
Further, the system observation at the time k +1 is estimated according to the system state X (k +1| k) of the indoor positioning and navigation device 100 at the time k +1
Figure BDA0002356758430000071
Namely, it is
Figure BDA0002356758430000072
Figure BDA0002356758430000073
Wherein,
Figure BDA0002356758430000081
wherein, BiRepresents a bluetooth node in a bluetooth network with coordinates (X)bi,Ybi,Zbi) The azimuth angle θ of the system state quantity X (k +1| k) at the time k +1 with respect to the indoor positioning navigation device 100iAnd elevation angle
Figure BDA0002356758430000082
FiOne of the feature points is represented, and the spatial three-dimensional coordinate of the feature point is (X)fi,Yfi,Zfi) F is the focal length of the camera, Wk+1And H is the Jacobian matrix of the observation equation to the state quantity, which is the observation equation noise at the moment of k + 1.
Step 306: and updating a covariance equation, and calculating the formula as follows:
P(k+1)=(P(k+1|k)-1+HT*R-1*H)-1…………(4)
wherein P (k +1) is a covariance matrix at the moment of k +1, and R is an observation noise covariance matrix;
step 308: updating the gain matrix, and calculating the formula as follows:
K=P(k+1)*HT*R-1…………(5)
wherein K is a gain matrix;
step 310: and updating the state vector, wherein the calculation formula is as follows:
X(k+1)=X(k+1|k)+K*ΔL…………(6)
wherein X (k +1) is the state vector at time k +1,
Figure BDA0002356758430000083
Lk+1is the true observed quantity at time k +1,
Figure BDA0002356758430000084
fi(ui,vi) Is the characteristic point F at the time of k +1i(Xfi,Yfi,Zfi) Corresponding image coordinates. Because the wheel slips, drags or accumulates the problem of error, the position obtained only through the motion equation can be wrong, so the indoor positioning navigation device 100 corrects by using the observation equation formed by other indoor information, such as characteristic points and Bluetooth nodes, and the calculation accuracy can be improved.
Fig. 6 is a flowchart of the operation of the bluetooth and SLAM based indoor positioning navigation device according to the embodiment of the present invention. Fig. 6 will be described in conjunction with fig. 1. The method specifically comprises the following steps:
step 602: the cameras within the vision module 112 are calibrated.
Step 604: and acquiring data collected by the Bluetooth module 111, the vision module 112 and the mileage module 113.
Step 606: the collected data are fused. When kalman filter fusion is employed in one embodiment, the specific filtering steps refer to the method flow diagram of fig. 3.
Step 608: the map building module 116 creates a three-dimensional environment map according to the calculated real-time position of the indoor positioning navigation device, the bluetooth node, and the spatial position of the feature point.
Step 610: the path planning and motion control module 117 performs positioning and navigation according to the created three-dimensional environment map.
Advantageously, the indoor positioning navigation device and method based on Bluetooth and SLAM can acquire mileage information by detecting and tracking ORB feature points, Bluetooth node information and a milemeter, and construct a scene map through Kalman filtering, so that the indoor positioning navigation device and method based on Bluetooth and SLAM are used for accurate positioning, path planning and navigation of a robot, and positioning accuracy can be greatly improved. Compared with the existing indoor positioning navigation technology, the method disclosed by the invention not only can effectively reduce the influence caused by environmental factors to improve the positioning precision, but also can provide a three-dimensional environment map with higher reliability and more accuracy.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (22)

1. An indoor positioning navigation device based on Bluetooth and SLAM, wherein the indoor positioning navigation device is built in a movable intelligent platform or a robot, comprising:
the Bluetooth module is used for acquiring Mesh network information of the indoor positioning and navigation device and acquiring addresses, attributes, RSSI (received signal strength indicator), IQ (in-phase quadrature) data, angles and arrival time of a plurality of Bluetooth nodes in the Mesh network;
the vision module is used for acquiring images around the robot, performing gray processing and image correction on the acquired images, and performing feature point extraction and feature line segment extraction on the images;
the odometer module is used for calculating the change of the position of the robot at each moment relative to the position at the last moment through relative positioning and estimating the position of the robot in real time;
the data storage and processing module is used for receiving data information acquired and calculated by the Bluetooth module, the vision module and the odometer module;
the position fusion and estimation module is used for fusing data in the data storage and processing module to acquire the real-time position of the indoor positioning navigation device;
a map construction module for storing the real-time position of the indoor positioning navigation device estimated by the position fusion and estimation module, and a three-dimensional environment map created according to the spatial position information of the Bluetooth nodes and the characteristic points in the data storage and processing module, an
And the path planning and motion control module is used for driving the robot and planning and navigating the path according to the three-dimensional environment map created by the map building module.
2. The indoor positioning and navigation device of claim 1, wherein the bluetooth module constructs positioning model equations based on TOA, RSSI, phase and AOA with RSSI ranging model and phase difference ranging model, and calculates real time position of the indoor positioning and navigation device by the following equations:
Figure FDA0002356758420000021
wherein (x, y, z) represents the spatial position of the indoor positioning navigation device, Bluetooth node BiIs (X)bi,Ybi,Zbi),θiIs a Bluetooth node BiThe azimuth angle between the indoor positioning navigation device and the plane of the Bluetooth antenna array,
Figure FDA0002356758420000022
is a Bluetooth node BiElevation angle with the plane of the Bluetooth antenna array of the indoor positioning navigation device, D1iIs an indoor positioning navigation device and a Bluetooth device B obtained by an RSSI ranging modeliA distance between, D2iIs an indoor positioning navigation device and a Bluetooth device B obtained by a TOA ranging modeliA distance between, D3iIs an indoor positioning navigation device and a Bluetooth device B obtained by a phase difference distance measurement modeliThe distance between them.
3. The indoor positioning and navigation device of claim 1, wherein the camera in the vision module is calibrated using a 12 x 9 black and white checkerboard calibration board of 10mm x 10 mm.
4. The indoor positioning navigation device of claim 3, wherein the camera in the vision module contains camera parameters and distortion parameter information.
5. The indoor positioning navigation device of claim 1, wherein the vision module further includes correcting for image distortion.
6. The indoor positioning and navigation device according to claim 1, wherein the position fusion and estimation module fuses data according to least square method, LM algorithm, BA algorithm, smart optimization algorithm, kalman filtering algorithm, wherein the kalman filtering further comprises initializing each module, establishing a motion equation, an observation equation, updating a covariance matrix, updating a gain matrix, and updating a state vector.
7. The indoor positioning and navigation device according to claim 6, wherein the position fusion and estimation module estimates the system state X (k +1| k) of the indoor positioning and navigation device at the time k +1 by the following formula,
X(k+1|k)=X(k)+Uk+1+Qk+1
wherein X (k) is the system state at time k, Qk+1Noise of equation of motion at time k +1, Uk+1Is the attitude change at the time of k + 1.
8. The indoor positioning and navigation device of claim 7, wherein the position fusion and estimation module estimates the system observation at time k +1 according to the system state X (k +1| k) at time k +1, and the formula is as follows:
Figure FDA0002356758420000031
namely, it is
Figure FDA0002356758420000032
Wherein,
Figure FDA0002356758420000033
wherein, the coordinate (X)bi,Ybi,Zbi) For bluetooth node-B in bluetooth networkiRelative to the azimuth angle theta of the system state quantity X (k +1| k) of the indoor positioning and navigation device at the time k +1iAnd elevation angle
Figure FDA0002356758420000034
FiOne of the feature points is represented, and the spatial three-dimensional coordinate of the feature point is (X)fi,Yfi,Zfi) F is the focal length of the camera, Wk+1And H is the Jacobian matrix of the observation equation to the state quantity, which is the observation equation noise at the moment of k + 1.
9. The indoor positioning and navigation device of claim 6, wherein the position fusion and estimation module updates a covariance equation, which is calculated as follows:
P(k+1)=(P(k+1|k)-1+HT*R-1*H)-1
wherein P (k +1) is the covariance matrix at time k +1, and R is the observed noise covariance matrix.
10. The indoor positioning and navigation device according to claim 6, wherein the position fusion and estimation module updates a gain matrix according to the following formula:
K=P(k+1)*HT*R-1
where K is the gain matrix.
11. The indoor positioning and navigation device according to claim 6, wherein the position fusion and estimation module updates the state vector according to the following formula:
X(k+1)=X(k+1|k)+K*ΔL
wherein X (k +1) is the state vector at time k +1,
Figure FDA0002356758420000041
Lk+1is the true observed quantity at time k +1,
Figure FDA0002356758420000042
fi(ui,vi) Is the characteristic point F at the time of k +1i(Xfi,Yfi,Zfi) Corresponding image coordinates.
12. An indoor positioning navigation method based on Bluetooth and SLAM comprises the following steps:
calibrating a camera in the visual module;
acquiring data collected by a Bluetooth module, a vision module and an odometer module;
fusing the collected data;
creating a three-dimensional environment map based on the calculated real-time position of the indoor positioning navigation device, the Bluetooth node, and the spatial position of the feature point, and
and positioning and navigating according to the created three-dimensional environment map.
13. The indoor positioning and navigation method according to claim 12, wherein the method for calibrating the camera in the vision module comprises: the scale was made using 12 x 9 black and white checkerboard scale plates of 10mm x 10 mm.
14. The indoor positioning and navigation method according to claim 12, wherein the bluetooth module obtains Mesh network information of the indoor positioning and navigation device, and obtains addresses, attributes, RSSI, IQ data, angles, and arrival times of a plurality of bluetooth nodes in a Mesh network.
15. The indoor positioning and navigation method according to claim 12, wherein the vision module collects an image around the robot, and grays the collected image, corrects the image, and performs feature point extraction and feature line segment extraction on the image.
16. The indoor positioning and navigation method according to claim 12, wherein the odometer module calculates a change of the position of the robot at each moment relative to the position at the previous moment by relative positioning to estimate the position of the robot in real time.
17. The indoor positioning and navigation method according to claim 12, wherein the position fusion and estimation module in the indoor positioning and navigation device fuses data, and the fusion algorithm includes least square method, LM algorithm, BA algorithm, smart optimization algorithm, and kalman filtering algorithm, wherein the kalman filtering further includes initializing each module, establishing a motion equation, an observation equation, updating a covariance matrix, updating a gain matrix, and updating a state vector.
18. The indoor positioning and navigation method according to claim 17, wherein the position fusion and estimation module estimates the system state X (k +1| k) of the indoor positioning and navigation device at the time k +1 by the following formula when passing through Kalman filtering,
X(k+1|k)=X(k)+Uk+1+Qk+1
wherein X (k) is the system state at time k, Qk+1Noise of equation of motion at time k +1, Uk+1Is the attitude change at the time of k + 1.
19. The indoor positioning and navigation method according to claim 17, wherein when the position fusion and estimation module performs fusion by using kalman filtering, the state vector is updated according to the following calculation formula:
X(k+1)=X(k+1|k)+K*ΔL
wherein X (k +1) is the state vector at time k +1,
Figure FDA0002356758420000051
Lk+1is the true observed quantity at time k +1,
Figure FDA0002356758420000061
fi(ui,vi) Is the characteristic point F at the time of k +1i(Xfi,Yfi,Zfi) Corresponding image coordinates.
20. The indoor positioning and navigation method according to claim 17, wherein when the position fusion and estimation module employs kalman filter fusion, the system observation at the time k +1 is estimated according to the system state X (k +1| k) at the time k +1, and the formula is as follows:
Figure FDA0002356758420000062
namely, it is
Figure FDA0002356758420000063
Wherein,
Figure FDA0002356758420000064
wherein, the coordinate (X)bi,Ybi,Zbi) For bluetooth node-B in bluetooth networkiRelative to the roomPositioning navigation device at the azimuth angle theta of system state quantity X (k +1| k) at the moment of k +1iAnd elevation angle
Figure FDA0002356758420000065
FiOne of the feature points is represented, and the spatial three-dimensional coordinate of the feature point is (X)fi,Yfi,Zfi) F is the focal length of the camera, Wk+1And H is the Jacobian matrix of the observation equation to the state quantity, which is the observation equation noise at the moment of k + 1.
21. The indoor positioning and navigation method according to claim 17, wherein when the position fusion and estimation module performs fusion by using kalman filtering, the covariance equation is updated, and the calculation formula is as follows:
P(k+1)=(P(k+1|k)-1+HT*R-1*H)-1
wherein P (k +1) is the covariance matrix at time k +1, and R is the observed noise covariance matrix.
22. The indoor positioning and navigation method according to claim 17, wherein when the position fusion and estimation module performs fusion by using kalman filtering, the state vector is updated according to the following calculation formula:
X(k+1)=X(k+1|k)+K*ΔL
wherein X (k +1) is the state vector at time k +1,
Figure FDA0002356758420000066
Lk+1is the true observed quantity at time k +1,
Figure FDA0002356758420000071
fi(ui,vi) Is the characteristic point F at the time of k +1i(Xfi,Yfi,Zfi) Corresponding image coordinates.
CN202010009926.2A 2020-01-06 2020-01-06 Indoor positioning navigation device and method based on Bluetooth and SLAM Pending CN113074727A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010009926.2A CN113074727A (en) 2020-01-06 2020-01-06 Indoor positioning navigation device and method based on Bluetooth and SLAM
PCT/CN2020/141624 WO2021139590A1 (en) 2020-01-06 2020-12-30 Indoor localization and navigation apparatus based on bluetooth and slam, and method therefor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010009926.2A CN113074727A (en) 2020-01-06 2020-01-06 Indoor positioning navigation device and method based on Bluetooth and SLAM

Publications (1)

Publication Number Publication Date
CN113074727A true CN113074727A (en) 2021-07-06

Family

ID=76609029

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010009926.2A Pending CN113074727A (en) 2020-01-06 2020-01-06 Indoor positioning navigation device and method based on Bluetooth and SLAM

Country Status (2)

Country Link
CN (1) CN113074727A (en)
WO (1) WO2021139590A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN114136306A (en) * 2021-12-01 2022-03-04 浙江大学湖州研究院 Expandable UWB and camera-based relative positioning device and method
CN116549218A (en) * 2023-05-12 2023-08-08 江西恒必达实业有限公司 Intelligent blind guiding glasses based on obstacle monitoring and reminding

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113587917A (en) * 2021-07-28 2021-11-02 北京百度网讯科技有限公司 Indoor positioning method, device, equipment, storage medium and computer program product
CN113568413A (en) * 2021-08-19 2021-10-29 深圳中智永浩机器人有限公司 Robot safety guarantee method and device, computer equipment and storage medium
CN113873435B (en) * 2021-09-24 2024-08-20 京东方科技集团股份有限公司 Indoor positioning method and related equipment
CN113850910A (en) * 2021-09-28 2021-12-28 江苏京芯光电科技有限公司 Map construction method of SLAM sweeper
CN114001743B (en) * 2021-10-29 2024-08-27 京东方科技集团股份有限公司 Map drawing method, device and system, storage medium and electronic equipment
CN114025320A (en) * 2021-11-08 2022-02-08 易枭零部件科技(襄阳)有限公司 Indoor positioning method based on 5G signal
CN114205748B (en) * 2021-12-08 2023-03-10 珠海格力电器股份有限公司 Network configuration method and device, electronic equipment and storage medium
CN114510044A (en) * 2022-01-25 2022-05-17 北京圣威特科技有限公司 AGV navigation ship navigation method and device, electronic equipment and storage medium
CN114888851A (en) * 2022-05-30 2022-08-12 北京航空航天大学杭州创新研究院 Moving object robot grabbing device based on visual perception
CN115334448B (en) * 2022-08-15 2024-03-15 重庆大学 Accurate dynamic positioning method of unmanned self-following device based on Bluetooth and inertial sensor
CN115218907B (en) * 2022-09-19 2022-12-09 季华实验室 Unmanned aerial vehicle path planning method and device, electronic equipment and storage medium
CN115802282B (en) * 2022-12-16 2024-06-07 兰笺(苏州)科技有限公司 Co-location method and device for wireless signal field
CN115808170B (en) * 2023-02-09 2023-06-06 宝略科技(浙江)有限公司 Indoor real-time positioning method integrating Bluetooth and video analysis
CN117119585B (en) * 2023-08-26 2024-02-06 江苏蓝策电子科技有限公司 Bluetooth positioning navigation system and method
CN116954235B (en) * 2023-09-21 2023-11-24 深圳大工人科技有限公司 AGV trolley navigation control method and system
CN118310523B (en) * 2024-04-01 2024-09-10 广东经纬天地科技有限公司 Indoor positioning method, system, equipment and storage medium
CN118695205A (en) * 2024-08-22 2024-09-24 广州彩熠灯光股份有限公司 Lamp position information processing method and device, stage lamp and program product

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9488492B2 (en) * 2014-03-18 2016-11-08 Sri International Real-time system for multi-modal 3D geospatial mapping, object recognition, scene annotation and analytics
US10365363B2 (en) * 2015-05-08 2019-07-30 Humatics Corporation Mobile localization using sparse time-of-flight ranges and dead reckoning
EP3645972A4 (en) * 2017-06-30 2021-01-13 SZ DJI Technology Co., Ltd. Map generation systems and methods
CN108801265A (en) * 2018-06-08 2018-11-13 武汉大学 Multidimensional information synchronous acquisition, positioning and position service apparatus and system and method
CN109541535A (en) * 2019-01-11 2019-03-29 浙江智澜科技有限公司 A method of AGV indoor positioning and navigation based on UWB and vision SLAM
CN110308729B (en) * 2019-07-18 2022-05-10 石家庄辰宙智能装备有限公司 AGV (automatic guided vehicle) combined navigation positioning method based on vision and IMU (inertial measurement Unit) or odometer

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113949999A (en) * 2021-09-09 2022-01-18 之江实验室 Indoor positioning navigation equipment and method
CN113949999B (en) * 2021-09-09 2024-01-30 之江实验室 Indoor positioning navigation equipment and method
CN114136306A (en) * 2021-12-01 2022-03-04 浙江大学湖州研究院 Expandable UWB and camera-based relative positioning device and method
CN114136306B (en) * 2021-12-01 2024-05-07 浙江大学湖州研究院 Expandable device and method based on relative positioning of UWB and camera
CN116549218A (en) * 2023-05-12 2023-08-08 江西恒必达实业有限公司 Intelligent blind guiding glasses based on obstacle monitoring and reminding

Also Published As

Publication number Publication date
WO2021139590A1 (en) 2021-07-15

Similar Documents

Publication Publication Date Title
CN113074727A (en) Indoor positioning navigation device and method based on Bluetooth and SLAM
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
WO2021026850A1 (en) Qr code-based navigation attitude determining and positioning method and system
CN112667837A (en) Automatic image data labeling method and device
CN111121754A (en) Mobile robot positioning navigation method and device, mobile robot and storage medium
AU2018282302A1 (en) Integrated sensor calibration in natural scenes
CN110411457B (en) Positioning method, system, terminal and storage medium based on stroke perception and vision fusion
CN111856499B (en) Map construction method and device based on laser radar
CN108332752B (en) Indoor robot positioning method and device
WO2019136613A1 (en) Indoor locating method and device for robot
CN114413909A (en) Indoor mobile robot positioning method and system
CN114111774B (en) Vehicle positioning method, system, equipment and computer readable storage medium
CN113763548A (en) Poor texture tunnel modeling method and system based on vision-laser radar coupling
CN115376109B (en) Obstacle detection method, obstacle detection device, and storage medium
CN112136021A (en) System and method for constructing landmark-based high-definition map
WO2022111723A1 (en) Road edge detection method and robot
CN110515088B (en) Odometer estimation method and system for intelligent robot
CN111025366A (en) Grid SLAM navigation system and method based on INS and GNSS
CN113324544B (en) Indoor mobile robot co-location method based on UWB/IMU (ultra wide band/inertial measurement unit) of graph optimization
CN114488094A (en) Vehicle-mounted multi-line laser radar and IMU external parameter automatic calibration method and device
CN115200572B (en) Three-dimensional point cloud map construction method and device, electronic equipment and storage medium
CN111380515A (en) Positioning method and device, storage medium and electronic device
CN113093759A (en) Robot formation construction method and system based on multi-sensor information fusion
CN114111791B (en) Indoor autonomous navigation method, system and storage medium for intelligent robot
CN113971697B (en) Air-ground cooperative vehicle positioning and orientation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
TA01 Transfer of patent application right

Effective date of registration: 20220520

Address after: No. 415, No. 19, Dongwu North Road, Wuzhong District, Suzhou City, Jiangsu Province

Applicant after: Suzhou Fuxi Artificial Intelligence Technology Co.,Ltd.

Address before: California, USA

Applicant before: BOT3, Inc.

TA01 Transfer of patent application right
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210706

WD01 Invention patent application deemed withdrawn after publication