[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110308720B - Unmanned distribution device and navigation positioning method and device thereof - Google Patents

Unmanned distribution device and navigation positioning method and device thereof Download PDF

Info

Publication number
CN110308720B
CN110308720B CN201910543477.7A CN201910543477A CN110308720B CN 110308720 B CN110308720 B CN 110308720B CN 201910543477 A CN201910543477 A CN 201910543477A CN 110308720 B CN110308720 B CN 110308720B
Authority
CN
China
Prior art keywords
environment
information
image
unmanned
passenger compartment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910543477.7A
Other languages
Chinese (zh)
Other versions
CN110308720A (en
Inventor
程保山
郝立良
申浩
聂琼
王景恒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Sankuai Online Technology Co Ltd
Original Assignee
Beijing Sankuai Online Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Sankuai Online Technology Co Ltd filed Critical Beijing Sankuai Online Technology Co Ltd
Priority to CN201910543477.7A priority Critical patent/CN110308720B/en
Publication of CN110308720A publication Critical patent/CN110308720A/en
Application granted granted Critical
Publication of CN110308720B publication Critical patent/CN110308720B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Electromagnetism (AREA)
  • Navigation (AREA)
  • Traffic Control Systems (AREA)

Abstract

The embodiment of the application discloses an unmanned delivery device and a navigation positioning method and device thereof. The method comprises the following steps: respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device; obtaining a passenger compartment number based on the first environment information, and obtaining a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number. The navigation positioning scheme of the embodiment of the application has high positioning precision and low cost, and is suitable for large-scale popularization and application.

Description

Unmanned distribution device and navigation positioning method and device thereof
Technical Field
The application relates to the technical field of navigation and positioning, in particular to an unmanned delivery device and a navigation and positioning method and device thereof.
Background
At present, a Positioning technology of an unmanned distribution device (an unmanned distribution vehicle or an unmanned distribution robot) is mainly based on a multi-sensor fusion scheme such as a laser sensor, a Global Positioning System (GPS), an Inertial Measurement Unit (IMU), And the like, for example, a visual SLAM (simultaneous localization And Mapping) technology, And the technology needs to use the laser sensor to scan a surrounding environment structure, And then calculates a position And posture transformation of the robot between two acquisitions by matching laser data acquired twice before And after so as to realize Positioning, but the laser sensor is expensive And can only be used for realizing Global Positioning And resolving, And the Positioning precision is not high.
Disclosure of Invention
In view of this, the embodiment of the application provides an unmanned delivery device and a navigation positioning method and device thereof, which solve the problem of low positioning accuracy of the unmanned delivery device in the prior art, are low in cost, and are suitable for large-scale popularization and application.
According to one aspect of the application, a navigation positioning method for unmanned delivery devices in public transport vehicles is provided, and the method comprises the following steps:
respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device;
obtaining a passenger compartment number based on the first environment information, and obtaining a seat number based on the second environment information;
determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number.
According to another aspect of the present application, there is provided a navigation positioning device for an unmanned dispensing device in a public transportation vehicle, the device comprising:
the acquisition module is used for respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device;
the navigation positioning module is used for obtaining a passenger compartment number based on the first environment information and obtaining a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number.
According to yet another aspect of the present application, there is provided an unmanned dispensing device traveling in a passenger compartment of a public transportation vehicle, comprising: a first camera, a second camera and a processor,
the first camera is used for acquiring first environment information and sending the first environment information to the processor;
the second camera is used for acquiring second environment information and sending the second environment information to the processor;
the processor is used for obtaining a passenger compartment number based on the first environment information and obtaining a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number.
According to yet another aspect of the application, a non-transitory computer-readable storage medium is provided, having stored thereon a computer program, which when executed by a processor, performs the steps of the method according to one aspect of the application.
Has the advantages that: according to the unmanned distribution device and the navigation positioning scheme thereof, the environment information collected by the two cameras of the unmanned distribution device is respectively obtained, the passenger compartment number and the seat number are respectively obtained based on the environment information, and the position of the unmanned distribution device in the public transport means is determined according to the passenger compartment number and the seat number. Therefore, the environment information of the public transport means is collected through the camera, the passenger compartment number and the seat number of the unmanned distribution device in the public transport means are obtained after the environment information is processed, the positioning precision is greatly improved, the specific seat can be positioned, the distribution requirement of the unmanned distribution device is met, high-precision hardware such as an IMU sensor is not needed, the cost is low, steps of map data pre-making and the like are not needed, the calculation speed is high, and the positioning efficiency is high.
Drawings
FIG. 1 is a flow chart of a method for navigational positioning of unmanned dispensing devices in a mass transit vehicle according to one embodiment of the present application;
FIG. 2 is a flow chart illustrating a method for navigating and positioning an unmanned dispensing device in a public transportation vehicle according to another embodiment of the present application;
FIG. 3 is a block diagram of a navigational positioning device of an unmanned dispensing device in a public transportation vehicle according to one embodiment of the present application;
FIG. 4 is a block diagram of an unmanned dispensing device according to one embodiment of the subject application;
FIG. 5 is a schematic structural diagram of a non-transitory computer-readable storage medium according to an embodiment of the present application.
Detailed Description
In order to make the aforementioned objects, features and advantages of the embodiments of the present application more comprehensible, embodiments of the present application are described in detail below with reference to the accompanying drawings and the detailed description. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application without any creative effort belong to the protection scope of the embodiments in the present application.
The technical idea of the application is that aiming at the technical problems of low positioning precision, the need of constructing an environment map in advance, the complex calculation based on the data fusion of various sensors and the like in the robot navigation positioning in the prior art, the navigation positioning scheme of the unmanned distribution device moving in the public transport means is provided, the scheme can accurately measure the passenger compartment number and the seat number of the unmanned distribution device (such as an unmanned distribution vehicle or a robot) moving in the passenger compartment, high-precision positioning is realized to automatically distribute goods, and the actual application requirements are met.
Fig. 1 is a flowchart of a navigation positioning method for unmanned distribution devices in a public transportation vehicle according to an embodiment of the present application, and referring to fig. 1, the navigation positioning method for unmanned distribution devices in a public transportation vehicle according to the present embodiment includes:
step S101, respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device;
step S102, obtaining a passenger compartment number based on the first environment information, and obtaining a seat number based on the second environment information;
step S103, determining the position of the unmanned distribution devices in the public transportation vehicle according to the passenger compartment number and the seat number.
As shown in fig. 1, in the navigation and positioning method for an unmanned distribution device in a public transportation vehicle according to this embodiment, environment information of the public transportation vehicle is collected by a first camera and a second camera of the unmanned distribution device, and is processed based on the environment information to obtain a passenger compartment number and a seat number of the unmanned distribution device, and a position of the unmanned distribution device in the public transportation vehicle is obtained from the passenger compartment number and the seat number. Therefore, the requirement of realizing positioning in a moving public transport means is met, and more importantly, the method has high positioning precision and can realize positioning at the seat level; in addition, an environment map of the public transport means does not need to be made in advance, the calculation process is simple, and the efficiency is high; finally, compared with a positioning scheme by means of sensor data fusion, the method does not need to install a high-precision sensor, is low in cost and is suitable for large-scale popularization and application.
It should be noted that the public transportation of the present embodiment includes: common trains, high-speed rails, airplanes, etc., wherein the public transportation vehicles are provided with a plurality of passenger compartments and a plurality of seats, unmanned distribution devices (such as unmanned distribution vehicles) can run on the aisles of the public transportation vehicles. In the case of high-speed rail, unmanned distribution devices travel on the aisles of the compartments of the high-speed rail for automatic distribution of goods such as food, beverages, and the like.
In the following, the steps of the navigation positioning method of the unmanned distribution device in the public transportation according to the present embodiment will be described in detail by taking a high-speed rail as an example. Fig. 2 is a schematic flowchart of a navigation and positioning method for an unmanned distribution device in a public transportation vehicle according to another embodiment of the present application, where the unmanned distribution device of this embodiment is installed with a first camera and a second camera, and the process starts to execute a step of respectively acquiring first environment information collected by the first camera and second environment information collected by the second camera.
Referring to fig. 2, step S201 is executed to acquire an image acquired by a first camera;
here acquiring the image captured by the first camera includes: acquiring first environment information collected by the first camera installed in front of the unmanned distribution device body, wherein the first environment information is a first environment image. The first environment image can be subjected to distortion correction to improve the distortion problem of the original image.
Step S210, acquiring an image acquired by a second camera;
the image acquired by the second camera comprises second environment information acquired by the second camera and installed on two sides of the bottom of the unmanned distribution device, and the second environment information is a second environment image.
The angle of view of the first camera is larger than the angle of view of the second camera in this embodiment. The wide-angle camera is selected as the first video camera because the wide-angle camera can collect the structural feature information of the environment of the passenger compartment from top to bottom as much as possible in consideration of the complex structure in the passenger compartment of the high-speed railway train.
Next, the passenger compartment number is obtained based on the first environmental information, and the seat number is obtained based on the second environmental information. Obtaining the passenger compartment number based on the first environmental information includes: matching the first environment image with a first template image stored in advance; if the matching is consistent, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments; and recording characteristic information of a through passage connecting two adjacent passenger compartments in the first template image. Deriving a seat number based on the second environmental information comprises: matching the second environment image with a second template image stored in advance; if the matching is consistent, adding 1 to the counted number of the current seats, and updating the number of the current seats; and recording the characteristic information of the seat in the passenger compartment in the second template image. The following description is made separately.
Referring to the left side of fig. 2, after an image acquired by the first camera, that is, a first environment image is acquired, step S202 is performed, and a through passage template image is matched;
specifically, a first environment image is matched with a first template image stored in advance, and feature information of a through passage connecting two adjacent passenger compartments is recorded in the first template image. That is, the through-train-passage template image is acquired and stored in advance in this example.
Step S203, whether the matching is successful or not;
in this step, it is determined whether the first environment image matches the first template image (i.e., the through-passage template image illustrated in fig. 2) twice, and if the first environment image matches the first template image twice, success is confirmed. Since the through passage template image includes the structural features of the through passage, and the through passage is a specific structure connecting two adjacent cars, if there are two matching times of the through passage template images, it can be determined that the unmanned distribution device has passed through one car. Namely, in the embodiment, the car doorway positioning is performed through the template image information specific to the sealing part of the through passage between the cars. If the matching is not successful, step S204 is performed.
Step S204, Hough transform;
the Hough transformation is carried out to extract edge straight line information in the image so as to correct the direction of the head of the unmanned distribution device, and therefore the positioning accuracy is improved. It can be understood that unmanned dispensing devices such as unmanned vehicles travel in high-speed railway cars and are difficult to avoid encountering obstacles, and in order to avoid obstacles, unmanned vehicles need to dynamically adjust their head orientation. When the first environment image does not match the first template image stored in advance, it indicates that the unmanned vehicle is currently on the aisle of the high-speed railway carriage instead of at the junction, so that it may be determined whether the head orientation needs to be changed according to the first environment image, specifically including:
carrying out Hough transformation on the first environment image to obtain a plurality of candidate straight lines; screening the candidate straight lines according to the geometric length of the candidate straight lines and the included angle between the candidate straight lines and the straight line where the side edge of a passenger compartment (such as a high-speed rail compartment) of a public transport means is located, and obtaining the candidate straight lines parallel to the straight line where the passageway of the passenger compartment is located as reference straight lines (namely lane lines); carrying out back projection transformation on the reference straight line to enable the reference straight line to be parallel to the vertical direction of the first environment image; if an included angle exists between the reference straight line after back projection transformation and the central axis of the first environment image, the head orientation of the unmanned distribution device (namely the head orientation of the unmanned vehicle) is adjusted according to the size of the included angle; and if the reference straight line after back projection transformation does not have an included angle with the central axis of the first environment image, keeping the head of the unmanned distribution device facing.
It should be noted that how to perform hough transform on an original image, that is, a first environment image, to obtain an inverse perspective view and a plurality of candidate straight lines is prior art, and details about implementation may be referred to in the description of the prior art, and are not described herein again. Screening the candidate straight lines according to the geometric lengths of the candidate straight lines and the included angles between the candidate straight lines and the straight lines on which the sides of the public vehicle passenger compartment (such as a high-speed rail compartment) are located comprises filtering out straight lines with lengths smaller than a preset length threshold value, such as 10 centimeters, because the lengths of the straight lines are too short to be lane lines.
Step S205, adjusting the head orientation;
if it is determined in the previous step, that an included angle exists between the reference straight line after the back projection transformation and the central axis of the first environment image in step S204, adjusting the head orientation of the unmanned distribution device according to the size of the included angle to prevent the unmanned vehicle from deviating from the aisle; how to adjust the head direction of the unmanned dispensing device is the prior art and is not described in detail herein.
Step S206, acquiring mileage information acquired by the odometer;
in this embodiment, the passenger car number is positioned by matching the first environment image acquired by the first camera with the template image, and for example, in an embodiment, the destination position of the unmanned distribution device is 13 cars starting from the 9 th car, the unmanned distribution device runs from the 9 th car to the 10 th car, and the unmanned distribution device runs through the 9 th car and the 10 th through passage a and continues to run through the 10 th car and the 11 th through passage B, that is, the unmanned distribution device continuously runs through two through passages, so that the condition that matching of the two consecutive through passages is successful is met, but at this time, if only depending on the condition, the number of cars is increased by 1, so that misjudgment is easily caused, and in this embodiment, mileage information is further added for judgment after matching of the two through passages is successful.
Step S207, accumulating the driving mileage;
and accumulating the information of the traveled mileage acquired by the odometer in the step S206 to obtain the mileage traveled by the unmanned distribution device.
Step S208, whether the mileage is greater than a mileage threshold value or not; if yes, go to step S209; otherwise, returning to execute the step S207;
here, it is determined whether the mileage is greater than a mileage threshold, such as 20 meters (the specific value should be determined according to the length of the passenger compartment),
step S209, adding 1 to the number of the passenger compartments;
and if the mileage is judged to be greater than the mileage threshold in the previous step, namely the unmanned distribution device continuously passes through the two through passages, the condition that the matching of the two continuous through passages is successful is met, and the mileage is greater than the mileage threshold, adding 1 to the counted current number of the passenger compartments, and updating the recorded current number of the passenger compartments. Therefore, misjudgment in the counting of the number of the passenger compartments is avoided, and the positioning accuracy is ensured.
Steps S211 to 213 relate to the contents of the positioning of the unmanned distribution device at the seat number level in the high-speed rail, specifically,
step S211, seat template image matching;
similar to the image matching procedure in the aforementioned passenger compartment number positioning, here the second environment image captured by the second camera is matched with a seat template image in which structural features of the seat, such as the structure of the seat base, are recorded, so that by template image matching it is possible to determine to which row of seats the unmanned aerial vehicle has traveled.
Step S212, whether the matching is successful or not; if yes, go to step S213; otherwise, step S211 is executed.
And judging whether the second environment image is successfully matched with a second template image stored in advance, wherein the second template image records the seat characteristic information in the passenger compartment. The specific image matching is the prior art and is not described in detail here. And if the second environment image is the second template image stored in advance, adding 1 to the counted current seat number.
Step S213, adding 1 to the seat number;
in this step, the current seat number recorded is added by 1, for example, if the current seat number is 14 rows, the current seat number is updated to 15 after adding 1. Since the seat numbers have certain rules, after the seat row number to which the unmanned distribution device runs is obtained, a specific seat number, such as a seat in row 15 and 06F, can be deduced according to the number rule of each row of seats in each compartment acquired in advance.
In step S214, the passenger compartment number and the seat number are determined.
The position of the unmanned distribution device in the high-speed rail car is obtained from the combination of the passenger car number and the seat number based on the passenger car number determined in the aforementioned step S209 and the seat number determined in the step S213.
In addition, in other embodiments of the present application, an implementation manner of performing positioning based on an environment image is further provided, where for example, the first environment information is a first environment image, the second environment information is a second environment image, and obtaining the passenger car number based on the first environment information includes: carrying out optical character recognition on the first environment image to obtain numbers in the first environment image, comparing the numbers with numbers in a preset passenger compartment white list, and if the numbers in the first environment image appear in the passenger compartment white list, obtaining a passenger compartment number from the numbers in the first environment image, wherein the passenger compartment white list is set according to the passenger compartment number of the public transport means; deriving a seat number based on the second environmental information comprises: and carrying out optical character recognition on the second environment image to obtain a number in the second environment image, comparing the number with a number in a preset seat white list, and if the number in the second environment image appears in the seat white list, obtaining a seat number from the number in the second environment image, wherein the seat white list is set according to the seat number of the public transport means.
In this way, navigation and positioning are realized by a camera and an optical character recognition technology, and compared with the image template matching method in the foregoing embodiment, the method requires higher computing resource requirements and has slightly lower positioning accuracy.
As can be seen from the above, in the navigation and positioning method for unmanned distribution devices in public transportation vehicles of this embodiment, the positioning at the seat level in the carriage (relative to the two ends of the carriage) is realized by acquiring the environment image and matching the environment image with the pre-stored through passage template image in the high-speed rail carriage and the seat base image template, and the absolute position information of the unmanned distribution devices does not need to be calculated in real time, so that the energy consumption is saved. The unmanned distribution device is navigated by the structural information (linear environment structure) in the carriage, an environment map does not need to be made in advance, the calculation complexity is reduced, and the efficiency is high.
The same technical concept as the navigation positioning method of the unmanned delivery devices in the public transportation means, the embodiment of the present application further provides a navigation positioning device of the unmanned delivery devices in the public transportation means, referring to fig. 3, the navigation positioning device 300 of the unmanned delivery devices in the public transportation means of the present embodiment includes an obtaining module 301 and a navigation positioning module 302,
an obtaining module 301, configured to obtain first environment information collected by a first camera and second environment information collected by a second camera of the unmanned distribution device, respectively;
a navigation positioning module 302, configured to obtain a passenger compartment number based on the first environment information, and obtain a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number.
In an embodiment of the present application, the first environment information is a first environment image, and the navigation positioning module 302 is specifically configured to match the first environment image with a first template image stored in advance; if the matching is consistent, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments; and recording characteristic information of a through passage connecting two adjacent passenger compartments in the first template image.
In an embodiment of the present application, the second environment information is a second environment image, and the navigation positioning module 302 is specifically configured to match the second environment image with a second template image stored in advance; if the matching is consistent, adding 1 to the counted number of the current seats, and updating the number of the current seats; and recording the characteristic information of the seat in the passenger compartment in the second template image.
In an embodiment of the present application, the first environment information is a first environment image, the second environment information is a second environment image, and the navigation positioning module 302 is specifically configured to perform optical character recognition on the first environment image to obtain a number in the first environment image, compare the number with a number in a preset passenger compartment white list, and if the number in the first environment image appears in the passenger compartment white list, obtain a passenger compartment number from the number in the first environment image, where the passenger compartment white list is set according to the passenger compartment number of the public transportation; and carrying out optical character recognition on the second environment image to obtain a number in the second environment image, comparing the number with a number in a preset seat white list, and if the number in the second environment image appears in the seat white list, obtaining a seat number from the number in the second environment image, wherein the seat white list is set according to the seat number of the public transport means.
In an embodiment of the present application, the navigation positioning module 302 further obtains the driving mileage information collected by the odometer of the unmanned distribution device before adding 1 to the counted current number of passenger compartments; and comparing the mileage with a mileage threshold value, and if the mileage is greater than the mileage threshold value, adding 1 to the current number of the passenger compartments.
In an embodiment of the present application, the obtaining module 301 is configured to obtain first environment information collected by the first camera installed in front of the body of the unmanned aerial vehicle, and obtain second environment information collected by the second camera installed on two sides of the bottom of the body of the unmanned aerial vehicle, where a field angle of the first camera is larger than a field angle of the second camera.
In an embodiment of the present application, the navigation positioning module 302 is further configured to determine whether a driving direction needs to be changed according to the first environment image if the first environment image does not match the first template image stored in advance, specifically including: carrying out Hough transform on the first environment image to obtain a plurality of candidate straight lines; screening the candidate straight lines according to the geometric length of the candidate straight lines and the included angle between the candidate straight lines and the straight line where the side edge of the passenger compartment of the public transport vehicle is located, and obtaining candidate straight lines which are parallel to the straight line where the passageway of the passenger compartment is located as reference straight lines; carrying out back projection transformation on the reference straight line to enable the reference straight line to be parallel to the vertical direction of the first environment image; if an included angle exists between the reference straight line after back projection transformation and the central axis of the first environment image, adjusting the driving direction of the unmanned distribution device according to the size of the included angle; and if the reference straight line after back projection transformation does not have an included angle with the central axis of the first environment image, keeping the driving direction of the unmanned distribution device.
The exemplary explanation about the functions performed by the modules in the apparatus shown in fig. 3 is consistent with the exemplary explanation in the foregoing method embodiment, and is not repeated here.
The present embodiment further provides an unmanned delivery apparatus, and referring to fig. 4, the unmanned delivery apparatus 400 of the present embodiment includes: a first camera 401, a second camera 402 and a processor 403,
the first camera 401 is configured to acquire first environment information and send the first environment information to the processor;
the second camera 402 is configured to acquire second environment information and send the second environment information to the processor;
the processor 403 is configured to obtain a passenger compartment number based on the first environment information, and obtain a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number.
In an embodiment of the present application, the first environment information is a first environment image, and the processor 403 is specifically configured to match the first environment image with a first template image stored in advance; if the matching is consistent, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments; and recording characteristic information of a through passage connecting two adjacent passenger compartments in the first template image.
In an embodiment of the present application, the second environment information is a second environment image, and the processor 403 is specifically configured to match the second environment image with a second template image stored in advance; if the matching is consistent, adding 1 to the counted number of the current seats, and updating the number of the current seats; and recording the characteristic information of the seat in the passenger compartment in the second template image.
In an embodiment of the application, the first environment information is a first environment image, the second environment information is a second environment image, and the processor 403 is specifically configured to perform optical character recognition on the first environment image, obtain a number in the first environment image, compare the number with a number in a preset passenger compartment white list, and if the number in the first environment image appears in the passenger compartment white list, obtain a passenger compartment number from the number in the first environment image, where the passenger compartment white list is set according to the passenger compartment number of the public transportation vehicle; and carrying out optical character recognition on the second environment image to obtain a number in the second environment image, comparing the number with a number in a preset seat white list, and if the number in the second environment image appears in the seat white list, obtaining a seat number from the number in the second environment image, wherein the seat white list is set according to the seat number of the public transport means.
In an embodiment of the application, the processor 403 is specifically configured to obtain the mileage information collected by the odometer of the unmanned distribution device before adding 1 to the counted current number of passenger compartments; and comparing the mileage with a mileage threshold value, and if the mileage is greater than the mileage threshold value, adding 1 to the current number of the passenger compartments.
In one embodiment of the present application, the first camera is installed in front of a body of the unmanned delivery vehicle, the second camera is installed at both sides of a bottom of the body of the unmanned delivery vehicle, and a field angle of the first camera is larger than a field angle of the second camera.
In an embodiment of the application, the processor 403 is further configured to determine whether the head orientation needs to be changed according to the first environment image if the first environment image does not match the first template image stored in advance, specifically including: carrying out Hough transform on the first environment image to obtain a plurality of candidate straight lines; screening the candidate straight lines according to the geometric length of the candidate straight lines and the included angle between the candidate straight lines and the straight line where the side edge of the passenger compartment of the public transport vehicle is located, and obtaining candidate straight lines which are parallel to the straight line where the passageway of the passenger compartment is located as reference straight lines; carrying out back projection transformation on the reference straight line to enable the reference straight line to be parallel to the vertical direction of the first environment image; if an included angle exists between the reference straight line after back projection transformation and the central axis of the first environment image, the head orientation of the unmanned distribution device is adjusted according to the size of the included angle; and if the reference straight line after back projection transformation does not have an included angle with the central axis of the first environment image, keeping the head of the unmanned distribution device facing.
In summary, the technical scheme of the embodiment of the application realizes higher positioning accuracy in the application scene of public transport means such as high-speed rails and the like, and meets the actual application requirements. In addition, navigation planning can be realized without pre-making a high-speed rail car environment map, the calculation complexity is reduced, and the positioning efficiency is improved. Finally, according to the technical scheme of the embodiment of the application, the unmanned distribution device is not required to be provided with high-precision IMU sensors and the like, the hardware cost is low, and the unmanned distribution device is suitable for large-scale popularization and application.
It should be noted that:
the algorithms and displays presented herein are not inherently related to any particular computer, virtual machine, or other apparatus. Various general purpose devices may be used with the teachings herein. The required structure for constructing such a device will be apparent from the description above. In addition, embodiments of the present application are not directed to any particular programming language. It is appreciated that a variety of programming languages may be used to implement the teachings of the embodiments of the present application as described herein, and any descriptions of specific languages are provided above to disclose the best modes of the embodiments of the present application.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the application, various features of the embodiments of the application are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various application aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that is, the claimed embodiments of the application require more features than are expressly recited in each claim. Rather, as the following claims reflect, application is directed to less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the embodiments of the application and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various component embodiments of the present application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that a microprocessor or Digital Signal Processor (DSP) may be used in practice to implement some or all of the functions of some or all of the components in a page performance testing apparatus according to embodiments of the present application. The present application may also be embodied as apparatus or device programs (e.g., computer programs and computer program products) for performing a portion or all of the methods described herein. Such programs implementing embodiments of the present application may be stored on a computer readable medium or may be in the form of one or more signals. Such a signal may be downloaded from an internet website or provided on a carrier signal or in any other form.
FIG. 5 is a schematic structural diagram of a non-transitory computer-readable storage medium according to an embodiment of the present application. The computer-readable storage medium 500 stores a computer program for executing the steps of the method according to the embodiments of the application, which is readable by a processor of an unmanned delivery apparatus and when the computer program is run by the unmanned delivery apparatus causes the unmanned delivery apparatus to execute the steps of the method described above, and in particular, the computer program stored by the computer-readable storage medium may execute the method shown in any of the embodiments described above. The computer program may be compressed in a suitable form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the embodiments of the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The embodiments of the application can be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words, second, third, etc. do not denote any order, and the words may be interpreted as names.

Claims (8)

1. A navigation positioning method for unmanned delivery devices in public transport means is characterized by comprising the following steps:
respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device;
obtaining a passenger compartment number based on the first environment information, and obtaining a seat number based on the second environment information;
determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number;
wherein the first environment information is a first environment image, and obtaining the passenger car number based on the first environment information includes:
matching the first environment image with a first template image stored in advance, wherein the first template image records characteristic information of a through passage connecting two adjacent passenger compartments;
if the matching fails, extracting edge straight line information in the first environment image by using Hough transform, and determining whether the head direction of the unmanned distribution device needs to be changed according to the edge straight line information;
if the matching is successful, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments;
wherein before adding 1 to the counted current number of passenger compartments, the method further comprises:
acquiring the information of the driving mileage collected by the odometer of the unmanned distribution device;
and comparing the mileage with a mileage threshold value, and if the mileage is greater than the mileage threshold value, adding 1 to the current number of the passenger compartments.
2. The navigation positioning method according to claim 1, wherein the second environment information is a second environment image, and the deriving a seat number based on the second environment information comprises:
matching the second environment image with a second template image stored in advance;
if the matching is consistent, adding 1 to the counted number of the current seats, and updating the number of the current seats;
and recording the characteristic information of the seat in the passenger compartment in the second template image.
3. The navigational positioning method of claim 1, wherein the first environmental information is a first environmental image, the second environmental information is a second environmental image,
the obtaining of the passenger compartment number based on the first environment information includes:
carrying out optical character recognition on the first environment image to obtain numbers in the first environment image, comparing the numbers with numbers in a preset passenger compartment white list, and if the numbers in the first environment image appear in the passenger compartment white list, obtaining a passenger compartment number from the numbers in the first environment image, wherein the passenger compartment white list is set according to the passenger compartment number of the public transport means;
the deriving a seat number based on the second environmental information comprises:
and carrying out optical character recognition on the second environment image to obtain a number in the second environment image, comparing the number with a number in a preset seat white list, and if the number in the second environment image appears in the seat white list, obtaining a seat number from the number in the second environment image, wherein the seat white list is set according to the seat number of the public transport means.
4. The navigation positioning method according to any one of claims 1-3, wherein the obtaining of the first environmental information collected by the first camera and the second environmental information collected by the second camera of the unmanned distribution device respectively comprises:
acquiring first environmental information collected by the first camera installed in front of the body of the unmanned distribution device,
and the number of the first and second groups,
acquiring second environmental information collected by the second cameras installed on two sides of the bottom of the unmanned distribution device,
wherein a field angle of the first camera is greater than a field angle of the second camera.
5. The navigation positioning method according to claim 4, wherein the extracting edge straight line information in the first environment image by using Hough transform, and the determining whether the head orientation of the unmanned aerial vehicle needs to be changed according to the edge straight line information comprises:
carrying out Hough transform on the first environment image to obtain a plurality of candidate straight lines;
screening the candidate straight lines according to the geometric length of the candidate straight lines and the included angle between the candidate straight lines and the straight line where the side edge of the passenger compartment of the public transport vehicle is located, and obtaining candidate straight lines which are parallel to the straight line where the passageway of the passenger compartment is located as reference straight lines;
carrying out back projection transformation on the reference straight line to enable the reference straight line to be parallel to the vertical direction of the first environment image;
if an included angle exists between the reference straight line after back projection transformation and the central axis of the first environment image, the head orientation of the unmanned distribution device is adjusted according to the size of the included angle;
and if the reference straight line after back projection transformation does not have an included angle with the central axis of the first environment image, keeping the head of the unmanned distribution device facing.
6. A navigational positioning device for an unmanned dispensing device in a mass transit vehicle, comprising:
the acquisition module is used for respectively acquiring first environment information acquired by a first camera and second environment information acquired by a second camera of the unmanned distribution device;
the navigation positioning module is used for obtaining a passenger compartment number based on the first environment information and obtaining a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number;
the navigation positioning module is specifically used for matching the first environment image with a pre-stored first template image, and the first template image records characteristic information of a through passage connecting two adjacent passenger compartments; if the matching fails, extracting edge straight line information in the first environment image by using Hough transform, and determining whether the head direction of the unmanned distribution device needs to be changed according to the edge straight line information; if the matching is successful, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments; acquiring the information of the driving mileage collected by a speedometer of the unmanned distribution device before adding 1 to the counted number of the current passenger compartments; and comparing the mileage with a mileage threshold value, and if the mileage is greater than the mileage threshold value, adding 1 to the current number of the passenger compartments.
7. An unmanned dispensing device for traveling in a passenger compartment of a public transportation vehicle, comprising: a first camera, a second camera and a processor,
the first camera is used for acquiring first environment information and sending the first environment information to the processor; wherein the first environment information is a first environment image;
the second camera is used for acquiring second environment information and sending the second environment information to the processor;
the processor is used for obtaining a passenger compartment number based on the first environment information and obtaining a seat number based on the second environment information; determining a location of the unmanned dispensing device in the public transportation vehicle from the passenger compartment number and the seat number; specifically, the method is used for matching the first environment image with a first template image stored in advance, and the first template image records characteristic information of a through passage connecting two adjacent passenger compartments; if the matching fails, extracting edge straight line information in the first environment image by using Hough transform, and determining whether the head direction of the unmanned distribution device needs to be changed according to the edge straight line information; if the matching is successful, adding 1 to the counted current number of the passenger compartments, and updating the current number of the passenger compartments; acquiring the information of the driving mileage collected by a speedometer of the unmanned distribution device before adding 1 to the counted number of the current passenger compartments; and comparing the mileage with a mileage threshold value, and if the mileage is greater than the mileage threshold value, adding 1 to the current number of the passenger compartments.
8. A non-transitory computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 5.
CN201910543477.7A 2019-06-21 2019-06-21 Unmanned distribution device and navigation positioning method and device thereof Active CN110308720B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910543477.7A CN110308720B (en) 2019-06-21 2019-06-21 Unmanned distribution device and navigation positioning method and device thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910543477.7A CN110308720B (en) 2019-06-21 2019-06-21 Unmanned distribution device and navigation positioning method and device thereof

Publications (2)

Publication Number Publication Date
CN110308720A CN110308720A (en) 2019-10-08
CN110308720B true CN110308720B (en) 2021-02-23

Family

ID=68077681

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910543477.7A Active CN110308720B (en) 2019-06-21 2019-06-21 Unmanned distribution device and navigation positioning method and device thereof

Country Status (1)

Country Link
CN (1) CN110308720B (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2783175B2 (en) * 1994-11-17 1998-08-06 日本電気株式会社 Traffic flow measurement device
CN1945351B (en) * 2006-10-21 2010-06-02 中国科学院合肥物质科学研究院 Robot navigation positioning system and navigation positioning method
CN101419705B (en) * 2007-10-24 2011-01-05 华为终端有限公司 Video camera demarcating method and device
CN202395858U (en) * 2011-12-14 2012-08-22 深圳市中控生物识别技术有限公司 Binocular photographic device
CN104142683B (en) * 2013-11-15 2016-06-08 上海快仓智能科技有限公司 Based on the automatic guide vehicle navigation method of Quick Response Code location
CN107918747A (en) * 2017-11-23 2018-04-17 大唐华银电力股份有限公司耒阳分公司 A kind of compartment numbering recognition methods
CN109195106B (en) * 2018-09-17 2020-01-03 北京三快在线科技有限公司 Train positioning method and device
CN109357676A (en) * 2018-10-19 2019-02-19 北京三快在线科技有限公司 The localization method and device and mobile device of a kind of mobile device

Also Published As

Publication number Publication date
CN110308720A (en) 2019-10-08

Similar Documents

Publication Publication Date Title
CN108216229B (en) Vehicle, road line detection and driving control method and device
CN103786729B (en) lane recognition method and system
RU2719499C1 (en) Method, device and railway vehicle, in particular, rail vehicle, for recognition of obstacles in railway connection, in particular in rail connection
CN107203738B (en) Vehicle lane boundary positioning
US10528055B2 (en) Road sign recognition
EP4273837A2 (en) Systems and methods for predicting blind spot incursions
JP6826421B2 (en) Equipment patrol system and equipment patrol method
JP7338369B2 (en) Environment map adjustment value calculation method and environment map adjustment value calculation program
TW202020811A (en) Systems and methods for correcting a high -definition map based on detection of obstructing objects
CN112154454A (en) Target object detection method, system, device and storage medium
CN108896994A (en) A kind of automatic driving vehicle localization method and equipment
CN108473131A (en) Parking assistance method and device
US11403947B2 (en) Systems and methods for identifying available parking spaces using connected vehicles
CN111742326A (en) Lane line detection method, electronic device, and storage medium
KR101977652B1 (en) A method for automatic generation of road surface type using a mobile mapping system
CN110491156A (en) A kind of cognitive method, apparatus and system
EP2096575A2 (en) Road sign recognition apparatus and road sign recognition method
CN104228837A (en) Apparatus and method for recognizing driving lane
US20180330172A1 (en) Indoor Navigation Method and System
CN111652072A (en) Track acquisition method, track acquisition device, storage medium and electronic equipment
CN111260549A (en) Road map construction method and device and electronic equipment
CN114241447A (en) Real-time obstacle detection method based on vehicle-mounted equipment
CN111914691A (en) Rail transit vehicle positioning method and system
CN114397685A (en) Vehicle navigation method, device, equipment and storage medium for weak GNSS signal area
Jiménez et al. Improving the lane reference detection for autonomous road vehicle control

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant