CN117217637A - Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts - Google Patents
Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts Download PDFInfo
- Publication number
- CN117217637A CN117217637A CN202210629224.3A CN202210629224A CN117217637A CN 117217637 A CN117217637 A CN 117217637A CN 202210629224 A CN202210629224 A CN 202210629224A CN 117217637 A CN117217637 A CN 117217637A
- Authority
- CN
- China
- Prior art keywords
- information
- unmanned vehicle
- pickup
- position information
- controlling
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000012546 transfer Methods 0.000 claims description 30
- 230000004044 response Effects 0.000 claims description 19
- 238000004590 computer program Methods 0.000 claims description 9
- 238000010586 diagram Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000007726 management method Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 238000013528 artificial neural network Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Landscapes
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Embodiments of the present disclosure disclose a method, apparatus, electronic device, and computer readable medium for controlling unmanned vehicle pickup. One embodiment of the method comprises the following steps: acquiring pickup information, wherein the pickup information comprises article position information; determining a picking mode according to the position information of the article; responding to the pickup mode to represent an unmanned vehicle pickup mode, and determining unmanned vehicle information according to pickup information, wherein the unmanned vehicle information comprises unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out the pickup operation according to the article position information and the unmanned vehicle position information. The embodiment relates to information safety, provides a non-contact workpiece taking mode and improves workpiece taking efficiency.
Description
Technical Field
Embodiments of the present disclosure relate to the field of information security technologies, and in particular, to a method, an apparatus, an electronic device, and a computer readable medium for controlling unmanned aerial vehicle pickup.
Background
With the rapid development of the logistics industry, the upper door picking up part has gradually become a ring in a service system in the logistics industry, and the related picking up part is as follows: the staff goes up the door and gets the piece.
However, when the above manner is adopted for picking up a workpiece, there are often the following technical problems:
when a worker gets a piece by going on the door, the address of the user needs to be known, leakage of user information is easy to cause, the worker needs to go to the address of the user to get the piece, time is wasted, and the piece getting efficiency is low.
Disclosure of Invention
The disclosure is in part intended to introduce concepts in a simplified form that are further described below in the detailed description. The disclosure is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
Some embodiments of the present disclosure propose methods, apparatus, electronic devices, and computer-readable media to control unmanned aerial vehicle pickup to address one or more of the technical problems mentioned in the background section above.
In a first aspect, some embodiments of the present disclosure provide a method of controlling unmanned aerial vehicle retrieval, the method comprising: acquiring pickup information, wherein the pickup information comprises article position information; determining a picking-up mode according to the article position information; responding to the pickup mode to represent an unmanned vehicle pickup mode, and determining unmanned vehicle information according to the pickup information, wherein the unmanned vehicle information comprises unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information.
Optionally, the pickup information further includes article size information; and determining the unmanned vehicle information according to the pickup information, including: determining a target unmanned aerial vehicle position information set according to the article position information; according to the size information of the articles, determining information of the cabinet body of the unmanned vehicle; and determining the unmanned vehicle information according to the unmanned vehicle cabinet information and the target unmanned vehicle position information set.
Optionally, the controlling the unmanned aerial vehicle corresponding to the unmanned aerial vehicle information to perform the pickup operation according to the article position information and the unmanned aerial vehicle position information includes: acquiring user grade information corresponding to the piece taking information; and responding to the user grade information meeting a preset grade condition, and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
Optionally, the controlling the unmanned aerial vehicle corresponding to the unmanned aerial vehicle information to perform the pickup operation according to the article position information and the unmanned aerial vehicle position information further includes: transmitting value transfer information to a user terminal corresponding to the piece taking information in response to the user grade information not meeting a preset grade condition; and in response to receiving the value transfer completion information corresponding to the value transfer information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
Optionally, the controlling the unmanned aerial vehicle corresponding to the unmanned aerial vehicle information to perform the pickup operation according to the article position information and the unmanned aerial vehicle position information includes: generating pickup path information according to the article position information and the unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information.
Optionally, the controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information includes: controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to the position corresponding to the article position information according to the pickup path information; and responding to the received cabinet door opening information corresponding to the unmanned vehicle information, and controlling the cabinet door corresponding to the cabinet door opening information to be opened.
Optionally, after the controlling the opening of the cabinet door corresponding to the cabinet door opening information, the method further includes: and in response to receiving the cabinet door closing information corresponding to the unmanned vehicle information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to a pickup station.
Optionally, after the controlling the drone corresponding to the drone information to move to the pick-up station, the method further includes: generating mail sending information according to the unmanned vehicle information and site position information corresponding to the mail taking site; and sending the mail information to the associated terminal equipment.
In a second aspect, some embodiments of the present disclosure provide a control unmanned vehicle pick-up device, the device comprising: an acquisition unit configured to acquire pickup information, wherein the pickup information includes article position information; a first determining unit configured to determine a pickup mode based on the article position information; a second determining unit configured to characterize an unmanned vehicle pickup mode in response to the pickup mode, and determine unmanned vehicle information according to the pickup information, wherein the unmanned vehicle information includes unmanned vehicle position information; and the control unit is configured to control the unmanned vehicle corresponding to the unmanned vehicle information to carry out the pickup operation according to the article position information and the unmanned vehicle position information.
Optionally, the above second determining unit is further configured to: determining a target unmanned aerial vehicle position information set according to the article position information; according to the size information of the articles, determining information of the cabinet body of the unmanned vehicle; and determining the unmanned vehicle information according to the unmanned vehicle cabinet information and the target unmanned vehicle position information set.
Optionally, the above second determining unit is further configured to: acquiring user grade information corresponding to the piece taking information; and responding to the user grade information to meet the preset grade condition.
Optionally, the control unit is further configured to: transmitting value transfer information to a user terminal corresponding to the piece taking information in response to the user grade information not meeting a preset grade condition; and in response to receiving the value transfer completion information corresponding to the value transfer information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
Optionally, the control unit is further configured to: generating pickup path information according to the article position information and the unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information.
Optionally, the control unit is further configured to: controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to the position corresponding to the article position information according to the pickup path information; and responding to the received cabinet door opening information corresponding to the unmanned vehicle information, and controlling the cabinet door corresponding to the cabinet door opening information to be opened.
Optionally, the apparatus further includes: the mobile unit is controlled. The control mobile unit is configured to control the unmanned vehicle corresponding to the unmanned vehicle information to move to the pickup station in response to receiving the cabinet door closing information corresponding to the unmanned vehicle information.
Optionally, the apparatus further includes: and a transmitting unit. The sending unit is configured to generate mail sending information according to the unmanned vehicle information and site position information corresponding to the mail taking site; and sending the mail information to the associated terminal equipment.
In a third aspect, some embodiments of the present disclosure provide an electronic device comprising: one or more processors; a storage device having one or more programs stored thereon, which when executed by one or more processors causes the one or more processors to implement the method described in any of the implementations of the first aspect above.
In a fourth aspect, some embodiments of the present disclosure provide a computer readable medium having a computer program stored thereon, wherein the program, when executed by a processor, implements the method described in any of the implementations of the first aspect.
The above embodiments of the present disclosure have the following advantageous effects: according to the method for controlling the unmanned aerial vehicle to take the parts, disclosed by the embodiment of the invention, a non-contact part taking mode is provided, and the part taking efficiency is improved. Specifically, the reason for the lower picking efficiency caused by the related picking method is that: the staff is required to go to the address of the user for taking the part, which wastes time. Based on this, in some embodiments of the present disclosure, a method for controlling an unmanned vehicle to get a piece is first obtained. Wherein the pickup information includes article position information. Thus, the position of the article to be fetched can be determined from the acquired article position information. And secondly, determining a picking-up mode according to the article position information. Therefore, whether the unmanned vehicle can be selected for picking up the article can be determined according to the position of the article to be picked up. And then, responding to the pickup mode to represent the pickup mode of the unmanned vehicle, and determining the unmanned vehicle information according to the pickup information. Wherein, the unmanned vehicle information comprises unmanned vehicle position information. Therefore, when the object grade information meets the requirement of selecting the unmanned vehicle for picking, the unmanned vehicle closest to the object to be picked can be selected for picking. And finally, controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information. Therefore, the unmanned vehicle is controlled to take the workpiece, so that the face-to-face contact between a user and a worker can be avoided. Therefore, the unmanned vehicle pickup control method of some embodiments of the present disclosure provides a non-contact pickup mode, and the pickup efficiency is improved.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale.
FIG. 1 is a schematic illustration of one application scenario of some embodiments of the present disclosure for controlling unmanned vehicle pickup methods;
FIG. 2 is a flow chart of some embodiments of a method of controlling unmanned vehicle pickup in accordance with the present disclosure;
FIG. 3 is a flow chart of other embodiments of a method of controlling unmanned vehicle pickup according to the present disclosure;
FIG. 4 is a schematic structural view of some embodiments of a control drone pick-up according to the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device suitable for use in implementing some embodiments of the present disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be noted that, for convenience of description, only the portions related to the present invention are shown in the drawings. Embodiments of the present disclosure and features of embodiments may be combined with each other without conflict.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 is a schematic diagram of one application scenario of some embodiments of the present disclosure for controlling unmanned vehicle pickup methods.
In the application scenario of fig. 1, first, the computing device 101 may obtain the pickup information 102, where the pickup information 102 includes item location information. For example, the pick-up information 102 may be "article location (116.420427, 39.981358)". Next, the computing device 101 may determine the pickup mode 103 according to the item location information. For example, the pickup mode 103 may be "unmanned pickup mode". The computing device 101 may then characterize the unmanned vehicle pickup mode in response to the pickup mode 103, and determine the unmanned vehicle information 104 based on the pickup information 102. Wherein the drone information 104 includes drone location information. For example, the drone information 104 may be "drone location (116.420429, 39.981360)". Finally, the computing device 101 may control the unmanned aerial vehicle 105 corresponding to the unmanned aerial vehicle information 104 to perform the pickup operation according to the article position information and the unmanned aerial vehicle position information.
The computing device 101 may be hardware or software. When the computing device is hardware, the computing device may be implemented as a distributed cluster formed by a plurality of servers or terminal devices, or may be implemented as a single server or a single terminal device. When the computing device is embodied as software, it may be installed in the hardware devices listed above. It may be implemented as a plurality of software or software modules, for example, for providing distributed services, or as a single software or software module. The present invention is not particularly limited herein.
It should be understood that the number of computing devices in fig. 1 is merely illustrative. There may be any number of computing devices, as desired for an implementation.
With continued reference to fig. 2, a flow chart 200 of some embodiments of a method of controlling unmanned aerial vehicle pickup according to the present disclosure is shown. The unmanned vehicle taking control method comprises the following steps:
step 201, obtaining the pickup information.
In some embodiments, an executing body (e.g., a computing device shown in fig. 1) that controls the unmanned aerial vehicle pickup method may receive the pickup information from the user terminal through a wired connection or a wireless connection. The pickup information may be information representing an article to be picked. Wherein the pickup information includes article position information. The article position information may be information indicating a position of the article to be picked. For example, the pickup information may be "article position (116.420427, 39.981358)". 116.420427 may represent longitude. 39.981358 may represent latitude. It should be noted that the wireless connection may include, but is not limited to, 3G/4G connections, wiFi connections, bluetooth connections, wiMAX connections, zigbee connections, UWB (ultra wideband) connections, and other now known or later developed wireless connection means. Thus, the position of the article to be fetched can be determined from the acquired article position information.
Step 202, determining a picking mode according to the position information of the article.
In some embodiments, the executing body may determine the picking mode according to the article position information. The pickup mode may be information representing a pickup mode. For example, the pick-up mode may characterize an unmanned vehicle pick-up mode or a manual pick-up mode. The unmanned vehicle pickup mode can be a mode of pickup by an unmanned vehicle. The manual picking mode can be a mode that a worker gets a piece on the door. In practice, as an example, the pickup mode may be determined according to a pre-configured correspondence table of position information and the pickup mode.
As yet another example, the execution body may determine whether the unmanned vehicle exists in a preset range of the article position corresponding to the article position information according to the prestored information about the unmanned vehicle. Specifically, the execution body may set a range smaller than the article position preset distance corresponding to the article position information as the preset range. Finally, the executing body may determine whether there is an unmanned vehicle whose position is within the preset range. When the unmanned vehicle exists in the preset range, the pickup mode is an unmanned vehicle pickup mode. And when no unmanned vehicle exists in the preset range, the picking mode is a manual picking mode. The information related to the unmanned vehicle may be information stored in advance by the execution subject, or may be information acquired by the execution subject from the unmanned vehicle system. The above-described unmanned vehicle system may be a system for storing various information about the unmanned vehicle. As an example, the above-described drone system or the above-described execution subject may store drone position information, drone identification, cabinet state information, and the like. The unmanned aerial vehicle identifier may be a serial number for characterizing the unmanned aerial vehicle. The cabinet state information can be information representing whether articles are placed in the cabinet of the unmanned vehicle or not. For example, when the articles are placed in the cabinet B of the unmanned vehicle, the cabinet status information may be "cabinet B, YES". When NO article is placed in the cabinet body B of the unmanned vehicle, the cabinet body state information can be cabinet body B and NO. Therefore, whether the unmanned vehicle can be selected for picking up the article can be determined according to the position of the article to be picked up.
And 203, responding to the pickup mode to represent the pickup mode of the unmanned vehicle, and determining the unmanned vehicle information according to the pickup information.
In some embodiments, the executing body may characterize the unmanned vehicle pickup mode in response to the pickup mode, and determine unmanned vehicle information according to the pickup information, where the unmanned vehicle information includes unmanned vehicle position information. The information on the unmanned vehicle may be information indicating a position of the unmanned vehicle. For example, the information on the drone may be "drone location (116.420429, 39.981360)". In practice, the executing body may determine the information of the unmanned vehicle according to the position information of the article included in the pickup information. As an example, the execution body may select an unmanned vehicle closest to the article position within a preset range of the article position corresponding to the article position information, and use the position information of the unmanned vehicle as the unmanned vehicle information. Therefore, when the object grade information meets the requirement of selecting the unmanned vehicle for picking, the unmanned vehicle closest to the object to be picked can be selected for picking.
And 204, controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information.
In some embodiments, the executing body may control the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the article position information and the unmanned vehicle position information. In practice, first, the execution body may preset a plurality of target pickup positions, and preset paths along which the drone moves from the drone position to the respective target pickup positions. Then, the execution body may transmit a target pickup position closest to the article position corresponding to the article position information to the user terminal. Finally, the execution body can control the unmanned vehicle corresponding to the unmanned vehicle information to move to the closest target pickup position. Therefore, the user can move to the target pickup position and place the object to be picked on the unmanned vehicle corresponding to the unmanned vehicle information. Therefore, the unmanned vehicle is controlled to take the workpieces, and the face-to-face contact between a user and a worker is avoided.
In some optional implementations of some embodiments, the executing body may control the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the article position information and the unmanned vehicle position information by executing the following steps:
Step one, obtaining user grade information corresponding to the piece taking information. The user grade information may be information indicating whether the user can use the unmanned aerial vehicle for free. For example, when the user can use the unmanned vehicle for taking a piece for free, the user-level information may be "VIP user". When the user cannot use the unmanned vehicle for free to pick up the part, the user grade information can be a common user.
And a second step of responding to the user grade information meeting a preset grade condition and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information. The preset grade condition can be the grade information of the user to indicate that the user can use unmanned vehicles for free.
In some optional implementations of some embodiments, the executing body may control the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the article position information and the unmanned vehicle position information by executing the following steps:
and the first step is to send value transfer information to the user terminal corresponding to the piece taking information in response to the user grade information not meeting the preset grade condition. The user terminal may be a device for transmitting the pickup information to a user. For example, the user terminal may be a mobile phone. The user terminal may be a computer. The value transfer information may be information characterizing a value transfer operation to be performed by the user. For example, the value transfer information may be a page to be paid. In practice, when the user grade information indicates that the user can not use the unmanned aerial vehicle for free, the unmanned aerial vehicle can be fetched. The execution subject may send value transfer information to the user terminal corresponding to the pickup information.
And a second step of controlling the unmanned vehicle corresponding to the unmanned vehicle information to take a piece according to the article position information and the unmanned vehicle position information in response to receiving the value transfer completion information corresponding to the value transfer information. The value transfer completion information may be information indicating that the user has completed the value transfer. For example, the value transfer completion information may be a paid page. In practice, when the user completes the value transfer, the executing entity may receive the value transfer completion information corresponding to the value transfer information sent by the value transfer management system. And then, the execution body controls the unmanned vehicle corresponding to the unmanned vehicle information to carry out the picking operation according to the article position information and the unmanned vehicle position information. The value transfer management system may be a system for recording various value transfer conditions. For example, the value transfer management system may be a ledger management system.
In some optional implementations of some embodiments, the executing body may further control the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the article position information and the unmanned vehicle position information by executing the following steps:
And first, generating pickup path information according to the article position information and the unmanned vehicle position information. In practice, the execution body may perform path planning processing on the article position information and the unmanned vehicle position information, to obtain a path as the pickup path information. As an example, the execution subject may perform path planning processing on the article position information and the unmanned vehicle position information through an ant colony algorithm, to obtain a path as pickup path information. As yet another example, the executing body may perform a path planning process on the shelf location information and the device location information through a neural network algorithm, to obtain a path as the pickup path information. The pickup path information may be "pickup path (0:400 m,1:500 m)". "0" may indicate an eastward shift. "1" may mean moving north. Further, "2" may represent a western movement. "3" may denote a southbound movement. Both "400m" and "500m" may characterize the distance traveled by the drone. The pickup path (0:400 m,1:500 m) can represent that the unmanned vehicle moves to the east for 400m and then moves to the north for 500m.
And secondly, controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out the pickup operation according to the pickup path information. In practice, the executing body may control the vehicle corresponding to the vehicle information to move to the position corresponding to the article position information according to the pickup path information.
In some optional implementations of some embodiments, the executing body may further control the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information by executing the following steps:
and controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to the position corresponding to the article position information according to the pickup path information.
And secondly, responding to the received cabinet door opening information corresponding to the unmanned vehicle information, and controlling the cabinet door corresponding to the cabinet door opening information to be opened. The cabinet door opening information can be information representing that the cabinet door needs to be opened. For example, the cabinet door opening information may be "cabinet B, open". As an example, when the user scans a two-dimensional code on the unmanned vehicle or inputs a verification code on the unmanned vehicle, the execution body may receive the cabinet door opening information, thereby controlling the cabinet door corresponding to the cabinet door opening information to be opened.
Optionally, after the controlling the opening of the cabinet door corresponding to the cabinet door opening information, the executing body may further execute the following steps: and in response to receiving the cabinet door closing information corresponding to the unmanned vehicle information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to a pickup station. The cabinet door closing information may be information indicating that the cabinet door has been closed. For example, the cabinet door closing information may be "cabinet B, close". The pick-up station can be a station for uniformly picking up the workpieces by the staff. For example, the position of the pickup station may be a position corresponding to the position information of the unmanned vehicle. The position of the pick-up station may be a preset position. In practice, first, in response to the control unit on the unmanned vehicle sensing that the door is closed through the sensor on the door, the control unit on the unmanned vehicle may send door closing information corresponding to the information of the unmanned vehicle to the execution subject. Then, the execution body may perform path planning processing on the article position information and the site position information corresponding to the pick-up site, so as to generate site path information. Finally, the execution main body can control the unmanned vehicle corresponding to the unmanned vehicle information to move to the pickup station according to the station path information. The sensor may be a door magnetic switch. The site location information may be information indicating a location of the pickup site.
Optionally, after the controlling the drone corresponding to the drone information to move to the pick-up station, the executing body may further execute the following steps:
first, generating mail information according to the unmanned vehicle information and site position information corresponding to the mail picking site. In practice, the execution subject may splice the unmanned vehicle information and the site location information corresponding to the pick-up site to generate the mail information. As an example, the above-described drone information may include a drone identification "drone 01" and cabinet status information "cabinet B, NO". The station location information may be "station location (116.420425, 39.981356)". The mail information may be "station location (116.420425, 39.981356), unmanned vehicle 01, cabinet B, NO".
And a second step of sending the mail information to the associated terminal equipment. The above-mentioned associated terminal device may be a terminal of a worker. For example, the terminal device may be a smart phone. The terminal device may be a computer.
Therefore, the staff can know the quantity of the articles to be mailed according to the received mailing information. When the number of articles to be sent is larger than a preset value, workers can send the articles to the article taking station, so that centralized article taking can be realized, and article taking efficiency is improved.
The above embodiments of the present disclosure have the following advantageous effects: according to the method for controlling the unmanned aerial vehicle to take the parts, disclosed by the embodiment of the invention, a non-contact part taking mode is provided, and the part taking efficiency is improved. Specifically, the reason for the lower picking efficiency caused by the related picking method is that: the staff is required to go to the address of the user for taking the part, which wastes time. Based on this, in some embodiments of the present disclosure, a method for controlling an unmanned vehicle to get a piece is first obtained. Wherein the pickup information includes article position information. Thus, the position of the article to be fetched can be determined from the acquired article position information. And secondly, determining a picking-up mode according to the article position information. Therefore, whether the unmanned vehicle can be selected for picking up the article can be determined according to the position of the article to be picked up. And then, responding to the pickup mode to represent the pickup mode of the unmanned vehicle, and determining the unmanned vehicle information according to the pickup information. Wherein, the unmanned vehicle information comprises unmanned vehicle position information. Therefore, when the object grade information meets the requirement of selecting the unmanned vehicle for picking, the unmanned vehicle closest to the object to be picked can be selected for picking. And finally, controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information. Therefore, the unmanned vehicle is controlled to take the workpiece, so that the face-to-face contact between a user and a worker can be avoided. Therefore, the unmanned vehicle pickup control method of some embodiments of the present disclosure provides a non-contact pickup mode, and the pickup efficiency is improved.
With further reference to fig. 3, a flow chart 300 of further embodiments of a method of controlling unmanned aerial vehicle pick-up according to the present disclosure is shown. The unmanned vehicle taking control method comprises the following steps:
step 301, obtaining the pickup information.
Step 302, determining a picking mode according to the position information of the article.
In some embodiments, the specific implementation of steps 301 to 302 and the technical effects thereof may refer to steps 201 to 202 in the corresponding embodiment of fig. 2, which are not described herein. The pickup information may further include article size information. The article size information may be information representing the sizes of the articles to be picked. For example, the article size information may be "article size (10 cm,20 cm)". The "article dimensions (10 cm,20 cm)" may characterize the length of the article to be taken as 10cm, the width as 20cm, and the height as 20cm.
Step 303, determining a target unmanned vehicle position information set according to the article position information.
In some embodiments, the executing entity may determine the target unmanned vehicle position information set according to the item position information. The target unmanned vehicle position information set may be a set of position information of unmanned vehicles within a preset range of the article to be fetched. In practice, the execution body may determine the position information of the unmanned aerial vehicle within the preset range of the article position corresponding to the article position information as the target unmanned aerial vehicle position information, and obtain the target unmanned aerial vehicle position information set.
And 304, determining the information of the unmanned vehicle cabinet body according to the size information of the articles.
In some embodiments, the executing body may determine the information of the unmanned vehicle cabinet according to the information of the article size. The information of the unmanned vehicle cabinet body can be information representing the size of the unmanned vehicle cabinet body. As an example, the unmanned vehicle cabinet information may be a cabinet model of an unmanned vehicle. For example, the types of the body models of the unmanned vehicles may include "body a", "body B", and "body C". Wherein, the cabinet body A can be characterized by the length of the cabinet body being 10cm, the width being 10cm and the height being 10cm. "Cabinet B" may be characterized as a cabinet having a length of 20cm, a width of 20cm, and a height of 20cm. "Cabinet C" may be characterized as a cabinet having a length of 40cm, a width of 40cm, and a height of 40cm. In practice, the execution main body may select, as the unmanned vehicle cabinet information, cabinet information having a cabinet size equal to or larger than a size of the article to be fetched according to the article size information. For example, when the article size information is "article size (10 cm,20 cm)", the unmanned vehicle cabinet information may be "cabinet B" or "cabinet C".
And 305, determining the unmanned vehicle information according to the unmanned vehicle cabinet information and the target unmanned vehicle position information set.
In some embodiments, the executing entity may determine the vehicle information according to the vehicle cabinet information and the target vehicle position information set.
As an example, first, the executing body may determine, according to the prestored related information of the unmanned aerial vehicle, target unmanned aerial vehicle information corresponding to the location information of each target unmanned aerial vehicle, and obtain a target unmanned aerial vehicle information set. The target unmanned vehicle information comprises target cabinet state information.
And finally, the execution main body can determine the target unmanned aerial vehicle information with the nearest corresponding unmanned aerial vehicle position and the nearest object position in at least one target unmanned aerial vehicle information meeting the preset cabinet condition as the unmanned aerial vehicle information. The preset cabinet condition can be that the object cabinet state information included in the object unmanned vehicle information represents that no object is placed in the cabinet body, and the cabinet corresponding to the included object cabinet state information is equal to the cabinet corresponding to the unmanned vehicle cabinet information.
And 306, controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information.
In some embodiments, the specific implementation of step 306 and the technical effects thereof may refer to step 204 in the corresponding embodiment of fig. 2, which is not described herein.
As can be seen in fig. 3, in contrast to the description of some embodiments corresponding to fig. 2, the process 300 of controlling the unmanned vehicle pickup method of some embodiments corresponding to fig. 3 includes a step of selecting a different cabinet according to the article size information. Therefore, the space of the unmanned vehicle can be utilized to the greatest extent by arranging the cabinet bodies with different sizes on the unmanned vehicle and selecting the empty cabinet body with the corresponding size according to the size of the article, so that the picking efficiency is further improved.
With further reference to fig. 4, as an implementation of the method illustrated in the above figures, the present disclosure provides some embodiments of controlling an unmanned vehicle pick-up device, which correspond to those illustrated in fig. 2, which may find particular application in a variety of electronic devices.
As shown in fig. 4, the unmanned aerial vehicle pickup device of some embodiments includes an acquisition unit 401, a first determination unit 402, a second determination unit 403, and a control unit 404. Wherein the acquiring unit 401 is configured to acquire pickup information, wherein the pickup information includes article position information. The first determining unit 402 is configured to determine a picking mode according to the above-mentioned article position information. And a second determining unit 403 configured to determine the unmanned vehicle information according to the pickup information in response to the pickup mode characterizing the unmanned vehicle pickup mode, wherein the unmanned vehicle information includes unmanned vehicle position information. And a control unit 404 configured to control the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information.
In some optional implementations of some embodiments, the second determining unit 403 is further configured to: determining a target unmanned aerial vehicle position information set according to the article position information; according to the size information of the articles, determining information of the cabinet body of the unmanned vehicle; and determining the unmanned vehicle information according to the unmanned vehicle cabinet information and the target unmanned vehicle position information set.
In some alternative implementations of some embodiments, the control unit 404 is further configured to: acquiring user grade information corresponding to the piece taking information; and responding to the user grade information meeting a preset grade condition, and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
In some alternative implementations of some embodiments, the control unit 404 is further configured to: transmitting value transfer information to a user terminal corresponding to the piece taking information in response to the user grade information not meeting a preset grade condition; and in response to receiving the value transfer completion information corresponding to the value transfer information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
In some alternative implementations of some embodiments, the control unit 404 is further configured to: generating pickup path information according to the article position information and the unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information.
In some alternative implementations of some embodiments, the control unit 404 is further configured to: controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to the position corresponding to the article position information according to the pickup path information; and responding to the received cabinet door opening information corresponding to the unmanned vehicle information, and controlling the cabinet door corresponding to the cabinet door opening information to be opened.
Optionally, the apparatus further includes: the mobile unit (not shown) is controlled. The control mobile unit is configured to control the unmanned vehicle corresponding to the unmanned vehicle information to move to the pickup station in response to receiving the cabinet door closing information corresponding to the unmanned vehicle information.
Optionally, the apparatus further includes: a transmitting unit (not shown). The sending unit is configured to generate mail sending information according to the unmanned vehicle information and site position information corresponding to the mail taking site; and sending the mail information to the associated terminal equipment.
It will be appreciated that the elements described in the apparatus 400 correspond to the various steps in the method described with reference to fig. 2. Thus, the operations, features and resulting benefits described above with respect to the method are equally applicable to the apparatus 400 and the units contained therein, and are not described in detail herein.
Referring now to FIG. 5, a schematic diagram of an electronic device (e.g., computing device 101 shown in FIG. 1) 500 suitable for use in implementing some embodiments of the present disclosure is shown. The electronic device shown in fig. 5 is merely an example and should not impose any limitations on the functionality and scope of use of embodiments of the present disclosure.
As shown in fig. 5, the electronic device 500 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 501, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 502 or a program loaded from a storage means 508 into a Random Access Memory (RAM) 503. In the RAM503, various programs and data required for the operation of the electronic apparatus 500 are also stored. The processing device 501, the ROM 502, and the RAM503 are connected to each other via a bus 504. An input/output (I/O) interface 505 is also connected to bus 504.
In general, the following devices may be connected to the I/O interface 505: input devices 506 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 507 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 508 including, for example, magnetic tape, hard disk, etc.; and communication means 509. The communication means 509 may allow the electronic device 500 to communicate with other devices wirelessly or by wire to exchange data. While fig. 5 shows an electronic device 500 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead. Each block shown in fig. 5 may represent one device or a plurality of devices as needed.
In particular, according to some embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, some embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such embodiments, the computer program may be downloaded and installed from a network via the communications device 509, or from the storage device 508, or from the ROM 502. The above-described functions defined in the methods of some embodiments of the present disclosure are performed when the computer program is executed by the processing device 501.
It should be noted that, the computer readable medium described in some embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In some embodiments of the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In some embodiments of the present disclosure, however, the computer-readable signal medium may comprise a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some implementations, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring pickup information, wherein the pickup information comprises article position information; determining a picking-up mode according to the article position information; responding to the pickup mode to represent an unmanned vehicle pickup mode, and determining unmanned vehicle information according to the pickup information, wherein the unmanned vehicle information comprises unmanned vehicle position information; and controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform a pickup operation according to the article position information and the unmanned vehicle position information.
Computer program code for carrying out operations for some embodiments of the present disclosure may be written in one or more programming languages, including an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in some embodiments of the present disclosure may be implemented by means of software, or may be implemented by means of hardware. The described units may also be provided in a processor, for example, described as: a processor includes an acquisition unit, a first determination unit, a second determination unit, and a control unit. The names of these units do not constitute a limitation on the unit itself in some cases, and for example, the acquisition unit may also be described as "a unit that acquires pickup information".
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above technical features, but encompasses other technical features formed by any combination of the above technical features or their equivalents without departing from the spirit of the invention. Such as the above-described features, are mutually substituted with (but not limited to) the features having similar functions disclosed in the embodiments of the present disclosure.
Claims (11)
1. A method of controlling unmanned vehicle pickup, comprising:
acquiring pickup information, wherein the pickup information comprises article position information;
determining a picking-up mode according to the article position information;
responding to the pickup mode to represent an unmanned vehicle pickup mode, and determining unmanned vehicle information according to the pickup information, wherein the unmanned vehicle information comprises unmanned vehicle position information;
and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
2. The method of claim 1, wherein the pick-up information further comprises item size information; and
the determining unmanned vehicle information according to the pickup information comprises the following steps:
determining a target unmanned aerial vehicle position information set according to the article position information;
according to the size information of the articles, determining information of the cabinet body of the unmanned vehicle;
and determining the unmanned vehicle information according to the unmanned vehicle cabinet information and the target unmanned vehicle position information set.
3. The method of claim 1, wherein the controlling the drone corresponding to the drone information to perform the pick-up operation according to the item location information and the drone location information includes:
Acquiring user grade information corresponding to the piece taking information;
and responding to the user grade information meeting a preset grade condition, and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
4. The method of claim 3, wherein the controlling the drone corresponding to the drone information to perform the pick-up operation according to the item location information and the drone location information, further comprises:
transmitting value transfer information to a user terminal corresponding to the piece taking information in response to the user grade information not meeting a preset grade condition;
and responding to receiving value transfer completion information corresponding to the value transfer information, and controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out a pickup operation according to the article position information and the unmanned vehicle position information.
5. The method of claim 1, wherein the controlling the drone corresponding to the drone information to perform the pick-up operation according to the item location information and the drone location information includes:
generating pickup path information according to the article position information and the unmanned vehicle position information;
And controlling the unmanned vehicle corresponding to the unmanned vehicle information to carry out the pickup operation according to the pickup path information.
6. The method of claim 5, wherein the controlling the unmanned vehicle corresponding to the unmanned vehicle information to perform the pickup operation according to the pickup path information comprises:
according to the pickup path information, controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to a position corresponding to the article position information;
and responding to the received cabinet door opening information corresponding to the unmanned vehicle information, and controlling the cabinet door corresponding to the cabinet door opening information to be opened.
7. The method of claim 6, wherein after said controlling the opening of the cabinet door corresponding to the cabinet door opening information, the method further comprises:
and responding to receiving the cabinet door closing information corresponding to the unmanned vehicle information, and controlling the unmanned vehicle corresponding to the unmanned vehicle information to move to a pickup station.
8. The method of claim 7, wherein after the controlling the drone corresponding to the drone information to move to a pick-up station, the method further comprises:
generating mail sending information according to the unmanned vehicle information and site position information corresponding to the mail taking site;
And sending the mail information to the associated terminal equipment.
9. A control drone pick-up device, comprising:
an acquisition unit configured to acquire pickup information, wherein the pickup information includes article position information;
a first determining unit configured to determine a pickup mode according to the article position information;
a second determining unit configured to characterize an unmanned vehicle pickup mode in response to the pickup mode, determine unmanned vehicle information according to the pickup information, wherein the unmanned vehicle information includes unmanned vehicle position information;
and the control unit is configured to control the unmanned vehicle corresponding to the unmanned vehicle information to carry out the pickup operation according to the article position information and the unmanned vehicle position information.
10. An electronic device, comprising:
one or more processors;
a storage device having one or more programs stored thereon,
when executed by the one or more processors, causes the one or more processors to implement the method of any of claims 1 to 8.
11. A computer readable medium having stored thereon a computer program, wherein the program when executed by a processor implements the method of any of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210629224.3A CN117217637A (en) | 2022-06-01 | 2022-06-01 | Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210629224.3A CN117217637A (en) | 2022-06-01 | 2022-06-01 | Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117217637A true CN117217637A (en) | 2023-12-12 |
Family
ID=89033969
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210629224.3A Pending CN117217637A (en) | 2022-06-01 | 2022-06-01 | Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117217637A (en) |
-
2022
- 2022-06-01 CN CN202210629224.3A patent/CN117217637A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9426610B2 (en) | Applying mesh network to luggage | |
CN109767130A (en) | Method for controlling a vehicle and device | |
CN111311193B (en) | Method and device for configuring public service resources | |
CN109816816A (en) | Method and apparatus for obtaining data | |
CN115326099A (en) | Local path planning method and device, electronic equipment and computer readable medium | |
CN113988485B (en) | Site arrival amount prediction method and device, electronic equipment and computer readable medium | |
CN117217637A (en) | Method, device, electronic equipment and computer readable medium for controlling unmanned vehicle to get parts | |
CN113722369A (en) | Method, device, equipment and storage medium for predicting field monitoring data | |
CN109819026B (en) | Method and device for transmitting information | |
CN118172139A (en) | Service processing method based on alliance chain and related device | |
CN116700250A (en) | Path planning method, computing device, apparatus, device and medium | |
CN116088528A (en) | Sample rapid inspection method, device, electronic equipment and computer readable medium | |
CN115061386A (en) | Intelligent driving automatic simulation test system and related equipment | |
CN111835917A (en) | Method, device and equipment for showing activity range and computer readable medium | |
CN116424758A (en) | Intelligent storage method and device for hotel container, electronic equipment and medium | |
CN116934188A (en) | Robot distribution result recognition method, apparatus, electronic device and medium | |
CN117494295B (en) | A rail transit operation and maintenance method, system, electronic device and storage medium based on BIM | |
CN119515254A (en) | Method, device, electronic device and computer-readable medium for sending item information | |
CN116957245A (en) | Hotel luggage management method, device, electronic equipment and medium | |
CN115924386A (en) | Robot distribution method, device, electronic device and medium | |
CN116739453A (en) | Distribution task execution method and device, electronic equipment and readable medium | |
CN116562521A (en) | Article scheduling apparatus control method, apparatus, and computer readable medium | |
CN116119235A (en) | Robot article conveying method, apparatus, electronic device, and medium | |
CN116362502A (en) | Robot calling method, device, electronic equipment and medium | |
CN117745158A (en) | Article distribution method for unmanned aerial vehicle, vehicle-mounted controller and unmanned aerial vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |