[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109074408B - Map loading method and device, electronic equipment and readable storage medium - Google Patents

Map loading method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN109074408B
CN109074408B CN201880001292.7A CN201880001292A CN109074408B CN 109074408 B CN109074408 B CN 109074408B CN 201880001292 A CN201880001292 A CN 201880001292A CN 109074408 B CN109074408 B CN 109074408B
Authority
CN
China
Prior art keywords
environment
illumination information
time period
image data
illumination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201880001292.7A
Other languages
Chinese (zh)
Other versions
CN109074408A (en
Inventor
易万鑫
廉士国
林义闽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Cloudminds Shanghai Robotics Co Ltd
Original Assignee
Cloudminds Robotics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cloudminds Robotics Co Ltd filed Critical Cloudminds Robotics Co Ltd
Publication of CN109074408A publication Critical patent/CN109074408A/en
Application granted granted Critical
Publication of CN109074408B publication Critical patent/CN109074408B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3837Data obtained from a single source

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)
  • Navigation (AREA)

Abstract

The present application relates to the field of computer vision, and in particular, to a method and an apparatus for map loading, an electronic device, and a readable storage medium. The map loading method comprises the following steps: acquiring current image data of an environment; calculating first illumination information corresponding to the environment at present according to the current image data of the environment; determining the current time period of the environment according to the first illumination information; and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map. The map loading method can accurately select the positioning map under the condition that the current illumination of the environment changes, and improve the success rate and the accuracy rate of positioning the images shot in real time.

Description

Map loading method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to the field of computer vision, and in particular, to a method and an apparatus for map loading, an electronic device, and a readable storage medium.
Background
An intelligent robot or unmanned vehicle needs to know map information of the whole unknown environment when the intelligent robot or unmanned vehicle wants to complete simple or complex functions in the unknown environment. By acquiring the information of the unknown environment, a map of the unknown environment is established so as to facilitate the positioning of the intelligent robot or the unmanned vehicle. Only if the drawing and the positioning are successfully built, the navigation and other functions of the robot can be guaranteed.
The inventor finds that, in the process of researching the prior art, currently, a visual instantaneous positioning and mapping (VSLAM) technology is generally used for mapping, but the VSLAM is image processing, is very sensitive to the intensity of illumination and is easily influenced by the illumination, and if the current illumination of an environment changes, the difference between the illumination of a currently loaded positioning map and the current illumination of the environment becomes large, so that the currently loaded positioning map is unavailable, and the positioning effect is seriously influenced.
Therefore, how to replace or update the positioning map corresponding to the current illumination intensity of the environment under the condition that the current positioning map is unavailable due to the change of the illumination of the environment is a problem to be solved.
Disclosure of Invention
The technical problem to be solved by some embodiments of the present application is how to accurately select a positioning map under the condition that the current illumination of the environment changes, so as to improve the success rate and accuracy rate of positioning the image shot in real time.
One embodiment of the present application provides a method for map loading, including: acquiring current image data of an environment; calculating first illumination information corresponding to the environment at present according to the current image data of the environment; determining the current time period of the environment according to the first illumination information; and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
An embodiment of the present application further provides a map loading apparatus, including: the map loading system comprises an acquisition module, a first determination module, a second determination module and a map loading module; the acquisition module is used for acquiring current image data of the environment; the first determining module is used for calculating first illumination information corresponding to the environment at present according to the current image data of the environment; the second determining module is used for determining the current time period of the environment according to the first illumination information; the map loading module is used for determining the positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
An embodiment of the present application further provides an electronic device, including: at least one processor; and a memory communicatively coupled to the at least one processor; the memory stores instructions executable by the at least one processor, and the instructions are executed by the at least one processor to enable the at least one processor to execute the map loading method.
The embodiment of the application also provides a computer readable storage medium, which stores a computer program, and the computer program realizes the map loading method when being executed by a processor.
Compared with the prior art, in the embodiments of the present application, the current time period of the environment is determined according to the current first illumination information of the environment, so that the positioning map corresponding to the current time period of the environment can be determined. Because the illumination change in the environment has a corresponding relation with the time period of the environment, and the positioning map also has a corresponding relation with the time period, the positioning map corresponding to the current first illumination information of the environment can be accurately determined through the current first illumination information of the environment, so that the positioning map can be accurately determined even under the condition that the current illumination of the environment is changed, the illumination difference between the illumination of the currently loaded positioning map and the illumination of the environment in the current time period is small or even no illumination difference exists, and the success rate and the accuracy rate of positioning the images shot in real time are improved.
Drawings
One or more embodiments are illustrated by way of example in the accompanying drawings, which correspond to the figures in which like reference numerals refer to similar elements and which are not to scale unless otherwise specified.
Fig. 1 is a detailed flowchart of a map loading method in a first embodiment of the present application;
fig. 2 is a schematic flowchart of a specific process of constructing a positioning map of an environment, determining a correspondence between the positioning map of the environment and a time period, and determining a correspondence between illumination information of the environment and the time period in the first embodiment of the present application;
FIG. 3 is a detailed flowchart of a method for loading a map according to a second embodiment of the present application;
fig. 4 is a schematic flow chart illustrating a specific process of constructing a positioning map of an environment and determining a correspondence between the positioning map of the environment and a time period in a second embodiment of the present application;
FIG. 5 is a schematic structural diagram of an apparatus for map loading according to a third embodiment of the present application;
fig. 6 is a schematic structural diagram of an electronic device in a fourth embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, some embodiments of the present application will be described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. However, it will be appreciated by those of ordinary skill in the art that in the various embodiments of the present application, numerous technical details are set forth in order to provide a better understanding of the present application. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present application relates to a map loading method, which may be applied to an electronic device, such as an unmanned vehicle, an intelligent robot, or the like, that constructs a map using VSLAM technology. The map loading method specifically comprises the following processes as shown in fig. 1:
step 101: current image data of an environment is acquired.
Specifically, the image data of the environment at the current time may be obtained by using a sensor, for example, obtaining the image data of the environment at the current time by using a camera, where the image data includes M images, M is an integer greater than 1, and the captured contents of the multiple images cover a wider area of the environment. For example, assuming that the environment is an outdoor gym, the gym may be photographed at a distance of 1 meter, and 5 images of the gym are acquired at the current time period (assuming that 5 images are photographed at one time period), and the photographed images of the 5 gyms are taken as image data of the gym at the current time. It can be understood that a wide-angle lens may be adopted to increase the shooting range of the camera, so that the shot content contained in the shot image is rich, and the type of the camera is not limited in this embodiment.
In a specific implementation, before acquiring current image data of an environment, a positioning map of the environment needs to be constructed, a corresponding relationship between the positioning map of the environment and a time period is determined, and a corresponding relationship between illumination information of the environment and the time period is determined, where a specific flow is shown in fig. 2.
Step 2011: image data of an environment is acquired at each of N cycles, N being an integer greater than 0.
Specifically, a day may be divided into m periods, each period includes m periods, m is an integer greater than 1, where N periods may be continuous N periods or discontinuous N periods, and this embodiment takes continuous N periods as an example for description.
In a specific implementation, image data of the environment is acquired at the same time period of N consecutive cycles, wherein N is an integer greater than 0; and acquiring M images of the environment at intervals of a preset distance in each time interval, wherein M is an integer greater than 1.
Specifically, whether the environment to be collected is an indoor environment or an outdoor environment, the illumination intensity of the environment changes with time. For example, if the environment to be collected is a road of a park, the illumination intensity of the road in the daytime is greater than the illumination intensity of the road in the nighttime. A day may be divided into m time periods, m being an integer greater than 1, for example, m being 2, i.e., a day is divided into two time periods, i.e., a day time period and an evening time period; and continuously acquiring image data of the environment in the same time period of N cycles, such as: it may be that image data of a park road in a daytime period and image data of the park road in a nighttime period are acquired for 10 consecutive days.
The image data of the environment are collected at the same time period of N continuous periods, so that the accuracy of the corresponding relation between the time period and the positioning map of the environment and the accuracy of the corresponding relation between the time period and the illumination information are improved.
For example, if image data of a stadium in the daytime period is collected, one image of the stadium can be collected every 1 meter, and M images of the stadium in the current time period are obtained in total, where M can be determined according to the size of the stadium and the angle of view taken by the camera, and of course, the collected preset distance can be reduced for the accuracy of the constructed map.
Step 2012: and respectively constructing a positioning map of the environment according to the image data acquired at each time interval and determining the corresponding relation between the time intervals and the positioning map of the environment.
Specifically, by the VSLAM technology, a positioning map of the environment is constructed while image data of the environment is acquired, and when the construction of the positioning map of the environment is completed, a corresponding relationship between the positioning map and a time period is established. For example, image data of a stadium is acquired at a time period t1, and a positioning map constructed by using the VSLAM technology has "t 1" as an identifier of the constructed positioning map, so that a correspondence relationship between the time period and the positioning map is established. It can be understood that the corresponding relationship between the positioning map of the environment and the time period may also be established in other manners, and of course, the corresponding relationship between the time period and the positioning map of the environment may also be established in other manners, which are not listed in this embodiment.
Step 2013: and respectively calculating second illumination information corresponding to the environment in different time periods according to the image data acquired in each time period.
Specifically, the illumination information may be a brightness value or a gray value of the acquired image, and the brightness value is used as the illumination information in this embodiment.
In a specific implementation, the processing procedure of calculating the second illumination information corresponding to the environment in a time period is as follows: determining third illumination information corresponding to the environment in the same time period of the N periods according to image data acquired in the same time period of the N periods; respectively determining third illumination information corresponding to the environment in the same time interval of N periods according to the image data acquired in each time interval; calculating the average value of the third illumination information corresponding to the environment in the same time interval of N periods; and taking the average value of the third illumination information as second illumination information corresponding to the environment in the same time period. The image data acquired in the same time interval in each cycle is processed as follows: calculating the average value of the fourth illumination information corresponding to the M images in the time period; and taking the average value of the fourth illumination information as the third illumination information corresponding to the environment in the same period of one cycle. And the fourth illumination information corresponding to one image is determined by calculation according to the total number of pixel points contained in the image and the illumination information of each pixel point.
The following will specifically describe a process of calculating the second illumination information corresponding to each of the different periods of time in the environment.
Taking one day as a cycle, dividing one day into M periods, and acquiring M images of the environment in each period, wherein the images included in the image data are identified by a sequence Pij, i represents the number of days, j represents the sequence number of the period, and the total number of acquired days is N days. Then the fourth illumination information h0 corresponding to one image Pij is calculated as formula 2.1:
Figure BDA0001796488610000051
wherein f (x, y) represents the illumination information of each pixel point of the image, i.e. the pixel value of each pixel point, B represents the total number of pixel points contained in the image, δ represents a nonzero positive number approaching zero, e.g. 0.001, and δ is used for preventing the occurrence of the situation that the calculation result of logarithm calculation tends to be negative and infinite. Through the formula 2.1, the fourth illumination information corresponding to the image Pij can be calculated.
Calculating the average value of the fourth illumination information corresponding to the M images in each time interval, namely:
Figure BDA0001796488610000052
and hij represents the illumination information of the environment in a j time period of a day with the serial number of i, namely the illumination information is the third illumination information corresponding to the environment in a time period of a cycle.
The average value of the third illumination information respectively corresponding to the environment in the same period of N cycles, that is,
Figure BDA0001796488610000053
Hjand second illumination information representing the environment corresponding to the same time period j.
Similarly, according to the formula 2.1, the formula 2.2, and the formula 2.3, the second illumination information corresponding to the environment in each other time period can be calculated.
Step 102: and calculating first illumination information currently corresponding to the environment according to the current image data of the environment.
In one specific implementation, the current image data of the environment comprises M images, and fourth illumination information corresponding to each image in the current image data of the environment is respectively calculated; and determining the first illumination information currently corresponding to the environment according to the fourth illumination information corresponding to each image.
Specifically, the process of calculating the fourth illumination information corresponding to each image in the current image data of the environment is substantially the same as the process in step 2013, that is, the fourth illumination information corresponding to each image in the current image data of the environment can be calculated according to formula 2.1, and the specific calculation process is not repeated. And substituting the fourth illumination information of each image in the current image data of the environment obtained by calculation into a formula 2.2 to obtain the first illumination information corresponding to the environment at present.
Step 103: and determining the current time period of the environment according to the first illumination information.
In a specific implementation, the first illumination information is respectively subtracted from second illumination information corresponding to the environment in different time periods, and the current time period of the environment is determined according to the difference result.
Specifically, the first illumination information is respectively differed from the second illumination information corresponding to the environment in different time periods, and the time period corresponding to the minimum difference result is taken as the current time period of the environment. The first illumination information and the second illumination information corresponding to the environment in a period of time are differentiated as follows:
Figure BDA0001796488610000061
Hjindicating second illumination information corresponding to the environment during the j period,
Figure BDA0001796488610000062
and selecting the minimum j value to determine the time period corresponding to the difference value.
Step 104: and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
Compared with the prior art, in the embodiments of the present application, the current time period of the environment is determined according to the current first illumination information of the environment, so that the positioning map corresponding to the current time period of the environment can be determined. Because the illumination change in the environment has a corresponding relation with the time period of the environment, and the positioning map also has a corresponding relation with the time period, the positioning map corresponding to the current first illumination information of the environment can be accurately determined through the current first illumination information of the environment, so that the positioning map can be accurately determined even under the condition that the current illumination of the environment is changed, the illumination difference between the illumination of the currently loaded positioning map and the illumination of the environment in the current time period is small or even no illumination difference exists, and the success rate and the accuracy rate of positioning the images shot in real time are improved.
The second embodiment of the present application relates to a map loading method, and is substantially the same as the first embodiment, and mainly differs in that before acquiring current image data of an environment, the present embodiment does not need to calculate second illumination information corresponding to the environment at different time periods according to image data acquired at different time periods. The embodiment is applied to an environment with an illumination device, and the specific flow is as shown in fig. 3:
step 301: current image data of an environment is acquired.
Specifically, this step is substantially the same as step 101 in the first embodiment, and is not described here again.
It should be noted that before acquiring the current image data of the environment, a positioning map of the environment needs to be constructed, and a corresponding relationship between the positioning map of the environment and a time period is determined, and a specific flow is shown in fig. 4:
step 4011: image data of an environment is acquired at each of N cycles, N being an integer greater than 0.
Step 4012: and respectively constructing a positioning map of the environment according to the image data acquired at each time interval and determining the corresponding relation between the time intervals and the positioning map of the environment.
Steps 4011 and 4012 are substantially the same as steps 2011 and 2012 in the first embodiment, and will not be described here.
Step 302: and identifying the illumination devices in the current M images of the environment according to the current M images of the environment.
Specifically, the current image data of the environment comprises M images, wherein M is an integer greater than 1; and (3) acquiring the characteristic points in the M images by adopting a deep learning mode, and identifying the illumination device in each image. The illumination device includes various types of lamps.
Step 303: and calculating fifth illumination information of the illumination device in any image of the current M images of the environment or within a preset range of the illumination device, and taking the calculated fifth illumination information of the illumination device or within the preset range of the illumination device as the first illumination information corresponding to the environment at present.
Specifically, fifth illumination information of the illumination device in any one of the current M images of the environment is calculated, and the fifth illumination information h1 is calculated by referring to equation 2.1, such as:
Figure BDA0001796488610000071
wherein, W is the total pixel point included in the illumination device in the image, and f (x, y) represents the pixel value of the pixel point of the illumination device in the image. It will be appreciated thatOther ways of calculating the illumination information may be employed.
Or, calculating fifth illumination information within a preset range of an illumination device in any one of the current M images of the environment, where the preset range of the illumination device may be an area with a center of mass of the illumination device in the image and a preset pixel as a radius, or other areas around the illumination device. For example, a circular area with a center of mass of the fluorescent lamp and a radius of 8 pixels is used as the preset range of the fluorescent lamp. The way of calculating the fifth illumination information within the preset range of the illumination device is substantially the same as the way of calculating the fifth illumination information of the illumination device, and is not described herein again.
Step 304: and judging whether the fifth illumination information of the illumination device or within the preset range of the illumination device exceeds a preset illumination threshold value, if so, executing the step 305, otherwise, executing the step 306.
Specifically, the different time periods include a day time period and a night time period, and the preset illumination threshold value may be determined according to illumination information of the illumination device in the day time period and illumination information in the night time period. If the fifth illumination information in the illumination device or the preset range of the illumination device exceeds a preset illumination threshold, the illumination device is turned on, if the illumination in the illumination device or the preset range of the illumination device is relatively bright and has a high brightness value, the current night time period of the environment can be determined, and if the fifth illumination information in the preset range of the illumination device or the illumination device does not exceed the preset illumination threshold, the illumination device is turned off, and if the illumination in the preset range of the illumination device or the illumination device is relatively dark and has a low brightness value, the current day time period of the environment can be determined.
Step 305: it is determined that the environment is currently in the evening hours.
Step 306: it is determined that the environment is currently in the daytime hours.
Step 307: and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
Compared with the prior art, the map loading method provided by the embodiment identifies the illumination device in the current image data of the environment, and judges whether the illumination device is turned on according to the fifth illumination information of the identified illumination device or the fifth illumination information within the preset range of the illumination device, so that the current time period of the environment is determined, and the current corresponding positioning map of the environment is further determined.
A third embodiment of the present application relates to a map loading apparatus 50, comprising: an acquisition module 501, a first determination module 502, a second determination module 503 and a map loading module 504; the specific structure of the map loading device is shown in fig. 5:
the obtaining module 501 is configured to obtain current image data of an environment; the first determining module 502 is configured to calculate first illumination information corresponding to the current environment according to the current image data of the environment; the second determining module 503 is configured to determine a current time period of the environment according to the first illumination information; the map loading module 504 is configured to determine a positioning map to be loaded and load the positioning map according to the current time period of the environment and the corresponding relationship between the time period and the positioning map.
The present embodiment is an embodiment of a virtual device corresponding to the map loading method, and technical details in the foregoing embodiment of the method are still applicable in the present embodiment, and are not described herein again.
It should be noted that the above-mentioned embodiments of the apparatus are merely illustrative, and do not limit the scope of the present application, and in practical applications, a person skilled in the art may select some or all of the modules to achieve the purpose of the embodiments according to actual needs, and the present invention is not limited herein.
A fourth embodiment of the present application relates to an electronic apparatus, the structure of which is shown in fig. 6. The method comprises the following steps: at least one processor 601; and a memory 602 communicatively coupled to the at least one processor 601. The memory 602 stores instructions executable by the at least one processor 601. The instructions are executed by the at least one processor 601 to enable the at least one processor 601 to perform the method of map loading described above.
In this embodiment, the processor is exemplified by a Central Processing Unit (CPU), and the Memory is exemplified by a Random Access Memory (RAM). The processor and the memory may be connected by a bus or other means, and fig. 6 illustrates the connection by a bus as an example. The memory, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as the map of locations stored in the memory in the embodiments of the present application. The processor executes various functional applications and data processing of the device by running nonvolatile software programs, instructions and modules stored in the memory, that is, the map loading method described above is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store a list of options, etc. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be connected to the external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory and, when executed by the one or more processors, perform the method of map loading in any of the method embodiments described above.
The product can execute the map loading method provided by the embodiment of the application, has corresponding functional modules and beneficial effects of the execution method, and can refer to the map loading method provided by the embodiment of the application without detailed technical details in the embodiment.
A fifth embodiment of the present application relates to a computer-readable storage medium, which is a computer-readable storage medium having stored therein computer instructions that enable a computer to perform the method for map loading referred to in the first or second method embodiment of the present application.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the present application, and that various changes in form and details may be made therein without departing from the spirit and scope of the present application in practice.

Claims (11)

1. A method of map loading, comprising:
acquiring current image data of an environment;
calculating first illumination information corresponding to the environment at present according to the current image data of the environment;
determining the current time period of the environment according to the first illumination information;
determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map;
the current image data of the environment comprises M images;
calculating first illumination information corresponding to the environment at present according to the current image data of the environment, specifically comprising:
respectively calculating fourth illumination information corresponding to each image in the current image data of the environment;
determining first illumination information corresponding to the environment at present according to fourth illumination information corresponding to each image;
or,
the current image data of the environment comprises M images, wherein M is an integer larger than 1;
calculating first illumination information corresponding to the environment at present according to the current image data of the environment, specifically comprising:
identifying the illumination devices in the current M images of the environment according to the current M images of the environment;
calculating fifth illumination information of the illumination device or within a preset range of the illumination device in any image of the current M images of the environment, and taking the calculated fifth illumination information of the illumination device or within the preset range of the illumination device as the first illumination information corresponding to the environment at present.
2. The method of map loading according to claim 1, wherein prior to acquiring current image data of an environment, the method of map loading further comprises:
acquiring image data of the environment at each time period of N cycles, wherein N is an integer greater than 0;
and respectively constructing a positioning map of the environment according to the image data acquired at each time interval and determining the corresponding relation between the time intervals and the positioning map of the environment.
3. The map loading method of claim 2, wherein acquiring the image data of the environment at each of the N cycles specifically comprises:
acquiring image data of the environment at the same time period of N continuous cycles;
and acquiring M images of the environment at intervals of a preset distance in each time interval, wherein M is an integer greater than 1.
4. The map loading method of claim 3, wherein after acquiring the image data of the environment at each of the N cycles and before acquiring the current image data of the environment, the map loading method further comprises:
and respectively calculating second illumination information corresponding to the environment in different time periods according to the image data acquired in each time period.
5. The map loading method according to claim 4, wherein the calculating, according to the image data acquired at each time interval, the second illumination information corresponding to the environment at different time intervals respectively comprises:
the following processing is carried out on the image data acquired by the environment in the same period of N cycles:
determining third illumination information corresponding to the environment in the same time period of the N periods according to image data acquired in the same time period of the N periods;
calculating the average value of the third illumination information corresponding to the environment in the same time interval of N periods;
and taking the average value of the third illumination information as the second illumination information corresponding to the environment in the same time interval.
6. The map loading method according to claim 5, wherein determining, according to the image data acquired at the same time period of the N cycles, third illumination information corresponding to the environment at the same time period of the N cycles, specifically comprises:
respectively carrying out the following processing on the image data acquired in the same time period in each period:
calculating the average value of the fourth illumination information corresponding to the M images in the time period;
taking the average value of the fourth illumination information as third illumination information corresponding to the time interval of one cycle of the environment;
and the fourth illumination information corresponding to one image is determined by calculation according to the total number of pixel points contained in the image and the illumination information of each pixel point.
7. The map loading method according to any one of claims 4 to 6, wherein determining, according to the first illumination information, a current time period of the environment specifically includes:
and respectively subtracting the first illumination information from the second illumination information corresponding to the environment in different time periods, and determining the current time period of the environment according to the difference result.
8. The method of map loading according to claim 1 or 3, wherein the time period comprises a day time period and a night time period;
determining the current time period of the environment according to the first illumination information, specifically comprising:
and judging whether the illumination device or fifth illumination information in a preset range of the illumination device exceeds a preset illumination threshold value, if so, determining that the environment is currently in the night time period, and otherwise, determining that the environment is currently in the day time period.
9. An apparatus for map loading, comprising: the map loading system comprises an acquisition module, a first determination module, a second determination module and a map loading module;
the acquisition module is used for acquiring current image data of the environment;
the first determining module is used for calculating first illumination information corresponding to the environment at present according to the current image data of the environment;
the second determining module is used for determining the current time period of the environment according to the first illumination information;
the map loading module is used for determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map;
the current image data of the environment comprises M images; the first determining module is specifically configured to calculate fourth illumination information corresponding to each image in the current image data of the environment respectively; determining first illumination information corresponding to the environment at present according to fourth illumination information corresponding to each image;
or the current image data of the environment comprises M images, wherein M is an integer greater than 1; the first determining module is specifically configured to identify, according to the current M images of the environment, an illumination device in the current M images of the environment; calculating fifth illumination information of the illumination device or within a preset range of the illumination device in any image of the current M images of the environment, and taking the calculated fifth illumination information of the illumination device or within the preset range of the illumination device as the first illumination information corresponding to the environment at present.
10. An electronic device, comprising:
at least one processor; and the number of the first and second groups,
a memory communicatively coupled to the at least one processor; wherein,
the memory stores instructions executable by the at least one processor to enable the at least one processor to perform a method of map loading as claimed in any one of claims 1 to 8.
11. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method of map loading according to any one of claims 1 to 8.
CN201880001292.7A 2018-07-16 2018-07-16 Map loading method and device, electronic equipment and readable storage medium Active CN109074408B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/095824 WO2020014832A1 (en) 2018-07-16 2018-07-16 Map loading method and device, electronic apparatus, and readable storage medium

Publications (2)

Publication Number Publication Date
CN109074408A CN109074408A (en) 2018-12-21
CN109074408B true CN109074408B (en) 2022-04-08

Family

ID=64789328

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880001292.7A Active CN109074408B (en) 2018-07-16 2018-07-16 Map loading method and device, electronic equipment and readable storage medium

Country Status (2)

Country Link
CN (1) CN109074408B (en)
WO (1) WO2020014832A1 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110211027A (en) * 2019-04-30 2019-09-06 北京云迹科技有限公司 Map data processing method and device for robot
CN111652934B (en) * 2020-05-12 2023-04-18 Oppo广东移动通信有限公司 Positioning method, map construction method, device, equipment and storage medium
CN112488007B (en) * 2020-12-04 2023-10-13 深圳市优必选科技股份有限公司 Visual positioning method, device, robot and storage medium
WO2022116156A1 (en) * 2020-12-04 2022-06-09 深圳市优必选科技股份有限公司 Visual positioning method, robot, and storage medium
CN112697156A (en) * 2020-12-04 2021-04-23 深圳市优必选科技股份有限公司 Map library establishing method, robot, computer device and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017022401A1 (en) * 2015-08-04 2017-02-09 ヤマハ発動機株式会社 Information provision system
CN106796434A (en) * 2015-08-28 2017-05-31 松下电器(美国)知识产权公司 Ground drawing generating method, self-position presumption method, robot system and robot
CN107945224A (en) * 2017-11-07 2018-04-20 北京中科慧眼科技有限公司 Method and apparatus based on image detection illumination condition

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6288721B1 (en) * 1999-07-07 2001-09-11 Litton Systems, Inc. Rendering process and method for digital map illumination intensity shading
CN102596517B (en) * 2009-07-28 2015-06-17 悠进机器人股份公司 Control method for localization and navigation of mobile robot and mobile robot using same
TWI391874B (en) * 2009-11-24 2013-04-01 Ind Tech Res Inst Method and device of mapping and localization method using the same
US9001190B2 (en) * 2011-07-05 2015-04-07 Microsoft Technology Licensing, Llc Computer vision system and method using a depth sensor
US20150187127A1 (en) * 2012-07-19 2015-07-02 Google Inc. Varying map content and styles based on time
US20150193971A1 (en) * 2014-01-03 2015-07-09 Motorola Mobility Llc Methods and Systems for Generating a Map including Sparse and Dense Mapping Information
US9971320B2 (en) * 2014-07-03 2018-05-15 Google Llc Methods and systems for adaptive triggering of data collection
EP3224649B1 (en) * 2014-11-26 2023-04-05 iRobot Corporation Systems and methods for performing simultaneous localization and mapping using machine vision systems
JP6411917B2 (en) * 2015-02-27 2018-10-24 株式会社日立製作所 Self-position estimation apparatus and moving body
US20170287196A1 (en) * 2016-04-01 2017-10-05 Microsoft Technology Licensing, Llc Generating photorealistic sky in computer generated animation
CN106125730B (en) * 2016-07-10 2019-04-30 北京工业大学 A kind of robot navigation's map constructing method based on mouse cerebral hippocampal spatial cell
CN106767750B (en) * 2016-11-18 2020-12-18 北京光年无限科技有限公司 Navigation method and system for intelligent robot
CN107223244B (en) * 2016-12-02 2019-05-03 深圳前海达闼云端智能科技有限公司 Localization method and device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017022401A1 (en) * 2015-08-04 2017-02-09 ヤマハ発動機株式会社 Information provision system
CN106796434A (en) * 2015-08-28 2017-05-31 松下电器(美国)知识产权公司 Ground drawing generating method, self-position presumption method, robot system and robot
CN107945224A (en) * 2017-11-07 2018-04-20 北京中科慧眼科技有限公司 Method and apparatus based on image detection illumination condition

Also Published As

Publication number Publication date
CN109074408A (en) 2018-12-21
WO2020014832A1 (en) 2020-01-23

Similar Documents

Publication Publication Date Title
CN109074408B (en) Map loading method and device, electronic equipment and readable storage medium
CN108174118B (en) Image processing method and device and electronic equipment
CN113468967B (en) Attention mechanism-based lane line detection method, attention mechanism-based lane line detection device, attention mechanism-based lane line detection equipment and attention mechanism-based lane line detection medium
CN107527017B (en) Parking space detection method and system, storage medium and electronic equipment
CN107748882B (en) Lane line detection method and device
CN114648709A (en) Method and equipment for determining image difference information
CN112418031B (en) Image recognition method and device, storage medium and electronic equipment
CN116170660A (en) Algorithm scheduling method and device for camera, computer equipment and medium
CN109691185B (en) Positioning method, positioning device, terminal and readable storage medium
CN110910432A (en) Remote sensing image matching method and device, electronic equipment and readable storage medium
CN111316135B (en) System for calculating atmospheric precipitation rate in real time according to digital image of environment in which atmospheric precipitation is occurring
CN104782115A (en) Image processing device for vehicle
CN113095194A (en) Image classification method and device, storage medium and electronic equipment
CN111596090A (en) Method and device for measuring vehicle running speed, vehicle and medium
CN111724440A (en) Orientation information determining method and device of monitoring equipment and electronic equipment
CN113807209B (en) Parking space detection method and device, electronic equipment and storage medium
CN111583312A (en) Method and device for accurately matching remote sensing images, electronic equipment and storage medium
CN111343385B (en) Method, device, equipment and storage medium for determining environment brightness
CN111899287A (en) Ghost high dynamic range image fusion method for automatic driving
CN111724320A (en) Blind pixel filling method and system
CN116046303B (en) Deflection intelligent detection system, method and device
CN116805988A (en) Automatic switching method and system for detecting luminosity edge of power transmission line in day and night state
EP4428835A1 (en) Parking recognition method, parking recognition system, and computer program stored in recording medium in order to execute method
CN112529954B (en) Method and device for determining position of suspension object based on heterogeneous binocular imaging equipment
CN115880239A (en) Camera visual field interference detection method and device, intelligent robot and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right

Effective date of registration: 20210203

Address after: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Applicant after: Dalu Robot Co.,Ltd.

Address before: 518000 Room 201, building A, No. 1, Qian Wan Road, Qianhai Shenzhen Hong Kong cooperation zone, Shenzhen, Guangdong (Shenzhen Qianhai business secretary Co., Ltd.)

Applicant before: Shenzhen Qianhaida Yunyun Intelligent Technology Co.,Ltd.

TA01 Transfer of patent application right
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address

Address after: 200245 Building 8, No. 207, Zhongqing Road, Minhang District, Shanghai

Patentee after: Dayu robot Co.,Ltd.

Address before: 200245 2nd floor, building 2, no.1508, Kunyang Road, Minhang District, Shanghai

Patentee before: Dalu Robot Co.,Ltd.

CP03 Change of name, title or address