Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, some embodiments of the present application will be described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the present application and are not intended to limit the present application. However, it will be appreciated by those of ordinary skill in the art that in the various embodiments of the present application, numerous technical details are set forth in order to provide a better understanding of the present application. However, the technical solution claimed in the present application can be implemented without these technical details and various changes and modifications based on the following embodiments.
A first embodiment of the present application relates to a map loading method, which may be applied to an electronic device, such as an unmanned vehicle, an intelligent robot, or the like, that constructs a map using VSLAM technology. The map loading method specifically comprises the following processes as shown in fig. 1:
step 101: current image data of an environment is acquired.
Specifically, the image data of the environment at the current time may be obtained by using a sensor, for example, obtaining the image data of the environment at the current time by using a camera, where the image data includes M images, M is an integer greater than 1, and the captured contents of the multiple images cover a wider area of the environment. For example, assuming that the environment is an outdoor gym, the gym may be photographed at a distance of 1 meter, and 5 images of the gym are acquired at the current time period (assuming that 5 images are photographed at one time period), and the photographed images of the 5 gyms are taken as image data of the gym at the current time. It can be understood that a wide-angle lens may be adopted to increase the shooting range of the camera, so that the shot content contained in the shot image is rich, and the type of the camera is not limited in this embodiment.
In a specific implementation, before acquiring current image data of an environment, a positioning map of the environment needs to be constructed, a corresponding relationship between the positioning map of the environment and a time period is determined, and a corresponding relationship between illumination information of the environment and the time period is determined, where a specific flow is shown in fig. 2.
Step 2011: image data of an environment is acquired at each of N cycles, N being an integer greater than 0.
Specifically, a day may be divided into m periods, each period includes m periods, m is an integer greater than 1, where N periods may be continuous N periods or discontinuous N periods, and this embodiment takes continuous N periods as an example for description.
In a specific implementation, image data of the environment is acquired at the same time period of N consecutive cycles, wherein N is an integer greater than 0; and acquiring M images of the environment at intervals of a preset distance in each time interval, wherein M is an integer greater than 1.
Specifically, whether the environment to be collected is an indoor environment or an outdoor environment, the illumination intensity of the environment changes with time. For example, if the environment to be collected is a road of a park, the illumination intensity of the road in the daytime is greater than the illumination intensity of the road in the nighttime. A day may be divided into m time periods, m being an integer greater than 1, for example, m being 2, i.e., a day is divided into two time periods, i.e., a day time period and an evening time period; and continuously acquiring image data of the environment in the same time period of N cycles, such as: it may be that image data of a park road in a daytime period and image data of the park road in a nighttime period are acquired for 10 consecutive days.
The image data of the environment are collected at the same time period of N continuous periods, so that the accuracy of the corresponding relation between the time period and the positioning map of the environment and the accuracy of the corresponding relation between the time period and the illumination information are improved.
For example, if image data of a stadium in the daytime period is collected, one image of the stadium can be collected every 1 meter, and M images of the stadium in the current time period are obtained in total, where M can be determined according to the size of the stadium and the angle of view taken by the camera, and of course, the collected preset distance can be reduced for the accuracy of the constructed map.
Step 2012: and respectively constructing a positioning map of the environment according to the image data acquired at each time interval and determining the corresponding relation between the time intervals and the positioning map of the environment.
Specifically, by the VSLAM technology, a positioning map of the environment is constructed while image data of the environment is acquired, and when the construction of the positioning map of the environment is completed, a corresponding relationship between the positioning map and a time period is established. For example, image data of a stadium is acquired at a time period t1, and a positioning map constructed by using the VSLAM technology has "t 1" as an identifier of the constructed positioning map, so that a correspondence relationship between the time period and the positioning map is established. It can be understood that the corresponding relationship between the positioning map of the environment and the time period may also be established in other manners, and of course, the corresponding relationship between the time period and the positioning map of the environment may also be established in other manners, which are not listed in this embodiment.
Step 2013: and respectively calculating second illumination information corresponding to the environment in different time periods according to the image data acquired in each time period.
Specifically, the illumination information may be a brightness value or a gray value of the acquired image, and the brightness value is used as the illumination information in this embodiment.
In a specific implementation, the processing procedure of calculating the second illumination information corresponding to the environment in a time period is as follows: determining third illumination information corresponding to the environment in the same time period of the N periods according to image data acquired in the same time period of the N periods; respectively determining third illumination information corresponding to the environment in the same time interval of N periods according to the image data acquired in each time interval; calculating the average value of the third illumination information corresponding to the environment in the same time interval of N periods; and taking the average value of the third illumination information as second illumination information corresponding to the environment in the same time period. The image data acquired in the same time interval in each cycle is processed as follows: calculating the average value of the fourth illumination information corresponding to the M images in the time period; and taking the average value of the fourth illumination information as the third illumination information corresponding to the environment in the same period of one cycle. And the fourth illumination information corresponding to one image is determined by calculation according to the total number of pixel points contained in the image and the illumination information of each pixel point.
The following will specifically describe a process of calculating the second illumination information corresponding to each of the different periods of time in the environment.
Taking one day as a cycle, dividing one day into M periods, and acquiring M images of the environment in each period, wherein the images included in the image data are identified by a sequence Pij, i represents the number of days, j represents the sequence number of the period, and the total number of acquired days is N days. Then the fourth illumination information h0 corresponding to one image Pij is calculated as formula 2.1:
wherein f (x, y) represents the illumination information of each pixel point of the image, i.e. the pixel value of each pixel point, B represents the total number of pixel points contained in the image, δ represents a nonzero positive number approaching zero, e.g. 0.001, and δ is used for preventing the occurrence of the situation that the calculation result of logarithm calculation tends to be negative and infinite. Through the formula 2.1, the fourth illumination information corresponding to the image Pij can be calculated.
Calculating the average value of the fourth illumination information corresponding to the M images in each time interval, namely:
and hij represents the illumination information of the environment in a j time period of a day with the serial number of i, namely the illumination information is the third illumination information corresponding to the environment in a time period of a cycle.
The average value of the third illumination information respectively corresponding to the environment in the same period of N cycles, that is,
Hjand second illumination information representing the environment corresponding to the same time period j.
Similarly, according to the formula 2.1, the formula 2.2, and the formula 2.3, the second illumination information corresponding to the environment in each other time period can be calculated.
Step 102: and calculating first illumination information currently corresponding to the environment according to the current image data of the environment.
In one specific implementation, the current image data of the environment comprises M images, and fourth illumination information corresponding to each image in the current image data of the environment is respectively calculated; and determining the first illumination information currently corresponding to the environment according to the fourth illumination information corresponding to each image.
Specifically, the process of calculating the fourth illumination information corresponding to each image in the current image data of the environment is substantially the same as the process in step 2013, that is, the fourth illumination information corresponding to each image in the current image data of the environment can be calculated according to formula 2.1, and the specific calculation process is not repeated. And substituting the fourth illumination information of each image in the current image data of the environment obtained by calculation into a formula 2.2 to obtain the first illumination information corresponding to the environment at present.
Step 103: and determining the current time period of the environment according to the first illumination information.
In a specific implementation, the first illumination information is respectively subtracted from second illumination information corresponding to the environment in different time periods, and the current time period of the environment is determined according to the difference result.
Specifically, the first illumination information is respectively differed from the second illumination information corresponding to the environment in different time periods, and the time period corresponding to the minimum difference result is taken as the current time period of the environment. The first illumination information and the second illumination information corresponding to the environment in a period of time are differentiated as follows:
H
jindicating second illumination information corresponding to the environment during the j period,
and selecting the minimum j value to determine the time period corresponding to the difference value.
Step 104: and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
Compared with the prior art, in the embodiments of the present application, the current time period of the environment is determined according to the current first illumination information of the environment, so that the positioning map corresponding to the current time period of the environment can be determined. Because the illumination change in the environment has a corresponding relation with the time period of the environment, and the positioning map also has a corresponding relation with the time period, the positioning map corresponding to the current first illumination information of the environment can be accurately determined through the current first illumination information of the environment, so that the positioning map can be accurately determined even under the condition that the current illumination of the environment is changed, the illumination difference between the illumination of the currently loaded positioning map and the illumination of the environment in the current time period is small or even no illumination difference exists, and the success rate and the accuracy rate of positioning the images shot in real time are improved.
The second embodiment of the present application relates to a map loading method, and is substantially the same as the first embodiment, and mainly differs in that before acquiring current image data of an environment, the present embodiment does not need to calculate second illumination information corresponding to the environment at different time periods according to image data acquired at different time periods. The embodiment is applied to an environment with an illumination device, and the specific flow is as shown in fig. 3:
step 301: current image data of an environment is acquired.
Specifically, this step is substantially the same as step 101 in the first embodiment, and is not described here again.
It should be noted that before acquiring the current image data of the environment, a positioning map of the environment needs to be constructed, and a corresponding relationship between the positioning map of the environment and a time period is determined, and a specific flow is shown in fig. 4:
step 4011: image data of an environment is acquired at each of N cycles, N being an integer greater than 0.
Step 4012: and respectively constructing a positioning map of the environment according to the image data acquired at each time interval and determining the corresponding relation between the time intervals and the positioning map of the environment.
Steps 4011 and 4012 are substantially the same as steps 2011 and 2012 in the first embodiment, and will not be described here.
Step 302: and identifying the illumination devices in the current M images of the environment according to the current M images of the environment.
Specifically, the current image data of the environment comprises M images, wherein M is an integer greater than 1; and (3) acquiring the characteristic points in the M images by adopting a deep learning mode, and identifying the illumination device in each image. The illumination device includes various types of lamps.
Step 303: and calculating fifth illumination information of the illumination device in any image of the current M images of the environment or within a preset range of the illumination device, and taking the calculated fifth illumination information of the illumination device or within the preset range of the illumination device as the first illumination information corresponding to the environment at present.
Specifically, fifth illumination information of the illumination device in any one of the current M images of the environment is calculated, and the fifth illumination information h1 is calculated by referring to equation 2.1, such as:
wherein, W is the total pixel point included in the illumination device in the image, and f (x, y) represents the pixel value of the pixel point of the illumination device in the image. It will be appreciated thatOther ways of calculating the illumination information may be employed.
Or, calculating fifth illumination information within a preset range of an illumination device in any one of the current M images of the environment, where the preset range of the illumination device may be an area with a center of mass of the illumination device in the image and a preset pixel as a radius, or other areas around the illumination device. For example, a circular area with a center of mass of the fluorescent lamp and a radius of 8 pixels is used as the preset range of the fluorescent lamp. The way of calculating the fifth illumination information within the preset range of the illumination device is substantially the same as the way of calculating the fifth illumination information of the illumination device, and is not described herein again.
Step 304: and judging whether the fifth illumination information of the illumination device or within the preset range of the illumination device exceeds a preset illumination threshold value, if so, executing the step 305, otherwise, executing the step 306.
Specifically, the different time periods include a day time period and a night time period, and the preset illumination threshold value may be determined according to illumination information of the illumination device in the day time period and illumination information in the night time period. If the fifth illumination information in the illumination device or the preset range of the illumination device exceeds a preset illumination threshold, the illumination device is turned on, if the illumination in the illumination device or the preset range of the illumination device is relatively bright and has a high brightness value, the current night time period of the environment can be determined, and if the fifth illumination information in the preset range of the illumination device or the illumination device does not exceed the preset illumination threshold, the illumination device is turned off, and if the illumination in the preset range of the illumination device or the illumination device is relatively dark and has a low brightness value, the current day time period of the environment can be determined.
Step 305: it is determined that the environment is currently in the evening hours.
Step 306: it is determined that the environment is currently in the daytime hours.
Step 307: and determining a positioning map to be loaded and loading the positioning map according to the current time period of the environment and the corresponding relation between the time period and the positioning map.
Compared with the prior art, the map loading method provided by the embodiment identifies the illumination device in the current image data of the environment, and judges whether the illumination device is turned on according to the fifth illumination information of the identified illumination device or the fifth illumination information within the preset range of the illumination device, so that the current time period of the environment is determined, and the current corresponding positioning map of the environment is further determined.
A third embodiment of the present application relates to a map loading apparatus 50, comprising: an acquisition module 501, a first determination module 502, a second determination module 503 and a map loading module 504; the specific structure of the map loading device is shown in fig. 5:
the obtaining module 501 is configured to obtain current image data of an environment; the first determining module 502 is configured to calculate first illumination information corresponding to the current environment according to the current image data of the environment; the second determining module 503 is configured to determine a current time period of the environment according to the first illumination information; the map loading module 504 is configured to determine a positioning map to be loaded and load the positioning map according to the current time period of the environment and the corresponding relationship between the time period and the positioning map.
The present embodiment is an embodiment of a virtual device corresponding to the map loading method, and technical details in the foregoing embodiment of the method are still applicable in the present embodiment, and are not described herein again.
It should be noted that the above-mentioned embodiments of the apparatus are merely illustrative, and do not limit the scope of the present application, and in practical applications, a person skilled in the art may select some or all of the modules to achieve the purpose of the embodiments according to actual needs, and the present invention is not limited herein.
A fourth embodiment of the present application relates to an electronic apparatus, the structure of which is shown in fig. 6. The method comprises the following steps: at least one processor 601; and a memory 602 communicatively coupled to the at least one processor 601. The memory 602 stores instructions executable by the at least one processor 601. The instructions are executed by the at least one processor 601 to enable the at least one processor 601 to perform the method of map loading described above.
In this embodiment, the processor is exemplified by a Central Processing Unit (CPU), and the Memory is exemplified by a Random Access Memory (RAM). The processor and the memory may be connected by a bus or other means, and fig. 6 illustrates the connection by a bus as an example. The memory, which is a non-volatile computer-readable storage medium, may be used to store non-volatile software programs, non-volatile computer-executable programs, and modules, such as the map of locations stored in the memory in the embodiments of the present application. The processor executes various functional applications and data processing of the device by running nonvolatile software programs, instructions and modules stored in the memory, that is, the map loading method described above is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store a list of options, etc. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be connected to the external device via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
One or more modules are stored in the memory and, when executed by the one or more processors, perform the method of map loading in any of the method embodiments described above.
The product can execute the map loading method provided by the embodiment of the application, has corresponding functional modules and beneficial effects of the execution method, and can refer to the map loading method provided by the embodiment of the application without detailed technical details in the embodiment.
A fifth embodiment of the present application relates to a computer-readable storage medium, which is a computer-readable storage medium having stored therein computer instructions that enable a computer to perform the method for map loading referred to in the first or second method embodiment of the present application.
That is, as can be understood by those skilled in the art, all or part of the steps in the method for implementing the embodiments described above may be implemented by a program instructing related hardware, where the program is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
It will be understood by those of ordinary skill in the art that the foregoing embodiments are specific examples for carrying out the present application, and that various changes in form and details may be made therein without departing from the spirit and scope of the present application in practice.