CN108462832A - Method and device for obtaining image - Google Patents
Method and device for obtaining image Download PDFInfo
- Publication number
- CN108462832A CN108462832A CN201810224931.8A CN201810224931A CN108462832A CN 108462832 A CN108462832 A CN 108462832A CN 201810224931 A CN201810224931 A CN 201810224931A CN 108462832 A CN108462832 A CN 108462832A
- Authority
- CN
- China
- Prior art keywords
- depth information
- face
- information
- image
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/61—Control of cameras or camera modules based on recognised objects
- H04N23/611—Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/71—Circuitry for evaluating the brightness variation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/70—Circuitry for compensating brightness variation in the scene
- H04N23/73—Circuitry for compensating brightness variation in the scene by influencing the exposure time
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The embodiment of the present application discloses the method and device for obtaining image.One specific implementation mode of this method includes:Obtain the depth map and near-infrared image comprising face of current location;Face depth information is identified from above-mentioned depth map;Detect the luminance information that face depth information is corresponded on above-mentioned near-infrared image;The time for exposure is controlled according to above-mentioned luminance information, to obtain the near-infrared facial image of current location.This embodiment improves the clarity for obtaining near-infrared facial image.
Description
Technical field
The invention relates to technical field of image processing, and in particular to the method and device for obtaining image.
Background technology
With the development of science and technology, the intelligent level of electronic equipment is also higher and higher.For accordingly being grasped by human eye
The electronic equipment of work, it is generally the case that electronic equipment can identify the direction of gaze of eyes under visible light, corresponding to execute
Operation.For example, determining whether eyes watch the camera of electronic equipment attentively by the direction of gaze of eyes, to realize to electronic equipment
The operations such as unlock.
Invention content
The purpose of the embodiment of the present application is to propose the method and device for obtaining image.
In a first aspect, the embodiment of the present application provides a kind of method for obtaining image, this method includes:It obtains current
The depth map and near-infrared image comprising face of position;Face depth information is identified from above-mentioned depth map;It detects above-mentioned
The luminance information of face depth information is corresponded on near-infrared image;The time for exposure is controlled according to above-mentioned luminance information, to be worked as
The near-infrared facial image of front position.
In some embodiments, above-mentioned to identify face depth information from above-mentioned depth map, including:Detect above-mentioned depth
The corresponding depth information of each pixel in figure draws depth information figure by depth information;It is examined from above-mentioned depth information figure
Measure the face depth information of corresponding human face structure.
In some embodiments, the luminance information of face depth information is corresponded on the above-mentioned near-infrared image of above-mentioned detection, is wrapped
It includes:Select the depth information of corresponding face setting position as target depth information from above-mentioned face depth information;From upper
State the brightness value that corresponding above-mentioned target depth information is detected on near-infrared image.
In some embodiments, above-mentioned that the time for exposure is controlled according to above-mentioned luminance information, including:In response to above-mentioned brightness value
Less than the first given threshold, then prolonging exposure time.
In some embodiments, above-mentioned that the time for exposure is controlled according to above-mentioned luminance information, including:In response to above-mentioned brightness value
Higher than the second given threshold, then shorten the time for exposure.
In some embodiments, above-mentioned that the time for exposure is controlled according to above-mentioned luminance information, including:Calculate above-mentioned brightness value with
The difference between brightness value is set, the time for exposure is adjusted according to above-mentioned difference.
Second aspect, the embodiment of the present application provide a kind of device for obtaining image, which includes:Image Acquisition
Unit, the depth map and near-infrared image comprising face for obtaining current location;Face depth information recognition unit, is used for
Face depth information is identified from above-mentioned depth map;Luminance information detection unit is right on above-mentioned near-infrared image for detecting
Answer the luminance information of face depth information;Image acquisition unit, for controlling the time for exposure according to above-mentioned luminance information, to obtain
The near-infrared facial image of current location.
In some embodiments, above-mentioned face depth information recognition unit includes:Depth information figure draws subelement, is used for
The corresponding depth information of each pixel in above-mentioned depth map is detected, depth information figure is drawn by depth information;Face depth
Infomation detection subelement, the face depth information for detecting corresponding human face structure from above-mentioned depth information figure.
In some embodiments, above-mentioned luminance information detection unit includes:Target depth information be arranged subelement, for from
Select the depth information of corresponding face setting position as target depth information in above-mentioned face depth information;Luminance information is examined
Unit detection sub-unit is surveyed, the brightness value for detecting corresponding above-mentioned target depth information from above-mentioned near-infrared image.
In some embodiments, above-mentioned image acquisition unit includes:First spectrum assignment subelement, in response to above-mentioned brightness
Value is less than the first given threshold, is used for prolonging exposure time.
In some embodiments, above-mentioned image acquisition unit includes:Second spectrum assignment subelement, in response to above-mentioned brightness
Value is higher than the second given threshold, for shortening the time for exposure.
In some embodiments, above-mentioned image acquisition unit includes:Third spectrum assignment subelement, it is above-mentioned bright for calculating
Difference between angle value and setting brightness value adjusts the time for exposure according to above-mentioned difference.
The third aspect, the embodiment of the present application provide a kind of electronic equipment, including:One or more processors;Memory,
For storing one or more programs;Near-infrared camera, for acquiring near-infrared image;When said one or multiple program quilts
When said one or multiple processors execute so that said one or multiple processors execute above-mentioned first aspect for obtaining
The method of image.
Fourth aspect, the embodiment of the present application provide a kind of computer-readable medium, are stored thereon with computer program,
It is characterized in that, which realizes the method for obtaining image of above-mentioned first aspect when being executed by processor.
It is provided by the embodiments of the present application for obtaining the method and device of image, obtain first current location depth map and
Near-infrared image;Then face depth information is identified from above-mentioned depth map;It detects later corresponding on above-mentioned near-infrared image
The luminance information of face depth information;The time for exposure is finally controlled according to above-mentioned luminance information, to obtain the close red of current location
Outer facial image.Improve the clarity for obtaining near-infrared facial image.
Description of the drawings
By reading a detailed description of non-restrictive embodiments in the light of the attached drawings below, the application's is other
Feature, objects and advantages will become more apparent upon:
Fig. 1 is that this application can be applied to exemplary system architecture figures therein;
Fig. 2 is the flow chart according to one embodiment of the method for obtaining image of the application;
Fig. 3 is the flow chart according to another embodiment of the method for obtaining image of the application;
Fig. 4 is the schematic diagram according to an application scenarios of the method for obtaining image of the application;
Fig. 5 is the structural schematic diagram according to one embodiment of the device for obtaining image of the application;
Fig. 6 is adapted for the system structure diagram of the electronic equipment for realizing the embodiment of the present application.
Specific implementation mode
The application is described in further detail with reference to the accompanying drawings and examples.It is understood that this place is retouched
The specific embodiment stated is used only for explaining related invention, rather than the restriction to the invention.It also should be noted that in order to
Convenient for description, is illustrated only in attached drawing and invent relevant part with related.
It should be noted that in the absence of conflict, the features in the embodiments and the embodiments of the present application can phase
Mutually combination.The application is described in detail below with reference to the accompanying drawings and in conjunction with the embodiments.
Fig. 1 shows the method for obtaining image that can apply the embodiment of the present application or the device for obtaining image
Exemplary system architecture 100.
As shown in Figure 1, system architecture 100 may include terminal device 101,102,103, network 104 and server 105.
Network 104 between terminal device 101,102,103 and server 105 provide communication link medium.Network 104 can be with
Including various connection types, such as wired, wireless communication link or fiber optic cables etc..
User can be interacted by network 104 with server 105 with using terminal equipment 101,102,103, to receive or send out
Send message etc..Near-infrared camera for obtaining near-infrared image can be installed on terminal device 101,102,103, be used for
Obtain the visible image capturing head of visible images and various image processing applications, such as Visual image processing application, near-infrared
Image processing application, image choose application and picture editting's application etc..
Terminal device 101,102,103 can be hardware, can also be software.When terminal device 101,102,103 is hard
Can be the various electronic equipments that there is display screen and visible images and near-infrared image is supported to acquire when part, including but
It is not limited to smart mobile phone, tablet computer, pocket computer on knee and desktop computer etc..When terminal device 101,102,
103 when being software, may be mounted in above-mentioned cited executive agent.Multiple softwares or software module may be implemented into it
(such as providing Distributed Services), can also be implemented as single software or software module.It is not specifically limited herein.
Server 105 can be to provide the server of various services, such as the depth to the acquisition of terminal device 101,102,103
Degree figure and near-infrared image carry out the server of image procossing.Server can be to the depth map received and near-infrared image etc.
Data carry out the processing such as analyzing, to determine the time for exposure of acquisition near-infrared facial image.
It should be noted that the embodiment of the present application provided for obtain image method can by terminal device 101,
102, it 103 is individually performed, or can also jointly be executed by terminal device 101,102,103 and server 105.Correspondingly, it uses
It can be set in terminal device 101,102,103, can also be set in server 105 in the device for obtaining image.
It should be noted that server can be hardware, can also be software.When server is hardware, may be implemented
At the distributed server cluster that multiple servers form, individual server can also be implemented as.It, can when server is software
To be implemented as multiple softwares or software module (such as providing Distributed Services), single software or software can also be implemented as
Module.It is not specifically limited herein.
It should be understood that the number of the terminal device, network and server in Fig. 1 is only schematical.According to realization need
It wants, can have any number of terminal device, network and server.
With continued reference to Fig. 2, the flow of one embodiment of the method for obtaining image according to the application is shown
200.The method for being used to obtain image includes the following steps:
Step 201, the depth map and near-infrared image comprising face of current location are obtained.
In the present embodiment, the method for obtaining image executive agent (such as terminal device shown in FIG. 1 101,
102,103) depth map and near-infrared image of current location can by wired connection mode or radio connection be obtained.
Wherein, current location can refer to all constant situation in position of the user location being taken and terminal device 101,102,103.
Depth map can be that terminal device 101,102,103 is obtained by passive ranging sensing/active depth sensing, includes and scene
The image or image channel of the information of the distance dependent on the surface of object.Wherein, depth map be similar to gray-scale map, depth map it is every
A pixel value can be the actual range of sensor distance subject (subject of the present embodiment is behaved).Near-infrared
Image can be the image acquired by the near-infrared camera on terminal device 101,102,103, can also be it will be seen that light
Image obtained from image is converted through near-infrared.It should be pointed out that above-mentioned radio connection can include but is not limited to 3G/
4G connections, WiFi connections, bluetooth connection, WiMAX connections, Zigbee connections, UWB (ultra wideband) connections, Yi Jiqi
The radio connection that he develops currently known or future.
It in practice, may be due to shooting angle or environment bright when user operates terminal device by eyes
Low reason is spent, causes the visible light facial image that terminal device not easily passs through acquisition that eye-angle information is recognized accurately, into
And it can not accurately be operated by eye-angle information.For example, under subdued light conditions, pass through the eye-angle information of user
When being unlocked to terminal device and (can also be delivery operation etc.), the visible light facial image of terminal device acquisition is not easy to identify
Go out correct eye-angle information, and then unlock operation can not be executed.For this purpose, can be true by way of obtaining near-infrared image
Determine eye-angle information, avoids interference of the visible light to eye-angle information to a certain extent.But existing terminal device
Time for exposure is not configured for near-infrared image, and then the near-infrared image for causing existing terminal device to obtain is clear
Degree is not high.
The present embodiment in the case where the position of the position of terminal device 101,102,103 and the user being taken is constant,
Obtain the depth map and near-infrared image comprising face of current location.
Step 202, face depth information is identified from above-mentioned depth map.
Seen from the above description, the pixel that depth map includes can characterize the camera lens and quilt of terminal device 101,102,103
Shoot the distance between object (subject of the present embodiment is behaved).In practice, the user being taken and terminal device
101, the distance of 102,103 camera lens is typically different at a distance from the camera lens with other objects with terminal device 101,102,103.
Also, face has relatively-stationary structure.When depth map includes face, terminal device 101,102,103 can be to depth
The pixel that figure includes is detected, and then the face depth information of corresponding face is identified from depth map.
Step 203, the luminance information that face depth information is corresponded on above-mentioned near-infrared image is detected.
In order to obtain the preferable near-infrared facial image of clarity, need after obtaining face depth information, detection is close red
The luminance information (for example, the value range of luminance information can be 0~255) of the face depth information is corresponded on outer image.Eventually
End equipment 101,102,103 can carry out the luminance information for each pixel that face depth information is corresponded on near-infrared image
Inquiry, and then obtain corresponding to the luminance information of face on near-infrared image at this time.
Step 204, the time for exposure is controlled according to above-mentioned luminance information, to obtain the near-infrared facial image of current location.
After obtaining luminance information, terminal device 101,102,103 can control the time for exposure according to luminance information,
So that the time for exposure disclosure satisfy that the requirement for obtaining clearly near-infrared facial image.Getting clearly near-infrared people
After face image, so that it may with according to near-infrared facial image carry out other data processings (such as can be to terminal device 101,
102,103 unlock operation or the delivery operation etc. of a certain application).
With further reference to Fig. 3, it illustrates according to another embodiment of the method for obtaining image of the application
Flow 300.In this embodiment, the flow 300 of the method for being used to obtain image, includes the following steps:
Step 301, the depth map and near-infrared image comprising face of current location are obtained.
In the present embodiment, the method for obtaining image executive agent (such as terminal device shown in FIG. 1 101,
102,103) depth map and near-infrared image of current location can by wired connection mode or radio connection be obtained.
Wherein, current location can refer to all constant situation in position of the user location being taken and terminal device 101,102,103.
Depth map can be that terminal device 101,102,103 is obtained by passive ranging sensing/active depth sensing, includes and scene
The image or image channel of the information of the distance dependent on the surface of object.Wherein, depth map be similar to gray-scale map, depth map it is every
A pixel value can be the actual range of sensor distance subject (subject of the present embodiment is behaved).Near-infrared
Image can be the image acquired by the near-infrared camera on terminal device 101,102,103, can also be it will be seen that light
Image obtained from image is converted through near-infrared.It should be pointed out that above-mentioned radio connection can include but is not limited to 3G/
4G connections, WiFi connections, bluetooth connection, WiMAX connections, Zigbee connections, UWB (ultra wideband) connections, Yi Jiqi
The radio connection that he develops currently known or future.The present embodiment is in the position of terminal device 101,102,103 and is clapped
In the case that the position of the user taken the photograph is constant, the depth map and near-infrared image comprising face of current location are obtained.
Step 302, the corresponding depth information of each pixel in above-mentioned depth map is detected, depth is drawn by depth information
Hum pattern.
Seen from the above description, the pixel that depth map includes can characterize the camera lens and quilt of terminal device 101,102,103
Shoot the distance between object (subject of the present embodiment is behaved).In practice, the user being taken and terminal device
101, the distance of 102,103 camera lens is typically different at a distance from the camera lens with other objects with terminal device 101,102,103.
Also, face has specific structure.When depth map includes face, terminal device 101,102,103 can be to depth map
Including pixel be detected, obtain the depth information of each pixel.Later, the corresponding depth can be drawn by depth information
The depth information figure of figure.
Step 303, the face depth information of corresponding human face structure is detected from above-mentioned depth information figure.
Depth information figure can characterize between the object for including in depth map and the camera lens of terminal device 101,102,103
Distance, i.e. depth information figure are similar with stereogram.Later, terminal device 101,102,103 can be detected from depth information figure
Go out the face depth information of corresponding human face structure.
Step 304, select the depth information of corresponding face setting position as target from above-mentioned face depth information
Depth information.
Human face structure is stereochemical structure.In order to obtain clear effective near-infrared facial image, need to select on face
Certain position, which is used as, refers to position, is subject to and obtains the reference position clearly near-infrared image obtains near-infrared face
Image.In the present embodiment, can be selected from above-mentioned face depth information the depth information of corresponding face setting position as
Target depth information.
It should be noted that the target depth information of face setting position may include following several situations:First, selection
The depth information on face at certain point is corresponded in face depth information as target depth information.For example, target depth information
It can be corresponding depth information at certain point on eyes;Second, it selects to correspond to a certain region on face in face depth information
The depth information at place is as target depth information.For example, target depth information can be corresponding depth at eyes region
Information;Third, selects to correspond to the mean value of the depth information of multiple points on face in face depth information as target depth information,
For example, target depth information can be the mean value of the depth information at multiple points on eyes.
Step 305, the brightness value of corresponding above-mentioned target depth information is detected from above-mentioned near-infrared image.
Seen from the above description, target depth information corresponds to some position or region on face.Obtain target depth letter
After breath, the brightness value that the target depth information is corresponded on near-infrared image can be found.At this point, brightness value can react acquisition
When near-infrared image, the exposure effect of terminal device 101,102,103.In order to obtain clearly near-infrared facial image, usually
Need brightness value in certain value range.Can be specifically:It is less than the first given threshold in response to above-mentioned brightness value, then executes
Step 306, to obtain the near-infrared facial image of current location;It is higher than the second given threshold in response to above-mentioned brightness value, then holds
Row step 307, to obtain the near-infrared facial image of current location;When brightness value is with setting brightness value difference, then step is executed
Rapid 308, to obtain the near-infrared facial image of current location.Wherein, the first given threshold is usually smaller brightness value, is used for
Limit the lower limit of brightness when obtaining clear near-infrared facial image;Second given threshold is usually larger brightness value, for limiting
The upper limit of brightness when obtaining clear near-infrared facial image surely;Setting brightness value can meet to obtain the close of certain effects requirement
Brightness value when infrared face image.
Step 306, prolonging exposure time.
When the corresponding brightness value of target depth information is less than the first given threshold, illustrate that the time for exposure at this time is shorter,
The brightness of the near-infrared facial image of acquisition is inadequate.On the basis of the time for exposure that terminal device 101,102,103 can be at this moment,
Prolonging exposure time.
Step 307, shorten the time for exposure.
When the corresponding brightness value of target depth information is higher than the second given threshold, illustrate that the time for exposure at this time is longer,
The brightness of the near-infrared facial image of acquisition is excessively high.On the basis of the time for exposure that terminal device 101,102,103 can be at this moment,
Shorten the time for exposure.
Step 308, it calculates above-mentioned brightness value and sets the difference between brightness value, when adjusting exposure according to above-mentioned difference
Between.
When the corresponding brightness value of target depth information is with setting brightness value difference, terminal device 101,102,103 can be with
According to the difference between brightness value and setting brightness value, extends or shorten the time for exposure.
The control to the time for exposure may be implemented in above-mentioned steps through this embodiment, and then gets clearly close red
Outer facial image.
It is a signal according to the application scenarios of the method for obtaining image of the present embodiment with continued reference to Fig. 4, Fig. 4
Figure.In the application scenarios of Fig. 4, the unlock operation of terminal device 102 requires user to watch near-infrared camera attentively.User is in half-light
Under the conditions of by terminal device 102 obtain user near-infrared facial image.At this point, terminal device 102 can present bit first
The depth map and near-infrared image comprising face set;Then face depth information is identified from above-mentioned depth map;Later, it examines
Survey the luminance information that face depth information is corresponded on above-mentioned near-infrared image;Finally, when controlling exposure according to above-mentioned luminance information
Between, to obtain the near-infrared facial image of current location.The near-infrared facial image obtained at this time is exactly clearly near-infrared people
Face image.Terminal device 102 can be unlocked by the near-infrared facial image and (can also be payment etc.) operation.
The method that above-described embodiment of the application provides obtains the depth map and near-infrared image of current location first;Then
Face depth information is identified from above-mentioned depth map;It detects later and corresponds to the bright of face depth information on above-mentioned near-infrared image
Spend information;The time for exposure is finally controlled according to above-mentioned luminance information, to obtain the near-infrared facial image of current location.It improves
Obtain the clarity of near-infrared facial image.
With further reference to Fig. 5, as the realization to method shown in above-mentioned each figure, this application provides one kind for obtaining figure
One embodiment of the device of picture, the device embodiment is corresponding with embodiment of the method shown in Fig. 2, which can specifically answer
For in various electronic equipments.
As shown in figure 5, the device 500 for obtaining image of the present embodiment may include:Image acquisition units 501, people
Face depth information recognition unit 502, luminance information detection unit 503 and image acquisition unit 504.Wherein, image acquisition units
501 depth map and near-infrared image comprising face for obtaining current location;Face depth information recognition unit 502 is used for
Face depth information is identified from above-mentioned depth map;Luminance information detection unit 503 is for detecting above-mentioned near-infrared image
The luminance information of corresponding face depth information;Image acquisition unit 504 is used to control the time for exposure according to above-mentioned luminance information, with
Obtain the near-infrared facial image of current location.
In some optional realization methods of the present embodiment, above-mentioned face depth information recognition unit 502 may include:
Depth information figure draws subelement (not shown) and face depth information detection sub-unit (not shown).Wherein, deep
Degree hum pattern draws subelement for detecting the corresponding depth information of each pixel in above-mentioned depth map, is painted by depth information
Depth information figure processed;Face depth information detection sub-unit is used to detect corresponding human face structure from above-mentioned depth information figure
Face depth information.
In some optional realization methods of the present embodiment, above-mentioned luminance information detection unit 503 may include:Target
Subelement (not shown) and luminance information detection unit detection sub-unit (not shown) is arranged in depth information.Wherein,
Depth letter of the subelement for selecting corresponding face setting position from above-mentioned face depth information is arranged in target depth information
Breath is used as target depth information;Luminance information detection unit detection sub-unit is used to detect from above-mentioned near-infrared image corresponding
State the brightness value of target depth information.
In some optional realization methods of the present embodiment, above-mentioned image acquisition unit 504 may include:First exposure
Subelement (not shown) is controlled, is less than the first given threshold in response to above-mentioned brightness value, is used for prolonging exposure time.
In some optional realization methods of the present embodiment, above-mentioned image acquisition unit 504 may include:Second exposure
Subelement (not shown) is controlled, is higher than the second given threshold in response to above-mentioned brightness value, for shortening the time for exposure.
In some optional realization methods of the present embodiment, above-mentioned image acquisition unit 504 may include:Third exposes
Subelement (not shown) is controlled, for calculating above-mentioned brightness value and setting the difference between brightness value, according to above-mentioned difference
Adjust the time for exposure.
The present embodiment additionally provides a kind of electronic equipment, including:One or more processors;Memory, for storing one
A or multiple programs, near-infrared camera, for acquiring near-infrared image;When said one or multiple programs by said one or
When multiple processors execute so that said one or multiple processors execute the above-mentioned method for obtaining image.
The present embodiment additionally provides a kind of computer-readable medium, is stored thereon with computer program, which is handled
Device realizes the above-mentioned method for obtaining image when executing.
Below with reference to Fig. 6, it illustrates the computer systems 600 suitable for the electronic equipment for realizing the embodiment of the present application
Structural schematic diagram.Electronic equipment shown in Fig. 6 is only an example, to the function of the embodiment of the present application and should not use model
Shroud carrys out any restrictions.
As shown in fig. 6, computer system 600 includes central processing unit (CPU) 601, it can be read-only according to being stored in
Program in memory (ROM) 602 or be loaded into the program in random access storage device (RAM) 603 from storage section 608 and
Execute various actions appropriate and processing.In RAM 603, also it is stored with system 600 and operates required various programs and data.
CPU 601, ROM 602 and RAM 603 are connected with each other by bus 604.Input/output (I/O) interface 605 is also connected to always
Line 604.
It is connected to I/O interfaces 605 with lower component:Importation 606 including keyboard, mouse etc.;Including such as liquid crystal
Show the output par, c 607 of device (LCD) etc. and loud speaker etc.;Storage section 608 including hard disk etc.;And including such as LAN
The communications portion 609 of the network interface card of card, modem etc..Communications portion 609 is executed via the network of such as internet
Communication process.Driver 610 is also according to needing to be connected to I/O interfaces 605.Detachable media 611, such as disk, CD, magneto-optic
Disk, semiconductor memory etc. are mounted on driver 610 as needed, in order to from the computer program root read thereon
According to needing to be mounted into storage section 608.Near-infrared camera 612 is connected to I/O interfaces 605 by various data-interfaces.
Particularly, in accordance with an embodiment of the present disclosure, it may be implemented as computer above with reference to the process of flow chart description
Software program.For example, embodiment of the disclosure includes a kind of computer program product comprising be carried on computer-readable medium
On computer program, which includes the program code for method shown in execution flow chart.In such reality
It applies in example, which can be downloaded and installed by communications portion 609 from network, and/or from detachable media
611 are mounted.When the computer program is executed by central processing unit (CPU) 601, limited in execution the present processes
Above-mentioned function.
It should be noted that the above-mentioned computer-readable medium of the application can be computer-readable signal media or meter
Calculation machine readable storage medium storing program for executing either the two arbitrarily combines.Computer readable storage medium for example can be --- but not
Be limited to --- electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor system, device or device, or arbitrary above combination.Meter
The more specific example of calculation machine readable storage medium storing program for executing can include but is not limited to:Electrical connection with one or more conducting wires, just
It takes formula computer disk, hard disk, random access storage device (RAM), read-only memory (ROM), erasable type and may be programmed read-only storage
Device (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-ROM), light storage device, magnetic memory device,
Or above-mentioned any appropriate combination.In this application, can be any include computer readable storage medium or storage journey
The tangible medium of sequence, the program can be commanded the either device use or in connection of execution system, device.And at this
In application, computer-readable signal media may include in a base band or as the data-signal that a carrier wave part is propagated,
Wherein carry computer-readable program code.Diversified forms may be used in the data-signal of this propagation, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By instruction execution system, device either device use or program in connection.Include on computer-readable medium
Program code can transmit with any suitable medium, including but not limited to:Wirelessly, electric wire, optical cable, RF etc. or above-mentioned
Any appropriate combination.
Flow chart in attached drawing and block diagram, it is illustrated that according to the system of the various embodiments of the application, method and computer journey
The architecture, function and operation in the cards of sequence product.In this regard, each box in flowchart or block diagram can generation
A part for a part for one module, program segment, or code of table, the module, program segment, or code includes one or more uses
The executable instruction of the logic function as defined in realization.It should also be noted that in some implementations as replacements, being marked in box
The function of note can also occur in a different order than that indicated in the drawings.For example, two boxes succeedingly indicated are actually
It can be basically executed in parallel, they can also be executed in the opposite order sometimes, this is depended on the functions involved.Also it to note
Meaning, the combination of each box in block diagram and or flow chart and the box in block diagram and or flow chart can be with holding
The dedicated hardware based system of functions or operations as defined in row is realized, or can use specialized hardware and computer instruction
Combination realize.
Being described in unit involved in the embodiment of the present application can be realized by way of software, can also be by hard
The mode of part is realized.Described unit can also be arranged in the processor, for example, can be described as:A kind of processor packet
Include image acquisition units, face depth information recognition unit, luminance information detection unit and image acquisition unit.Wherein, these
The title of unit does not constitute the restriction to the unit itself under certain conditions, for example, image acquisition unit can also be retouched
It states as " unit for obtaining near-infrared facial image ".
As on the other hand, present invention also provides a kind of computer-readable medium, which can be
Included in device described in above-described embodiment;Can also be individualism, and without be incorporated the device in.Above-mentioned calculating
Machine readable medium carries one or more program, when said one or multiple programs are executed by the device so that should
Device:Obtain the depth map and near-infrared image comprising face of current location;Face depth is identified from above-mentioned depth map
Information;Detect the luminance information that face depth information is corresponded on above-mentioned near-infrared image;It is controlled and is exposed according to above-mentioned luminance information
Time, to obtain the near-infrared facial image of current location.
Above description is only the preferred embodiment of the application and the explanation to institute's application technology principle.People in the art
Member should be appreciated that invention scope involved in the application, however it is not limited to technology made of the specific combination of above-mentioned technical characteristic
Scheme, while should also cover in the case where not departing from foregoing invention design, it is carried out by above-mentioned technical characteristic or its equivalent feature
Other technical solutions of arbitrary combination and formation.Such as features described above has similar work(with (but not limited to) disclosed herein
Can technical characteristic replaced mutually and the technical solution that is formed.
Claims (14)
1. a kind of method for obtaining image, which is characterized in that the method includes:
Obtain the depth map and near-infrared image comprising face of current location;
Face depth information is identified from the depth map;
Detect the luminance information that face depth information is corresponded on the near-infrared image;
The time for exposure is controlled according to the luminance information, to obtain the near-infrared facial image of current location.
2. according to the method described in claim 1, it is characterized in that, described identify that face depth is believed from the depth map
Breath, including:
The corresponding depth information of each pixel in the depth map is detected, depth information figure is drawn by depth information;
The face depth information of corresponding human face structure is detected from the depth information figure.
3. according to the method described in claim 1, it is characterized in that, corresponding to face depth on the detection near-infrared image
The luminance information of information, including:
Select the depth information of corresponding face setting position as target depth information from the face depth information;
The brightness value of the corresponding target depth information is detected from the near-infrared image.
4. according to the method described in claim 3, it is characterized in that, described control time for exposure, packet according to the luminance information
It includes:
It is less than the first given threshold in response to the brightness value, then prolonging exposure time.
5. according to the method described in claim 3, it is characterized in that, described control time for exposure, packet according to the luminance information
It includes:
It is higher than the second given threshold in response to the brightness value, then shortens the time for exposure.
6. according to the method described in claim 3, it is characterized in that, described control time for exposure, packet according to the luminance information
It includes:
It calculates the brightness value and sets the difference between brightness value, the time for exposure is adjusted according to the difference.
7. a kind of for obtaining the device of image, which is characterized in that described device includes:
Image acquisition units, the depth map and near-infrared image comprising face for obtaining current location;
Face depth information recognition unit, for identifying face depth information from the depth map;
Luminance information detection unit, for detecting the luminance information for corresponding to face depth information on the near-infrared image;
Image acquisition unit, for controlling the time for exposure according to the luminance information, to obtain the near-infrared face of current location
Image.
8. device according to claim 7, which is characterized in that the face depth information recognition unit includes:
Depth information figure draws subelement and passes through depth for detecting the corresponding depth information of each pixel in the depth map
It spends information and draws depth information figure;
Face depth information detection sub-unit, the face depth for detecting corresponding human face structure from the depth information figure
Information.
9. device according to claim 7, which is characterized in that the luminance information detection unit includes:
Subelement is arranged in target depth information, the depth for selecting corresponding face setting position from the face depth information
Information is spent as target depth information;
Luminance information detection unit detection sub-unit, for detecting the corresponding target depth information from the near-infrared image
Brightness value.
10. device according to claim 9, which is characterized in that described image acquiring unit includes:
First spectrum assignment subelement is less than the first given threshold in response to the brightness value, is used for prolonging exposure time.
11. device according to claim 9, which is characterized in that described image acquiring unit includes:
Second spectrum assignment subelement is higher than the second given threshold, for shortening the time for exposure in response to the brightness value.
12. device according to claim 9, which is characterized in that described image acquiring unit includes:
Third spectrum assignment subelement, for calculating the brightness value and setting the difference between brightness value, according to the difference
Adjust the time for exposure.
13. a kind of electronic equipment, including:
One or more processors;
Storage device, for storing one or more programs;
Near-infrared camera, for acquiring near-infrared image;
When one or more of programs are executed by one or more of processors so that one or more of processors are real
The now method as described in any in claim 1 to 6.
14. a kind of computer-readable medium, is stored thereon with computer program, which is characterized in that the program is executed by processor
In Shi Shixian such as claim 1 to 6 it is any as described in method.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810224931.8A CN108462832A (en) | 2018-03-19 | 2018-03-19 | Method and device for obtaining image |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810224931.8A CN108462832A (en) | 2018-03-19 | 2018-03-19 | Method and device for obtaining image |
Publications (1)
Publication Number | Publication Date |
---|---|
CN108462832A true CN108462832A (en) | 2018-08-28 |
Family
ID=63236998
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810224931.8A Pending CN108462832A (en) | 2018-03-19 | 2018-03-19 | Method and device for obtaining image |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108462832A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109348138A (en) * | 2018-10-12 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Light irradiation regulating method, device, equipment and storage medium |
CN110569822A (en) * | 2019-09-16 | 2019-12-13 | 深圳市商汤科技有限公司 | image processing method and device, electronic equipment and storage medium |
CN110784660A (en) * | 2019-11-13 | 2020-02-11 | 广州洪荒智能科技有限公司 | Method, system, equipment and medium for controlling camera brightness |
CN111885311A (en) * | 2020-03-27 | 2020-11-03 | 浙江水晶光电科技股份有限公司 | Method and device for adjusting exposure of infrared camera, electronic equipment and storage medium |
CN112232324A (en) * | 2020-12-15 | 2021-01-15 | 杭州宇泛智能科技有限公司 | Face fake-verifying method and device, computer equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103475821A (en) * | 2013-10-11 | 2013-12-25 | 中科院微电子研究所昆山分所 | Adjustment method based on automatic integration time of near infrared camera |
CN103702015A (en) * | 2013-12-20 | 2014-04-02 | 华南理工大学 | Exposure control method for human face image acquisition system under near-infrared condition |
CN104333710A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Camera exposure method, camera exposure device and camera exposure equipment |
WO2017034701A1 (en) * | 2015-08-26 | 2017-03-02 | Intel Corporation | Infrared lamp control for use with iris recognition authentication |
CN107295269A (en) * | 2017-07-31 | 2017-10-24 | 努比亚技术有限公司 | A kind of light measuring method and terminal, computer-readable storage medium |
-
2018
- 2018-03-19 CN CN201810224931.8A patent/CN108462832A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103475821A (en) * | 2013-10-11 | 2013-12-25 | 中科院微电子研究所昆山分所 | Adjustment method based on automatic integration time of near infrared camera |
CN103702015A (en) * | 2013-12-20 | 2014-04-02 | 华南理工大学 | Exposure control method for human face image acquisition system under near-infrared condition |
CN104333710A (en) * | 2014-11-28 | 2015-02-04 | 广东欧珀移动通信有限公司 | Camera exposure method, camera exposure device and camera exposure equipment |
WO2017034701A1 (en) * | 2015-08-26 | 2017-03-02 | Intel Corporation | Infrared lamp control for use with iris recognition authentication |
CN107295269A (en) * | 2017-07-31 | 2017-10-24 | 努比亚技术有限公司 | A kind of light measuring method and terminal, computer-readable storage medium |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109348138A (en) * | 2018-10-12 | 2019-02-15 | 百度在线网络技术(北京)有限公司 | Light irradiation regulating method, device, equipment and storage medium |
CN110569822A (en) * | 2019-09-16 | 2019-12-13 | 深圳市商汤科技有限公司 | image processing method and device, electronic equipment and storage medium |
WO2021051949A1 (en) * | 2019-09-16 | 2021-03-25 | 深圳市商汤科技有限公司 | Image processing method and apparatus, electronic device, and storage medium |
CN110784660A (en) * | 2019-11-13 | 2020-02-11 | 广州洪荒智能科技有限公司 | Method, system, equipment and medium for controlling camera brightness |
CN110784660B (en) * | 2019-11-13 | 2020-09-04 | 四川云从天府人工智能科技有限公司 | Method, system, equipment and medium for controlling camera brightness |
CN111885311A (en) * | 2020-03-27 | 2020-11-03 | 浙江水晶光电科技股份有限公司 | Method and device for adjusting exposure of infrared camera, electronic equipment and storage medium |
CN111885311B (en) * | 2020-03-27 | 2022-01-21 | 东莞埃科思科技有限公司 | Method and device for adjusting exposure of infrared camera, electronic equipment and storage medium |
CN112232324A (en) * | 2020-12-15 | 2021-01-15 | 杭州宇泛智能科技有限公司 | Face fake-verifying method and device, computer equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108462832A (en) | Method and device for obtaining image | |
CN109191514A (en) | Method and apparatus for generating depth detection model | |
CN108171204B (en) | Detection method and device | |
CN108133201B (en) | Face character recognition methods and device | |
CN108898185A (en) | Method and apparatus for generating image recognition model | |
CN108280413B (en) | Face recognition method and device | |
CN108446651A (en) | Face identification method and device | |
CN108182412A (en) | For the method and device of detection image type | |
CN108492364A (en) | The method and apparatus for generating model for generating image | |
CN108388878A (en) | The method and apparatus of face for identification | |
CN108986169A (en) | Method and apparatus for handling image | |
CN108491809A (en) | The method and apparatus for generating model for generating near-infrared image | |
CN108363995A (en) | Method and apparatus for generating data | |
CN109242801A (en) | Image processing method and device | |
CN110113538A (en) | Intelligent capture apparatus, intelligent control method and device | |
CN108491823A (en) | Method and apparatus for generating eye recognition model | |
CN108337505A (en) | Information acquisition method and device | |
CN108154547A (en) | Image generating method and device | |
CN108510454A (en) | Method and apparatus for generating depth image | |
CN109389072A (en) | Data processing method and device | |
CN108876858A (en) | Method and apparatus for handling image | |
CN108171206A (en) | information generating method and device | |
CN109308687A (en) | Method and apparatus for adjusting brightness of image | |
CN108427941A (en) | Method, method for detecting human face and device for generating Face datection model | |
CN109241934A (en) | Method and apparatus for generating information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180828 |
|
RJ01 | Rejection of invention patent application after publication |