CN107580176A - A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium - Google Patents
A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium Download PDFInfo
- Publication number
- CN107580176A CN107580176A CN201710653345.0A CN201710653345A CN107580176A CN 107580176 A CN107580176 A CN 107580176A CN 201710653345 A CN201710653345 A CN 201710653345A CN 107580176 A CN107580176 A CN 107580176A
- Authority
- CN
- China
- Prior art keywords
- terminal
- shooting
- aperture
- distance
- depth
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 58
- 238000013507 mapping Methods 0.000 claims description 66
- 238000012216 screening Methods 0.000 claims description 21
- 238000003384 imaging method Methods 0.000 claims 3
- 238000005516 engineering process Methods 0.000 abstract description 14
- 230000006870 function Effects 0.000 description 24
- 238000010586 diagram Methods 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000004364 calculation method Methods 0.000 description 8
- 238000012545 processing Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004907 flux Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000010295 mobile communication Methods 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 230000001629 suppression Effects 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Landscapes
- Studio Devices (AREA)
Abstract
The invention discloses a kind of terminal taking control method, camera shooting terminal and computer-readable recording medium, the distance between camera shooting terminal and subject parameter are measured by ultrasonic ranging, as object distance, acquisition parameters are controlled according to object distance, realize the function that acquisition parameters are automatically controlled according to object distance, solve the problems, such as that existing terminal taking control technology relies on end user manual control acquisition parameters, enhance the usage experience of user.
Description
Technical Field
The present invention relates to the field of user interaction, and in particular, to a terminal shooting control method, a camera terminal, and a computer-readable storage medium.
Background
With the sophistication and growth of terminal systems such as mobile phones and the like, users use terminals to perform daily activities such as photographing and the like, but in the prior photographing technology, because the photographing terminal cannot determine parameters such as an object distance between a photographed object and the photographing terminal, the terminal users are required to manually control photographing parameters such as accurate focusing and aperture adjustment, and the best effect is determined through multiple attempts.
Disclosure of Invention
The invention mainly aims to provide a terminal shooting control method, a camera terminal and a computer readable storage medium, and aims to solve the problem that the existing terminal shooting control technology depends on a terminal user to manually control shooting parameters.
In order to achieve the above object, the present invention provides a terminal shooting control method, including:
sending a first ultrasonic signal outwards, and receiving a second ultrasonic signal formed by reflecting the first ultrasonic signal;
determining distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
according to the distance data of the object, the shooting parameters when shooting the object are controlled.
In some embodiments, determining distance data between the subject and the photographing terminal from the first ultrasonic signal and the second ultrasonic signal comprises:
transmitting a first ultrasonic signal in a non-directional manner;
calculating distance data between all objects in the sensing range of the shooting terminal and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
determining a shot object according to the shooting direction of a camera of a shooting terminal;
and screening the distance data between all the objects and the shooting terminal, and determining the distance data between the object and the shooting terminal.
In some embodiments, determining distance data between the subject and the photographing terminal from the first ultrasonic signal and the second ultrasonic signal comprises:
determining a shot object according to the shooting direction of a camera of a shooting terminal;
directionally transmitting a first ultrasonic signal to a shot object;
and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal.
In some embodiments, controlling the photographing parameters when photographing the subject according to the distance data of the subject includes:
determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal;
determining a focus object in the shot object according to the selection operation of the terminal user;
screening the distance data of the shot object, and determining the distance data of the focus object;
calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focal object;
and controlling focusing parameters in the shooting parameters according to the focal length of the shooting terminal.
In some embodiments, controlling the photographing parameters when photographing the subject according to the distance data of the subject includes:
determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal;
determining a focus object in the shot object according to the selection operation of the terminal user;
screening the distance data of the shot object, and determining the distance data of the focus object and the distance data of the non-focus object;
determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object;
determining the shooting depth of field according to the distance between the focus object and the non-focus object, and calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field;
and controlling the aperture parameters in the shooting parameters according to the aperture values.
In some embodiments, when the second mapping relationship is a mapping relationship between a depth of field relative size and an aperture relative size, determining a shooting depth of field according to a distance between a focus object and a non-focus object, and calculating the aperture value according to the second mapping relationship and the shooting depth of field includes:
determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object;
when the shooting depth of field is large, the aperture value is small;
when the shooting depth of field is small, the aperture value is large.
In some embodiments, when the second mapping relationship is a mapping relationship between an absolute value of a depth of field and an absolute value of an aperture diameter, and the photographing terminal supports fine adjustment of the aperture, determining a photographing depth of field according to a distance between a focus object and a non-focus object, and calculating the aperture value according to the second mapping relationship and the photographing depth of field includes:
determining a foreground deep object and a back depth object in the non-focus object;
respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear depth object and the focus object;
calculating the minimum value of the shooting depth of field when the foreground covers the first maximum distance deeply and the back depth of field covers the second maximum distance deeply;
calculating the maximum value of the aperture diameter according to the minimum value of the shooting depth of field and the second mapping relation;
the aperture value corresponding to the maximum aperture diameter is defined as the aperture value.
In some embodiments, when the second mapping relationship is a mapping relationship between an absolute value of a depth of field and an absolute value of an aperture diameter, and the photographing terminal does not support aperture fine adjustment, determining a photographing depth of field according to a distance between a focus object and a non-focus object, and calculating the aperture value according to the second mapping relationship and the photographing depth of field includes:
determining a foreground deep object and a back depth object in the non-focus object;
respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear depth object and the focus object;
calculating the minimum value of the shooting depth of field when the foreground covers the first maximum distance deeply and the back depth of field covers the second maximum distance deeply;
calculating the maximum value of the aperture diameter according to the minimum value of the shooting depth of field and the second mapping relation;
determining the aperture level where the maximum value of the aperture diameter is located according to the aperture size range corresponding to the aperture level of the shooting terminal;
and taking the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located as the aperture value.
Meanwhile, the present invention provides a camera terminal, characterized in that the camera terminal comprises: the terminal shooting control program is executed by the processor to realize the steps of the terminal shooting control method provided by the invention.
Meanwhile, the present invention provides a computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a terminal photographing control program, which when executed implements the steps of the terminal photographing control method provided by the present invention.
According to the terminal shooting control method, the camera terminal and the computer readable storage medium provided by the embodiment of the invention, the distance parameter between the camera terminal and the shot object is measured through ultrasonic ranging and is used as the object distance, and the shooting parameter is controlled according to the object distance, so that the function of automatically controlling the shooting parameter according to the object distance is realized, the problem that the existing terminal shooting control technology depends on a terminal user to manually control the shooting parameter is solved, and the use experience of the user is enhanced.
Drawings
Fig. 1 is a schematic diagram of a hardware structure of an alternative mobile terminal for implementing various embodiments of the present invention;
FIG. 2 is a diagram illustrating the relationship between the depth of field and the focal length and aperture according to the present invention;
fig. 3 is a flowchart of a terminal shooting control method according to a first embodiment of the present invention;
fig. 4 is a flowchart of a terminal shooting control method according to a second embodiment of the present invention;
fig. 5 is a schematic structural diagram of a terminal according to a first embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal according to a second embodiment of the present invention;
fig. 7 is a first schematic diagram of a subject according to an embodiment of the present invention;
fig. 8 is a second schematic diagram of a subject according to an embodiment of the present invention;
fig. 9 is a third schematic diagram of a subject according to an embodiment of the present invention;
fig. 10 is a fourth schematic diagram of a subject according to an embodiment of the present invention.
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
In the following description, suffixes such as "module", "component", or "unit" used to denote elements are used only for facilitating the explanation of the present invention, and have no specific meaning in itself. Thus, "module", "component" or "unit" may be used mixedly.
The terminal may be implemented in various forms. For example, the terminal described in the present invention may include a mobile terminal such as a mobile phone, a tablet computer, a notebook computer, a palmtop computer, a Personal Digital Assistant (PDA), a Portable Media Player (PMP), a navigation device, a wearable device, a smart band, a pedometer, and the like, and a fixed terminal such as a Digital TV, a desktop computer, and the like.
The following description will be given by way of example of a mobile terminal, and it will be understood by those skilled in the art that the construction according to the embodiment of the present invention can be applied to a fixed type terminal, in addition to elements particularly used for mobile purposes.
Referring to fig. 1, which is a schematic diagram of a hardware structure of a mobile terminal for implementing various embodiments of the present invention, the mobile terminal 100 may include: RF (Radio Frequency) unit 101, WiFi module 102, audio output unit 103, a/V (audio/video) input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 1 is not intended to be limiting of mobile terminals, which may include more or fewer components than those shown, or some components may be combined, or a different arrangement of components.
The following describes each component of the mobile terminal in detail with reference to fig. 1:
the radio frequency unit 101 may be configured to receive and transmit signals during information transmission and reception or during a call, and specifically, receive downlink information of a base station and then process the downlink information to the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through wireless communication. The wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System for Mobile communications), GPRS (General Packet Radio Service), CDMA2000(Code Division Multiple Access 2000), WCDMA (Wideband Code Division Multiple Access), TD-SCDMA (Time Division-Synchronous Code Division Multiple Access), and so on,FDD-LTE(Frequency division duplex-Long Term Evolution, Frequency division duplex Long Term Evolution) and TDD-LTE (Time division duplex-Long Term Evolution, Time division duplex Long Term Evolution).
WiFi belongs to short-distance wireless transmission technology, and the mobile terminal can help a user to transmit and receive through the WiFi module 102Electric power Sub-mailBrowsing web pages, accessing streaming media, etc., which provide wireless broadband internet access to users. Although FIG. 1 showsThe WiFi module 102, but it will be understood that it does not belong to the essential constitution of the mobile terminal, and can be omitted entirely as required within the scope not changing the essence of the invention.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the WiFi module 102 or stored in the memory 109 into an audio signal and output as sound when the mobile terminal 100 is in a call signal reception mode, a call mode, a recording mode, a voice recognition mode, a broadcast reception mode, or the like. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the mobile terminal 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 may include a speaker, a buzzer, and the like.
The a/V input unit 104 is used to receive audio or video signals. The a/V input Unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the Graphics Processing Unit 1041 processes image data of still pictures or videos obtained by an image capturing apparatus (such as a camera according to the present invention) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the WiFi module 102. The microphone 1042 may receive sounds (audio data) via the microphone 1042 in a phone call mode, a recording mode, a voice recognition mode, or the like, and may be capable of processing such sounds into audio data. The processed audio (voice) data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode. The microphone 1042 may implement various types of noise cancellation (or suppression) algorithms to cancel (or suppress) noise or interference generated in the course of receiving and transmitting audio signals.
The mobile terminal 100 also includes at least one sensor 105, such as a light sensor, a motion sensor, an ultrasonic sensor 1051, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or a backlight when the mobile terminal 100 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used for applications of recognizing the posture of the mobile phone (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), vibration recognition related functions (such as pedometer, tapping), and the like. In the ultrasonic sensor 1051, a distance of an object to be measured is calculated by using the characteristics of ultrasonic waves, and the ultrasonic distance measurement generally adopts a degree transit time method, that is, a distance of the object to be measured is calculated by using s ═ vt/2, where s is a distance between a transmitting/receiving head and the object to be measured, v is a propagation velocity of the ultrasonic waves in a medium (v ═ 331.41+ T/273m/s), and T is a round-trip time interval of the ultrasonic waves. As for other sensors such as a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, and an infrared sensor, which can be configured on the mobile phone, further description is omitted here.
The display unit 106 is used to display information input by a user or information provided to the user. The Display unit 106 may include a Display panel 1061, and the Display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 107 may include a touch panel 1071 and other input devices 1072. The touch panel 1071, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 1071 (e.g., an operation performed by the user on or near the touch panel 1071 using a finger, a stylus, or any other suitable object or accessory), and drive a corresponding connection device according to a predetermined program. The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and can receive and execute commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. In particular, other input devices 1072 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like, and are not limited to these specific examples.
Further, the touch panel 1071 may cover the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although the touch panel 1071 and the display panel 1061 are shown in fig. 1 as two separate components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 108 serves as an interface through which at least one external device is connected to the mobile terminal 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 100 or may be used to transmit data between the mobile terminal 100 and external devices.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the mobile terminal. Processor 110 may include one or more processing units; preferably, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The mobile terminal 100 may further include a power supply 111 (e.g., a battery) for supplying power to various components, and preferably, the power supply 111 may be logically connected to the processor 110 via a power management system, so as to manage charging, discharging, and power consumption management functions via the power management system.
Although not shown in fig. 1, the mobile terminal 100 may further include a bluetooth module or the like, which is not described in detail herein.
In practical applications, the camera terminal related to the present invention may be any device or terminal having a camera and a processor, including the mobile terminal shown in fig. 1, such as a mobile phone, a tablet computer, a notebook computer, a video camera, and so on.
In practical applications, the shooting parameters related to the present invention may include all the working parameters of the camera related to the object distance, such as the focusing parameters (focal length) and the aperture parameters (aperture size) described in detail below, the focusing parameters are realized by controlling the distance between the lenses in the camera, and the aperture parameters are realized by controlling the aperture size in the camera. The focus is simply achieved since it relates only to the focal object itself, whereas the choice of a reasonable depth of field affects the sharpness of non-focal objects in the vicinity of the focal object, and will therefore be explained in more detail with reference to fig. 2.
Circle of confusion, before and after the focus, the light starts to gather and diffuse, and the image of the spot becomes blurred, forming an enlarged circle, known as a circle of confusion. In reality, when an image photographed for viewing is observed in some manner (for example, projection, enlargement into a photograph, etc.), the image perceived by the human eyes has a large relationship with the magnification, projection distance, and viewing distance, and if the diameter of the circle of confusion is smaller than the discrimination ability of the human eyes, the blur generated by the actual image cannot be recognized within a certain range. This unrecognized circle of confusion is referred to as a circle of permissible confusion.
As shown in (1) of fig. 2, there is an allowable circle of confusion respectively before and after the focus, the distance between the two circles of confusion is called depth of field, and before and after the subject (focusing point), the image still has a clear range, i.e. depth of field, in other words, the front and back depth of the subject, and the image blur degree presented on the negative surface is within the limited range of the allowable circle of confusion, i.e. the depth of field is the acceptable clear area extending from the front to the back of the focus point.
As shown in (2) in fig. 2, the distance from the focal point to the near allowable circle of confusion is called the front depth of field, and the distance from the focal point to the far allowable circle of confusion is called the back depth of field, specifically:
depth of field ═ 2 × circle of confusion diameter × square of lens focal length × square of aperture value × focal distance)/(4 th power of lens focal length-square of circle of confusion diameter × square of aperture value × square of focal distance);
the back depth of field ═ the (circle of confusion × the aperture value × the square of the focus distance)/(the square of the lens focal length-the circle of confusion × the aperture value × the focus distance);
front depth of field ═ the (circle of confusion diameter × aperture value × the square of the focus distance)/(the square of the lens focal length + the circle of confusion diameter × aperture value × the focus distance);
as shown in (3) in fig. 2, the depth of field varies with the focal length of the lens, the aperture value, and the shooting distance, and for a fixed focal length and shooting distance, the smaller the aperture is used, that is, the smaller the diameter of the lens, the larger the depth of field.
Based on the hardware structure of the mobile terminal and the relation between the depth of field and the focal length and the aperture, the embodiments of the invention are provided.
As shown in fig. 3, a first embodiment of the terminal shooting control method of the present invention is proposed, and in this embodiment, the terminal shooting control method includes the following steps:
s301: sending a first ultrasonic signal outwards, and receiving a second ultrasonic signal formed by reflecting the first ultrasonic signal; in practical application, this step is implemented by the ultrasonic sensor 1051 in the camera terminal device under the control of the processor 110, and the specific implementation principle thereof can be implemented by using conventional technology, which is not described in detail herein.
S302: determining distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
s303: according to the distance data of the object, the shooting parameters when shooting the object are controlled.
In practical applications, the image capturing according to the present invention includes capturing a video, capturing a photograph, and the like.
In practical application, the method for determining the distance data between the object to be shot and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal includes two specific methods:
the method comprises the steps of sending a first ultrasonic signal in a non-directional mode, calculating distance data between all objects and a shooting terminal in a sensing range of the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal, determining a shot object according to a shooting direction of a camera of the shooting terminal, screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot object and the shooting terminal. The method can start the ultrasonic sensor to measure the distance after the user opens the camera application, because the range which can be sensed by the camera terminal is fixed at the fixed position, and then the corresponding shot object can only be selected from all objects in the sensing range.
And secondly, determining a shot object according to the shooting direction of a camera of the shooting terminal, directionally sending a first ultrasonic signal to the shot object, and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal. The method is used for measuring pertinence according to the shot object selected by the user, the measuring time is short, and the method is mainly used for a scene that the electric quantity of the terminal is insufficient or the user does not frequently change the shot object.
In practical application, the two modes can be selected actively by a terminal user, or selected by a camera terminal in a default mode, or selected by the camera terminal according to the electric quantity, for example, when the remaining electric quantity is greater than 60%, the mode one is selected to avoid multiple starts of the ultrasonic sensor, and when the remaining electric quantity is less than 60%, the mode two is selected to reduce the single working time of the ultrasonic sensor; the camera terminal can also select the definition set by the user, and when the definition required by the user is high, the mode I is adopted to avoid the repeated starting of the ultrasonic sensor and the like caused by the frequent switching of the shot object by the user.
In practical applications, when the shooting parameters include the focusing parameters, the mode of controlling the shooting parameters when shooting the object according to the distance data of the object includes: the method comprises the steps of determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal, determining the focus object in the shot object according to selection operation of a terminal user, screening distance data of the shot object, determining the distance data of the focus object, calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focus object, and controlling focusing parameters in shooting parameters according to the focal length of the shooting terminal.
As shown in fig. 7 to 10, the object of focus is an object a, and a linear distance D1 between the object a and the camera is measured by the ultrasonic sensor as an object distance.
After the object distance is measured, the object distance cannot be directly used as an adjustment parameter for adjusting the focal length in the camera module or for focusing, so that after the object distance is measured, the object distance needs to be converted into a focusing parameter suitable for the camera module.
In practical application, the object distance and the focal length shot by the camera module can be preset, and can also be calculated according to actual measurement data, preferably, when the preset condition is adopted, a tester tests the corresponding relation between the object distance and the focal length before the shooting module leaves a factory, and also sets a corresponding adjusting control instruction for adjusting each focal length, inquires the corresponding adjusting control instruction according to the calculated focal length, and then directly executes the frame skipping control instruction to realize the automatic focusing operation, optionally, according to the required distance between the shooting object and the camera, the preset position relation between the object distance and the focal length, and the corresponding relation between each focal length and the rotation amplitude of the control motor, the corresponding control instruction is set according to different rotation amplitudes, and according to the object distance measured by ultrasonic waves, the control instruction corresponding to the object distance which is equal to the distance between the shooting object and the camera is searched and obtained by searching the object distance which is required to be shot And the motor is driven to reach the focus position by executing the control command, so that automatic focusing is realized. According to the embodiment, the distance between the object to be shot and the camera is measured at present, the distance between the object to be shot and the camera is shot according to the requirement, and the preset corresponding relation between the object distance and the rotation amplitude of the motor drives the motor to reach the focus position, so that the focusing speed and the focusing accuracy of the shooting equipment are improved.
In practical applications, when the shooting parameters include the aperture parameter, the way of controlling the shooting parameters when shooting the subject according to the distance data of the subject according to the present invention includes: determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal, determining a focus object in a shot object according to the selection operation of a terminal user, screening the distance data of the shot object, determining the distance data of the focus object and the distance data of a non-focus object, determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object, determining the shooting depth of field according to the distance between the focus object and the non-focus object, calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field, and controlling the aperture parameter in the shooting parameter according to the aperture value.
Specifically, the adjustment and control of the aperture parameters are similar to the control of the focusing parameters (focal length), and the control of the aperture size is realized through the rotation of a motor and the like, and the specific control process is not repeated.
The large depth of field is a relative concept, and is mainly used for scenes with low requirements on depth of field parameters, that is, in practical application, when the second mapping relationship is a mapping relationship between a relative depth of field and a relative size of an aperture, as shown in (3) in fig. 2, the method determines the shooting depth of field according to the distance between a focus object and a non-focus object, and calculates the aperture value according to the second mapping relationship and the shooting depth of field includes: and determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object, wherein the aperture value is a small aperture when the shooting depth of field is large, and the aperture value is a large aperture when the shooting depth of field is small.
Specifically, as shown in fig. 7, the subject includes a focus object a, a non-focus object B, and a non-focus object C, the distances between the focus object and the non-focus object are relatively small, and the definition of the subject can be ensured by using a small depth of field, so that the aperture value is a large aperture by using a small depth of field; as shown in fig. 8, the subject includes a focus object a, a non-focus object B, and a non-focus object C, and the distances between the focus object and the non-focus object are relatively large, so that a large depth of field is required to ensure the sharpness, and therefore, with a large depth of field, the aperture value is a small aperture.
For a camera terminal with a high-performance camera, for example, a camera terminal supporting a plurality of aperture levels (commonly including F1.0, F1.4, F2.0, F2.8, F4.0, F5.6, F8.0,F11,F16,F22f32, F45, F64, an image pickup terminal in which the smaller the F value of the aperture, the larger the clear aperture), or an image pickup terminal supporting precise control of the aperture size (arbitrary values between F1.0 and F64 may be set, not the aperture level) in order to make full use ofThe invention can also realize the accurate calculation of the aperture value.
In practical applications, when the second mapping relationship is a mapping relationship between an absolute value of a depth of field and an absolute value of an aperture diameter, i.e. a calculation relationship shown in (2) in fig. 2, the determining of the depth of field according to the distance between the focus object and the non-focus object according to the present invention includes: determining a foreground deep object and a rear field depth object in non-focus objects, respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear field depth object and the focus object, when the foreground deep covers the first maximum distance and the rear field depth covers the second maximum distance, calculating the minimum value of the shooting field depth, calculating the maximum value of the aperture diameter according to the minimum value of the shooting field depth and the second mapping relation, and then determining a subsequent adjusting mode according to the equipment capability. The equipment capability refers to whether the shooting terminal supports accurate aperture adjustment, when the shooting terminal does not support accurate aperture adjustment, the aperture level where the maximum value of the aperture diameter is located is determined according to the aperture size range corresponding to the aperture level of the shooting terminal, and the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located is used as the aperture value; when the shooting terminal supports the accurate adjustment of the aperture, the aperture value corresponding to the maximum value of the aperture diameter is directly used as the aperture value.
Specifically, as shown in fig. 9, the subject includes a focus object a, a non-focus object B, and a non-focus object C, where the non-focus object B is a back depth object and the non-focus object C is a foreground depth object, and in order to ensure the sharpness of these 3 objects, it needs to calculate as follows:
the distance from the focus object A to the image pickup terminal is D1, the distance from the non-focus object B to the image pickup terminal is D2, the distance from the non-focus object C to the image pickup terminal is D3, and since the back depth of field object only includes the non-focus object B and the foreground deep object only includes the non-focus object C, then the first maximum distance L1 between the foreground deep object and the focus object is D1-D3, and the second maximum distance L2 between the back depth of field object and the focus object is D2-D1;
when the front depth of field covers the first maximum distance and the back depth of field covers the second maximum distance, that is:
△ L2 > L2, and △ L1 > L1;
since all parameters except the aperture value F are known in the above calculation formula, the value range of F can be calculated, and in order to ensure the maximum light flux, the maximum value of the aperture diameter, that is, the minimum value △ l (min) of the shooting depth of field needs to be determined;
according to the second mapping relation and the minimum value △ L (min) of the shooting depth of field, the maximum value of the aperture diameter can be calculated, and then the minimum value F (min) of the aperture F value is determined according to the calculation relation between the aperture diameter and the aperture F value.
As shown in fig. 10, the subject includes a focus object a, a non-focus object B1, a non-focus object B2, a non-focus object C1, and a non-focus object C2, the non-focus object B1 and the non-focus object B2 are rear depth objects, and the non-focus object C1 and the non-focus object C2 are foreground deep objects, and in order to ensure the sharpness of these 5 objects, it needs to be calculated as follows:
the distance from the focus object A to the camera terminal is D1, the distance from the non-focus object B1 to the camera terminal is D2-1, the distance from the non-focus object B2 to the camera terminal is D2-2, the distance from the non-focus object C1 to the camera terminal is D3-1, the distance from the non-focus object C2 to the camera terminal is D3-2, since the back depth objects include the non-focus object B1 and the non-focus object B2, the foreground deep objects include the non-focus object C1 and the non-focus object C2, since D1-D3-1 > D1-D3-2, then the first maximum distance L1 between the foreground deep object and the focus object is D1-D3-1, since D2-1-D1 > D2-2-D1, then the second maximum distance L2 between the back depth object and the focus object is D2-1-D1;
when the front depth of field covers the first maximum distance and the back depth of field covers the second maximum distance, that is:
△ L2 > L2, and △ L1 > L1;
since all parameters except the aperture value F are known in the above calculation formula, the value range of F can be calculated, and in order to ensure the maximum light flux, the maximum value of the aperture diameter, that is, the minimum value △ l (min) of the shooting depth of field needs to be determined;
according to the second mapping relation and the minimum value △ L (min) of the shooting depth of field, the maximum value of the aperture diameter can be calculated, and then the minimum value F (min) of the aperture F value is determined according to the calculation relation between the aperture diameter and the aperture F value.
Aiming at the situation that the shooting terminal supports the accurate adjustment of the aperture, the maximum value of the aperture diameter is directly used as the aperture value, namely the aperture F value is adjusted to be F (min).
For the case that the shooting terminal does not support the precise adjustment of the aperture, the aperture level at which the maximum value of the aperture diameter is located is determined according to the aperture size range corresponding to the aperture level of the shooting terminal, and the aperture level at which the maximum value of the aperture diameter is located is used as the aperture value, for example, the case that the shooting terminal supports a plurality of aperture levels (commonly including F1.0, F1.4, F2.0, F2.8, F4.0, F5.6, F8.0,F11,F16,F22f32, F45, F64, the smaller the F value of the aperture, the larger the clear aperture), when F32 > F (min) >F22When in use, willF22As an aperture value.
In summary, in the terminal shooting control method provided in this embodiment, the distance parameter between the camera terminal and the object is measured through ultrasonic ranging, and is used as the object distance, and the shooting parameter is controlled according to the object distance, so that the function of automatically controlling the shooting parameter according to the object distance is realized, the problem that the existing terminal shooting control technology depends on the terminal user to manually control the shooting parameter is solved, and the user experience is enhanced.
As shown in fig. 4, a second embodiment of the terminal shooting control method of the present invention is proposed, in which the terminal shooting control method includes the steps of:
s401: setting a working mode of the ultrasonic sensor;
in this embodiment, the ultrasonic sensor operates in the second manner (i.e., determining the subject according to the shooting direction of the camera of the shooting terminal, sending the first ultrasonic signal to the subject in a directional manner, and calculating the distance data between the subject and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal).
S402: the terminal user opens the photographing application and selects a focus object;
in the present embodiment, as shown in fig. 10, the photographic subjects include 5 photographic subjects, which are respectively recorded as a photographic subject a, a photographic subject B1, a photographic subject B2, a photographic subject C1, and a photographic subject C2, wherein the photographic subject a is selected as a focus object by the user.
S403: measuring distance data of a subject by an ultrasonic sensor;
in the present embodiment, surface distance data of all subjects are measured by the ultrasonic sensor at the same time;
s404: performing shooting control according to the distance data of the shot object;
in the present embodiment, focus control is performed according to the distance of the subject a, and control of the aperture parameter is performed according to the distances of the subject a, the subject B1, the subject B2, the subject C1, and the subject C2, and the specific processing procedures have been described in detail above, and are not described in detail in this embodiment.
The embodiment provides a terminal shooting control method, which measures a distance parameter between a camera shooting terminal and a shot object through ultrasonic ranging, takes the distance parameter as an object distance, controls shooting parameters according to the object distance, realizes a function of automatically controlling the shooting parameters according to the object distance, solves the problem that the existing terminal shooting control technology depends on a terminal user to manually control the shooting parameters, and enhances the use experience of the user.
As shown in fig. 5, an embodiment of the camera terminal according to the present invention is proposed based on the above-mentioned mobile terminal hardware structure, and specifically, the camera terminal according to the present invention includes:
the acquisition module 51 is configured to send a first ultrasonic signal to the outside through the ultrasonic sensor, and receive a second ultrasonic signal formed by reflecting the first ultrasonic signal; in practical application, this step is implemented by the ultrasonic sensor 1051 in the camera terminal device under the control of the processor 110, and the specific implementation principle thereof may be implemented by using a conventional technology to implement the obtaining module 51, which is not described in detail herein.
A calculating module 52, configured to determine distance data between the subject and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
and a control module 53 for controlling shooting parameters when shooting the subject according to the distance data of the subject.
In practical applications, the obtaining module 51 may operate as follows:
the method comprises the steps of sending a first ultrasonic signal in a non-directional mode, calculating distance data between all objects and a shooting terminal in a sensing range of the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal, determining a shot object according to the shooting direction of a camera of the shooting terminal, screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot object and the shooting terminal;
and secondly, determining a shot object according to the shooting direction of a camera of the shooting terminal, directionally sending a first ultrasonic signal to the shot object, and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal.
In practical application, the two modes can be selected actively by a terminal user, or selected by a camera terminal in a default mode, or selected by the camera terminal according to the electric quantity, for example, when the remaining electric quantity is greater than 60%, the mode one is selected to avoid multiple starts of the ultrasonic sensor, and when the remaining electric quantity is less than 60%, the mode two is selected to reduce the single working time of the ultrasonic sensor; the camera terminal can also select the definition set by the user, and when the definition required by the user is high, the mode I is adopted to avoid the repeated starting of the ultrasonic sensor and the like caused by the frequent switching of the shot object by the user.
In practical applications, the control module 53 may operate as follows: the method comprises the steps of determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal, determining the focus object in the shot object according to selection operation of a terminal user, screening distance data of the shot object, determining the distance data of the focus object, calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focus object, and controlling focusing parameters in shooting parameters according to the focal length of the shooting terminal.
As shown in fig. 7 to 10, the object of focus is an object a, and a linear distance D1 between the object a and the camera is measured by the ultrasonic sensor as an object distance.
After the object distance is measured, the object distance cannot be directly used as an adjustment parameter for adjusting the focal length in the camera module or for focusing, so that after the object distance is measured, the object distance needs to be converted into a focusing parameter suitable for the camera module.
In practical application, the object distance and the focal length shot by the camera module can be preset, and can also be calculated according to actual measurement data, preferably, when the preset condition is adopted, a tester tests the corresponding relation between the object distance and the focal length before the shooting module leaves a factory, and also sets a corresponding adjusting control instruction for adjusting each focal length, inquires the corresponding adjusting control instruction according to the calculated focal length, and then directly executes the frame skipping control instruction to realize the automatic focusing operation, optionally, according to the required distance between the shooting object and the camera, the preset position relation between the object distance and the focal length, and the corresponding relation between each focal length and the rotation amplitude of the control motor, the corresponding control instruction is set according to different rotation amplitudes, and according to the object distance measured by ultrasonic waves, the control instruction corresponding to the object distance which is equal to the distance between the shooting object and the camera is searched and obtained by searching the object distance which is required to be shot And the motor is driven to reach the focus position by executing the control command, so that automatic focusing is realized. According to the embodiment, the distance between the object to be shot and the camera is measured at present, the distance between the object to be shot and the camera is shot according to the requirement, and the preset corresponding relation between the object distance and the rotation amplitude of the motor drives the motor to reach the focus position, so that the focusing speed and the focusing accuracy of the shooting equipment are improved.
In practical applications, the control module 53 may operate as follows: determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal, determining a focus object in a shot object according to the selection operation of a terminal user, screening the distance data of the shot object, determining the distance data of the focus object and the distance data of a non-focus object, determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object, determining the shooting depth of field according to the distance between the focus object and the non-focus object, calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field, and controlling the aperture parameter in the shooting parameter according to the aperture value.
Specifically, the adjustment and control of the aperture parameters are similar to the control of the focusing parameters (focal length), and the control of the aperture size is realized through the rotation of a motor and the like, and the specific control process is not repeated.
The large depth of field is a relative concept, and is mainly used for scenes with low requirements on depth of field parameters, that is, in practical application, when the second mapping relationship is a mapping relationship between a relative depth of field and a relative size of an aperture, as shown in (3) in fig. 2, the method determines the shooting depth of field according to the distance between a focus object and a non-focus object, and calculates the aperture value according to the second mapping relationship and the shooting depth of field includes: and determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object, wherein the aperture value is a small aperture when the shooting depth of field is large, and the aperture value is a large aperture when the shooting depth of field is small.
Specifically, as shown in fig. 7, the subject includes a focus object a, a non-focus object B, and a non-focus object C, the distances between the focus object and the non-focus object are relatively small, and the definition of the subject can be ensured by using a small depth of field, so that the aperture value is a large aperture by using a small depth of field; as shown in fig. 8, the subject includes a focus object a, a non-focus object B, and a non-focus object C, and the distances between the focus object and the non-focus object are relatively large, so that a large depth of field is required to ensure the sharpness, and therefore, with a large depth of field, the aperture value is a small aperture.
For a camera terminal with a high-performance camera, for example, a camera terminal supporting a plurality of aperture levels (commonly including F1.0, F1.4, F2.0, F2.8, F4.0, F5.6, F8.0,F11,F16,F22f32, F45, F64, a camera terminal that has a smaller F-number of aperture and a larger clear aperture), or a camera terminal that supports precise control of the aperture size (any value between F1.0 and F64 may be set, not the aperture level), the present invention can also achieve precise calculation of the aperture value in order to take the best picture with full use of the device performance.
In practical applications, when the second mapping relationship is a mapping relationship between an absolute value of the depth of field and an absolute value of the aperture diameter, i.e. a calculation relationship shown in (2) in fig. 2, the control module 53 may operate as follows: determining a foreground deep object and a rear field depth object in non-focus objects, respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear field depth object and the focus object, when the foreground deep covers the first maximum distance and the rear field depth covers the second maximum distance, calculating the minimum value of the shooting field depth, calculating the maximum value of the aperture diameter according to the minimum value of the shooting field depth and the second mapping relation, and then determining a subsequent adjusting mode according to the equipment capability. The equipment capability refers to whether the shooting terminal supports accurate aperture adjustment, when the shooting terminal does not support accurate aperture adjustment, the aperture level where the maximum value of the aperture diameter is located is determined according to the aperture size range corresponding to the aperture level of the shooting terminal, and the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located is used as the aperture value; when the shooting terminal supports the accurate adjustment of the aperture, the aperture value corresponding to the maximum value of the aperture diameter is directly used as the aperture value.
In summary, according to the camera terminal provided in this embodiment, the terminal measures the distance parameter between the camera terminal and the object to be photographed through ultrasonic ranging, and uses the distance parameter as the object distance, and controls the photographing parameter according to the object distance, so that the function of automatically controlling the photographing parameter according to the object distance is realized, the problem that the existing terminal photographing control technology depends on the terminal user to manually control the photographing parameter is solved, and the user experience is enhanced.
In an embodiment of the present invention, the processor 110 in fig. 1 may include the functions of all the functional modules in the embodiment shown in fig. 5. In this case, the above embodiment may be:
firstly, the processor 110 sends a first ultrasonic signal to the outside through the ultrasonic sensor 1051, and receives a second ultrasonic signal formed by reflecting the first ultrasonic signal;
then, the processor 110 determines distance data between the subject and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
finally, the processor 110 controls shooting parameters when shooting the subject according to the distance data of the subject.
In practical applications, the processor 110 is configured to: sending a first ultrasonic signal in a non-directional manner, calculating distance data between all objects in a sensing range of a shooting terminal and the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal, determining a shot object according to a shooting direction of a camera of the shooting terminal, screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot object and the shooting terminal; or, determining a shot object according to the shooting direction of a camera of the shooting terminal, sending a first ultrasonic signal to the shot object in a directional mode, and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal.
In practical applications, the processor 110 is configured to: the method comprises the steps of determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal, determining the focus object in the shot object according to selection operation of a terminal user, screening distance data of the shot object, determining the distance data of the focus object, calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focus object, and controlling focusing parameters in shooting parameters according to the focal length of the shooting terminal.
In practical applications, the processor 110 is configured to: determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal, determining a focus object in a shot object according to the selection operation of a terminal user, screening the distance data of the shot object, determining the distance data of the focus object and the distance data of a non-focus object, determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object, determining the shooting depth of field according to the distance between the focus object and the non-focus object, calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field, and controlling the aperture parameter in the shooting parameter according to the aperture value.
In summary, according to the camera terminal provided in this embodiment, the terminal measures the distance parameter between the camera terminal and the object to be photographed through ultrasonic ranging, and uses the distance parameter as the object distance, and controls the photographing parameter according to the object distance, so that the function of automatically controlling the photographing parameter according to the object distance is realized, the problem that the existing terminal photographing control technology depends on the terminal user to manually control the photographing parameter is solved, and the user experience is enhanced.
As shown in fig. 6, a second embodiment of the terminal of the present invention is proposed, in this embodiment, the terminal at least includes: an input/output (IO) bus 61, a processor 62, a memory 63, a memory 64, a camera 65, an ultrasonic sensor 66, and a terminal imaging program stored on the memory 63 and executable on the processor 62, the terminal imaging program implementing the following steps when executed by the processor. Wherein,
the input/output (IO) bus 61 is connected to other components (the processor 62, the RAM 63, the memory 64, the camera 65, and the ultrasonic sensor 66) of the terminal to which it belongs, and provides a transmission line for the other components.
The processor 62 typically controls the overall operation of the terminal to which it belongs. For example, processor 62 performs computations, validation, etc. The processor 62 may be a Central Processing Unit (CPU), among others.
The memory 63 stores processor-readable, processor-executable software code containing instructions for controlling the processor 62 to perform the functions described herein (i.e., software execution functions). In the present embodiment, the RAM 63 needs to store at least the programs necessary for the processor 62 to perform the above-described functions.
In the terminal control device provided by the present invention, the software codes for implementing all the functions of the modules in fig. 5 may be stored in the memory 63 and executed by the processor 62 or compiled and executed.
The memory 64, which is typically a semiconductor memory unit, includes Random Access Memory (RAM), Read Only Memory (ROM), and CACHE memory (CACHE), of which RAM is the most important. The memory 44 is one of the important components in the computer, and is a bridge for communicating with the CPU, and the operation of all programs in the computer is performed in the memory, and is used for temporarily storing the operation data in the CPU and the data exchanged with an external storage such as a hard disk, and the CPU transfers the data to be operated to the memory for operation as long as the computer is in operation, and the CPU transmits the result after the operation is completed, and the operation of the memory also determines the stable operation of the computer.
And a camera 65 for shooting and transmitting to the processor 62.
An ultrasonic sensor 46 for measuring distance.
On the basis of the terminal component shown in fig. 6, the terminal imaging program provided by the present embodiment implements the following steps when executed by a processor:
sending a first ultrasonic signal outwards through the ultrasonic sensor 1051, and receiving a second ultrasonic signal formed by reflecting the first ultrasonic signal;
determining distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
according to the distance data of the object, the shooting parameters when shooting the object are controlled.
In practical application, the terminal shooting control program realizes the following steps when executed by the processor:
sending a first ultrasonic signal in a non-directional manner, calculating distance data between all objects in a sensing range of a shooting terminal and the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal, determining a shot object according to a shooting direction of a camera of the shooting terminal, screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot object and the shooting terminal;
or, determining a shot object according to the shooting direction of a camera of the shooting terminal, sending a first ultrasonic signal to the shot object in a directional mode, and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal.
In practical application, the terminal shooting control program realizes the following steps when executed by the processor:
the method comprises the steps of determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal, determining the focus object in the shot object according to selection operation of a terminal user, screening distance data of the shot object, determining the distance data of the focus object, calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focus object, and controlling focusing parameters in shooting parameters according to the focal length of the shooting terminal.
In practical application, the terminal shooting control program realizes the following steps when executed by the processor:
determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal, determining a focus object in a shot object according to the selection operation of a terminal user, screening the distance data of the shot object, determining the distance data of the focus object and the distance data of a non-focus object, determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object, determining the shooting depth of field according to the distance between the focus object and the non-focus object, calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field, and controlling the aperture parameter in the shooting parameter according to the aperture value.
In practical application, the terminal shooting control program realizes the following steps when executed by the processor:
and determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object, wherein the aperture value is a small aperture when the shooting depth of field is large, and the aperture value is a large aperture when the shooting depth of field is small.
In practical application, the terminal shooting control program realizes the following steps when executed by the processor:
determining a foreground deep object and a rear field depth object in non-focus objects, respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear field depth object and the focus object, calculating a minimum value of shooting field depth when the foreground deep covers the first maximum distance and the rear field depth covers the second maximum distance, calculating a maximum value of an aperture diameter according to a minimum value of the shooting field depth and a second mapping relation, then determining a subsequent adjusting mode according to equipment capacity, when the shooting terminal does not support accurate adjustment of the aperture, determining an aperture level where the maximum value of the aperture diameter is located according to an aperture size range corresponding to the aperture level of the shooting terminal, and taking the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located as the aperture value; when the shooting terminal supports the accurate adjustment of the aperture, the aperture value corresponding to the maximum value of the aperture diameter is directly used as the aperture value.
In summary, according to the camera terminal provided in this embodiment, the terminal measures the distance parameter between the camera terminal and the object to be photographed through ultrasonic ranging, and uses the distance parameter as the object distance, and controls the photographing parameter according to the object distance, so that the function of automatically controlling the photographing parameter according to the object distance is realized, the problem that the existing terminal photographing control technology depends on the terminal user to manually control the photographing parameter is solved, and the user experience is enhanced.
Meanwhile, the present invention provides a computer-readable storage medium having a terminal photographing control program stored thereon, the terminal photographing control program implementing the following steps when executed by a processor:
sending a first ultrasonic signal outwards through the ultrasonic sensor 1051, and receiving a second ultrasonic signal formed by reflecting the first ultrasonic signal;
determining distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
according to the distance data of the object, the shooting parameters when shooting the object are controlled.
In practical application, the terminal shooting control program realizes the following steps when executed:
sending a first ultrasonic signal in a non-directional manner, calculating distance data between all objects in a sensing range of a shooting terminal and the shooting terminal according to the first ultrasonic signal and a second ultrasonic signal, determining a shot object according to a shooting direction of a camera of the shooting terminal, screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot object and the shooting terminal;
or, determining a shot object according to the shooting direction of a camera of the shooting terminal, sending a first ultrasonic signal to the shot object in a directional mode, and calculating distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal.
In practical application, the terminal shooting control program realizes the following steps when executed:
the method comprises the steps of determining a first mapping relation between an actual distance between a focus object and a shooting terminal and a focal length of the shooting terminal according to equipment parameters of the shooting terminal, determining the focus object in the shot object according to selection operation of a terminal user, screening distance data of the shot object, determining the distance data of the focus object, calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focus object, and controlling focusing parameters in shooting parameters according to the focal length of the shooting terminal.
In practical application, the terminal shooting control program realizes the following steps when executed:
determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal, determining a focus object in a shot object according to the selection operation of a terminal user, screening the distance data of the shot object, determining the distance data of the focus object and the distance data of a non-focus object, determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object, determining the shooting depth of field according to the distance between the focus object and the non-focus object, calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field, and controlling the aperture parameter in the shooting parameter according to the aperture value.
In practical application, the terminal shooting control program realizes the following steps when executed:
and determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object, wherein the aperture value is a small aperture when the shooting depth of field is large, and the aperture value is a large aperture when the shooting depth of field is small.
In practical application, the terminal shooting control program realizes the following steps when executed:
determining a foreground deep object and a rear field depth object in non-focus objects, respectively calculating a first maximum distance between the foreground deep object and the focus object and a second maximum distance between the rear field depth object and the focus object, calculating a minimum value of shooting field depth when the foreground deep covers the first maximum distance and the rear field depth covers the second maximum distance, calculating a maximum value of an aperture diameter according to a minimum value of the shooting field depth and a second mapping relation, then determining a subsequent adjusting mode according to equipment capacity, when the shooting terminal does not support accurate adjustment of the aperture, determining an aperture level where the maximum value of the aperture diameter is located according to an aperture size range corresponding to the aperture level of the shooting terminal, and taking the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located as the aperture value; when the shooting terminal supports the accurate adjustment of the aperture, the aperture value corresponding to the maximum value of the aperture diameter is directly used as the aperture value.
In summary, in the storage medium provided in the embodiment of the present invention, when the program stored in the storage medium runs, the terminal measures the distance parameter between the camera terminal and the object through ultrasonic ranging, and uses the distance parameter as the object distance, and controls the shooting parameter according to the object distance, so as to implement the function of automatically controlling the shooting parameter according to the object distance, solve the problem that the existing terminal shooting control technology depends on the terminal user to manually control the shooting parameter, and enhance the user experience.
According to the terminal shooting control method, the camera terminal and the computer readable storage medium provided by the embodiment of the invention, the distance parameter between the camera terminal and the shot object is measured through ultrasonic ranging and is used as the object distance, and the shooting parameter is controlled according to the object distance, so that the function of automatically controlling the shooting parameter according to the object distance is realized, the problem that the existing terminal shooting control technology depends on a terminal user to manually control the shooting parameter is solved, and the use experience of the user is enhanced.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.
Claims (10)
1. A terminal shooting control method is characterized by comprising the following steps:
sending a first ultrasonic signal outwards, and receiving a second ultrasonic signal formed by reflecting the first ultrasonic signal;
determining distance data between the shot object and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
and controlling shooting parameters when the shot object is shot according to the distance data of the shot object.
2. The terminal photographing control method of claim 1, wherein the determining distance data between the subject and the photographing terminal according to the first ultrasonic signal and the second ultrasonic signal comprises:
transmitting the first ultrasonic signal non-directionally;
calculating distance data between all objects in the sensing range of the shooting terminal and the shooting terminal according to the first ultrasonic signal and the second ultrasonic signal;
determining the shot object according to the shooting direction of a camera of a shooting terminal;
and screening the distance data between all the objects and the shooting terminal, and determining the distance data between the shot objects and the shooting terminal.
3. The terminal photographing control method of claim 1, wherein the determining distance data between the subject and the photographing terminal according to the first ultrasonic signal and the second ultrasonic signal comprises:
determining the shot object according to the shooting direction of a camera of a shooting terminal;
directionally transmitting the first ultrasonic signal to the subject;
and calculating distance data between the shot object and a shooting terminal according to the first ultrasonic signal and the second ultrasonic signal.
4. The terminal photographing control method according to claim 1, wherein the controlling of the photographing parameters when photographing the subject according to the distance data of the subject comprises:
determining a first mapping relation between an actual distance between a focus object and the shooting terminal and a focal length of the shooting terminal according to the equipment parameters of the shooting terminal;
determining a focus object in the shot object according to the selection operation of the terminal user;
screening the distance data of the shot object, and determining the distance data of the focus object;
calculating the focal length of the shooting terminal according to the first mapping relation and the distance data of the focal object;
and controlling focusing parameters in the shooting parameters according to the focal length of the shooting terminal.
5. The terminal photographing control method according to any one of claims 1 to 4, wherein the controlling of the photographing parameters when photographing the subject according to the distance data of the subject includes:
determining a second mapping relation between the shooting depth of field of the shooting terminal and the aperture parameter of the shooting terminal according to the equipment parameter of the shooting terminal;
determining a focus object in the shot object according to the selection operation of the terminal user;
screening the distance data of the shot object, and determining the distance data of the focus object and the distance data of the non-focus object;
determining the distance between the focus object and the non-focus object according to the distance data of the focus object and the distance data of the non-focus object;
determining the shooting depth of field according to the distance between the focus object and the non-focus object, and calculating the aperture value of the shooting terminal according to the second mapping relation and the shooting depth of field;
and controlling the aperture parameters in the shooting parameters according to the aperture value.
6. The terminal photographing control method according to claim 5, wherein when the second mapping relationship is a mapping relationship between a depth-of-field relative size and an aperture relative size, the determining a photographing depth-of-field according to a distance between the focus object and a non-focus object, and the calculating the aperture value according to the second mapping relationship and the photographing depth-of-field comprises:
determining the relative size of the shooting depth of field according to the distance between the focus object and the non-focus object;
when the shooting depth of field is large, the aperture value is a small aperture;
and when the shooting depth of field is small, the aperture value is a large aperture.
7. The terminal photographing control method according to claim 5, wherein when the second mapping relationship is a mapping relationship of an absolute value of a depth of field size and an absolute value of an aperture diameter and the photographing terminal supports fine adjustment of an aperture, the determining a photographing depth of field according to a distance between the focus object and a non-focus object, and the calculating the aperture value according to the second mapping relationship and the photographing depth of field comprises:
determining a foreground deep object and a back depth object in the non-focus object;
calculating a first maximum distance between the front depth object and the focus object and a second maximum distance between the rear depth object and the focus object, respectively;
calculating the minimum value of the shooting depth of field when the foreground deeply covers the first maximum distance and the back depth of field covers the second maximum distance;
calculating the maximum value of the aperture diameter according to the minimum value of the shooting depth of field and the second mapping relation;
and taking the aperture value corresponding to the maximum value of the aperture diameter as the aperture value.
8. The terminal photographing control method according to claim 5, wherein when the second mapping relationship is a mapping relationship between an absolute value of a depth of field size and an absolute value of an aperture diameter and the photographing terminal does not support aperture fine adjustment, the determining a photographing depth of field according to a distance between the focus object and a non-focus object, and the calculating the aperture value according to the second mapping relationship and the photographing depth of field comprises:
determining a foreground deep object and a back depth object in the non-focus object;
calculating a first maximum distance between the front depth object and the focus object and a second maximum distance between the rear depth object and the focus object, respectively;
calculating the minimum value of the shooting depth of field when the foreground deeply covers the first maximum distance and the back depth of field covers the second maximum distance;
calculating the maximum value of the aperture diameter according to the minimum value of the shooting depth of field and the second mapping relation;
determining the aperture level where the maximum value of the aperture diameter is located according to the aperture size range corresponding to the aperture level of the shooting terminal;
and taking the aperture value corresponding to the aperture level where the maximum value of the aperture diameter is located as the aperture value.
9. A camera terminal, characterized in that the camera terminal comprises: an ultrasonic sensor, a camera, a memory, a processor and a terminal imaging control program stored on the memory and executable on the processor, the terminal imaging control program, when executed by the processor, implementing the steps of the terminal imaging control method according to any one of claims 1 to 8.
10. A computer-readable storage medium, characterized in that the computer-readable storage medium has stored thereon a terminal photographing control program that, when executed, implements the steps of the terminal photographing control method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710653345.0A CN107580176A (en) | 2017-08-02 | 2017-08-02 | A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710653345.0A CN107580176A (en) | 2017-08-02 | 2017-08-02 | A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107580176A true CN107580176A (en) | 2018-01-12 |
Family
ID=61034365
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710653345.0A Pending CN107580176A (en) | 2017-08-02 | 2017-08-02 | A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107580176A (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108366207A (en) * | 2018-03-20 | 2018-08-03 | 广东欧珀移动通信有限公司 | Control method, apparatus, electronic equipment and the computer readable storage medium of shooting |
CN108989687A (en) * | 2018-09-07 | 2018-12-11 | 北京小米移动软件有限公司 | camera focusing method and device |
CN110519504A (en) * | 2018-05-21 | 2019-11-29 | 夏普株式会社 | Electronic equipment, the control device of electronic equipment, storage medium and control method |
CN110806755A (en) * | 2018-08-06 | 2020-02-18 | 中兴通讯股份有限公司 | Unmanned aerial vehicle tracking shooting method, terminal and computer readable storage medium |
WO2020042581A1 (en) * | 2018-08-29 | 2020-03-05 | 北京七鑫易维信息技术有限公司 | Focusing method and device for image acquisition apparatus |
CN111526282A (en) * | 2020-03-26 | 2020-08-11 | 香港光云科技有限公司 | Method and device for shooting with adjustable depth of field based on flight time |
WO2021072648A1 (en) * | 2019-10-15 | 2021-04-22 | Qualcomm Incorporated | Active depth sensing based autofocus |
WO2022000213A1 (en) * | 2020-06-29 | 2022-01-06 | 深圳市大疆创新科技有限公司 | Control method and apparatus for image photographing, and storage medium |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5532782A (en) * | 1994-02-17 | 1996-07-02 | Nikon Corporation | Camera having a depth priority operation mode |
CN101221341A (en) * | 2007-01-08 | 2008-07-16 | 华晶科技股份有限公司 | Initialization method for field depth composition |
CN102566045A (en) * | 2010-12-20 | 2012-07-11 | 北京泰邦天地科技有限公司 | Optical imaging system |
CN104243828A (en) * | 2014-09-24 | 2014-12-24 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for shooting pictures |
CN104883502A (en) * | 2015-05-19 | 2015-09-02 | 广东欧珀移动通信有限公司 | Focusing method and apparatus for mobile terminal |
WO2016056089A1 (en) * | 2014-10-08 | 2016-04-14 | 日立マクセル株式会社 | Camera, and image-capturing method |
CN105549299A (en) * | 2016-02-29 | 2016-05-04 | 广东欧珀移动通信有限公司 | Control method, control apparatus and electronic apparatus |
CN105592260A (en) * | 2014-10-24 | 2016-05-18 | 中兴通讯股份有限公司 | ocusing method and apparatus, and terminal |
JP2016092785A (en) * | 2014-11-11 | 2016-05-23 | キヤノン株式会社 | Imaging apparatus |
CN105704380A (en) * | 2016-03-03 | 2016-06-22 | 上海与德通讯技术有限公司 | Camera focusing method and electric device |
CN105812656A (en) * | 2016-02-29 | 2016-07-27 | 广东欧珀移动通信有限公司 | Control method, control apparatus and electronic apparatus |
-
2017
- 2017-08-02 CN CN201710653345.0A patent/CN107580176A/en active Pending
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5532782A (en) * | 1994-02-17 | 1996-07-02 | Nikon Corporation | Camera having a depth priority operation mode |
CN101221341A (en) * | 2007-01-08 | 2008-07-16 | 华晶科技股份有限公司 | Initialization method for field depth composition |
CN102566045A (en) * | 2010-12-20 | 2012-07-11 | 北京泰邦天地科技有限公司 | Optical imaging system |
CN104243828A (en) * | 2014-09-24 | 2014-12-24 | 宇龙计算机通信科技(深圳)有限公司 | Method, device and terminal for shooting pictures |
WO2016056089A1 (en) * | 2014-10-08 | 2016-04-14 | 日立マクセル株式会社 | Camera, and image-capturing method |
CN105592260A (en) * | 2014-10-24 | 2016-05-18 | 中兴通讯股份有限公司 | ocusing method and apparatus, and terminal |
JP2016092785A (en) * | 2014-11-11 | 2016-05-23 | キヤノン株式会社 | Imaging apparatus |
CN104883502A (en) * | 2015-05-19 | 2015-09-02 | 广东欧珀移动通信有限公司 | Focusing method and apparatus for mobile terminal |
CN105549299A (en) * | 2016-02-29 | 2016-05-04 | 广东欧珀移动通信有限公司 | Control method, control apparatus and electronic apparatus |
CN105812656A (en) * | 2016-02-29 | 2016-07-27 | 广东欧珀移动通信有限公司 | Control method, control apparatus and electronic apparatus |
CN105704380A (en) * | 2016-03-03 | 2016-06-22 | 上海与德通讯技术有限公司 | Camera focusing method and electric device |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108366207A (en) * | 2018-03-20 | 2018-08-03 | 广东欧珀移动通信有限公司 | Control method, apparatus, electronic equipment and the computer readable storage medium of shooting |
CN108366207B (en) * | 2018-03-20 | 2020-06-23 | Oppo广东移动通信有限公司 | Method and device for controlling shooting, electronic equipment and computer-readable storage medium |
CN110519504A (en) * | 2018-05-21 | 2019-11-29 | 夏普株式会社 | Electronic equipment, the control device of electronic equipment, storage medium and control method |
CN110806755A (en) * | 2018-08-06 | 2020-02-18 | 中兴通讯股份有限公司 | Unmanned aerial vehicle tracking shooting method, terminal and computer readable storage medium |
WO2020042581A1 (en) * | 2018-08-29 | 2020-03-05 | 北京七鑫易维信息技术有限公司 | Focusing method and device for image acquisition apparatus |
CN108989687A (en) * | 2018-09-07 | 2018-12-11 | 北京小米移动软件有限公司 | camera focusing method and device |
WO2021072648A1 (en) * | 2019-10-15 | 2021-04-22 | Qualcomm Incorporated | Active depth sensing based autofocus |
CN111526282A (en) * | 2020-03-26 | 2020-08-11 | 香港光云科技有限公司 | Method and device for shooting with adjustable depth of field based on flight time |
WO2022000213A1 (en) * | 2020-06-29 | 2022-01-06 | 深圳市大疆创新科技有限公司 | Control method and apparatus for image photographing, and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107580176A (en) | A kind of terminal taking control method, camera shooting terminal and computer-readable recording medium | |
CN110795236B (en) | Method, device, electronic equipment and medium for adjusting capacity of server | |
CN109743498B (en) | Shooting parameter adjusting method and terminal equipment | |
CN110505408B (en) | Terminal shooting method and device, mobile terminal and readable storage medium | |
CN107592466B (en) | Photographing method and mobile terminal | |
CN109348020B (en) | Photographing method and mobile terminal | |
CN110300267B (en) | Photographing method and terminal equipment | |
CN107248137B (en) | Method for realizing image processing and mobile terminal | |
CN109951398B (en) | Data sending method and device and computer equipment | |
CN108419008B (en) | Shooting method, terminal and computer readable storage medium | |
CN103458189A (en) | Method and device for configuring photographing parameters and terminal device | |
EP4000700A1 (en) | Camera shot movement control method, device, apparatus, and storage medium | |
CN111031248A (en) | Shooting method and electronic equipment | |
CN111385525B (en) | Video monitoring method, device, terminal and system | |
CN110769154B (en) | Shooting method and electronic equipment | |
CN109104573B (en) | Method for determining focusing point and terminal equipment | |
CN110060617B (en) | Screen light spot compensation method and device, terminal and readable storage medium | |
WO2021104162A1 (en) | Display method and electronic device | |
CN113269877A (en) | Method and electronic equipment for acquiring room layout plan | |
CN112153305A (en) | Camera starting method, mobile terminal and computer storage medium | |
CN108449560B (en) | Video recording method and terminal | |
CN107728789B (en) | Starting method of one-hand operation mode, terminal and storage medium | |
CN108391050B (en) | Image processing method and mobile terminal | |
CN107257430B (en) | A kind of camera control method, terminal and computer readable storage medium | |
CN110233962B (en) | Confidence optimization method and device and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180112 |