[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111988522B - Shooting control method and device, electronic equipment and storage medium - Google Patents

Shooting control method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN111988522B
CN111988522B CN202010738488.3A CN202010738488A CN111988522B CN 111988522 B CN111988522 B CN 111988522B CN 202010738488 A CN202010738488 A CN 202010738488A CN 111988522 B CN111988522 B CN 111988522B
Authority
CN
China
Prior art keywords
control
current
shooting
image processing
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010738488.3A
Other languages
Chinese (zh)
Other versions
CN111988522A (en
Inventor
张艺弘
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202010738488.3A priority Critical patent/CN111988522B/en
Publication of CN111988522A publication Critical patent/CN111988522A/en
Application granted granted Critical
Publication of CN111988522B publication Critical patent/CN111988522B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/631Graphical user interfaces [GUI] specially adapted for controlling image capture or setting capture parameters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present disclosure relates to a shooting control method, a shooting control apparatus, an electronic device, and a storage medium, and relates to the technical field of photography, wherein the method comprises: displaying a control and a current shooting image on a shooting preview interface; identifying a specified object among the currently photographed images; determining the current display position of a specified object on a shooting preview interface; and if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control. According to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then the operation corresponding to the control is executed, so that a user can trigger the functional control in the shooting preview interface in a remote operation mode, the user can trigger the control in a non-contact mode conveniently, the problem that the user shooting experience is poor due to the fact that the user frequently walks to the front of the device to trigger the control is solved, and the user shooting experience is improved.

Description

Shooting control method and device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of photography technologies, and in particular, to a shooting control method and apparatus, an electronic device, and a storage medium.
Background
At present, a shooting function becomes an indispensable function module in the field of terminal equipment (e.g., mobile phones and tablet computers), and the shooting function is built in the terminal equipment, so that a user can record and share the life state and beautiful scenery of the user to a social network anytime and anywhere through the terminal equipment.
In the related art, in order to obtain better shooting, in the process of shooting through the terminal device, a user needs to see the whole shooting effect through a certain distance, and if the shooting effect is not good, the user needs to walk in front of the terminal device and touch related controls displayed in the terminal device to adjust the shooting effect of the terminal device. The mode that the shooting effect can be adjusted only by directly touching the terminal equipment before the terminal equipment is frequently walked, so that the shooting experience of a user is poor. Therefore, how to realize shooting control of images becomes an urgent problem to be solved.
Disclosure of Invention
The disclosure provides a shooting control method, a shooting control device, electronic equipment and a storage medium, which are used for at least solving the problem of poor shooting experience of a user caused by a mode that a shooting effect can be adjusted only by directly touching terminal equipment in the related art. The technical scheme of the disclosure is as follows:
according to a first aspect of the embodiments of the present disclosure, there is provided a shooting control method including: displaying a control and a current shooting image on a shooting preview interface; identifying a specified object among the currently photographed images; determining the current display position of the specified object on the shooting preview interface; and if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control.
According to one embodiment of the present disclosure, the displaying a currently photographed image on a photographing preview interface includes: acquiring the size of a current shot image; adjusting the size of the current shot image according to the size of the shooting preview interface; and displaying the current shooting image with the same size as the shooting preview interface on the shooting preview interface.
According to an embodiment of the present disclosure, the determining a current display position of the designated object on the shooting preview interface includes: determining an image position of the specified object in the current captured image; and determining the current display position of the specified object on the shooting preview interface according to the image position based on the position mapping relation between the current shooting image and the shooting preview interface.
According to an embodiment of the present disclosure, the determining that the control is triggered according to the current display position of the specified object and the position of the control includes: and if the fact that the overlapping area between the current display position of the specified object and the position of the control exceeds a preset threshold value is determined, and the time length of the overlapping area exceeding the preset threshold value exceeds a preset time length is determined, the control is triggered.
According to one embodiment of the present disclosure, the control is configured to instruct to adjust a function parameter of a current image processing function, and the executing operation corresponding to the control includes: carrying out value adjustment on the functional parameters of the current image processing function to obtain the adjusted functional parameters; and processing and displaying the current shot image according to the adjusted functional parameters.
According to an embodiment of the present disclosure, the performing value adjustment on the functional parameter of the current image processing function to obtain the adjusted functional parameter includes: determining the duration of triggering the control according to the current display position of the specified object and the position of the control; acquiring the current value of the functional parameter; and adjusting the value of the functional parameter according to the current value and the duration to obtain the adjusted functional parameter.
According to an embodiment of the present disclosure, the control is displayed in a first display manner on the shooting preview interface, wherein before the value of the function parameter is adjusted to obtain the adjusted function parameter, the method further includes: and controlling to display the control in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
According to one embodiment of the present disclosure, the control is configured to instruct switching from a current image processing function to an adjacent image processing function, and the executing operation corresponding to the control includes: acquiring a corresponding first image processing function according to the current image processing function, wherein the first image processing function is an image processing function adjacent to the current image processing function; switching the current image processing function to a first image processing function; and processing and displaying the current shot image according to the first image processing function.
According to one embodiment of the present disclosure, the control includes N, where N is an integer greater than 1, and the method further includes: if the current display position and the positions of the M controls have position superposition parts, determining the inclination angle of the specified object relative to the reference direction according to the current display position, wherein M is an integer which is greater than 1 and less than or equal to N; and if the inclination angle is smaller than a preset angle threshold, outputting first prompt information, wherein the first prompt information is used for prompting that the operation is invalid and prompting to adjust the inclination angle of the specified object.
According to an embodiment of the present disclosure, the method further comprises: and if the inclination angle of the specified object is larger than or equal to a preset angle threshold, outputting second prompt information, wherein the second prompt information is used for prompting that the operation is invalid and prompting to adjust the distance between the specified object and the shooting preview interface.
According to an embodiment of the present disclosure, the displaying a control on a shooting preview interface includes: receiving a trigger instruction aiming at the current image processing function; and responding to the trigger instruction, displaying a first operation area which is transparent and covers the current shooting image on the shooting preview interface, wherein the first operation area comprises the control.
According to an embodiment of the present disclosure, the shooting preview interface further includes a second operation area thereon, where the second operation area includes: the receiving of the trigger instruction for the current image processing function by the touch control corresponding to the current image processing function includes: receiving a touch operation aiming at the touch control; and triggering the triggering instruction aiming at the current image processing function according to the touch operation.
According to an embodiment of the present disclosure, after displaying a first operation region that is transparent and is overlaid on the currently photographed image on the photographing preview interface, the method further includes: when the fact that the dragging operation is performed on the first operation area is detected, moving the first operation area according to the dragging operation; and when the fact that the first operation area stops executing the dragging operation is detected, controlling the first operation area to stay at the current position of the shooting preview interface.
According to an embodiment of the present disclosure, the method further comprises: receiving a hiding instruction, wherein the hiding instruction is used for indicating to hide the second operation area; and responding to a hiding instruction, and hiding the second operation area.
According to an embodiment of the present disclosure, the receiving a hiding instruction includes: and if the current display position is detected to be in the first operation area, determining that the hiding instruction is received.
According to an embodiment of the present disclosure, after the hiding the second operation region, the method further includes: receiving a display instruction, wherein the display instruction is used for indicating to display the second operation area; and responding to the display instruction, and controlling to display the second operation area on the shooting preview interface again.
According to an embodiment of the present disclosure, the receiving a display instruction includes: if the click operation aiming at the shooting preview interface is received, determining to receive the display instruction; or if the received voice information comprises preset keywords, determining to receive the display instruction; or if the received gesture is consistent with the preset gesture corresponding to the first operation area, determining to receive the display instruction.
According to an embodiment of the present disclosure, after displaying a first operation region that is transparent and is overlaid on the currently photographed image on the photographing preview interface, the method further includes: and if the current display position is detected to be in the first operation area, acquiring a contour corresponding to the specified object, and displaying the contour in a preset mode.
According to an embodiment of the present disclosure, the acquiring a contour corresponding to the designated object includes: determining an image operation area of a hand corresponding to the fingertip according to the current shot image; and determining the corresponding contour of the fingertip according to the image operation area.
According to a second aspect of the embodiments of the present disclosure, there is provided a photographing control apparatus including: the display module is configured to display the control and the current shooting image on the shooting preview interface; an identification module configured to identify a specified object among the currently photographed image; a determination module configured to determine a current display position of the designated object on the photographing preview interface; and the execution module is configured to execute the operation corresponding to the control if the control is determined to be triggered according to the current display position of the specified object and the position of the control.
According to an embodiment of the present disclosure, the display module includes: an acquisition size unit configured to acquire a size of a currently captured image; a size adjustment unit configured to adjust a size of the currently photographed image according to a size of the photographing preview interface; and a display image unit configured to display the current photographed image having the same size as the photographing preview interface on the photographing preview interface.
According to one embodiment of the disclosure, the determining module includes: an acquisition position unit configured to determine an image position of the specified object in the currently captured image; and a position determining unit configured to determine a current display position of the designated object on the shooting preview interface according to the image position based on a position mapping relationship between the current shooting image and the shooting preview interface.
According to an embodiment of the present disclosure, the execution module is specifically configured to: and if the fact that the overlapping area between the current display position of the specified object and the position of the control exceeds a preset threshold value is determined, and the time length of the overlapping area exceeding the preset threshold value exceeds a preset time length is determined, the control is triggered.
According to one embodiment of the present disclosure, the control is configured to instruct to adjust a function parameter of a current image processing function, and the execution module includes: a first adjusting unit, configured to perform value adjustment on a functional parameter of the current image processing function to obtain the adjusted functional parameter; and the first processing unit is configured to process and display the current shot image according to the adjusted functional parameters.
According to an embodiment of the present disclosure, the first adjusting unit includes: a first determining subunit, configured to determine, according to the current display position of the specified object and the position of the control, a duration for which the control is triggered; a first obtaining subunit configured to obtain a current value of the functional parameter; and the first adjusting subunit is configured to perform value adjustment on the functional parameter according to the current value and the duration to obtain an adjusted functional parameter.
According to one embodiment of the disclosure, the control is displayed in a first display mode on the shooting preview interface, wherein the device further comprises: and the control display module is configured to control the control to be displayed in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
According to one embodiment of the present disclosure, the control is configured to instruct switching from a current image processing function to an adjacent image processing function, and the execution module includes: a second obtaining unit configured to obtain a corresponding first image processing function according to the current image processing function, wherein the first image processing function is an image processing function adjacent to the current image processing function; a switching unit configured to switch the current image processing function to a first image processing function; a first image processing unit configured to process and display the currently captured image according to the first image processing function.
According to an embodiment of the present disclosure, the control includes N, where N is an integer greater than 1, and the apparatus further includes: the inclination angle determining module is configured to determine an inclination angle of the specified object relative to a reference direction according to the current display position if the current display position and positions of the M controls all have position overlapping parts, wherein M is an integer which is greater than 1 and less than or equal to N; the first prompt module is configured to output first prompt information if the inclination angle is smaller than a preset angle threshold, wherein the first prompt information is used for prompting that the operation is invalid and prompting to adjust the inclination angle of the specified object.
According to one embodiment of the present disclosure, the apparatus further comprises: and the second prompting module is configured to output second prompting information if the inclination angle of the specified object is greater than or equal to a preset angle threshold, wherein the second prompting information is used for prompting that the operation is invalid and prompting to adjust the distance between the specified object and the shooting preview interface.
According to an embodiment of the present disclosure, the display module includes: a reception instruction unit configured to receive a trigger instruction for the current image processing function; a first operation area display unit configured to display a first operation area which is transparent and is overlaid on the current shot image on the shooting preview interface in response to the trigger instruction, wherein the first operation area includes the control.
According to an embodiment of the present disclosure, the shooting preview interface further includes a second operation area thereon, where the second operation area includes: the touch control corresponding to the current image processing function, the instruction receiving unit, is specifically configured to: receiving a touch operation aiming at the touch control; and triggering the triggering instruction aiming at the current image processing function according to the touch operation.
According to one embodiment of the present disclosure, the apparatus further comprises: a movement region unit configured to, when it is detected that a drag operation is performed on the first operation region, move the first operation region in accordance with the drag operation; a control area unit configured to control the first operation area to stay at a current position of the photographing preview interface when it is detected that the drag operation is stopped being performed on the first operation area.
According to one embodiment of the present disclosure, the apparatus further comprises: a hiding instruction receiving module configured to receive a hiding instruction, wherein the hiding instruction is used for indicating to hide the second operation area; a response hiding instruction module configured to perform hiding processing on the second operation area in response to a hiding instruction.
According to an embodiment of the present disclosure, the hidden instruction receiving module is specifically configured to: and if the current display position is detected to be in the first operation area, determining that the hiding instruction is received.
According to one embodiment of the present disclosure, the apparatus further comprises: a receiving and displaying instruction module configured to receive a displaying instruction, wherein the displaying instruction is used for indicating to display the second operation area; a response display instruction module configured to control to display the second operation region again on the photographing preview interface in response to the display instruction.
According to an embodiment of the present disclosure, the receiving and displaying instruction module is specifically configured to: if the click operation aiming at the shooting preview interface is received, determining to receive the display instruction; or if the received voice information comprises preset keywords, determining to receive the display instruction; or if the received gesture is consistent with the preset gesture corresponding to the first operation area, determining to receive the display instruction.
According to one embodiment of the present disclosure, the apparatus further comprises: and the contour acquiring module is configured to acquire a contour corresponding to the specified object and display the contour in a preset mode if the current display position is detected to be in the first operation area.
According to an embodiment of the present disclosure, the designated object is a fingertip, and the contour obtaining module includes: a determination region unit configured to determine an image operation region of a hand corresponding to the fingertip from the currently captured image; a determine contour unit configured to determine a contour of the hand according to the image operation region.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic apparatus including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the shooting control method provided by the embodiment of the first aspect of the disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a storage medium, wherein instructions that, when executed by a processor of an electronic device, enable the electronic device to perform a photographing control method as provided in embodiments of the first aspect of the present disclosure.
In order to achieve the above object, a fifth aspect embodiment of the present invention proposes a computer program product, which, when executed by an instruction processor, executes a shooting control method as described in the above embodiments.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects: the method comprises the steps of displaying a control and a current shot image on a shooting preview interface, then identifying a specified object in the current shot image, then determining the current display position of the specified object on the shooting preview interface, and executing the operation corresponding to the control if the control is triggered according to the current display position of the specified object and the position of the control. Therefore, according to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then the operation corresponding to the control is executed, so that a user can trigger the function control in the shooting preview interface in a remote operation mode, the user can trigger the control in a non-contact mode conveniently, the problem of poor shooting experience of the user caused by the fact that the user frequently walks to the device to trigger the control is avoided, and the shooting experience of the user is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 is a flowchart illustrating a photographing control method according to an exemplary embodiment.
Fig. 2 is a first exemplary diagram illustrating a capture preview interface in accordance with an exemplary embodiment.
Fig. 3 is a flowchart illustrating a photographing control method according to a specific embodiment according to an exemplary embodiment.
Fig. 4 is an exemplary diagram of a capture preview interface shown in accordance with an exemplary embodiment.
Fig. 5 is an exemplary diagram three of a capture preview interface shown in accordance with an exemplary embodiment.
Fig. 6 is an exemplary diagram of a capture preview interface shown in accordance with an exemplary embodiment.
FIG. 7 is a flow diagram illustrating adjusted functional parameters according to an example embodiment.
Fig. 8 is an exemplary diagram of a capture preview interface shown in accordance with an exemplary embodiment.
Fig. 9 is an exemplary diagram six of a capture preview interface shown in accordance with an exemplary embodiment.
Fig. 10 is a flowchart illustrating a photographing control method according to another specific embodiment according to an exemplary embodiment.
Fig. 11 is an exemplary diagram seven of a capture preview interface shown in accordance with an exemplary embodiment.
Fig. 12 is a block diagram illustrating a photographing control apparatus according to an exemplary embodiment.
FIG. 13 is a schematic diagram of an electronic device shown in accordance with an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
It should be noted that the shooting control method according to the embodiment of the present disclosure may be executed by a shooting control apparatus according to the embodiment of the present disclosure, where the shooting control apparatus may be implemented in software and/or hardware, and the shooting control apparatus may be configured in an electronic device, where the electronic device may display a shooting preview interface, and the electronic device may be a terminal device having a shooting function. The terminal device may include, but is not limited to, a smart phone, a tablet computer, and the like.
Fig. 1 is a flowchart illustrating a photographing control method according to an exemplary embodiment, and as shown in fig. 1, the photographing control method proposed by the present embodiment includes the following steps.
In step 101, a control and a current shooting image are displayed on a shooting preview interface.
For example, the shooting preview interface may be a preview interface when a camera of the electronic device shoots a picture or a video, or may be a preview interface when the electronic device is connected with other video or image acquisition devices.
In the embodiment of the disclosure, when a user needs to shoot an image through an electronic device, the user can start a camera of the electronic device in multiple ways, and correspondingly, the electronic device obtains the current shot image of the camera and displays a shooting preview interface on the electronic device.
For another example, in the process of using the smart phone by the user, if the user is inconvenient to operate the smart phone, the user may start the shooting function of the smart phone by inputting a voice command, specifically, the user may input a voice message of "please open the shooting function of a certain short video application", and correspondingly, the smart phone analyzes the voice message to obtain a corresponding control command, and turns on the camera according to the corresponding control command, and displays the current image shot by the camera on the shooting preview screen of the short video application.
In order to improve the effect of shooting images or recording videos of a user, controls with various functions are further displayed on the shooting preview interface, so that the user can conveniently adjust shooting parameters through the controls, or perform image processing on the shot images, for example, add various special effects to the shot images.
Wherein, the control refers to the encapsulation of data and methods. The control may have its own properties and methods, the properties being simple visitors of the control data, and the methods being some simple and visible functions of the control.
For example, the currently shot image is displayed on the shooting preview interface, and a control for adjusting the function parameters of the image processing function for the currently shot image, or a control for switching the image processing function, etc. may also be displayed.
For example, after a user triggers a shooting button in a camera application in the electronic device, the electronic device starts a camera, collects an image through the camera, and displays a shot current image on a shooting preview interface in the camera application, where an exemplary view of the shooting preview interface is shown in fig. 2, and as can be seen from fig. 2, the shooting preview interface can display the current shot image, a control a for adjusting function parameters of an image processing function, and a control B for switching to an adjacent image processing function.
In step 102, a specified object among the currently captured images is identified.
In the embodiment of the disclosure, after the currently shot image is displayed on the shooting preview interface, the specified object in the currently shot image can be identified through the identification module in the electronic device based on the characteristics of the specified object.
Wherein the designated object includes, but is not limited to, a fingertip. For example, the fingertips include a single fingertip or multiple fingertips.
For example, after the current captured image is displayed on the capture preview interface, the electronic device may detect the area of the quasi-straight line by using the contour image based on the contour features of the fingertips, and identify the fingertips in the current captured image by combining the areas of the quasi-straight line.
For another example, after the current captured image is displayed on the capture preview interface, the identification module in the electronic device may identify whether a fingerprint exists in the current captured image, and if so, may determine that a fingertip exists in the current captured image, and further identify the fingertip in the current captured image.
For another example, after displaying the current captured image on the capture preview interface, the electronic device may identify a fingertip in the current captured image based on a barycentric distance method.
For example, after the currently-photographed image is displayed on the photographing preview interface, the electronic device may detect a region of the hand based on skin color detection, then calculate a center of gravity of the region of the hand, then intensively detect a point farthest from the center of gravity at an edge point of the region of the hand, where the point is a fingertip candidate position, then determine whether a distance from the candidate point to the center of gravity is greater than 1.6 times an average distance from the edge to the center of gravity, and if so, may identify the point as a fingertip.
In step 103, the current display position of the designated object on the shooting preview interface is determined.
In an embodiment of the present disclosure, a designated object in a currently captured image is recognized, an image position of the designated object in the currently captured image may be determined, and a current display position of the designated object on a capture preview interface is determined according to the image position based on a position mapping relationship between the currently captured image and the capture preview interface.
For example, a fingertip in the current captured image is recognized, coordinates (x, y) of the fingertip in the current captured image can be determined, and based on the position mapping relationship between the current captured image and the capture preview interface, the current display position of the fingertip on the capture preview interface can be determined to be (x, y).
As one possible implementation example, the specified object in the current shot image is identified, the coordinate of the specified object in the current shot image can be determined, the coordinate of the specified object in the shooting preview interface can be determined based on the same reference coordinate system of the current shot image and the shooting preview interface which is established in advance, and then the current display position of the specified object on the shooting preview interface can be determined.
As another possible implementation example, a specified object in the currently captured image is identified, the image position of the specified object in the currently captured image may be determined, and the image position of the specified object in the currently captured image may be directly used as the current display position of the specified object on the capturing preview interface. For example, the upper right position of the subject in the currently captured image is specified, that is, the current display position of the subject on the capture preview interface is specified to the upper right.
In step 104, if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control.
In the embodiment of the disclosure, after the current display position of the designated object on the shooting preview interface is determined, whether the current display position of the designated object is within a preset area range of a control position in the shooting preview interface can be detected, if so, it can be judged that the control in the shooting preview interface is triggered, and then the electronic device executes an operation corresponding to the control. Specific implementation modes can refer to the following embodiments.
For example, the coordinates (x, y) of the fingertip in the current captured image may be determined, and when the coordinates (x, y) of the fingertip are detected to be within the range of the top-right button area (top, right) in the capture preview interface, it may be determined that the control in the capture preview interface is triggered, and then the electronic device performs an operation of adjusting the function parameter of the corresponding image processing function in the control or an operation of switching to an adjacent image processing function in the control.
As a possible implementation example, after determining the current display position of the designated object on the shooting preview interface, it may be detected whether the current display position of the designated object on the shooting preview interface coincides with the position of the control in the shooting preview interface, and if the current display position of the designated object on the shooting preview interface coincides with the position of the control in the shooting preview interface, it may be determined that the control is triggered, and then the operation corresponding to the control is executed.
As another possible implementation example, after the current display position of the designated object on the shooting preview interface is determined, it may be determined that a coincidence area between the current display position of the designated object on the shooting preview interface and a position of a control in the shooting preview interface exceeds a preset threshold, and a duration of the coincidence area exceeding the preset threshold exceeds a preset duration, and if yes, it may be determined that the control is triggered, and then the operation corresponding to the control is executed.
In one embodiment of the present disclosure, to prompt the user whether the control is triggered, for the control, the control is displayed in a first display before the control is triggered and in a second display after the control is triggered. And the display styles of the controls in the first display mode and the second display mode are different. For example, the color of the controllable control may be white before the control is not triggered and yellow after the control is triggered.
According to the shooting control method provided by the embodiment of the disclosure, a control and a current shooting image are displayed on a shooting preview interface; identifying a specified object among the currently photographed images; determining the current display position of a specified object on a shooting preview interface; and if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control. Therefore, according to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then the operation corresponding to the control is executed, so that a user can trigger the function control in the shooting preview interface in a remote operation mode, the user can trigger the control in a non-contact mode conveniently, the problem of poor shooting experience of the user caused by the fact that the user frequently walks to the device to trigger the control is avoided, and the shooting experience of the user is improved.
Fig. 3 is a flowchart illustrating a photographing control method according to a specific embodiment according to an exemplary embodiment. As shown in fig. 3, the shooting control method of the present disclosure includes the steps of:
in step 301, a control and a current shot image are displayed on a shot preview interface.
And the control is used for indicating to adjust the function parameters of the current image processing function.
In the embodiment of the disclosure, a user can start a camera to shoot an image by triggering a camera application button in an electronic device, correspondingly, the electronic device can obtain the size of the currently shot image, then adjust the size of the currently shot image according to the size of a shooting preview interface, and further display the currently shot image with the same size as the shooting preview interface on the shooting preview interface, so that the shooting preview interface and the currently shot image have the same size.
The image processing function refers to a function that can process a currently shot image so as to present an effect of the currently shot image. The image processing functions include, but are not limited to, beauty, makeup, body beauty, filters, and the like. Wherein, the beautifying includes but not limited to skin grinding, skin whitening and the like; makeup includes, but is not limited to, lip makeup, eyebrow makeup, eye makeup, facial makeup, etc.; body beauty including but not limited to long legs, thin waist, thin face, small head, swan neck, etc.; filters include, but are not limited to, motion pictures, film time, texture, nature, and the like.
For example, the image processing function is a beauty processing function, the user can adjust the beauty degree of the beauty processing function according to the requirement, and correspondingly, the electronic device will beautify the face part in the current shot image according to the beauty parameters of the beauty processing function used by the user.
For example, the image processing function is a long-leg function, and generally, the length effect is different for different values of the functional parameter corresponding to the long-leg function, and generally, the greater the value of the functional parameter of the long-leg function is, the longer the length of the user's leg in the currently captured image is.
The current image processing function is an image processing function currently used for a currently captured image. The image processing function used for the currently captured image is triggered by the user.
The control in this embodiment may be set in a first operation area on the shooting preview interface, where the first operation area may be transparent and covers the currently shot image.
In an embodiment of the present disclosure, the first operation area may be displayed on the shooting preview interface after the user triggers the current image processing function. That is, after receiving the trigger function for the current image processing, the first operation region transparent and overlaid on the current captured image may be displayed on the capture preview interface in response to the trigger instruction.
In one embodiment of the present disclosure, there are various ways in which the current image processing function is triggered, for example as follows:
as an example, the trigger instruction for the current image processing function may be obtained by means of a voice instruction.
Specifically, the user can input the function name of the current image processing function in a voice mode, correspondingly, the electronic device recognizes the voice information input by the user, determines a control instruction corresponding to the voice recognition result, and when the control instruction is determined to be a trigger instruction for the current image processing function, can display a first operation area which is transparent and covers the current shot image on the shooting preview interface.
For example, the user needs to use the long leg function for the current captured image, the user can input voice data of "use the long leg function for the image", correspondingly, the electronic device takes the long leg function as the currently triggered image processing function, and can display a first operation area which is transparent and covers the current captured image on the capture preview interface.
As another example, the shooting preview interface may further include a second operation area, where the second operation area includes: the method comprises the steps that a touch control corresponding to a current image processing function receives touch operation aiming at the touch control, triggers a trigger instruction aiming at the current image processing function according to the touch operation, and displays a first operation area which is transparent and covers the current shooting image on a shooting preview interface.
The shooting preview interface further comprises a second operation area, wherein the second operation area comprises: a touch control corresponding to the first image processing function. For example, the second operating area may be understood as a cosmetic panel.
For example, as shown in fig. 2, the beauty panel includes: and the touch control corresponds to the body beautifying function in the first operation area.
For example, according to the requirement of the user, the function control corresponding to the long leg function in the beauty options on the shooting preview interface can be touched by the user, then the electronic device can receive touch operation aiming at the long leg function, then a trigger instruction aiming at the long leg function is triggered according to the touch operation, and in response to the trigger instruction, a first operation area which is transparent and covers the current shooting image is displayed on the shooting preview interface.
In one embodiment of the present disclosure, in order to facilitate the user to know the currently triggered image processing function, in addition to displaying the control in the first operation area, the name of the image processing function and the current value of the function parameter of the image processing function may be displayed in the first operation area.
Here, it is to be understood that the first operation region may be displayed at a preset position of the photographing preview screen by default. In this embodiment, in order to facilitate the user to remotely trigger the control in the first operation region, preferably, the preset position may be the top of the shooting preview interface.
In an embodiment of the present disclosure, in order to facilitate that a user may adjust the display position of the first operation region as desired, the position of the first operation region on the shooting preview interface of the embodiment may be adjustable.
As an exemplary embodiment, after a transparent first operation area is displayed on a photographing preview interface, when it is detected that a drag operation is performed on the first operation area, the first operation area is moved according to the drag operation; when the fact that the dragging operation is stopped being executed on the first operation area is detected, the first operation area is controlled to stay at the current position of the shooting preview interface, and therefore a user can move the first operation area on the shooting preview interface to a position suitable for self operation according to self requirements, and the user can conveniently conduct remote triggering on a control in the first operation area.
Specifically, after the transparent first operation area is displayed on the shooting preview interface, a user can perform mirror image dragging on the first operation area according to the needs of the user, when the electronic device detects that the user performs mirror image dragging on the first operation area, the first operation area can be moved according to the dragging operation, and when the electronic device detects that the dragging operation is stopped being performed on the first operation area, the first operation area can be controlled to stop at the current position of the shooting preview interface.
For example, the user may drag the first operation region to a top position in the shooting preview interface through mirror dragging; for another example, the user may drag the first operation area to a left position in the shooting preview interface through mirror image dragging; for another example, the user may drag the first operation region to a right position in the photographing preview interface through a mirror drag.
In step 302, a specified object among the currently captured images is identified.
Wherein the designated object includes, but is not limited to, a fingertip.
For example, after the currently captured image is displayed on the capture preview interface, the electronic device may identify a fingertip in the currently captured image based on a barycentric distance method. For example, after the currently-photographed image is displayed on the photographing preview interface, the electronic device may detect a region of the hand based on skin color detection, then calculate a center of gravity of the region of the hand, then intensively detect a point farthest from the center of gravity at an edge point of the region of the hand, where the point is a fingertip candidate position, then determine whether a distance from the candidate point to the center of gravity is greater than 1.6 times an average distance from the edge to the center of gravity, and if so, may identify the point as a fingertip.
For another example, after a girl image is displayed on the shooting preview interface, the identification module in the electronic device may identify whether a fingerprint exists in the girl image, and if so, may determine that a fingertip exists in the girl image, and further identify the fingertip in the girl image.
In step 303, the current display position of the designated object on the shooting preview interface is determined.
For example, a fingertip in the current captured image is recognized, coordinates (x, y) of the fingertip in the current captured image can be determined, and based on the position mapping relationship between the current captured image and the capture preview interface, the current display position of the fingertip on the capture preview interface can be determined to be (x, y).
For another example, when a fingertip in the current captured image is recognized, the midpoint position of the fingertip in the current captured image can be determined, and the midpoint position of the fingertip in the current captured image can be used as the midpoint position of the fingertip on the capture preview interface.
In step 304, if it is determined that the control is triggered according to the current display position of the designated object and the position of the control, a value of the functional parameter of the current image processing function is adjusted to obtain an adjusted functional parameter.
In an embodiment of the disclosure, in order to prompt a user that a designated object has entered a first operation area, after a transparent first operation area is displayed on a shooting preview interface, if it is detected that a current display position is within the first operation area, a contour corresponding to the designated object is acquired, and the contour is displayed in a preset manner.
The preset mode is a preset mode for displaying the outline of the designated object, and the preset mode may be a mode for displaying the outline corresponding to the designated object in a preset color, for example, the outline of the designated object may be displayed in a red, yellow or blue mode.
In the embodiment of the present disclosure, when the designated object is a fingertip, the image operation region of the hand corresponding to the fingertip can be determined according to the currently-captured image, and then the contour corresponding to the finger can be determined according to the image operation region.
For example, as shown in fig. 4, after a transparent first operation region is displayed on the shooting preview interface, when it is detected that the display position of the fingertip is within the first operation region, an image operation region of a hand corresponding to the fingertip can be determined according to the current shooting image, then an outline of the hand is determined according to the image operation region, and the outline is displayed by red edge painting.
It should be noted that after the first operation area is displayed on the preview shooting interface, if the second operation area is also displayed on the preview shooting interface, the second operation area may generally affect the display of the currently shot image in the shooting preview interface, that is, the second operation area may block part of the content of the currently shot image. In order to reduce the influence of the second operation area on the display of the current shot image in the shooting preview interface, in one embodiment of the present disclosure, a hiding instruction may be received, where the hiding instruction is used to instruct hiding the second operation area, and the hiding processing may be performed on the second operation area in response to the hiding instruction.
It will be appreciated that there are a variety of ways in which the user input hides the second operating region, examples of which are illustrated below:
as an exemplary embodiment, a current gesture input by a user may be acquired, and if the gesture input by the user is consistent with a gesture corresponding to hiding the second operation region, it is determined that a hiding instruction for hiding the second operation region is received, and according to the hiding instruction, the second operation region in the preview shooting interface is hidden.
As another exemplary embodiment, the current voice information input by the user may be acquired, if the current voice information contains a keyword for hiding the second operation area, it is determined that the user inputs a hiding instruction for hiding the second operation area, and the second operation area in the preview shooting interface is hidden according to the hiding instruction.
As another exemplary embodiment, it may be determined whether the current display position of the designated object is within the first operation region, and if it is determined that the current display position of the designated object is within the first operation region, it is determined that a hiding instruction for hiding the second operation region is received, and in response to the hiding instruction, the hiding process is performed on the second operation region in the preview shooting interface.
For example, when the specified object is a fingertip and the current display position of the fingertip is not within the first operation region, as shown in a in fig. 5, it can be understood that the top region in a is the first operation region, and the region including the image processing functions such as one-key slimming, long leg, thin waist, and small head is the second operation region. Upon detecting that the current display position of the fingertip is within the first operation region, a hiding instruction for hiding the second operation region input by the user may be determined, and the bottom second operation region may be hidden, wherein an exemplary view of the photographing preview interface hiding the second operation region is shown as b in fig. 5.
In the embodiment of the disclosure, after the second operation area is subjected to hiding processing, in order to meet the requirement of displaying the second operation area again by a user, a display instruction may be received, where the display instruction is used to instruct to display the second operation area, and in response to the display instruction, the second operation area is controlled to be displayed again on the shooting preview interface. There are various ways to trigger the display instruction for displaying the second operation area, and the examples are as follows:
in some embodiments, if a click operation for the photographing preview interface is received, it is determined to receive a display instruction.
For example, after the second operation area is hidden, the user may click the shooting preview interface in a mirror image manner, when the electronic device receives a click operation of the user on the shooting preview interface through the electronic device, it may be determined to receive a command to display the second operation area, and the second operation area is controlled to be displayed again on the shooting preview interface in response to the display command.
In some embodiments, if a preset keyword is included in the received voice information, it is determined to receive a display instruction.
For another example, after the second operation area is hidden, the user may say "please help to open the second operation area", and then the electronic device may collect the "please help to open the second operation area" input by the user, and when the electronic device detects "open the second operation area", the electronic device may determine to receive the instruction to display the second operation area, and in response to the display instruction, control to display the second operation area again on the shooting preview interface.
In some embodiments, if the received gesture is consistent with a preset gesture corresponding to displaying the first operation area, it is determined to receive a display instruction.
For another example, after the second operation region is hidden, the user may make an "a" gesture, and the electronic device may receive the "a" gesture input by the user, and when it is detected that the received "a" gesture matches the preset gesture "a" corresponding to displaying the first operation region, it may be determined to receive a command to display the second operation region, and in response to the display command, the second operation region is controlled to be displayed again on the shooting preview interface.
In an actual application, the user may select, according to a requirement, a manner of triggering and displaying the second operation area, which is not specifically limited in this embodiment.
In the embodiment of the disclosure, in order to prompt a user that a control is triggered, before the value of the functional parameter is adjusted to obtain the adjusted functional parameter, the control can be controlled to be displayed in a second display mode, wherein the display styles of the control in the first display mode and the second display mode are different. And before the control is not triggered, the control is displayed in a first display mode.
For example, when the designated object touches the function control button, the function control button may be in a pressed state, that is, the control is displayed in the second display mode; and when the designated object leaves the function button, the function button can be in a normal state, namely the control is displayed in the first display mode. For example, as shown in fig. 6, a yellow color of the function control button may indicate that the function button is in a pressed state, and a white color of the function control button may indicate that the function button may be in a normal state.
In step 305, the current captured image is processed and displayed according to the adjusted functional parameters.
That is to say, the adjusted function parameters are obtained, and according to the adjusted function parameters, the electronic device performs processing operation on the current shot image, and then displays the current image after the operation.
For example, after the value of the functional parameter of the leg slimming function is adjusted to obtain the adjusted functional parameter, the leg slimming operation processing can be performed on the currently shot image according to the leg slimming function, and the currently shot image after the leg slimming operation processing is displayed.
According to the shooting control method provided by the embodiment of the disclosure, a control and a current shot image are displayed on a shooting preview interface, a specified object in the current shot image is identified, the current display position of the specified object on the shooting preview interface is determined, a first image processing function used currently is obtained, the value of a function parameter of the first image processing function is adjusted to obtain an adjusted function parameter, and the current shot image is processed and displayed according to the adjusted function parameter. According to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then the operation of adjusting the functional parameters corresponding to the image processing function is executed, so that the functional control of the image processing function in the shooting preview interface can be triggered by a user in a remote operation mode, the user can conveniently trigger the control in a non-contact mode, the problem of poor shooting experience of the user caused by the fact that the user frequently walks to the device to trigger the control is solved, the shooting experience of the user is improved, and the man-machine interaction performance is improved.
As shown in fig. 7, the specific implementation manner of adjusting the value of the functional parameter of the current image processing function to obtain the adjusted functional parameter is as follows:
in step 701, the duration of the control being triggered is determined according to the current display position of the specified object and the position of the control.
In the embodiment of the disclosure, if it is determined that the area of the position coincidence between the designated object and the control exceeds a preset threshold value and the time length of the area exceeding the preset threshold value exceeds a preset time length according to the current display position and the position of the control, it is determined that the control is touched; and determining a duration of time that the control is triggered.
That is to say, when the current display position of the designated object coincides with the position of the control, the area of the position coincidence between the designated object and the control exceeds the preset threshold, it can be determined that the control is triggered, and the duration of the triggering of the control can be determined according to the fact that the duration of the area exceeding the preset threshold exceeds the preset duration.
For example, as shown in FIG. 8, when the current position of the fingertip is in the first operation region
Figure BDA0002605836370000161
When the positions coincide, the finger tip and the finger tip can be judged
Figure BDA0002605836370000162
Whether the area of the coincidence exceeds a preset threshold value, if so, determining the area in the first operation area
Figure BDA0002605836370000163
The control is triggered, whether the duration of the area exceeding the preset threshold exceeds the preset duration or not is judged, and if yes, the situation that the area in the first operation area exceeds the preset threshold can be determined
Figure BDA0002605836370000164
The duration of time the control is triggered.
In order to avoid triggering invalid control operation, in the embodiment of the disclosure, if the current display position and positions of M controls all have position coinciding parts, determining an inclination angle of the designated object relative to a reference direction according to the current display position, where M is an integer greater than 1 and less than or equal to N, and outputting first prompt information if the inclination angle is less than a preset angle threshold, where the first prompt information is used to prompt that the operation is invalid and prompt to adjust the inclination angle of the designated object; and if the inclination angle of the specified object is larger than or equal to the preset angle threshold, outputting second prompt information, wherein the second prompt information is used for prompting that the operation is invalid and prompting to adjust the distance between the specified object and the shooting preview interface.
The manner of prompting information includes, but is not limited to, voice prompt, text prompt, etc.
For example, when it is detected that the user fingertip and the plurality of controls all have the overlapped portion, the tilt angle of the user fingertip relative to the reference direction can be determined according to the current position of the user fingertip, when the tilt angle is smaller than a preset angle threshold, the operation can be determined to be invalid, and the user is prompted with the 'fingertip direction error' in a voice mode, when the tilt angle is larger than or equal to the preset angle threshold, the operation can be determined to be invalid, and the user is prompted with the 'fingertip and shooting preview interface distance' in a voice mode.
For another example, when it is detected that the user fingertip and the plurality of controls all have the overlapped portion, the inclination angle of the user fingertip relative to the reference direction can be determined according to the current user fingertip position, when the inclination angle is smaller than the preset angle threshold, it can be determined that the operation is invalid, then the character of 'the fingertip direction is wrong' displayed in the shooting preview interface so as to prompt the user, when the inclination angle is larger than or equal to the preset angle threshold, it can be determined that the operation is invalid, and the character of 'the fingertip is away from the shooting preview interface' is displayed in the shooting preview interface so as to prompt the user.
In step 702, the current value of the functional parameter is obtained.
In some embodiments, after the current image processing function is obtained, a current value of a function parameter of the current image processing function may be obtained, where the current value of the function parameter may be a numerical value automatically generated by the electronic device according to the current captured image, for example, when a person in the current captured image is fat, the current value of the function parameter is large; when the figure in the current shot image is thin, the current value of the functional parameter is small.
In some embodiments, after the current image processing function is obtained, a current value of a function parameter of the current image processing function may be obtained, where the current value of the function parameter may be a function value used by a user last time.
For another example, after the current image processing function is obtained, the current value of the function parameter of the current image processing function may be obtained, where the current value of the function parameter may be a value in which the frequency of occurrence of the function value in the history of use of the user is the greatest.
In step 703, a value of the functional parameter is adjusted according to the current value and the duration to obtain an adjusted functional parameter.
For example, as shown in fig. 9, the current image processing function is a long-leg function, and the user can click the first operation area in the electronic device through the mirror imageIn (1)
Figure BDA0002605836370000171
And when the trigger duration is 1 second, the electronic device obtains a value of 65 as a function parameter for increasing the processing strength of the long leg, for example, from 64 current clicks to 65 current clicks.
For another example, as shown in fig. 9, when the current image processing function is a long-leg function, the user can click on the first operation region through the electronic device
Figure BDA0002605836370000172
And when the trigger duration is 2 seconds, the electronic device obtains a value of a functional parameter for reducing the processing strength of the long leg as 62 by clicking from the current 64 to the control 62.
Fig. 10 is a flowchart illustrating a photographing control method according to another specific embodiment according to an exemplary embodiment. As shown in fig. 10, the shooting control method of the present disclosure includes the steps of:
in step 1001, a control and a currently photographed image are displayed on a photographing preview interface.
Wherein the control is used for indicating switching from the current image processing function to the adjacent image processing function.
In step 1002, a specified object among the currently captured images is identified.
In step 1003, the current display position of the designated object on the shooting preview interface is determined.
It should be noted that, in the embodiment of the present disclosure, the implementation manners of the steps 1001 to 1003 may refer to the implementation manners of the steps 301 to 303, which is not described herein again.
In step 1004, a corresponding first image processing function is obtained according to the current image processing function.
Wherein the first image processing function is an image processing function adjacent to the current image processing function.
Wherein the first image processing function is an image processing function adjacent to the current image processing function. For example, the first image processing function includes, but is not limited to, face thinning, waist thinning, heightening, eye enlarging, and the like.
In step 1005, the current image processing function is switched to the first image processing function.
In step 1006, the currently captured image is processed and displayed according to the first image processing function.
For example, when the current image processing function is a long-leg function, a preview interface is taken, as shown in a in fig. 11, by touching the user's fingertip in a mirror image
Figure BDA0002605836370000181
After the control is obtained, if the next image processing function adjacent to the current image function is a waist-thinning function, the current image function may be switched to the waist-thinning function, where an exemplary diagram of the shooting preview interface after the image processing function is switched is shown as b in fig. 11.
In addition, fig. 11 shows
Figure BDA0002605836370000182
A control part for switching to the last image processing function, and detecting the mirror image touch of the specified object
Figure BDA0002605836370000183
After the control button is pressed, the last image processing function adjacent to the current image processing function can be obtained according to the current image processing function and the switching instruction for switching to the last image processing function, the current image processing function is switched to the last image processing function, and the current shot image is processed according to the switched image processing function. For example, the current image processing function is a small head function, the last image processing function adjacent to the current image processing function is a waist slimming function, and when it is detected that the current display position of the fingertip is located in the display area of the control switched to the last image processing function, the small head function is switched to the waist slimming function, and waist slimming processing is performed on the waist of the person in the currently captured image according to the waist slimming function.
According to the shooting control method provided by the embodiment of the disclosure, a control and a current shot image are displayed on a shooting preview interface, a specified object in the current shot image is identified, the current display position of the specified object on the shooting preview interface is determined, a corresponding first image processing function is obtained according to the current image processing function, the current image processing function is switched to the first image processing function, and the current shot image is processed and displayed according to the first image processing function. According to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then switching to the adjacent image processing function operation is carried out, so that the user can trigger the control for switching the image processing function in the shooting preview interface in a remote operation mode, the user can conveniently trigger the control in a non-contact mode, the problem of poor user shooting experience caused by the fact that the user frequently walks to the equipment to trigger the control is solved, the user shooting experience is improved, and the human-computer interaction performance is improved.
Fig. 12 is a block diagram illustrating a photographing control apparatus according to an exemplary embodiment.
Referring to fig. 12, the apparatus 1200 includes a display module 121, a recognition module 122, a determination module 123, and an execution module 124.
The display module 121 is configured to display a control and a current shot image on a shot preview interface;
the recognition module 122 is configured to recognize a specified object among the currently photographed images;
the determination module 123 is configured to determine a current display position of the specified object on the photographing preview interface; and
the executing module 124 is configured to execute the operation corresponding to the control if the control is triggered according to the current display position of the specified object and the position of the control.
In an embodiment of the present disclosure, the display module 121 includes: the device comprises an acquisition size unit, a size adjusting unit and a display image unit, wherein the acquisition size unit is configured to acquire the size of a current shot image; a size adjustment unit configured to adjust a size of a currently photographed image according to a size of the photographing preview interface; and a display image unit configured to display a current photographed image having the same size as the photographing preview interface on the photographing preview interface.
In an embodiment of the present disclosure, the determining module 123 includes: the image processing device comprises an acquisition position unit and a determination position unit, wherein the acquisition position unit is configured to determine the image position of a specified object in a current shooting image; and the position determining unit is configured to determine the current display position of the specified object on the shooting preview interface according to the image position based on the position mapping relation between the current shooting image and the shooting preview interface.
In an embodiment of the present disclosure, the executing module 124 is configured to determine that the control is triggered if it is determined that a coincidence area between the current display position of the specified object and the position of the control exceeds a preset threshold, and a duration of the coincidence area exceeding the preset threshold exceeds a preset duration.
In an embodiment of the disclosure, the control is used to instruct to adjust a function parameter of the corresponding image processing function, and the executing module 124 includes: a first adjusting unit 1241 configured to perform value adjustment on a function parameter configured to perform a current image processing function to obtain an adjusted function parameter; and a first processing unit 1242 configured to process and display the currently photographed image according to the adjusted functional parameter.
In an embodiment of the present disclosure, the first adjusting unit 1241 includes: the first determining subunit is configured to determine the duration of the control triggered according to the current display position of the specified object and the position of the control; a first obtaining subunit configured to obtain a current value of the functional parameter; and the first adjusting subunit is configured to perform value adjustment on the functional parameter according to the current value and the duration to obtain an adjusted functional parameter.
In an embodiment of the present disclosure, a control is displayed in a first display manner on a shooting preview interface, where the apparatus further includes: and the control display module is configured to control the control to be displayed in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
In an embodiment of the present disclosure, the display module 121 includes: a reception instruction unit configured to execute reception of a trigger instruction for a first image processing function; and the first operation area display unit is configured to display a first operation area which is transparent and covers the current shooting image on the shooting preview interface in response to the trigger instruction, wherein the first operation area comprises a control.
In an embodiment of the present disclosure, the shooting preview interface further includes a second operation area, where the second operation area includes: the touch control corresponding to the first image processing function and the receiving instruction unit are configured to receive touch operation aiming at the touch control; and triggering a triggering instruction aiming at the first image processing function according to the touch operation.
In an embodiment of the disclosure, the apparatus further comprises: a movement region unit configured to, when it is detected that a drag operation is performed on the first operation region, move the first operation region in accordance with the drag operation; and a control area unit configured to control the first operation area to stay at a current position of the photographing preview interface when it is detected that the drag operation is stopped being performed on the first operation area.
In an embodiment of the disclosure, the apparatus further comprises: and the contour acquisition module is configured to acquire a contour corresponding to the designated object and display the contour in a preset mode if the current display position is detected to be in the first operation area.
In an embodiment of the present disclosure, the specifying object is a fingertip, and the obtaining contour module includes: a determination region unit configured to determine an image operation region of a hand corresponding to a fingertip from a currently captured image; and a contour determining unit configured to determine a contour of the hand according to the image operation region.
In an embodiment of the disclosure, the apparatus further comprises: a hiding instruction receiving module configured to receive a hiding instruction, wherein the hiding instruction is used for indicating to hide the second operation area; and the response hiding instruction module is configured to respond to the hiding instruction and hide the second operation area.
In an embodiment of the present disclosure, the receive hiding instruction module is configured to determine that a hiding instruction is received if it is detected that the current display position is within the first operation region.
In an embodiment of the disclosure, the apparatus further comprises: the receiving and displaying instruction module is configured to receive a displaying instruction, wherein the displaying instruction is used for indicating to display the second operation area; and the response display instruction module is configured to respond to the display instruction and control the second operation area to be displayed again on the shooting preview interface.
In an embodiment of the present disclosure, the receiving display instruction module is configured to determine to receive a display instruction if a click operation for the shooting preview interface is received; or, if the received voice message comprises a preset keyword, determining to receive a display instruction; or if the received gesture is consistent with the preset gesture corresponding to the first operation area, determining to receive a display instruction.
In an embodiment of the present disclosure, a control is displayed in a first display manner on a shooting preview interface, where the apparatus further includes: and the control display module is configured to control the control to be displayed in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
In an embodiment of the present disclosure, the first adjusting unit 1241 includes: a first determining subunit 12411, configured to determine, according to the current display position of the specified object and the position of the control, a duration of time for which the control is triggered; a first obtaining subunit 12412 configured to perform obtaining a current value of the functional parameter; and a first adjusting subunit 12413, configured to perform value adjustment on the functional parameter according to the current value and the duration, so as to obtain an adjusted functional parameter.
In an embodiment of the present disclosure, the control includes N, where N is an integer greater than 1, and the apparatus further includes: the inclination angle determining module is configured to determine an inclination angle of the specified object relative to the reference direction according to the current display position if the current display position and positions of the M controls all have position overlapping parts, wherein M is an integer which is greater than 1 and less than or equal to N; and if the inclination angle is smaller than the preset angle threshold, outputting first prompt information, wherein the first prompt information is used for prompting that the operation is invalid and prompting to adjust the inclination angle of the specified object.
In an embodiment of the disclosure, the apparatus further comprises: and the second prompting module is configured to output second prompting information if the inclination angle of the specified object is greater than or equal to the preset angle threshold, wherein the second prompting information is used for prompting that the operation is invalid and prompting that the distance between the specified object and the shooting preview interface is adjusted.
In an embodiment of the present disclosure, the control is configured to instruct to switch to an adjacent image processing function, and the execution module 124 includes: a second obtaining unit 1243 configured to perform obtaining a corresponding first image processing function according to the current image processing function, where the first image processing function is an image processing function adjacent to the current image processing function; a switching unit 1244 configured to switch the current image processing function to the first image processing function; and a first image processing unit 1245 configured to process and display the currently photographed image according to the first image processing function.
The shooting control device provided by the embodiment of the disclosure displays a control and a current shooting image on a shooting preview interface; identifying a specified object among the currently photographed images; determining the current display position of a specified object on a shooting preview interface; and if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control. Therefore, according to the method and the device, the current display position of the designated object on the shooting preview interface is determined, the control is determined to be triggered according to the current display position of the designated object and the position of the control, and then the operation corresponding to the control is executed, so that a user can trigger the function control in the shooting preview interface in a remote operation mode, the user can trigger the control in a non-contact mode conveniently, the problem of poor shooting experience of the user caused by the fact that the user frequently walks to the device to trigger the control is avoided, and the shooting experience of the user is improved.
To implement the above embodiments, the present disclosure also provides an electronic device, and fig. 13 is a block diagram of an electronic device 1300 shown according to an exemplary embodiment. For example, the electronic device 1300 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and so forth.
Referring to fig. 13, electronic device 1300 may include one or more of the following components: a processing component 1302, a memory 1304, a power component 1306, a multimedia component 1308, an audio component 1310, an input/output (I/O) interface 1312, a sensor component 1314, and a communication component 1316.
The processing component 1302 generally controls overall operation of the electronic device 1300, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 1302 may include one or more processors 1320 to execute instructions to perform all or part of the steps of the capture control method described above. Further, the processing component 1302 can include one or more modules that facilitate interaction between the processing component 1302 and other components. For example, the processing component 1302 may include a multimedia module to facilitate interaction between the multimedia component 1308 and the processing component 1302.
The memory 1304 is configured to store various types of data to support operation at the electronic device 1300. Examples of such data include instructions for any application or method operating on the electronic device 1300, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 1304 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 1306 provides power to the various components of the electronic device 1300. Power components 1306 may include a power management system, one or more power sources, and other components associated with generating, managing, and distributing power for electronic device 1300.
The multimedia component 1308 includes a touch-sensitive display screen that provides an output interface between the electronic device 1300 and a user. In some embodiments, the touch display screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 1308 includes a front facing camera and/or a rear facing camera. The front-facing camera and/or the rear-facing camera may receive external multimedia data when the electronic device 1300 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 1310 is configured to output and/or input audio signals. For example, the audio component 1310 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 1300 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 1304 or transmitted via the communication component 1316. In some embodiments, the audio component 1310 also includes a speaker for outputting audio signals.
The I/O interface 1312 provides an interface between the processing component 1302 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 1314 includes one or more sensors for providing various aspects of state assessment for the electronic device 1300. For example, the sensor assembly 1314 may detect an open/closed state of the electronic device 1300, the relative positioning of components, such as a display and keypad of the electronic device 1300, the sensor assembly 1314 may also detect a change in the position of the electronic device 1300 or a component of the electronic device 1300, the presence or absence of user contact with the electronic device 1300, orientation or acceleration/deceleration of the electronic device 1300, and a change in the temperature of the electronic device 1300. The sensor assembly 1314 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 1314 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 1314 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 1316 is configured to facilitate communications between the electronic device 1300 and other devices in a wired or wireless manner. The electronic device 1300 may access a wireless network based on a communication standard, such as WiFi, 2G, or 3G, or a combination thereof. In an exemplary embodiment, the communication component 1316 receives broadcast signals or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communications component 1316 also includes a Near Field Communications (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrD a) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 1300 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors, or other electronic components for performing the above-described photographing control method.
In an exemplary embodiment, a non-transitory computer-readable storage medium including instructions, such as the memory 1304 including instructions, executable by the processor 1320 of the electronic device 1300 to perform the above-described photographing control method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
A non-transitory computer-readable storage medium in which instructions, when executed by a processor of an electronic device 1300, enable the electronic device 1300 to perform a photographing control method.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (40)

1. A shooting control method, characterized by comprising:
displaying a control and a current shot image on a shooting preview interface, wherein the control is used for indicating to adjust the function parameter of the current image processing function or indicating to switch from the current image processing function to an adjacent image processing function, and the image processing function is an image processing function used for the current shot image;
identifying a specified object in the current captured image, wherein the specified object comprises a fingertip;
determining the current display position of the specified object on the shooting preview interface; and
and if the control is determined to be triggered according to the current display position of the specified object and the position of the control, executing the operation corresponding to the control.
2. The photographing control method according to claim 1, wherein the displaying the currently photographed image on the photographing preview interface includes:
acquiring the size of a current shot image;
adjusting the size of the current shot image according to the size of the shooting preview interface; and
and displaying the current shooting image with the same size as the shooting preview interface on the shooting preview interface.
3. The photographing control method according to claim 1, wherein the determining of the current display position of the designated object on the photographing preview interface includes:
determining an image position of the specified object in the current captured image; and
and determining the current display position of the specified object on the shooting preview interface according to the image position based on the position mapping relation between the current shooting image and the shooting preview interface.
4. The shooting control method according to claim 1, wherein the determining that the control is triggered according to the current display position of the specified object and the position of the control includes:
and if the fact that the overlapping area between the current display position of the specified object and the position of the control exceeds a preset threshold value is determined, and the time length of the overlapping area exceeding the preset threshold value exceeds a preset time length is determined, the control is triggered.
5. The shooting control method according to claim 1, wherein the control is used for instructing to adjust a function parameter of a current image processing function, and the executing operation corresponding to the control comprises:
carrying out value adjustment on the functional parameters of the current image processing function to obtain the adjusted functional parameters; and
and processing and displaying the current shot image according to the adjusted functional parameters.
6. The shooting control method according to claim 5, wherein the performing value adjustment on the functional parameter of the current image processing function to obtain the adjusted functional parameter includes:
determining the duration of triggering the control according to the current display position of the specified object and the position of the control;
acquiring the current value of the functional parameter; and
and adjusting the value of the functional parameter according to the current value and the duration to obtain the adjusted functional parameter.
7. The shooting control method according to claim 5, wherein the control is displayed in a first display manner on the shooting preview interface, and before the value adjustment of the function parameter is performed to obtain the adjusted function parameter, the method further includes:
and controlling to display the control in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
8. The shooting control method according to claim 1, wherein the control is used for instructing switching from a current image processing function to an adjacent image processing function, and the executing operation corresponding to the control comprises:
acquiring a corresponding first image processing function according to the current image processing function, wherein the first image processing function is an image processing function adjacent to the current image processing function;
switching the current image processing function to a first image processing function;
and processing and displaying the current shot image according to the first image processing function.
9. The shooting control method according to claim 1, wherein the control includes N, N being an integer greater than 1, the method further comprising:
if the current display position and the positions of the M controls have position superposition parts, determining the inclination angle of the specified object relative to the reference direction according to the current display position, wherein M is an integer which is greater than 1 and less than or equal to N;
and if the inclination angle is smaller than a preset angle threshold, outputting first prompt information, wherein the first prompt information is used for prompting that the operation is invalid and prompting to adjust the inclination angle of the specified object.
10. The shooting control method according to claim 9, characterized by further comprising:
and if the inclination angle of the specified object is larger than or equal to a preset angle threshold, outputting second prompt information, wherein the second prompt information is used for prompting that the operation is invalid and prompting to adjust the distance between the specified object and the shooting preview interface.
11. The shooting control method according to claim 5, wherein the displaying a control on the shooting preview interface includes:
receiving a trigger instruction aiming at the current image processing function;
and responding to the trigger instruction, displaying a first operation area which is transparent and covers the current shooting image on the shooting preview interface, wherein the first operation area comprises the control.
12. The shooting control method according to claim 11, further comprising a second operation area on the shooting preview interface, wherein the second operation area includes: the receiving of the trigger instruction for the current image processing function by the touch control corresponding to the current image processing function includes:
receiving a touch operation aiming at the touch control;
and triggering the triggering instruction aiming at the current image processing function according to the touch operation.
13. The photographing control method according to claim 11, wherein after displaying a first operation area that is transparent and overlaid on the current photographed image on the photographing preview interface, the method further comprises:
when the fact that the dragging operation is performed on the first operation area is detected, moving the first operation area according to the dragging operation;
and when the fact that the first operation area stops executing the dragging operation is detected, controlling the first operation area to stay at the current position of the shooting preview interface.
14. The shooting control method according to claim 12, characterized by further comprising:
receiving a hiding instruction, wherein the hiding instruction is used for indicating to hide the second operation area;
and responding to a hiding instruction, and hiding the second operation area.
15. The shooting control method according to claim 14, wherein the receiving of the hidden instruction includes:
and if the current display position is detected to be in the first operation area, determining that the hiding instruction is received.
16. The shooting control method according to claim 14, wherein after the hiding the second operation region, the method further comprises:
receiving a display instruction, wherein the display instruction is used for indicating to display the second operation area;
and responding to the display instruction, and controlling to display the second operation area on the shooting preview interface again.
17. The shooting control method according to claim 16, wherein the receiving of the display instruction includes:
if the click operation aiming at the shooting preview interface is received, determining to receive the display instruction; or,
if the received voice information comprises preset keywords, determining to receive the display instruction; or,
and if the received gesture is consistent with the preset gesture corresponding to the first operation area, determining to receive the display instruction.
18. The photographing control method according to claim 11, wherein after displaying a first operation area that is transparent and overlaid on the current photographed image on the photographing preview interface, the method further comprises:
and if the current display position is detected to be in the first operation area, acquiring a contour corresponding to the specified object, and displaying the contour in a preset mode.
19. The shooting control method according to claim 18, wherein the specified object is a fingertip, and the acquiring of the contour corresponding to the specified object includes:
determining an image operation area of a hand corresponding to the fingertip according to the current shot image;
and determining the corresponding contour of the fingertip according to the image operation area.
20. A shooting control apparatus, characterized by comprising:
the display module is configured to display a control and a current shot image on a shooting preview interface, wherein the control is used for indicating to adjust the function parameter of the current image processing function or indicating to switch from the current image processing function to an adjacent image processing function, and the image processing function is an image processing function used for the current shot image;
a recognition module configured to recognize a specified object among the currently captured image, wherein the specified object includes a fingertip;
a determination module configured to determine a current display position of the designated object on the photographing preview interface; and
and the execution module is configured to execute the operation corresponding to the control if the control is determined to be triggered according to the current display position of the specified object and the position of the control.
21. The shooting control apparatus according to claim 20, wherein the display module includes:
an acquisition size unit configured to acquire a size of a currently captured image;
a size adjustment unit configured to adjust a size of the currently photographed image according to a size of the photographing preview interface; and
a display image unit configured to display the current photographed image having the same size as the photographing preview interface on the photographing preview interface.
22. The shooting control apparatus according to claim 20, wherein the determining means includes:
an acquisition position unit configured to determine an image position of the specified object in the currently captured image; and
a position determining unit configured to determine a current display position of the designated object on the shooting preview interface according to the image position based on a position mapping relationship between the current shooting image and the shooting preview interface.
23. The shooting control apparatus of claim 20, wherein the execution module is specifically configured to:
and if the fact that the overlapping area between the current display position of the specified object and the position of the control exceeds a preset threshold value is determined, and the time length of the overlapping area exceeding the preset threshold value exceeds a preset time length is determined, the control is triggered.
24. The shooting control apparatus according to claim 20 or 23, wherein the control is configured to instruct to adjust a function parameter of a current image processing function, and the execution module includes:
a first adjusting unit, configured to perform value adjustment on a functional parameter of the current image processing function to obtain the adjusted functional parameter; and
and the first processing unit is configured to process and display the current shot image according to the adjusted functional parameters.
25. The shooting control apparatus according to claim 24, wherein the first adjusting unit includes:
a first determining subunit, configured to determine, according to the current display position of the specified object and the position of the control, a duration for which the control is triggered;
a first obtaining subunit configured to obtain a current value of the functional parameter; and
and the first adjusting subunit is configured to perform value adjustment on the functional parameter according to the current value and the duration to obtain an adjusted functional parameter.
26. The photographic control apparatus of claim 24, wherein the control is displayed in a first display on the photographic preview interface, wherein the apparatus further comprises:
and the control display module is configured to control the control to be displayed in a second display mode, wherein the display styles of the control in the first display mode and the control in the second display mode are different.
27. The shooting control apparatus according to claim 20, wherein the control is configured to instruct switching from a current image processing function to an adjacent image processing function, and the execution module includes:
a second obtaining unit configured to obtain a corresponding first image processing function according to the current image processing function, wherein the first image processing function is an image processing function adjacent to the current image processing function;
a switching unit configured to switch the current image processing function to a first image processing function;
a first image processing unit configured to process and display the currently captured image according to the first image processing function.
28. The shooting control apparatus according to claim 20, wherein the control includes N, N being an integer greater than 1, the apparatus further comprising:
the inclination angle determining module is configured to determine an inclination angle of the specified object relative to a reference direction according to the current display position if the current display position and positions of the M controls all have position overlapping parts, wherein M is an integer which is greater than 1 and less than or equal to N;
the first prompt module is configured to output first prompt information if the inclination angle is smaller than a preset angle threshold, wherein the first prompt information is used for prompting that the operation is invalid and prompting to adjust the inclination angle of the specified object.
29. The shooting control apparatus according to claim 28, characterized by further comprising:
and the second prompting module is configured to output second prompting information if the inclination angle of the specified object is greater than or equal to a preset angle threshold, wherein the second prompting information is used for prompting that the operation is invalid and prompting to adjust the distance between the specified object and the shooting preview interface.
30. The shooting control apparatus according to claim 24, wherein the display module includes:
a reception instruction unit configured to receive a trigger instruction for the current image processing function;
a first operation area display unit configured to display a first operation area which is transparent and is overlaid on the current shot image on the shooting preview interface in response to the trigger instruction, wherein the first operation area includes the control.
31. The shooting control apparatus according to claim 30, further comprising a second operation area on the shooting preview interface, wherein the second operation area includes: the touch control corresponding to the current image processing function, the instruction receiving unit, is specifically configured to:
receiving a touch operation aiming at the touch control;
and triggering the triggering instruction aiming at the current image processing function according to the touch operation.
32. The shooting control apparatus according to claim 30, characterized in that the apparatus further comprises:
a movement region unit configured to, when it is detected that a drag operation is performed on the first operation region, move the first operation region in accordance with the drag operation;
a control area unit configured to control the first operation area to stay at a current position of the photographing preview interface when it is detected that the drag operation is stopped being performed on the first operation area.
33. The shooting control apparatus according to claim 31, characterized in that the apparatus further comprises:
a hiding instruction receiving module configured to receive a hiding instruction, wherein the hiding instruction is used for indicating to hide the second operation area;
a response hiding instruction module configured to perform hiding processing on the second operation area in response to a hiding instruction.
34. The capture control device of claim 33, wherein the receive hiding instruction module is specifically configured to:
and if the current display position is detected to be in the first operation area, determining that the hiding instruction is received.
35. The shooting control apparatus according to claim 33, characterized by further comprising:
a receiving and displaying instruction module configured to receive a displaying instruction, wherein the displaying instruction is used for indicating to display the second operation area;
a response display instruction module configured to control to display the second operation region again on the photographing preview interface in response to the display instruction.
36. The capture control device of claim 35, wherein the receive display instruction module is specifically configured to:
if the click operation aiming at the shooting preview interface is received, determining to receive the display instruction; or,
if the received voice information comprises preset keywords, determining to receive the display instruction; or,
and if the received gesture is consistent with the preset gesture corresponding to the first operation area, determining to receive the display instruction.
37. The shooting control apparatus according to claim 30, characterized in that the apparatus further comprises:
and the contour acquiring module is configured to acquire a contour corresponding to the specified object and display the contour in a preset mode if the current display position is detected to be in the first operation area.
38. The photographing control apparatus of claim 37, wherein the specified object is a fingertip, and the contour acquisition module includes:
a determination region unit configured to determine an image operation region of a hand corresponding to the fingertip from the currently captured image;
a determine contour unit configured to determine a contour of the hand according to the image operation region.
39. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the photography control method of any of claims 1 to 19.
40. A storage medium in which instructions are executed by a processor of an electronic device, so that the electronic device can execute the photographing control method according to any one of claims 1 to 19.
CN202010738488.3A 2020-07-28 2020-07-28 Shooting control method and device, electronic equipment and storage medium Active CN111988522B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010738488.3A CN111988522B (en) 2020-07-28 2020-07-28 Shooting control method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010738488.3A CN111988522B (en) 2020-07-28 2020-07-28 Shooting control method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN111988522A CN111988522A (en) 2020-11-24
CN111988522B true CN111988522B (en) 2022-04-22

Family

ID=73444618

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010738488.3A Active CN111988522B (en) 2020-07-28 2020-07-28 Shooting control method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN111988522B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112911059B (en) * 2021-01-22 2022-06-07 维沃移动通信(杭州)有限公司 Photographing method and device, electronic equipment and readable storage medium
CN114995713B (en) * 2022-05-30 2024-06-18 维沃移动通信有限公司 Display control method, display control device, electronic equipment and readable storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101810003A (en) * 2007-07-27 2010-08-18 格斯图尔泰克股份有限公司 enhanced camera-based input
CN105744172A (en) * 2016-04-27 2016-07-06 广东欧珀移动通信有限公司 Photographing method and device and mobile terminal
CN106993213A (en) * 2017-03-20 2017-07-28 苏州佳世达电通有限公司 The setting device and establishing method of a kind of sprite
CN107566716A (en) * 2017-08-02 2018-01-09 努比亚技术有限公司 Adjusting method, terminal and the computer-readable recording medium of camera acquisition parameter
CN108924417A (en) * 2018-07-02 2018-11-30 Oppo(重庆)智能科技有限公司 Filming control method and Related product
CN110045819A (en) * 2019-03-01 2019-07-23 华为技术有限公司 A kind of gesture processing method and equipment
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium
CN111198644A (en) * 2018-11-16 2020-05-26 西安易朴通讯技术有限公司 Method and system for identifying screen operation of intelligent terminal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007228233A (en) * 2006-02-23 2007-09-06 Fujifilm Corp Photographic device
US9104239B2 (en) * 2011-03-09 2015-08-11 Lg Electronics Inc. Display device and method for controlling gesture functions using different depth ranges
CN110058777B (en) * 2019-03-13 2022-03-29 华为技术有限公司 Method for starting shortcut function and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101810003A (en) * 2007-07-27 2010-08-18 格斯图尔泰克股份有限公司 enhanced camera-based input
CN105744172A (en) * 2016-04-27 2016-07-06 广东欧珀移动通信有限公司 Photographing method and device and mobile terminal
CN106993213A (en) * 2017-03-20 2017-07-28 苏州佳世达电通有限公司 The setting device and establishing method of a kind of sprite
CN107566716A (en) * 2017-08-02 2018-01-09 努比亚技术有限公司 Adjusting method, terminal and the computer-readable recording medium of camera acquisition parameter
CN108924417A (en) * 2018-07-02 2018-11-30 Oppo(重庆)智能科技有限公司 Filming control method and Related product
CN111198644A (en) * 2018-11-16 2020-05-26 西安易朴通讯技术有限公司 Method and system for identifying screen operation of intelligent terminal
CN110045819A (en) * 2019-03-01 2019-07-23 华为技术有限公司 A kind of gesture processing method and equipment
CN111142655A (en) * 2019-12-10 2020-05-12 上海博泰悦臻电子设备制造有限公司 Interaction method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN111988522A (en) 2020-11-24

Similar Documents

Publication Publication Date Title
CN106572299B (en) Camera opening method and device
CN108052079B (en) Device control method, device control apparatus, and storage medium
CN112118380B (en) Camera control method, device, equipment and storage medium
CN107347135B (en) Photographing processing method and device and terminal equipment
CN111695382B (en) Fingerprint acquisition area determining method and fingerprint acquisition area determining device
JP2017532922A (en) Image photographing method and apparatus
WO2022110614A1 (en) Gesture recognition method and apparatus, electronic device, and storage medium
US20160241783A1 (en) Portable terminal
EP3299946B1 (en) Method and device for switching environment picture
EP3761627B1 (en) Image processing method and apparatus
KR20180108739A (en) VR control method, apparatus, electronic apparatus, program and storage medium
US20180144176A1 (en) Fingerprint template acquisition method and device
WO2022077970A1 (en) Method and apparatus for adding special effects
US11252341B2 (en) Method and device for shooting image, and storage medium
CN108986019A (en) Method for regulating skin color and device, electronic equipment, machine readable storage medium
CN108933891B (en) Photographing method, terminal and system
CN111988522B (en) Shooting control method and device, electronic equipment and storage medium
US10839209B2 (en) Imaging processing method for smart mirror, and smart mirror
CN106527919A (en) Method and device for adjusting screen display
CN107563395B (en) Method and device for dressing management through intelligent mirror
CN111373409B (en) Method and terminal for obtaining color value change
CN106990893B (en) Touch screen operation processing method and device
CN110502993B (en) Image processing method, image processing device, electronic equipment and storage medium
CN114063876B (en) Virtual keyboard setting method, device and storage medium
CN115914721A (en) Live broadcast picture processing method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant