CN118158518A - Method for controlling camera and electronic equipment - Google Patents
Method for controlling camera and electronic equipment Download PDFInfo
- Publication number
- CN118158518A CN118158518A CN202410140219.5A CN202410140219A CN118158518A CN 118158518 A CN118158518 A CN 118158518A CN 202410140219 A CN202410140219 A CN 202410140219A CN 118158518 A CN118158518 A CN 118158518A
- Authority
- CN
- China
- Prior art keywords
- electronic device
- camera
- request
- picture
- physical camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 66
- 230000015654 memory Effects 0.000 claims description 28
- 230000006870 function Effects 0.000 claims description 20
- 238000011217 control strategy Methods 0.000 claims description 11
- 230000004044 response Effects 0.000 claims description 4
- 230000006399 behavior Effects 0.000 claims 1
- 238000010586 diagram Methods 0.000 description 9
- 239000003795 chemical substances by application Substances 0.000 description 8
- 238000004590 computer program Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000003287 optical effect Effects 0.000 description 5
- 238000013528 artificial neural network Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 229920001621 AMOLED Polymers 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000013078 crystal Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/44—Program or device authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/62—Control of parameters via user interfaces
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/57—Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Signal Processing (AREA)
- Computer Security & Cryptography (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Studio Devices (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application provides a method and electronic equipment for controlling a camera, when a second electronic equipment detects the use permission of a physical camera of a second electronic equipment, which is requested by a first electronic equipment, the second electronic equipment generates and displays a first UI (user interface) to a user, the user can control the physical camera on the second electronic equipment through the first UI, when the user performs a certain operation on the first UI, the second electronic equipment detects the request from the first UI, and the second electronic equipment executes corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic equipment during the control of the physical camera of the second electronic equipment by the first electronic equipment can be met.
Description
Technical Field
The present application relates to the field of terminals, and more particularly, to a method of controlling a camera and an electronic device in the field of terminals.
Background
When a virtual camera corresponding to a physical camera of another electronic device is created on one electronic device, a user can control the physical camera of the other electronic device through the virtual camera, the former can be called a master control device, and the latter can be called a remote device. For example, a user may control the turning on or off of a physical camera of a remote device through a virtual camera on a master device.
However, the user cannot control the physical camera of the remote device during the period when the physical camera of the remote device is controlled by the master device, so the above technical solution cannot meet the control requirement of the user on the physical camera of the remote device during the period when the physical camera of the remote device is controlled by the master device, thereby affecting the user experience.
Disclosure of Invention
The embodiment of the application provides a method for controlling a camera, which can meet the control requirement of a user on the physical camera of remote equipment during the control of the physical camera of the remote equipment by main control equipment.
In a first aspect, there is provided a method of controlling a camera, the method performed by a second electronic device, comprising: when the second electronic device detects a first request from the first electronic device, starting a physical camera and displaying a first UI, wherein the first request requests the second electronic device for the use permission of the physical camera; when a second request from the first UI is detected by a second electronic device, a first operation is performed on the physical camera, wherein the first operation is requested by the second request to be performed by the second electronic device.
In the above scheme, when the second electronic device detects the use permission from the first electronic device for requesting the physical camera of the second electronic device, the second electronic device generates and displays the first UI to the user, the user can control the physical camera on the second electronic device through the first UI, when the user performs a certain operation on the first UI, the second electronic device detects the request from the first UI, and the second electronic device performs a corresponding operation on the physical camera according to the request of the first UI, so that the control requirement of the user on the physical camera of the second electronic device during the period that the physical camera of the second electronic device is controlled by the first electronic device can be met.
With reference to the first aspect, in certain implementation manners of the first aspect, the method further includes: and the second electronic equipment displays the current picture acquired by the physical camera after executing the first operation on the first UI.
In the above scheme, the second electronic device displays the current picture acquired by the physical camera after the first operation is performed on the first UI, so that the user can control the physical camera of the second electronic device through the first UI, and can see the current picture acquired by the physical camera from the second electronic device, thereby improving the user experience.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: the second electronic device sends the data information and the first parameter information of the first picture to the first electronic device, wherein the first parameter information is the parameter information of the physical camera when the first picture is shot.
In the above scheme, after the user makes the second electronic device perform the first operation on the physical camera through the first UI, the second electronic device may send the data information of the current picture acquired by the physical camera after performing the first operation and the parameter information of the physical camera when shooting the current picture to the first electronic device, and the first electronic device may display the current picture to the user according to the received data information of the current picture and the parameter information of the physical camera when shooting the current picture, so as to improve user experience.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: when a second electronic device detects a third request from the first electronic device, performing a second operation on the physical camera, the second operation being an operation requested by the third request to be performed by the second electronic device; the second electronic device sends data information of a second picture and second parameter information to the first electronic device, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is parameter information of the physical camera when the second picture is shot.
In the above scheme, the second electronic device can process the request from the first UI and also process the request from the first electronic device, in other words, the user can control not only the physical camera of the second electronic device but also the physical camera of the second electronic device through the first UI, so as to realize multi-terminal control of the physical camera of the second electronic device.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a first operation includes: and when a control strategy is met, the second electronic device performs the first operation on the physical camera, wherein the control strategy comprises the following steps: and when the receiving time of the second request is earlier than the receiving time of the third request, executing the second request, or when the second request collides with the third request, refusing to execute the third request.
In the above-described scheme, since the second electronic device is capable of processing not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, in order to avoid the occurrence of the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and eventually only the request allowed to be performed according to the control policy is processed, thereby avoiding the occurrence of the collision.
With reference to the first aspect and the foregoing implementation manners of the first aspect, in some implementation manners of the first aspect, the performing a second operation includes: and when a control strategy is met, the second operation is performed on the physical camera by second electronic equipment, wherein the control strategy comprises the following steps: and when the receiving time of the second request is earlier than the receiving time of the third request, the second electronic device executes the second request, or when the second request collides with the third request, the second electronic device refuses to execute the third request.
In the above-described scheme, since the second electronic device is capable of processing not only the request from the first UI but also the request from the first electronic device, in this case, the operation requested to be performed by the request from the first UI may collide with the operation requested to be performed by the request from the first electronic device, in order to avoid the occurrence of the collision, the second electronic device may determine whether the request is allowed to be performed according to the control policy before performing the request, and eventually only the request allowed to be performed according to the control policy is processed, thereby avoiding the occurrence of the collision.
With reference to the first aspect and the foregoing implementation manner of the first aspect, in certain implementation manners of the first aspect, the method further includes: a second electronic device receives user input data from the first UI; and the second electronic equipment acquires the control strategy according to the user input data.
In the above scheme, the second electronic device acquires the control policy according to the user input data from the first UI, so that the user can set the control policy through the first UI, in other words, the user decides what control policy to use, thereby improving the user experience.
In a second aspect, the present application provides an apparatus for inclusion in an electronic device, the apparatus having functionality to implement the above aspects and possible implementations of the above aspects. The functions may be realized by hardware, or may be realized by hardware executing corresponding software. The hardware or software includes one or more modules or units corresponding to the functions described above.
In a third aspect, the present application provides an electronic device, comprising: a touch display screen, wherein the touch display screen comprises a touch-sensitive surface and a display; a camera; one or more processors; a memory; a plurality of applications; and one or more computer programs. Wherein one or more computer programs are stored in the memory, the one or more computer programs comprising instructions. The instructions, when executed by the electronic device, cause the electronic device to perform the method of controlling a camera in any of the possible implementations of any of the above aspects.
In a fourth aspect, the present application provides an electronic device comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being operable to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the electronic device to perform the method of controlling a camera in any of the possible implementations of the above.
In a fifth aspect, the application provides a computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform a method of video playback possible in any one of the above aspects.
In a sixth aspect, the application provides a computer program product for, when run on an electronic device, causing the electronic device to perform the method of controlling a camera as possible in any one of the above aspects.
Drawings
FIG. 1 is a system frame diagram provided by an embodiment of the present application;
FIG. 2 is a schematic block diagram of an electronic device provided by an embodiment of the present application;
FIG. 3 is a schematic diagram of a method for controlling a camera according to an embodiment of the present application;
FIG. 4 (a) is a diagram illustrating an example of a user interface for controlling a camera according to an embodiment of the present application;
FIG. 4 (b) is a diagram illustrating a user interface for controlling a camera according to another embodiment of the present application;
FIG. 5 (a) is a diagram illustrating another example of a user interface for controlling a camera according to an embodiment of the present application;
Fig. 5 (b) is a schematic diagram of a user interface for controlling a camera according to another embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application.
For ease of understanding, the description of the embodiments of the present application will be briefly described.
Main control equipment
The master device may be an electronic device that initiates a request to other electronic devices for use of the usage rights of the physical camera of the other electronic devices.
Remote device
The remote device may be an electronic device with a physical camera controlled by a master device.
Virtual camera
The virtual camera may be a virtual camera corresponding to the physical camera of the remote device created by the master device according to parameter information of the physical camera of the remote device, and the master device may control the physical camera of the remote device through the virtual camera with respect to the physical camera.
Fig. 1 shows a frame diagram of a system provided by an embodiment of the present application, where a master device may control a physical camera of a remote device through a virtual camera corresponding to the physical camera of the remote device.
Fig. 2 is a schematic structural diagram of an electronic device 100 according to an embodiment of the present application. The electronic device 100 may include a processor 110, an internal memory 120, a universal serial bus (universal serial bus, USB) interface 130, a camera 140, a display 150, and a touch sensor 160, among others.
It should be understood that the illustrated structure of the embodiment of the present application does not constitute a specific limitation on the electronic device 100. In other embodiments of the application, electronic device 100 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The processor 110 may include one or more processing units, such as: the processor 110 may include an application processor (application processor, AP), a graphics processor (graphics processing unit, GPU), an image signal processor (IMAGE SIGNAL processor, ISP), a controller, a video codec, a digital signal processor (DIGITAL SIGNAL processor, DSP), and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that the processor 110 has just used or recycled. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Repeated accesses are avoided and the latency of the processor 110 is reduced, thereby improving the efficiency of the system.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transfer data between the electronic device 100 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiments of the present application is only illustrative, and is not meant to limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also employ different interfacing manners in the above embodiments, or a combination of multiple interfacing manners.
The electronic device 100 implements display functions through a GPU, a display screen 150, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display 150 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 110 may include one or more GPUs that execute program instructions to generate or change display information.
The display screen 150 is used to display images, videos, and the like. The display 150 includes a display panel. The display panel may employ a Liquid Crystal Display (LCD) CRYSTAL DISPLAY, an organic light-emitting diode (OLED), an active-matrix organic LIGHT EMITTING diode (AMOLED), a flexible light-emitting diode (FLED), miniled, microLed, micro-oLed, a quantum dot LIGHT EMITTING diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or N display screens 150, N being a positive integer greater than 1.
The electronic device 100 may implement photographing functions through an ISP, a camera 140, a video codec, a GPU, a display screen 150, an application processor, and the like.
The ISP is used to process the data fed back by the camera 140. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 140.
The camera 140 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, the electronic device 100 may include 1 or N cameras 140, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, or the like.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent awareness of the electronic device 100 may be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 120 may be used to store computer executable program code including instructions. The internal memory 120 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data created during use of the electronic device 100 (e.g., audio data, etc.), and so on. In addition, the internal memory 120 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (universal flash storage, UFS), and the like. The processor 110 performs various functional applications and data processing of the electronic device 100 by executing instructions stored in the internal memory 120 and/or instructions stored in a memory provided in the processor.
The touch sensor 160, also referred to as a "touch device". The touch sensor 160 may be disposed on the display screen 150, and the touch sensor 160 and the display screen 150 form a touch screen, which is also called a "touch screen". The touch sensor 160 is used to detect a touch operation acting thereon or thereabout. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 150. In other embodiments, the touch sensor 160 may also be disposed on the surface of the electronic device 100 at a different location than the display 150.
For example, the electronic device 100 may be a first electronic device or a second electronic device.
The method of controlling a camera provided by the present application is described in detail below in conjunction with the system shown in fig. 1, and fig. 3 shows a schematic interactive flow chart of a method 300 of controlling a camera.
Step 301, when the second electronic device detects a first request from the first electronic device, starting a physical camera of the second electronic device and displaying a first User Interface (UI), wherein the first request requests the second electronic device for the use authority of the physical camera.
In step 302, when the second electronic device detects a second request from the first UI, a first operation is performed on the physical camera, the first operation being an operation requested by the second electronic device to be performed by the second request.
The first electronic device may, for example, discover the second electronic device by using a near-field broadcast manner, and then the first electronic device may establish a connection with the second electronic device and may acquire parameter information of a physical camera of the second electronic device, and the first electronic device may create a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device according to the parameter information of the physical camera of the second electronic device, and then the first electronic device may achieve the purpose of controlling the physical camera of the second electronic device through the virtual camera, where the first electronic device is a master control device and the second electronic device is a remote device.
It should be noted that, in the embodiment of the present application, the parameter information of the physical camera is used to indicate the capability of the physical camera, for example, the parameter information of the physical camera may include the resolution, the frame rate, the color format and so on supported by the physical camera.
After the first electronic device creates a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device, a user may call the virtual camera through an Application (APP) installed on the first electronic device, and when the virtual camera is called, the second electronic device may detect a first request from the first electronic device, where the first request is used to request the second electronic device to start up its physical camera, and the second electronic device starts up the physical camera according to the first request.
For example, the APP on the first electronic device may display a UI on its display interface, where the UI is used for the user to select a remote device that requests the usage right of the physical camera, and may be further used for the user to further select the usage right of the front camera or the rear camera of the physical camera. In order to distinguish from a UI that appears subsequently, the UI may be referred to as a UI corresponding to the remote device in view of the UI being used for the user to select the remote device.
Assuming that the user selects a front-facing camera of the physical camera of the second electronic device, after the user finishes selecting, the APP on the first electronic device calls a virtual camera corresponding to the physical camera of the second electronic device, and after the virtual camera is called, the second electronic device detects a first request from the second electronic device, the first request is used for requesting the second electronic device to start the front-facing camera of the right camera of the second electronic device, and the second electronic device calls a camera interface to start the front-facing camera according to the first request. The physical camera of the second electronic device captures a current picture in an actual environment, and the second electronic device can send the data information of the current picture and the parameter information of the physical camera when the current picture is shot to the first electronic device, so that the first electronic device displays the current picture corresponding to the data information of the current picture to a user according to the parameter information of the physical camera when the current picture is shot.
It is worth mentioning that, when the virtual camera corresponding to the physical camera of the second electronic device is invoked, the first electronic device may display the UI corresponding to the virtual camera based on the invocation, and the UI corresponding to the virtual camera may be used for the user to control the physical camera of the second electronic device.
Further, the second electronic device may generate a UI (e.g., a first UI) based on the first request from the first electronic device and display the first UI on the second electronic device. The user may enable control of the physical camera of the second electronic device on the second electronic device through interaction with the first UI. When the user completes a certain operation on the first UI, the second electronic device detects a second request from the first UI, and the second electronic device performs the first operation on the physical camera according to the second request.
For example, the user requests to increase the resolution of the physical camera from the second electronic device through the first UI, at which time the second electronic device performs a first operation to increase the resolution of the physical camera in response to the user's request.
Illustratively, the first UI may display a screen captured by the physical camera of the second electronic device in real time to the user in addition to providing an interaction portal between the user and the second electronic device, in which case the method 300 may further include:
in step 303, the second electronic device displays a first screen on the first UI, where the first screen is a current screen acquired by the physical camera after the first operation is performed.
For example, the second electronic device may further transmit data information of the first screen and parameter information of a physical camera of the second electronic device when the first screen is photographed to the first electronic device, in which case the method 300 may further include:
in step 304, the second electronic device sends the data information and the first parameter information of the first frame to the first electronic device, where the first parameter information is the parameter information of the physical camera when the first frame is shot.
For example, when the second electronic device performs the first operation, the resolution of the physical camera is increased from 1280×720 to 1920×1080, in which case the resolution used when the physical camera shoots the first picture is 1920×1080, and after the first picture is acquired, the second electronic device transmits the data information of the first picture and the parameter information of the physical camera when shooting the first picture to the first electronic device, so that the first electronic device displays the first picture corresponding to the data information of the first picture to the user according to the parameter information of the physical camera when shooting the first picture, where the parameter information of the physical camera when shooting the first picture includes the resolution, the resolution is 1920×1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is 1920×1080, which is the resolution used when the physical camera shoots the first picture.
Illustratively, the second electronic device may process the request from the first electronic device in addition to the request from the first UI, in which case the method 300 may further include:
In step 305, when the second electronic device detects a third request from the first electronic device, a second operation is performed on the physical camera, where the second operation is an operation performed by the second electronic device requested by the third request.
Step 306, the second electronic device sends the data information and the second parameter information of the second frame to the first electronic device, where the second frame is the current frame obtained by the physical camera after the second operation is performed, and the second parameter information is the parameter information of the physical camera when the second frame is shot.
When a user performs a certain operation on the virtual camera corresponding to the physical camera of the second electronic device on the UI corresponding to the virtual camera, the second electronic device may detect a third request from the first electronic device, and the second electronic device executes the second operation according to the third request, and then the second electronic device may send, after executing the second operation, data information of the second picture acquired by the physical camera and parameter information of the physical camera when the second picture is shot to the first electronic device, so that the first electronic device displays the second picture corresponding to the data information of the second picture to the user according to the parameter information of the physical camera when the second picture is shot.
For example, for a detected request from the UI corresponding to the virtual camera, or a request from the first UI, the second electronic device may first determine whether the control policy is satisfied, and when the control policy is satisfied, the second electronic device executes the corresponding request.
The control strategy may include: a sequence priority control strategy, a local priority control strategy and an opposite terminal priority strategy, wherein the sequence priority control strategy refers to that the second electronic equipment sequentially executes corresponding requests according to the sequence of receiving the requests; the local priority control policy refers to that when an operation requested to be executed by a request from a first electronic device collides with an operation requested to be executed by a request from a second electronic device, the second electronic device refuses to execute the request from the first electronic device; the peer-to-peer priority policy refers to that when an operation requested to be performed by a request from a first electronic device collides with an operation requested to be performed by a request from a second electronic device, the second electronic device refuses to perform the request from the second electronic device.
For example, assume that the control policy is a sequential priority control policy, in which case the second electronic device executes all requests detected, in which case the second electronic device executes both for the second request and for the third request.
In this case, when the second electronic device detects both the second request and the third request, it is assumed that the first operation requested by the second request and the second operation requested by the third request to be performed by the second electronic device collide, and the second electronic device refuses to perform the second operation corresponding to the third request based on the local priority control policy.
For example, the second electronic device detects both the second request, which corresponds to the first operation of setting the flash of the physical camera to the auto mode, and the third request, which corresponds to the second operation of setting the flash of the physical camera to the normally on mode, in which case the second electronic device may refuse to execute the third request, in other words, in which case the second electronic device executes only the second request, i.e., sets the flash of the physical camera to the auto mode.
In this case, when the second electronic device detects both the second request and the third request, it is assumed that the first operation requested by the second request and the second operation requested by the third request to be performed by the second electronic device collide, and at this time, the second electronic device refuses to perform the first operation corresponding to the second request based on the peer-to-peer priority control policy.
For example, the second electronic device detects both the second request, which corresponds to the first operation of turning on the night view photographing mode, and the third request, which corresponds to the second operation of turning on the large aperture photographing mode, in which case the second electronic device may refuse to execute the second request, in other words, in which case the second electronic device executes only the third request, that is, the large aperture photographing mode.
Illustratively, the control policy may be preconfigured on the second electronic device, or the control policy may be set by the user through the first UI, for example, the user interacts with the second electronic device through the first UI, and the control policy is set on the first UI, where the second electronic device may detect user input data from the first UI, and the second electronic device obtains the control policy according to the user input data.
In the embodiment of the present application, the user operating the first electronic device and the user operating the second electronic device may be the same user, or may be different users, which is not limited in the embodiment of the present application.
The method 300 is described in detail below with reference to fig. 4-5, taking a live scenario as an example.
When the first electronic device finds the second electronic device by using the near field mode, the first electronic device can create a virtual camera corresponding to the physical camera of the second electronic device on the first electronic device, and then the first electronic device can control the physical camera of the second electronic device through the virtual camera, and the first electronic device is a master control device and the second electronic device is a remote device. For the manner in which the first electronic device discovers the second electronic device and the manner in which the first electronic device creates the virtual camera, please refer to the related description in the method 300, and for brevity, the description is omitted here.
The method comprises the steps that a first APP installed on a first electronic device is used for live broadcasting by an anchor, wherein the first APP has a live broadcasting function and has the capability of calling a virtual camera corresponding to a physical camera of a second electronic device.
The method comprises the steps that when a live broadcast is performed, a host uses the function of a physical camera of a first electronic device and the function of a virtual camera corresponding to a physical camera of a second electronic device, for the physical camera of the first electronic device, when a user opens the live broadcast function, the first electronic device correspondingly opens the physical camera, the physical camera is assumed to be currently opened to be a front-end camera, at the moment, the user watching the live broadcast can see the host on the electronic device, for the virtual camera corresponding to the physical camera of the second electronic device, the host can select on a UI corresponding to a remote device displayed by a first APP, and for the virtual camera corresponding to the physical camera of the second electronic device, the host selects a rear-end camera corresponding to the virtual camera corresponding to the second electronic device, and after the host completes selection, the first APP can call the virtual camera corresponding to the physical camera of the second electronic device on the first electronic device.
When the virtual camera corresponding to the second electronic device is called, the virtual camera proxy module on the second electronic device detects a first request from the first APP, and the first request is used for requesting the second electronic device to start the rear camera of the physical camera of the second electronic device because the host selects the rear camera of the virtual camera corresponding to the second electronic device, and the virtual camera proxy module calls the camera interface to start the rear camera of the physical camera according to the first request.
The virtual camera agent module calls the camera interface to start the rear camera of the physical camera according to the first request, the physical camera captures a current picture in an actual environment, the virtual camera agent module can call the camera interface to send data information of the current picture and parameter information of the physical camera when the current picture is shot to a first APP on the first electronic device, and the first APP can display the current picture corresponding to the data information of the current picture to a user watching live broadcasting according to the parameter information of the physical camera when the current picture is shot. At this time, the first electronic device may display the photographing picture of the physical camera of the first electronic device and the photographing picture of the virtual camera corresponding to the physical camera of the second electronic device, and similarly, for a user watching live broadcast, not only can see the photographing picture of the physical camera of the first electronic device but also can see the photographing picture of the virtual camera corresponding to the physical camera of the second electronic device on the electronic device.
For example, when a host is live, a user of the first electronic device is a host, a user of the second electronic device is an assistant of the host, when the host starts live, a front camera of a physical camera of the first electronic device is used for shooting himself, when the host needs to show goods to a user watching live, a rear camera of a virtual camera corresponding to the second electronic device can be called on the first electronic device through the first APP, when the rear camera on the second electronic device is turned on, the assistant can shoot the goods to be shown, and at the moment, the user watching live can see not only the host but also the corresponding goods on the electronic device.
In addition, the assistant may also control the physical camera of the second electronic device on the second electronic device, for example, the virtual camera proxy module may generate a UI (e.g., a first UI) after detecting the first request from the first electronic device, and display the first UI401 on the second electronic device, the first UI401 may control the physical camera of the second electronic device through the first UI401, for example, the assistant may set a flash mode of the physical camera of the second electronic device through an option 4011, the assistant may set a tone mode of the physical camera through an option 4012, the assistant may adjust a resolution head of the physical camera through an option 4013, and the assistant may set other options of the physical camera through an option 4014.
It should be noted that the options shown in the first UI401 are only illustrated as examples, and in specific implementations, the first UI may further include more options than illustrated, for example, may further include a focusing option, which is not limited by the embodiment of the present application.
For example, the assistant adjusts the resolution of the physical camera via option 4013, at which point the virtual camera proxy module detects a request to adjust resolution (e.g., a second request) from the first UI, and the virtual camera proxy module invokes the camera interface to adjust the resolution of the physical camera based on the second request.
After executing the first operation corresponding to the second request, the virtual camera proxy module may call the camera interface to display, through the first UI, a current screen (for example, a first screen) acquired by the physical camera, in which case, as shown in fig. 4 (b), the first UI402 may be a first screen acquired by the physical camera by a person in a dashed line box 4025 in the first UI402, and the screen in a dashed line box 4025 shown in fig. 4 (b) may be a non-full screen display mode, and the assistant may set, through the option 4026, the screen in the dashed line box 4026 to a full screen display mode.
In addition, the virtual camera agent module can call the camera interface, data information of a first picture acquired by the physical camera after the first operation is executed and parameter information of the physical camera when the first picture is shot are sent to the first APP on the first electronic device, and the first APP displays the first picture corresponding to the data information of the first picture to a user watching live broadcast according to the parameter information of the physical camera when the first picture is shot.
For example, the assistant increases the resolution of the physical camera from 1280×720 to 1920×1080 through the option 4023, in which case the resolution used when the physical camera captures the first picture is 1920×1080, after the first picture is acquired, the second electronic device transmits the data information of the first picture and the parameter information of the physical camera when the first picture is captured to the first APP on the first electronic device, the first APP displays the first picture corresponding to the data information of the first picture to the user on the first electronic device according to the parameter information of the physical camera when the first picture is captured, the parameter information of the physical camera when the first picture is captured includes the resolution, the resolution is 1920×1080, in other words, the resolution of the first picture displayed to the user by the first electronic device is 1920×1080.
When the anchor performs a certain operation on the virtual camera corresponding to the physical camera of the second electronic device on the UI corresponding to the virtual camera, the virtual camera proxy module detects a third request from the first electronic device, and the virtual camera proxy module invokes the camera interface to perform a second operation according to the third request, and then the virtual camera proxy module may invoke the camera interface to send data information of a current picture (for example, a second picture) acquired by the physical camera after performing the second operation and parameter information of the physical camera when shooting the second picture to the first APP on the first electronic device, so that the first APP displays the second picture corresponding to the data information of the second picture acquired by the physical camera to a user watching live broadcast after performing the second operation according to the parameter information of the physical camera when shooting the second picture.
For example, the anchor selects a soft option corresponding to the tone on the UI corresponding to the virtual camera, at this time, the virtual camera proxy module detects a third request for requesting to adjust the tone of the physical camera to a soft mode, and the virtual camera proxy module invokes the camera interface to adjust the physical camera to the soft mode according to the third request, and then, the virtual camera proxy module may invoke the camera interface to send the data information of the second picture acquired by the physical camera after the mode adjustment and the parameter information of the physical camera when the second picture is shot to the first APP on the first electronic device, so that the first APP displays the second picture corresponding to the data information of the second picture on the first electronic device according to the parameter information of the physical camera when the second picture is shot to a user who views live broadcast.
As can be seen from the foregoing description, the virtual camera proxy module may detect a request from a UI corresponding to the virtual camera, and detect a request from the first UI, and in a specific implementation, the virtual camera proxy module may first send the request to the control policy management module, where the control policy management module stores a control policy, and the control policy management module determines whether the request can be executed on the premise of ensuring that the control policy is satisfied, and the control policy management module may notify the virtual camera proxy module of the result, and the virtual camera proxy module determines whether to execute the corresponding request according to the result. For a detailed description of the control strategy, please refer to the related description in the method 300, and for brevity, the description is omitted here.
For example, assuming that the control policy is a sequential priority control policy, the control policy management module obtains a result of allowing execution for all the detected requests, and notifies the virtual camera proxy module of the result, and the virtual camera proxy module calls the camera interface to execute the corresponding request, and in this case, the virtual camera proxy module calls the camera interface to execute the second request and the third request.
For example, it is assumed that the control policy is a local priority control policy, in this case, when the virtual camera proxy module detects both the second request and the third request, and it is assumed that the first operation requested to be performed by the second request conflicts with the second operation requested to be performed by the third request, the control policy management module refuses to perform the second operation corresponding to the third request based on the local priority control policy, and notifies the virtual camera proxy module of the result of refusal of the execution.
For example, the virtual camera proxy module detects both the second request, which corresponds to the first operation of setting the flash to the automatic mode, and the third request, which corresponds to the second operation of setting the flash to the constant mode, in which case the control policy management module refuses to execute the third request based on the local priority control policy, in other words, in which case the virtual camera proxy module invokes the camera to execute only the second request, setting the flash to the automatic mode.
For example, it is assumed that the control policy is a peer-to-peer priority control policy, in this case, when the virtual camera proxy module detects both the second request and the third request, and it is assumed that the first operation requested to be performed by the second request conflicts with the second operation requested to be performed by the third request, the control policy management module refuses to perform the first operation corresponding to the second request based on the peer-to-peer priority control policy, and notifies the virtual camera proxy module of the result of refusal of the execution.
For example, the virtual camera agent module detects both the second request and the third request, the first operation corresponding to the second request is to start the night scene photographing mode, the second operation corresponding to the third request is to start the large aperture photographing mode, in which case the control policy management module refuses to execute the second request based on the peer-to-peer priority control policy, in other words, in which case the virtual camera agent module invokes the camera to execute only the third request, i.e. to start the large aperture photographing mode.
For example, the control policy may be configured in advance in the control policy management module, or the control policy may be set by the user through the first UI, for example, the assistant may select an option on the first UI, the option representing a certain control policy, and after the assistant makes the selection, the inside of the second electronic device may acquire the control policy according to the selection of the assistant (e.g., user input data), and finally the acquired control policy may be saved in the control policy management module, so that the control policy management module may determine whether to allow the request to be executed according to the saved control policy.
For example, when the assistant sets a control policy through the first UI 402, the assistant may select a setting option in the first UI 402, in response to an operation of the assistant, the second electronic device may display the first UI 501 as shown in fig. 5 (a), the first UI 501 may include a control policy option 5011 therein, the assistant may select the control policy option 5011 in the first UI 501, in response to an operation of the assistant, the second electronic device may display the first UI 502 as shown in fig. 5 (b), the first UI 502 may include a sequential priority control policy option 5021 therein, a local priority control policy option 5022, and an opposite priority control policy option 5023, and the first assistant may select one of the first UIs 502 as a control policy, for example, the assistant selects the sequential priority control policy option 5021 on behalf of the assistant.
It should be noted that, in the embodiment of the present application, the first UI may be suspended on any interface and may be dragged at will, for example, the first UI may be suspended on a screen locking interface of the second electronic device or other interfaces after the second electronic device is unlocked.
It should be noted that, the foregoing description only uses the assistant to set the control policy as an example, but this is not a limitation on the embodiment of the present application, and when the embodiment is implemented, the anchor or other users may complete the setting of the control policy.
It should be further noted that, in the embodiment of the present application, the time when the control policy is set by the user is not limited, and the user may set the control policy through the first UI at any time after the display of the first UI, and the control policy management module stores the control policy or updates the stored control policy in real time according to the setting of the user.
It should be noted that, in the embodiment of the present application, the virtual camera agent module and the control policy management module are used to execute different functions respectively as an example, but this is not limitative of the embodiment of the present application, and in specific implementation, the functions executed by the virtual camera agent module and the control policy management module may be implemented by one module or may be implemented by a plurality of modules, which is not particularly limited in the embodiment of the present application.
In this embodiment, the electronic device may be divided into functional modules according to the above method example, for example, each functional module may be divided into functional modules corresponding to each function, or two or more functions may be integrated into one processing module, for example, the above virtual camera agent module and the control policy management module may be integrated into one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
The electronic device provided in this embodiment is configured to execute the method for controlling a camera, so that the same effects as those of the implementation method can be achieved. In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage actions of the electronic device, for example, may be configured to support the electronic device to execute steps executed by the processing unit. The memory module may be used to support the electronic device in storing program code, data, etc. And the communication module can be used for supporting the communication between the electronic device and other devices.
Wherein the processing module may be a processor or a controller. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. A processor may also be a combination that performs computing functions, e.g., including one or more microprocessors, digital Signal Processing (DSP) and a combination of microprocessors, and the like. The memory module may be a memory. The communication module can be a radio frequency circuit, a Bluetooth chip, a Wi-Fi chip and other equipment which interact with other electronic equipment.
In one embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 2.
The present embodiment also provides a computer-readable storage medium having stored therein computer instructions which, when executed on an electronic device, cause the electronic device to perform the above-described related method steps to implement the method of controlling a camera in the above-described embodiments.
The present embodiment also provides a computer program product which, when run on a computer, causes the computer to perform the above-described related steps to implement the method of controlling a camera in the above-described embodiments.
In addition, embodiments of the present application also provide an apparatus, which may be embodied as a chip, component or module, which may include a processor and a memory coupled to each other; the memory is configured to store computer-executable instructions, and when the device is running, the processor may execute the computer-executable instructions stored in the memory, so that the chip performs the method for controlling the camera in the above method embodiments.
The electronic device, the computer storage medium, the computer program product, or the chip provided in this embodiment are used to execute the corresponding methods provided above, so that the beneficial effects thereof can be referred to the beneficial effects in the corresponding methods provided above, and will not be described herein.
It will be appreciated by those skilled in the art that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of modules or units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts shown as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of the present application, and the present application is not limited thereto, and any person skilled in the art will readily recognize that variations or substitutions are within the scope of the present application. Therefore, the protection scope of the application is subject to the protection scope of the claims.
Claims (11)
1. A method of controlling a camera, the method performed by a second electronic device, comprising:
Detecting a request from a first electronic device to use a physical camera of the second electronic device, starting the physical camera of the second electronic device and displaying a first user interface UI;
adjusting parameter information of the physical camera in response to one or more operations by a user acting on the first UI;
Acquiring a first picture by using the physical camera according to the adjusted parameter information;
And sending the first picture to the first electronic equipment.
2. The method according to claim 1, wherein the method further comprises:
And displaying the first picture on the first UI, wherein the first picture is a current picture acquired by the physical camera according to the adjusted parameter information.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
and sending the data information and the first parameter information of the first picture to the first electronic equipment, wherein the first parameter information is the parameter information of the physical camera when the first picture is shot.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
When a first request from the first electronic device is detected, performing a first operation on the physical camera, the first operation being an operation performed by the second electronic device requested by the first request;
And sending data information of a second picture and second parameter information to the first electronic equipment, wherein the second picture is a current picture acquired by the physical camera after the second operation is executed, and the second parameter information is parameter information of the physical camera when the second picture is shot.
5. The method according to any one of claims 1 to 4, further comprising:
Receiving user input data generated by one or more operations of a user on the first UI;
and acquiring a control strategy according to the user input data.
6. A method of controlling a camera, the method performed by a second electronic device, comprising:
Detecting a request from a first electronic device to use a physical camera of the second electronic device, starting the physical camera of the second electronic device and displaying a first user interface UI;
detecting a first request from the first electronic device, and adjusting the physical camera parameter information;
Acquiring a first picture by using the physical camera according to the adjusted parameter information;
And sending the first picture to the first electronic equipment.
7. The method according to claim 1, wherein the method further comprises:
And displaying the first picture on the first UI, wherein the first picture is a current picture acquired by the physical camera according to the adjusted parameter information.
8. The method according to claim 6 or 7, characterized in that the method further comprises:
And detecting a second request from the first electronic device, and adjusting the parameter information of the physical camera again.
9. An electronic device, the electronic device comprising:
One or more processors, memory, and one or more programs, wherein the one or more programs are stored in the memory, which when executed by the processor, cause the electronic device to perform the method of controlling a camera of any of claims 1-5 or the method of controlling a camera of any of claims 6-8.
10. An apparatus characterized in that the apparatus has a function of realizing the behavior of the electronic device in the method of controlling a camera according to any one of claims 1 to 5 or the method of controlling a camera according to any one of claims 6 to 8.
11. A program product, characterized in that the program product, when run on an electronic device, causes the device to perform the method of controlling a camera according to any one of claims 1 to 5 or the method of controlling a camera according to any one of claims 6 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410140219.5A CN118158518A (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011257123.5A CN114500822B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
CN202410140219.5A CN118158518A (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011257123.5A Division CN114500822B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
Publications (1)
Publication Number | Publication Date |
---|---|
CN118158518A true CN118158518A (en) | 2024-06-07 |
Family
ID=81489859
Family Applications (3)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410140219.5A Pending CN118158518A (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
CN202011257123.5A Active CN114500822B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
CN202210954724.4A Active CN115514881B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011257123.5A Active CN114500822B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
CN202210954724.4A Active CN115514881B (en) | 2020-11-11 | 2020-11-11 | Method for controlling camera and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (3) | CN118158518A (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117956269A (en) * | 2022-10-27 | 2024-04-30 | 荣耀终端有限公司 | Camera switching method and related electronic equipment |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104796610A (en) * | 2015-04-20 | 2015-07-22 | 广东欧珀移动通信有限公司 | Mobile terminal and camera sharing method, device and system thereof |
WO2020029306A1 (en) * | 2018-08-10 | 2020-02-13 | 华为技术有限公司 | Image capture method and electronic device |
CN110971823A (en) * | 2019-11-29 | 2020-04-07 | 维沃移动通信(杭州)有限公司 | Parameter adjusting method and terminal equipment |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH08237635A (en) * | 1995-02-28 | 1996-09-13 | Canon Inc | Image pickup controller and control method thereof |
JP3934521B2 (en) * | 2002-10-04 | 2007-06-20 | 日本電信電話株式会社 | Video remote control device, video remote control method, video remote control program, and recording medium recording video remote control program |
CN104349032A (en) * | 2013-07-23 | 2015-02-11 | 中兴通讯股份有限公司 | Method for photographing and mobile terminal |
CN103634524A (en) * | 2013-11-15 | 2014-03-12 | 北京智谷睿拓技术服务有限公司 | Control method and control equipment of camera system and camera system |
CN105516507A (en) * | 2015-12-25 | 2016-04-20 | 联想(北京)有限公司 | Information processing method and electronic equipment |
KR20200054298A (en) * | 2017-09-19 | 2020-05-19 | 로비 가이드스, 인크. | Systems and methods for navigating Internet appliances using a media guidance application |
CN110944109B (en) * | 2018-09-21 | 2022-01-14 | 华为技术有限公司 | Photographing method, device and equipment |
CN110113483B (en) * | 2019-04-19 | 2022-02-25 | 华为技术有限公司 | Method for using enhanced function of electronic equipment and related device |
CN111083364B (en) * | 2019-12-18 | 2023-05-02 | 荣耀终端有限公司 | Control method, electronic equipment, computer readable storage medium and chip |
CN111010588B (en) * | 2019-12-25 | 2022-05-17 | 成都酷狗创业孵化器管理有限公司 | Live broadcast processing method and device, storage medium and equipment |
CN111083379A (en) * | 2019-12-31 | 2020-04-28 | 维沃移动通信(杭州)有限公司 | Shooting method and electronic equipment |
-
2020
- 2020-11-11 CN CN202410140219.5A patent/CN118158518A/en active Pending
- 2020-11-11 CN CN202011257123.5A patent/CN114500822B/en active Active
- 2020-11-11 CN CN202210954724.4A patent/CN115514881B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104796610A (en) * | 2015-04-20 | 2015-07-22 | 广东欧珀移动通信有限公司 | Mobile terminal and camera sharing method, device and system thereof |
WO2020029306A1 (en) * | 2018-08-10 | 2020-02-13 | 华为技术有限公司 | Image capture method and electronic device |
CN110971823A (en) * | 2019-11-29 | 2020-04-07 | 维沃移动通信(杭州)有限公司 | Parameter adjusting method and terminal equipment |
Also Published As
Publication number | Publication date |
---|---|
CN115514881A (en) | 2022-12-23 |
CN115514881B (en) | 2023-07-11 |
CN114500822B (en) | 2024-03-05 |
CN114500822A (en) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220342475A1 (en) | Terminal control method and terminal | |
CN109167931B (en) | Image processing method, device, storage medium and mobile terminal | |
CN113099146B (en) | Video generation method and device and related equipment | |
WO2018036164A1 (en) | Screen flash photographing method, device and mobile terminal | |
CN111860530B (en) | Electronic equipment, data processing method and related device | |
KR20230133970A (en) | Photography methods, devices and electronics | |
EP4443894A1 (en) | Image processing method and electronic device | |
WO2022007854A1 (en) | Screen recording method and screen recording system | |
CN114500822B (en) | Method for controlling camera and electronic equipment | |
CN111259441B (en) | Device control method, device, storage medium and electronic device | |
CN112437235B (en) | Night scene picture generation method and device and mobile terminal | |
WO2024027331A1 (en) | Photographing method and related apparatus | |
EP4262226A1 (en) | Photographing method and related device | |
CN115460343A (en) | Image processing method, apparatus and storage medium | |
US20240273029A1 (en) | Photographing method and related apparatus | |
CN117692753B (en) | Photographing method and electronic equipment | |
CN117119295B (en) | Camera control method and electronic device | |
CN115278047B (en) | Shooting method, shooting device, electronic equipment and storage medium | |
CN115733913B (en) | Continuous photographing method and device and storage medium | |
WO2024119922A1 (en) | Video processing method, display device and storage medium | |
CN117453104B (en) | Image acquisition method and electronic equipment | |
CN117156293A (en) | Photographing method and related device | |
CN117692763B (en) | Photographing method, electronic device, storage medium and program product | |
WO2022111701A1 (en) | Screen projection method and system | |
CN116755748A (en) | Card updating method, electronic device, and computer-readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |