WO2024159950A1 - Procédé et appareil d'affichage, dispositif électronique et support de stockage - Google Patents
Procédé et appareil d'affichage, dispositif électronique et support de stockage Download PDFInfo
- Publication number
- WO2024159950A1 WO2024159950A1 PCT/CN2023/139711 CN2023139711W WO2024159950A1 WO 2024159950 A1 WO2024159950 A1 WO 2024159950A1 CN 2023139711 W CN2023139711 W CN 2023139711W WO 2024159950 A1 WO2024159950 A1 WO 2024159950A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- processing
- thread
- timestamp
- threads
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 157
- 238000012545 processing Methods 0.000 claims abstract description 377
- 230000008569 process Effects 0.000 claims abstract description 101
- 239000002131 composite material Substances 0.000 claims abstract description 34
- 230000001360 synchronised effect Effects 0.000 claims abstract description 22
- 230000004044 response Effects 0.000 claims abstract description 18
- 238000009877 rendering Methods 0.000 claims description 153
- 230000004617 sleep duration Effects 0.000 claims description 52
- 238000004590 computer program Methods 0.000 claims description 26
- 230000015572 biosynthetic process Effects 0.000 claims description 10
- 238000003786 synthesis reaction Methods 0.000 claims description 10
- 230000001960 triggered effect Effects 0.000 claims description 6
- 238000007654 immersion Methods 0.000 abstract description 11
- 238000004891 communication Methods 0.000 description 45
- 230000006854 communication Effects 0.000 description 45
- 238000005516 engineering process Methods 0.000 description 37
- 230000006870 function Effects 0.000 description 33
- 238000010586 diagram Methods 0.000 description 28
- 238000007726 management method Methods 0.000 description 21
- 230000005540 biological transmission Effects 0.000 description 19
- 230000007613 environmental effect Effects 0.000 description 19
- 238000004364 calculation method Methods 0.000 description 18
- 230000005236 sound signal Effects 0.000 description 13
- 238000010295 mobile communication Methods 0.000 description 11
- 210000000988 bone and bone Anatomy 0.000 description 10
- 239000011521 glass Substances 0.000 description 10
- 230000004927 fusion Effects 0.000 description 8
- 101000827703 Homo sapiens Polyphosphoinositide phosphatase Proteins 0.000 description 6
- 102100023591 Polyphosphoinositide phosphatase Human genes 0.000 description 6
- 230000000694 effects Effects 0.000 description 5
- 101001121408 Homo sapiens L-amino-acid oxidase Proteins 0.000 description 4
- 102100026388 L-amino-acid oxidase Human genes 0.000 description 4
- 230000003993 interaction Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000002159 abnormal effect Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008859 change Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 238000013500 data storage Methods 0.000 description 3
- 239000010985 leather Substances 0.000 description 3
- 230000007246 mechanism Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 208000002173 dizziness Diseases 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000004622 sleep time Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- 230000001755 vocal effect Effects 0.000 description 2
- 101100012902 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) FIG2 gene Proteins 0.000 description 1
- 101100233916 Saccharomyces cerevisiae (strain ATCC 204508 / S288c) KAR5 gene Proteins 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 230000003111 delayed effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000003862 health status Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000035945 sensitivity Effects 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000002618 waking effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/04—Indexing scheme for image data processing or generation, in general involving 3D image data
Definitions
- the present application belongs to the field of device control technology, and in particular, relates to a display method, device, electronic device and storage medium.
- VR virtual reality
- VST Video see through
- multiple threads in the VR display device need to work together to process the image data and generate VR composite images. If a thread has a delay fluctuation when processing image data, it will cause all threads to be unable to synchronize the processing trigger time of the next frame of image data, which is prone to frame loss, resulting in the final output picture being unsmooth, reducing the user's immersion and affecting the user's viewing experience.
- the embodiments of the present application provide a display method, device, electronic device and computer-readable storage medium, which can solve the problem of existing display technology. Since VR display devices require multiple threads to collaborate in the process of displaying VR composite images, when a thread processes image data and there is a delay fluctuation, frame loss is likely to occur, and the smoothness of the picture output is low.
- an embodiment of the present application provides a display method, which is applied to a virtual reality (VR) display device, wherein the VR display device includes multiple threads for generating VR composite images; the display method includes:
- the timestamps of the respective display cycles are sent to a plurality of threads respectively, so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamps of the Nth display cycle; the processing triggering moment of the threads is synchronized with the timestamps of the synchronization signals of the N+Pth display cycle; N and P are positive integers greater than 0;
- the threads are triggered in sequence to process image data at the processing triggering moments corresponding to the threads to generate VR composite images.
- the VR display device when receiving the first operation, the VR display device generates a synchronization signal in each display cycle and records the timestamp of generating the synchronization signal; the timestamp of recording each display cycle is sent to multiple threads for synthesizing VR synthetic images, and each thread can determine the processing trigger moment of the next display cycle according to the timestamp.
- the processing trigger moment of each thread that can process the image data in the next display cycle is synchronized with the timestamp of the synchronization signal of the next display cycle, it is possible to ensure that the processing trigger moments of multiple threads are synchronized with each other, and then according to the processing order corresponding to each thread, the threads are triggered to process the image data in sequence at the processing trigger moment, thereby generating a VR synthetic image.
- the processing trigger moment of each thread can be synchronized with the synchronization signal of the next display cycle, and the processing trigger moments between multiple threads are synchronized with each other, thereby realizing orderly control of multiple threads to collaboratively process image data, reducing the probability of frame loss, ensuring the smoothness of the output picture, and then improving the user's immersion and viewing experience.
- the multiple threads include an imaging thread; a processing triggering moment of the imaging thread is an exposure moment;
- the sending the timestamps of the respective display cycles to a plurality of threads respectively so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamps of the Nth display cycle comprises:
- the camera thread sends a first start instruction to the camera framework in the hardware abstraction layer; the first start instruction includes the timestamp;
- the camera framework In response to the first start-up instruction, the camera framework sends a second start-up instruction to the camera module;
- the camera module feeds back the exposure parameters of the preview image to the camera framework, wherein the preview image is obtained by the camera module based on the second start instruction, and the exposure parameters include the start exposure moment of the preview image;
- the camera framework calculates the time deviation between the timestamp and the exposure start time
- the exposure moment of the camera thread in the N+Pth display period is determined according to the timestamp of the Nth display period and the time deviation.
- the camera module includes a main camera module and at least one slave camera module; the main camera module and the slave camera module complete hardware synchronization through a second startup instruction sent by the camera framework when starting.
- the multiple threads include: a graphics processing thread
- the sending the timestamps of the respective display cycles to a plurality of threads respectively so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamps of the Nth display cycle comprises:
- the graphics processing thread calculates a timestamp of an N+Pth display cycle; the timestamp of the N+Pth display cycle is determined according to the timestamp of the Nth display cycle and a processing frame rate of the graphics processing thread;
- the graphics processing thread determines a processing completion time of the Nth display cycle; the processing completion time is a time corresponding to when the graphics processing thread completes processing of the image data of the Nth display cycle;
- the graphics processing thread calculates a sleep duration; the sleep duration is calculated based on the processing completion time and the timestamp of the N+Pth display cycle;
- the graphics processing thread is set to a sleep state, and the graphics processing thread is awakened when the duration of the sleep state reaches the sleep duration.
- the graphics processing thread includes a secondary rendering thread
- the graphics processing thread determines the processing completion time of the Nth display cycle, including:
- the secondary rendering thread determines a second rendering completion time of the Nth display cycle
- the graphics processing thread calculates the sleep duration, including:
- the secondary rendering thread calculates a time difference between a timestamp of the N+Pth display cycle and a time point at which the rendering is completed, and uses the time difference as a second sleep duration.
- the graphics processing thread determines the processing completion time of the Nth display cycle, including:
- the graphics processing thread calculates the sleep duration, including:
- the processing chip generates the synchronization signal of each display period based on a preset display frame rate
- the service process in the application framework layer reads the timestamp of the synchronization signal and stores it in a time variable corresponding to the service process.
- the decentralized central processing unit DPU generates the synchronization signal at the display frame rate.
- the processing chip generates the synchronization signal of each display period based on a preset display frame rate, including:
- the display driver chip sends a feedback signal to the central processing unit CPU, so that the CPU generates the synchronization signal when receiving the feedback signal; the feedback signal is generated when the display driver chip refreshes a frame of the VR composite image.
- sending the timestamps of the display periods to multiple threads respectively includes:
- the runtime application running in the application layer reads the timestamp recorded in the time variable; the runtime application communicates with the service process in the application framework layer through a preset interface;
- P is determined based on a ratio between a display period of the VR synthetic image and a processing period of the thread.
- an embodiment of the present application provides a display device, including:
- a timestamp recording unit configured to record, in response to a first operation, a timestamp of a synchronization signal generated in each display period;
- a timestamp sending unit is used to send the timestamp of each display cycle to multiple threads respectively, so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamp of the Nth display cycle; the processing triggering moment of the threads is synchronized with the timestamp of the synchronization signal of the N+Pth display cycle; N and P are positive integers greater than 0; the multiple threads are used to generate VR synthetic images;
- the multiple threads include a camera thread; a processing trigger moment of the camera thread is an exposure moment; and the timestamp issuing unit includes:
- a first startup instruction transmission unit used for the camera thread to send a first startup instruction to the camera framework in the hardware abstraction layer; the first startup instruction includes the timestamp;
- a second startup instruction transmission unit configured to cause the camera framework to send a second startup instruction to the camera module in response to the first startup instruction
- an exposure parameter feedback unit configured for the camera module to feed back the exposure parameters of the preview image to the camera framework, wherein the preview image is obtained by the camera module based on the second start instruction, and the exposure parameters include the start exposure moment of the preview image;
- a time deviation calculation unit used for calculating the time deviation between the timestamp and the exposure start time in the camera framework
- the exposure time determination unit is used to determine the exposure time of the camera thread in the N+Pth display period according to the timestamp of the Nth display period and the time deviation.
- the camera module includes a main camera module and at least one slave camera module; the main camera module and the slave camera module complete hardware synchronization through a second startup instruction sent by the camera framework when starting.
- the multiple threads include: a graphics processing thread; and the timestamp issuing unit includes:
- a timestamp calculation unit used for the graphics processing thread to calculate the timestamp of the N+Pth display cycle; the timestamp of the N+Pth display cycle is determined according to the timestamp of the Nth display cycle and the processing frame rate of the graphics processing thread;
- a rendering completion time determination unit used for the graphics processing thread to determine the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the completion of the processing of the image data by the graphics processing thread in the Nth display cycle;
- a sleep duration calculation unit used for the graphics processing thread to calculate the sleep duration; the sleep duration is calculated based on the processing completion time and the timestamp of the N+Pth display cycle;
- the sleep state triggering unit is used to set the graphics processing thread to a sleep state and wake up the graphics processing thread when the duration of the sleep state reaches the sleep duration.
- the graphics processing thread includes a secondary rendering thread
- the rendering completion time determination unit comprises:
- a second rendering time determination unit used for the secondary rendering thread to determine the second rendering completion time of the Nth display cycle
- the sleep duration calculation unit comprises:
- the second sleep calculation unit is used for the secondary rendering thread to calculate the time difference between the timestamp of the N+Pth display cycle and the rendering completion time, and use the time difference as the second sleep duration.
- the graphics processing thread includes a primary rendering thread; a time difference between an expected triggering moment of the primary rendering thread and an expected triggering moment of the secondary rendering thread is a preset interval duration;
- the rendering completion time determination unit comprises:
- a first rendering time determination unit used for the primary rendering thread to determine a first rendering completion time of an Nth display cycle
- the sleep duration calculation unit comprises:
- the first sleep calculation unit is used for calculating a first sleep duration for the rendering thread; the first sleep duration is determined according to the timestamp of the N+Pth display cycle, the preset interval duration and the first rendering completion time.
- the timestamp recording unit includes:
- a synchronization signal generating unit configured to generate the synchronization signal of each display period based on a preset display frame rate by a processing chip
- a device node storage unit configured to record the timestamp of generating the synchronization signal based on the generation time of the synchronization signal, and store the timestamp in a device node in the kernel layer;
- the time variable storage unit is used for the service process in the application framework layer to read the timestamp of the synchronization signal and store it in the time variable corresponding to the service process.
- the synchronization signal generating unit includes:
- the first synchronization signal generating unit is used for the central processing unit distributed unit DPU to generate the synchronization signal at the display frame rate.
- the synchronization signal generating unit includes:
- the second synchronization signal generating unit is used for the display driver chip to send a feedback signal to the central processing unit CPU, so that the CPU generates the synchronization signal when receiving the feedback signal; the feedback signal is generated when the display driver chip refreshes a frame of the VR composite image.
- the timestamp issuing unit includes:
- a runtime application reading unit used for a runtime application running in the application layer to read the timestamp recorded in the time variable; the runtime application communicates with the service process in the application framework layer through a preset interface;
- a runtime application sending unit is used for the runtime application to send the timestamp to the multiple threads.
- P is determined based on a ratio between a display period of the VR synthetic image and a processing period of the thread.
- an embodiment of the present application provides a display device, comprising: a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the display method described in any one of the first aspects when executing the computer program.
- an embodiment of the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and wherein the computer program, when executed by a processor, implements the display method described in any one of the first aspects above.
- an embodiment of the present application provides a computer program product.
- the computer program product When the computer program product is run on an electronic device, the electronic device executes the display method described in any one of the first aspects above.
- an embodiment of the present application provides a chip system, including a processor, the processor being coupled to a memory, the processor executing a computer program stored in the memory to implement a display method as described in any one of the first aspects.
- FIG1 is a schematic diagram of an implementation of displaying a VR composite screen by an electronic device provided by an embodiment of the present application
- FIG2 is a schematic diagram of the structure of an electronic device provided in an embodiment of the present application.
- FIG3 is a software structure block diagram of an electronic device according to an embodiment of the present application.
- FIG4 is a schematic diagram of a screen based on a VST display technology provided by an embodiment of the present application.
- FIG5 is a data flow diagram of a VR synthetic image based on VST technology provided by an embodiment of the present application.
- FIG6 is a data flow diagram of a VR synthetic image based on VST technology provided by another embodiment of the present application.
- FIG7 is a flowchart of an implementation of a display method provided in an embodiment of the present application.
- FIG8 is a flowchart of a specific implementation of S701 provided in an embodiment of the present application.
- FIG. 9 is a flowchart of a specific implementation of S701 based on a software framework according to an embodiment of the present application.
- FIG10 is a schematic diagram showing a comparison of the calibration process before and after the triggering moment according to an embodiment of the present application
- FIG. 11 is a flowchart of the implementation of the camera thread calibration processing triggering moment provided by an embodiment of the present application.
- FIG13 is a schematic diagram of calibration of exposure timing provided by an embodiment of the present application.
- FIG15 is a control timing diagram of a graphics processing thread provided by an embodiment of the present application.
- FIG16 is a timing diagram of different graphics processing threads occupying a GPU according to an embodiment of the present application.
- FIG17 is a flowchart of an implementation of each graphics processing thread determining a triggering time according to an embodiment of the present application
- FIG18 is a schematic diagram of dividing a processing order provided by an embodiment of the present application.
- FIG19 is a timing diagram of various stages in a process of generating a multi-frame VR composite image by an electronic device provided by an embodiment of the present application;
- FIG20 is a structural block diagram of a display device provided in an embodiment of the present application.
- FIG. 21 is a structural block diagram of an electronic device provided in one embodiment of the present application.
- the term “if” can be interpreted as “when” or “uponce” or “in response to determining” or “in response to detecting”, depending on the context.
- the phrase “if it is determined” or “if [described condition or event] is detected” can be interpreted as meaning “uponce it is determined” or “in response to determining” or “uponce [described condition or event] is detected” or “in response to detecting [described condition or event]", depending on the context.
- references to "one embodiment” or “some embodiments” etc. described in the specification of this application mean that one or more embodiments of the present application include specific features, structures or characteristics described in conjunction with the embodiment. Therefore, the statements “in one embodiment”, “in some embodiments”, “in some other embodiments”, “in some other embodiments”, etc. that appear in different places in this specification do not necessarily refer to the same embodiment, but mean “one or more but not all embodiments", unless otherwise specifically emphasized in other ways.
- the terms “including”, “comprising”, “having” and their variations all mean “including but not limited to”, unless otherwise specifically emphasized in other ways.
- the display method provided in the embodiments of the present application can be applied to electronic devices such as mobile phones, tablet computers, augmented reality (AR)/virtual reality (VR) display devices, laptop computers, ultra-mobile personal computers (UMPCs), netbooks, and personal digital assistants (PDAs).
- the display method can be applied to electronic devices that can realize VR display, or electronic devices that are externally connected to VR display devices.
- the embodiments of the present application do not impose any restrictions on the specific types of electronic devices.
- FIG1 shows a schematic diagram of an implementation of displaying a VR composite screen by an electronic device provided in an embodiment of the present application.
- the electronic device can be a wearable VR display device.
- the VR display device has a built-in processor 11 and a camera module 12.
- the processor 11 includes: a central processing unit (CPU) and a graphics processing unit (GPU), etc.
- the image data is processed and synthesized by the processor.
- the camera module 12 can be used to obtain the environmental image of the scene where the wearer (i.e., user) is located.
- the processor 11 can synthesize the environmental image with the virtual picture to generate a VR synthetic image based on VST technology.
- the electronic device may be a smart phone, and the smart phone 13 may establish a communication connection with a wearable pair of glasses 14.
- the communication connection may be a wired communication connection or a wireless communication connection; for example, the smart phone 13 may be connected to the wearable pair of glasses 14 via a serial port; if the wearable pair of glasses 14 is equipped with a wireless communication module, such as a Bluetooth module or a WIFI module, the smart phone 13 may establish a communication connection with the wearable pair of glasses 14 via the wireless communication module.
- the wearable pair of glasses 14 may also be equipped with a camera module, which captures environmental images and feeds them back to the smart phone 13.
- the smart phone 13 may synthesize the environmental images with the virtual images through the built-in processor, generate a VR synthesized image based on the VST technology, and feed it back to the wearable pair of glasses, and output the VR synthesized image through the wearable pair of glasses.
- the electronic device can be a station (STAION, ST) in a WLAN, a cellular phone, a cordless phone, a Session Initiation Protocol (SIP) phone, a Wireless Local Loop (WLL) station, a personal digital assistant (PDA) device, a handheld device with wireless communication function, a computing device or other processing device connected to a wireless modem, a computer, a laptop computer, a handheld communication device, a handheld computing device, and/or other devices for communicating on a wireless system and a next-generation communication system, such as a mobile terminal in a 5G network or a mobile terminal in a future evolved public land mobile network (PLMN) network, etc.
- STAION, ST in a WLAN
- a cellular phone a cordless phone
- SIP Session Initiation Protocol
- WLL Wireless Local Loop
- PDA personal digital assistant
- a handheld device with wireless communication function a computing device or other processing device connected to a wireless modem
- a computer a laptop
- FIG. 2 shows a schematic structural diagram of the electronic device 100 .
- the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a universal serial bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, and a subscriber identification module (SIM) card interface 195, etc.
- SIM subscriber identification module
- the sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, etc.
- the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the electronic device 100.
- the electronic device 100 may include more or fewer components than shown in the figure, or combine some components, or split some components, or arrange the components differently.
- the components shown in the figure may be implemented in hardware, software, or a combination of software and hardware.
- the processor 110 may include one or more processing units, for example, the processor 110 may include an application processor (AP), a modem processor, a graphics processor (GPU), an image signal processor (ISP), a controller, a video codec, a digital signal processor (DSP), a baseband processor, a central processing unit (DPU), and/or a neural-network processing unit (NPU). Different processing units may be independent devices or integrated into one or more processors.
- AP application processor
- GPU graphics processor
- ISP image signal processor
- DSP digital signal processor
- DPU central processing unit
- NPU neural-network processing unit
- Different processing units may be independent devices or integrated into one or more processors.
- the controller can generate operation control signals according to the instruction operation code and timing signal to complete the control of instruction fetching and execution.
- the processor 110 may also be provided with a memory for storing instructions and data.
- the memory in the processor 110 is a cache memory.
- the memory may store instructions or data that the processor 110 has just used or cyclically used. If the processor 110 needs to use the instruction or data again, it may be directly called from the memory. This avoids repeated access, reduces the waiting time of the processor 110, and thus improves the efficiency of the system.
- the processor 110 may include one or more interfaces.
- the interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit sound (I2S) interface, a pulse code modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a mobile industry processor interface (MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (SIM) interface, and/or a universal serial bus (USB) interface, etc.
- I2C inter-integrated circuit
- I2S inter-integrated circuit sound
- PCM pulse code modulation
- UART universal asynchronous receiver/transmitter
- MIPI mobile industry processor interface
- GPIO general-purpose input/output
- SIM subscriber identity module
- USB universal serial bus
- the I2C interface is a bidirectional synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
- the processor 110 may include multiple groups of I2C buses.
- the processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces.
- the processor 110 may be coupled to the touch sensor 180K through the I2C interface, so that the processor 110 communicates with the touch sensor 180K through the I2C bus interface, thereby realizing the touch function of the electronic device 100.
- the I2S interface can be used for audio communication.
- the processor 110 can include multiple I2S buses.
- the processor 110 can be coupled to the audio module 170 via the I2S bus to achieve communication between the processor 110 and the audio module 170.
- the audio module 170 can transmit an audio signal to the wireless communication module 160 via the I2S interface to achieve the function of answering a call through a Bluetooth headset.
- the PCM interface can also be used for audio communication, sampling, quantizing and encoding analog signals.
- the audio module 170 and the wireless communication module 160 can be coupled via a PCM bus interface.
- the audio module 170 can also transmit audio signals to the wireless communication module 160 via the PCM interface to realize the function of answering calls via a Bluetooth headset. Both the I2S interface and the PCM interface can be used for audio communication.
- the UART interface is a universal serial data bus for asynchronous communication.
- the bus can be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication.
- the UART interface is generally used to connect the processor 110 and the wireless communication module 160.
- the processor 110 communicates with the Bluetooth module in the wireless communication module 160 through the UART interface to implement the Bluetooth function.
- the audio module 170 can transmit an audio signal to the wireless communication module 160 through the UART interface to implement the function of playing music through a Bluetooth headset.
- the MIPI interface can be used to connect the processor 110 with peripheral devices such as the display screen 194 and the camera 193.
- the MIPI interface includes a camera serial interface (CSI), a display serial interface (DSI), etc.
- the processor 110 and the camera 193 communicate via the CSI interface to implement the shooting function of the electronic device 100.
- the processor 110 and the display screen 194 communicate via the DSI interface to implement the display function of the electronic device 100.
- the GPIO interface can be configured by software.
- the GPIO interface can be configured as a control signal or as a data signal.
- the GPIO interface can be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, etc.
- the GPIO interface can also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, etc.
- the USB interface 130 is an interface that complies with the USB standard specification, and specifically can be a Mini USB interface, a Micro USB interface, a USB Type C interface, etc.
- the USB interface 130 can be used to connect a charger to charge the electronic device 100, and can also be used to transmit data between the electronic device 100 and a peripheral device. It can also be used to connect headphones to play audio through the headphones.
- the interface can also be used to connect other electronic devices, such as AR devices, etc.
- the interface connection relationship between the modules illustrated in the embodiment of the present invention is only a schematic illustration and does not constitute a structural limitation on the electronic device 100.
- the electronic device 100 may also adopt different interface connection methods in the above embodiments, or a combination of multiple interface connection methods.
- the charging management module 140 is used to receive charging input from a charger.
- the charger may be a wireless charger or a wired charger.
- the charging management module 140 may receive charging input from a wired charger through the USB interface 130.
- the charging management module 140 may receive wireless charging input through a wireless charging coil of the electronic device 100. While the charging management module 140 is charging the battery 142, it may also power the electronic device through the power management module 141.
- the power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110.
- the power management module 141 receives input from the battery 142 and/or the charging management module 140, and supplies power to the processor 110, the internal memory 121, the display screen 194, the camera 193, and the wireless communication module 160.
- the power management module 141 can also be used to monitor parameters such as battery capacity, battery cycle number, battery health status (leakage, impedance), etc.
- the power management module 141 can also be set in the processor 110.
- the power management module 141 and the charging management module 140 can also be set in the same device.
- the wireless communication function of the electronic device 100 can be implemented through the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, the modem processor and the baseband processor.
- Antenna 1 and antenna 2 are used to transmit and receive electromagnetic wave signals.
- Each antenna in electronic device 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve the utilization of antennas.
- antenna 1 can be reused as a diversity antenna for a wireless local area network.
- the antenna can be used in combination with a tuning switch.
- the mobile communication module 150 can provide solutions for wireless communications including 2G/3G/4G/5G, etc., applied to the electronic device 100.
- the mobile communication module 150 may include at least one filter, a switch, a power amplifier, a low noise amplifier (LNA), etc.
- the mobile communication module 150 may receive electromagnetic waves from the antenna 1, and perform filtering, amplification, and other processing on the received electromagnetic waves, and transmit them to the modulation and demodulation processor for demodulation.
- the mobile communication module 150 may also amplify the signal modulated by the modulation and demodulation processor, and convert it into electromagnetic waves for radiation through the antenna 1.
- at least some of the functional modules of the mobile communication module 150 may be arranged in the processor 110.
- at least some of the functional modules of the mobile communication module 150 may be arranged in the same device as at least some of the modules of the processor 110.
- the modem processor may include a modulator and a demodulator.
- the modulator is used to modulate the low-frequency baseband signal to be sent into a medium-high frequency signal.
- the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal.
- the demodulator then transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
- the application processor outputs a sound signal through an audio device (not limited to a speaker 170A, a receiver 170B, etc.), or displays an image or video through a display screen 194.
- the modem processor may be an independent device.
- the modem processor may be independent of the processor 110 and be set in the same device as the mobile communication module 150 or other functional modules.
- the wireless communication module 160 can provide wireless communication solutions including wireless local area networks (WLAN) (such as wireless fidelity (Wi-Fi) network), bluetooth (BT), global navigation satellite system (GNSS), frequency modulation (FM), near field communication (NFC), infrared (IR) and the like applied to the electronic device 100.
- WLAN wireless local area networks
- BT wireless fidelity
- GNSS global navigation satellite system
- FM frequency modulation
- NFC near field communication
- IR infrared
- the wireless communication module 160 can be one or more devices integrating at least one communication processing module.
- the wireless communication module 160 receives electromagnetic waves via the antenna 2, modulates the frequency of the electromagnetic wave signal and performs filtering processing, and sends the processed signal to the processor 110.
- the wireless communication module 160 can also receive the signal to be sent from the processor 110, modulate the frequency of the signal, amplify the signal, and convert it into electromagnetic waves for radiation through the antenna 2.
- the antenna 1 of the electronic device 100 is coupled to the mobile communication module 150, and the antenna 2 is coupled to the wireless communication module 160, so that the electronic device 100 can communicate with the network and other devices through wireless communication technology.
- the wireless communication technology may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), wideband code division multiple access (WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC, FM, and/or IR technology.
- the GNSS may include a global positioning system (GPS), a global navigation satellite system (GLONASS), a Beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS) and/or a satellite based augmentation system (SBAS).
- GPS global positioning system
- GLONASS global navigation satellite system
- BDS Beidou navigation satellite system
- QZSS quasi-zenith satellite system
- SBAS satellite based augmentation system
- the electronic device 100 implements the display function through a GPU, a display screen 194, and an application processor.
- the GPU is a prompt microprocessor that connects the display screen 194 and the application processor.
- the GPU is used to perform mathematical and geometric calculations for graphics rendering.
- the processor 110 may include one or more GPUs that execute program instructions to generate or change display information. It should be noted that the GPU can perform abnormal identification on the storage unit associated with the controlled display screen 194 through the display method provided in this embodiment.
- the GPU can transfer the image data to be displayed to the storage unit in the display screen 194 for storage for subsequent display. If the electronic device is a smart phone, the electronic device can be connected to external wearable glasses through a serial interface or a wireless communication interface, and the display function is implemented through the wearable glasses when it is in VR display mode.
- the display screen 194 is used to display images, videos, etc.
- the display screen 194 includes a display panel.
- the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode or an active-matrix organic light-emitting diode (AMOLED), a flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diodes (QLED), etc.
- the electronic device 100 may include 1 or N display screens 194, where N is a positive integer greater than 1.
- the display screen 194 may include a touch panel and other input devices.
- the display screen 194 may be associated with one or more storage units, which are used to cache image data displayed on the display screen 194.
- the electronic device 100 can realize the shooting function through ISP, camera 193, video codec, GPU, display screen 194 and application processor.
- ISP is used to process the data fed back by camera 193. For example, when taking a photo, the shutter is opened, and the light is transmitted to the camera photosensitive element through the lens. The light signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to ISP for processing and converts it into an image visible to the naked eye. ISP can also perform algorithm optimization on the noise, brightness, and skin color of the image. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene. In some embodiments, ISP can be set in camera 193.
- the camera 193 is used to capture still images or videos.
- the object generates an optical image through the lens and projects it onto the photosensitive element.
- the photosensitive element can be a charge coupled device (CCD) or a complementary metal oxide semiconductor (CMOS) phototransistor.
- CMOS complementary metal oxide semiconductor
- the photosensitive element converts the optical signal into an electrical signal, and then passes the electrical signal to the ISP to be converted into a digital image signal.
- the ISP outputs the digital image signal to the DSP for processing.
- the DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
- the electronic device 100 may include 1 or N cameras 193, where N is a positive integer greater than 1.
- the digital signal processor is used to process digital signals, and can process not only digital image signals but also other digital signals. For example, when the electronic device 100 is selecting a frequency point, the digital signal processor is used to perform Fourier transform on the frequency point energy.
- Video codecs are used to compress or decompress digital videos.
- the electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record videos in a variety of coding formats, such as Moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
- MPEG Moving Picture Experts Group
- MPEG2 MPEG2, MPEG3, MPEG4, etc.
- NPU is a neural network (NN) computing processor.
- NN neural network
- applications such as intelligent cognition of electronic device 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, etc.
- the external memory interface 120 can be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 100.
- the external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music and videos are saved in the external memory card.
- the display method in the embodiment of the present application can manage the storage space in the external memory card.
- the internal memory 121 can be used to store computer executable program codes, which include instructions.
- the internal memory 121 may include a program storage area and a data storage area.
- the program storage area may store an operating system, an application required for at least one function (such as a sound playback function, an image playback function, etc.), etc.
- the data storage area may store data created during the use of the electronic device 100 (such as audio data, a phone book, etc.), etc.
- the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one disk storage device, a flash memory device, a universal flash storage (UFS), etc.
- the processor 110 executes various functional applications and data processing of the electronic device 100 by running instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
- the electronic device 100 can implement audio functions such as music playing and recording through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone jack 170D, and the application processor.
- the audio module 170 is used to convert digital audio information into analog audio signal output, and is also used to convert analog audio input into digital audio signals.
- the audio module 170 can also be used to encode and decode audio signals.
- the audio module 170 can be arranged in the processor 110, or some functional modules of the audio module 170 can be arranged in the processor 110.
- the speaker 170A also called a "speaker" is used to convert an audio electrical signal into a sound signal.
- the electronic device 100 can listen to music or listen to a hands-free call through the speaker 170A.
- the speaker 170A can be used to output prompt information to inform the user of the part that needs to be touched by the electronic scale.
- the receiver 170B also called a "earpiece" is used to convert audio electrical signals into sound signals.
- the voice can be received by placing the receiver 170B close to the human ear.
- Microphone 170C also called “microphone” or “microphone” is used to convert sound signals into electrical signals. When making a call or sending a voice message, the user can speak by putting their mouth close to microphone 170C to input the sound signal into microphone 170C.
- the electronic device 100 can be provided with at least one microphone 170C. In other embodiments, the electronic device 100 can be provided with two microphones 170C, which can not only collect sound signals but also realize noise reduction function. In other embodiments, the electronic device 100 can also be provided with three, four or more microphones 170C to collect sound signals, reduce noise, identify the sound source, realize directional recording function, etc.
- the earphone interface 170D is used to connect a wired earphone.
- the earphone interface 170D may be the USB interface 130, or may be a 3.5 mm open mobile terminal platform (OMTP) standard interface or a cellular telecommunications industry association of the USA (CTIA) standard interface.
- OMTP open mobile terminal platform
- CTIA cellular telecommunications industry association of the USA
- the pressure sensor 180A is used to sense the pressure signal and can convert the pressure signal into an electrical signal.
- the pressure sensor 180A can be set on the display screen 194.
- the electronic device can obtain the user's weight through the pressure sensor 180A.
- the capacitive pressure sensor can be a parallel plate including at least two conductive materials. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the intensity of the pressure based on the change in capacitance. When a touch operation acts on the display screen 194, the electronic device 100 detects the touch operation intensity according to the pressure sensor 180A.
- the electronic device 100 can also calculate the touch position according to the detection signal of the pressure sensor 180A.
- touch operations acting on the same touch position but with different touch operation intensities can correspond to different operation instructions. For example: when a touch operation with a touch operation intensity less than the first pressure threshold acts on the short message application icon, an instruction to view the short message is executed. When a touch operation with a touch operation intensity greater than or equal to a first pressure threshold acts on the short message application icon, an instruction to create a new short message is executed.
- the gyro sensor 180B can be used to determine the motion posture of the electronic device 100.
- the angular velocity of the electronic device 100 around three axes i.e., x, y, and z axes
- the gyro sensor 180B can be used for anti-shake shooting. For example, when the shutter is pressed, the gyro sensor 180B detects the angle of the electronic device 100 shaking, calculates the distance that the lens module needs to compensate based on the angle, and allows the lens to offset the shaking of the electronic device 100 through reverse movement to achieve anti-shake.
- the gyro sensor 180B can also be used for navigation and somatosensory game scenes.
- the air pressure sensor 180C is used to measure air pressure.
- the electronic device 100 calculates the altitude through the air pressure value measured by the air pressure sensor 180C to assist positioning and navigation.
- the magnetic sensor 180D includes a Hall sensor.
- the electronic device 100 can use the magnetic sensor 180D to detect the opening and closing of the flip leather case.
- the electronic device 100 when the electronic device 100 is a flip phone, the electronic device 100 can detect the opening and closing of the flip cover according to the magnetic sensor 180D. Then, according to the detected opening and closing state of the leather case or the opening and closing state of the flip cover, the flip cover can be automatically unlocked.
- the acceleration sensor 180E can detect the magnitude of the acceleration of the electronic device 100 in all directions (generally three axes). When the electronic device 100 is stationary, the magnitude and direction of gravity can be detected. It can also be used to identify the posture of the electronic device and is applied to applications such as horizontal and vertical screen switching and pedometers.
- the distance sensor 180F is used to measure the distance.
- the electronic device 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the electronic device 100 can use the distance sensor 180F to measure the distance to achieve fast focusing.
- the proximity light sensor 180G may include, for example, a light emitting diode (LED) and a light detector, such as a photodiode.
- the light emitting diode may be an infrared light emitting diode.
- the electronic device 100 emits infrared light outward through the light emitting diode.
- the electronic device 100 uses a photodiode to detect infrared reflected light from nearby objects. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 can determine that there is no object near the electronic device 100.
- the electronic device 100 can use the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear to talk, so as to automatically turn off the screen to save power.
- the proximity light sensor 180G can also be used in leather case mode and pocket mode to automatically unlock and lock the screen.
- the ambient light sensor 180L is used to sense the brightness of the ambient light.
- the electronic device 100 can adaptively adjust the brightness of the display screen 194 according to the perceived ambient light brightness.
- the ambient light sensor 180L can also be used to automatically adjust the white balance when taking pictures.
- the ambient light sensor 180L can also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
- the fingerprint sensor 180H is used to collect fingerprints.
- the electronic device 100 can use the collected fingerprint characteristics to implement fingerprint unlocking, access application locks, fingerprint photography, fingerprint call answering, etc.
- the temperature sensor 180J is used to detect temperature.
- the electronic device 100 uses the temperature detected by the temperature sensor 180J to execute a temperature processing strategy. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 reduces the performance of a processor located near the temperature sensor 180J to reduce power consumption and implement thermal protection. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 heats the battery 142 to avoid abnormal shutdown of the electronic device 100 due to low temperature. In other embodiments, when the temperature is lower than another threshold, the electronic device 100 boosts the output voltage of the battery 142 to avoid abnormal shutdown caused by low temperature.
- the touch sensor 180K is also called a "touch control device”.
- the touch sensor 180K can be set on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, also called a "touch control screen”.
- the touch sensor 180K is used to detect touch operations acting on or near it.
- the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
- Visual output related to the touch operation can be provided through the display screen 194.
- the touch sensor 180K can also be set on the surface of the electronic device 100, which is different from the position of the display screen 194.
- the bone conduction sensor 180M can obtain a vibration signal. In some embodiments, the bone conduction sensor 180M can obtain a vibration signal of a vibrating bone block of the vocal part of the human body. The bone conduction sensor 180M can also contact the human pulse to receive a blood pressure beat signal. In some embodiments, the bone conduction sensor 180M can also be set in an earphone and combined into a bone conduction earphone.
- the audio module 170 can parse out a voice signal based on the vibration signal of the vibrating bone block of the vocal part obtained by the bone conduction sensor 180M to realize a voice function.
- the application processor can parse the heart rate information based on the blood pressure beat signal obtained by the bone conduction sensor 180M to realize a heart rate detection function.
- the key 190 includes a power key, a volume key, etc.
- the key 190 may be a mechanical key or a touch key.
- the electronic device 100 may receive key input and generate key signal input related to user settings and function control of the electronic device 100.
- Motor 191 can generate vibration prompts.
- Motor 191 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
- touch operations acting on different applications can correspond to different vibration feedback effects.
- touch operations acting on different areas of the display screen 194 can also correspond to different vibration feedback effects.
- Different application scenarios for example: time reminders, receiving messages, alarm clocks, games, etc.
- the touch vibration feedback effect can also support customization.
- Indicator 192 may be an indicator light, which may be used to indicate charging status, power changes, messages, missed calls, notifications, etc.
- the SIM card interface 195 is used to connect a SIM card.
- the SIM card can be connected to and separated from the electronic device 100 by inserting it into the SIM card interface 195 or pulling it out from the SIM card interface 195.
- the electronic device 100 can support 1 or N SIM card interfaces, where N is a positive integer greater than 1.
- the SIM card interface 195 can support Nano SIM cards, Micro SIM cards, SIM cards, and the like. Multiple cards can be inserted into the same SIM card interface 195 at the same time. The types of the multiple cards can be the same or different.
- the SIM card interface 195 can also be compatible with different types of SIM cards.
- the SIM card interface 195 can also be compatible with external memory cards.
- the electronic device 100 interacts with the network through the SIM card to implement functions such as calls and data communications.
- the electronic device 100 uses an eSIM, i.e., an embedded SIM card.
- the eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
- the software system of the electronic device 100 may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
- the Android system of the layered architecture is taken as an example to exemplify the software structure of the electronic device 100.
- FIG. 3 is a software structure block diagram of the electronic device according to an embodiment of the present application.
- the layered architecture divides the software into several layers, each with clear roles and division of labor.
- the layers communicate with each other through software interfaces.
- the Android system is divided into four layers, from top to bottom: the application layer, the application framework layer, the system layer of the Android runtime (Android runtime), and the kernel layer.
- the application layer can include a series of application packages.
- the application package may include applications such as camera, calendar, map, WLAN, Bluetooth, music, video, short message, mailbox, WeChat, WPS, etc.
- the application framework layer provides application programming interface (API) and programming framework for the applications in the application layer.
- API application programming interface
- the application framework layer includes some predefined functions.
- the application framework layer may include a window manager, a content provider, a view system, a telephony manager, a resource manager, a notification manager, and the like.
- the window manager is used to manage window programs.
- the window manager can obtain the display screen size, determine whether there is a status bar, lock the screen, capture the screen, etc.
- Content providers are used to store and retrieve data and make it accessible to applications.
- the data may include videos, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
- the view system includes visual controls, such as controls for displaying text, controls for displaying images, etc.
- the view system can be used to build applications.
- a display interface can be composed of one or more views.
- a display interface including a text notification icon can include a view for displaying text and a view for displaying images.
- the phone manager is used to provide communication functions for electronic devices, such as the management of call status (including answering, hanging up, etc.).
- the resource manager provides various resources for applications, such as localized strings, icons, images, layout files, video files, and so on.
- the notification manager enables applications to display notification information in the status bar. It can be used to convey notification-type messages and can disappear automatically after a short stay without user interaction. For example, the notification manager is used to notify download completion, message reminders, etc.
- the notification manager can also be a notification that appears in the system top status bar in the form of a chart or scroll bar text, such as notifications of applications running in the background, or a notification that appears on the screen in the form of a dialog window. For example, a text message is displayed in the status bar, a prompt sound is emitted, an electronic device vibrates, an indicator light flashes, etc.
- Android Runtime includes core libraries and virtual machines. Android runtime is responsible for scheduling and management of the Android system.
- the core library consists of two parts: one part is the function that needs to be called by the Java language, and the other part is the Android core library.
- the application layer and the application framework layer run in a virtual machine.
- the virtual machine executes the Java files of the application layer and the application framework layer as binary files.
- the virtual machine is used to perform functions such as object life cycle management, stack management, thread management, security and exception management, and garbage collection.
- the system layer can include multiple functional modules, such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
- functional modules such as surface manager, media library, 3D graphics processing library (such as OpenGL ES), 2D graphics engine (such as SGL), etc.
- the surface manager is used to manage the display subsystem and provide the fusion of 2D and 3D layers for multiple applications.
- the media library supports playback and recording of a variety of commonly used audio and video formats, as well as static image files, etc.
- the media library can support a variety of audio and video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
- the 3D graphics processing library is used to implement 3D graphics drawing, image rendering, synthesis and layer processing, etc.
- a 2D graphics engine is a drawing engine for 2D drawings.
- the kernel layer is the layer between hardware and software.
- the kernel layer contains at least display driver, camera driver, audio driver, and sensor driver.
- the corresponding hardware interrupt is sent to the kernel layer.
- the kernel layer processes the touch operation into a raw input event (including touch coordinates, timestamp of the touch operation, and other information).
- the raw input event is stored in the kernel layer.
- the application framework layer obtains the raw input event from the kernel layer and identifies the control corresponding to the input event. For example, if the touch operation is a touch single-click operation and the control corresponding to the single-click operation is the control of the camera application icon, the camera application calls the interface of the application framework layer to start the camera application, and then starts the camera driver by calling the kernel layer to capture static images or videos through the camera 193.
- VR display technology has become one of the mainstream display technologies today.
- Traditional VR display technology provides users with an immersive viewing experience.
- the above traditional VR display technology has been applied to many fields such as games and movies.
- traditional VR display technology builds one or more virtual graphics through the GPU in the processor, and generates purely fictitious virtual images through the combination of virtual graphics, its authenticity is low, thereby reducing the user's sense of immersion.
- VST technology has emerged.
- VST display technology can collect the real environment image of the scene collected by the user through the camera module configured on the VR display device, and superimpose a synthetic virtual image on the basis of the real environment image, that is, obtain a VR synthetic image based on VST technology. Since the background image in the VR synthetic image is often generated based on the real environment image, it can improve the authenticity of the image, and then improve the user's sense of immersion in the process of watching the VR synthetic image.
- FIG4 shows a schematic diagram of a screen based on VST display technology provided by an embodiment of the present application.
- the user wears a VR display device, which may be the VR display device shown in FIG1 (a).
- the VR display device is equipped with a camera module, through which the environmental image within the user's field of vision can be obtained, such as objects such as a television and a clock can be photographed, and the VR display device can synthesize the desired synthesized virtual image into the above-mentioned environmental image.
- FIG4 (b) By comparing FIG4 (a) with FIG4 (b), it can be found that the TV is in a turned-off state in the real scene, that is, no screen content is output.
- the specified video image frame can be added to the area where the TV is located in the environmental image, realizing the combination of the virtual image and the real image, improving the immersion of the output image, and then improving the user's viewing experience.
- VST display technology can improve the authenticity of VR synthetic images and thus enhance the user's immersive viewing experience, it also introduces new challenges to VR display devices, namely, the amount of data to be processed is large and the processing flow is long.
- VR synthetic images need to be combined with the real and the virtual, that is, the camera thread needs to be used to control the camera module to shoot the real scene, which involves processes such as image exposure and image transmission.
- the combination of virtual images requires the rendering thread to perform operations such as virtual image analysis and synthesis, and the environmental image needs to be overlaid and rendered with the virtual image, which requires the collaboration of multiple threads.
- the latency of different threads is different during the image data processing of VR display devices, and the latency is also affected by the real-time processing capability of the display device and fluctuates, for example, when the display device processes high-precision video decoding, it occupies more processing resources, while other threads can be allocated fewer processing resources, which will reduce the processing rate of other threads, thereby increasing the latency of this type of thread processing.
- Embodiment 1 is a diagrammatic representation of Embodiment 1:
- the present application provides a display method
- the execution subject of the display method is an electronic device
- the electronic device includes but is not limited to: a smart phone, a tablet computer, a computer, a laptop, a VR display device, etc., which can synthesize VR synthetic images based on VST technology.
- the electronic device can have a built-in VR display module, or it can be externally connected to a wearable VR display device, and the above-mentioned VR synthetic image is output through the external wearable VR display device.
- the VR display device needs to call multiple different threads to process the image data in the process of generating VR synthetic images.
- the processing trigger moment of multiple threads can be synchronized to achieve the purpose of orderly controlling multiple threads, thereby reducing the probability of frame loss, so as to improve the smoothness of the output picture, and then make the predicted posture more consistent with the actual picture, improve the authenticity of the VR synthetic image, reduce the dizziness of the user when watching, and enhance the user's viewing experience.
- FIG5 shows a data flow diagram of a VR synthetic image based on VST technology provided by an embodiment of the present application.
- the process of generating a VR synthetic image by an electronic device specifically includes the following multiple stages:
- Stage 1 Image exposure stage
- the image signal processing stage can be divided into the processing stage through the image front-end processor (Image Front End, IFE) (such as 0.5ms) and the processing stage through the image processor (Image processing engine, IPE) (such as 8.5ms).
- IFE image Front End
- IPE image processing engine
- the above-mentioned image exposure stage is completed through the camera module side.
- the processed image data needs to be transmitted to the relevant threads at the system layer for processing.
- the camera module is at the kernel layer, and the relevant threads transmitted to the system layer need to go through the hardware abstraction layer for data conversion and then be handed over to the GPU for processing. This will consume a certain amount of transmission time, such as 6.5ms.
- Phase 3 One-shot rendering and video parsing
- an electronic device When an electronic device generates a VR composite image, it needs to use a rendering application at the application layer to complete the task of rendering virtual objects, such as rendering a virtual keyboard, a virtual cartoon image, etc., which requires a rendering stage.
- this rendering stage can be performed in parallel with the above-mentioned image exposure stage, that is, while the image is exposed and transmitted, the electronic device can perform a rendering process, which also introduces a certain processing delay.
- the video decoder in the application framework layer needs to parse the video data frame by frame, and the above process also requires a certain amount of processing time.
- the virtual image data fusion also requires the introduction of processing time, for example, 0.6ms.
- This stage needs to complete the fusion of virtual and real, that is, the fusion of the environment image and the virtual screen, which involves the image frame selection stage (such as 10.2ms) and the rendering stage (such as 21.3ms) to generate a VR synthetic image.
- the delay of this stage is relatively long. In some implementation scenarios, the average delay of the secondary rendering stage can reach 21.3ms.
- Stage 7 Image display stage
- the display screen When the display screen obtains a VR composite image, it needs to be scanned row by row or column by column to output the image.
- FIG6 shows a data flow diagram of VR synthetic images based on VST technology provided by another embodiment of the present application.
- the relationship between each of the above stages and the corresponding modules can be:
- the original environment image is mainly collected through the camera module, and the original environment image is initially processed through IFE, where the processing is specifically used to distinguish the gaze area where the user's line of sight is focused in the original environment image, and the background area where the user is not focused.
- the background area can be blurred, and the gaze area is subjected to image enhancement and other related processing.
- the image data after the initial processing is then transmitted to IPE for secondary processing, such as performing image processing tasks such as hardware noise reduction, image cropping, color processing, detail enhancement, etc., to generate image data after secondary processing.
- the framework that completes the data transmission from the camera module to the upper layer in the hardware abstraction layer is the camera framework CAM X.
- CAM X can provide the camera module of the kernel layer through the interface to receive the relevant image data sent by the camera module to the system layer.
- the decoder located in the application framework layer can decode the required video data, and the decoded video image frame can be forwarded to the runtime application located in the application layer for subsequent secondary rendering.
- a rendering phase when the rendering application at the application layer needs to render a virtual object, it can render the virtual object through the rendering framework in the application framework layer (such as XR Plugin) and transmit the rendered virtual object to the runtime application in the application layer.
- the rendering framework in the application framework layer such as XR Plugin
- the runtime application in the application layer can call related threads to perform image fusion on virtual objects and video image frames to obtain virtual image data.
- the runtime application located in the application layer can call related threads to fuse the environmental image data and the virtual image data.
- the GPU of the kernel layer it is necessary to call the GPU of the kernel layer to complete it.
- the image data can be forwarded through the interface provided by the 3D graphics processing library in the system layer.
- the GPU will feed back the VR composite image after secondary rendering to the runtime application.
- the runtime application in the system layer sends the synthesized VR composite image for display.
- the corresponding VR composite images are output through the VR display screen and binocular screen respectively.
- the electronic device can generate a synchronization signal through the processor and send it to the relevant threads in the above stages, and synchronize and calibrate the processing trigger time of the threads in each stage to ensure that the processing trigger time of each stage is aligned with the synchronization signal, so as to achieve the purpose of orderly controlling each thread and reduce the probability of frame loss.
- FIG7 shows an implementation flow chart of the display method provided by an embodiment of the present application.
- the user when the user needs to play the VR picture, the user can perform a first operation on the electronic device, and the first operation is used to indicate that the user needs to view the VR picture.
- the electronic device detects the user's first operation, the process of generating a VR composite image can be executed.
- a start button may be provided in the VR display device, and the first operation may be that the user touches (which may be a click, a double click, a long press, etc.) the start button.
- the smart phone can be connected to a wearable pair of glasses to output VR composite images.
- the user can start a related VR application in the smart phone, and a start control can be configured in the VR application.
- the first operation can be that the user touches (can be clicked, double-clicked, long pressed, etc.) the start control displayed on the screen of the smart phone.
- the electronic device when the electronic device detects the first operation of the user, the electronic device can generate a synchronization signal based on a preset display cycle.
- the processor of the electronic device may include a controller, and the controller generates an operation control signal based on the instruction operation code and the timing signal to complete the control of fetching and executing instructions. Therefore, based on the display cycle, the synchronization signal can be generated by the controller in the processor.
- the electronic device can also record the timestamp of the synchronization signal generated in each display cycle. The timestamp of the synchronization signal can be determined based on the system time of the electronic device.
- Fig. 8 shows a specific implementation flow chart of S701 provided in an embodiment of the present application.
- S701 specifically includes S7011 to S7013.
- FIG9 shows a specific implementation flowchart of S701 based on the software framework provided in an embodiment of the present application.
- the processing chip generates a synchronization signal for each display period based on a preset display frame rate.
- the electronic device can generate a synchronization signal through one or more built-in processing chips.
- a controller can be configured in the processing chip. Based on a preset display frame rate, the processing chip can calculate the corresponding period interval length. For example, if the display frame rate is F, the interval length of the display cycle is 1/F, and the synchronization signal is generated according to the interval length.
- the processing chip may be a DPU or a display driver chip.
- the generation method may also be different, and may specifically include the following two methods:
- the electronic device can generate a synchronization signal at a preset display frame rate through the hardware module inside the DPU. That is, the electronic device can generate a synchronization signal at a preset display frame rate through the DPU in the CPU.
- the display driver chip can generate a feedback signal and send the feedback signal to the CPU.
- the CPU receives the feedback signal, it can generate a synchronization signal.
- the generation timing of the synchronization signal is the refresh timing of a frame of VR synthetic image. The synchronization of the two timings can ensure that the generation cycle of the synchronization signal is consistent with the display frame rate, and the subsequent threads can also calibrate the processing trigger time according to the timestamp of the synchronization signal.
- a timestamp of the generation of the synchronization signal is recorded, and the timestamp is stored in a device node in the kernel layer.
- the processing chip when the processing chip generates a synchronization signal, the system time corresponding to the generation of the synchronization signal is recorded at the same time, and a timestamp corresponding to the synchronization signal is created according to the system time.
- the driver of the processing chip in the kernel layer records the timestamp in the device node.
- the device nodes in the kernel layer can be used to record related signals of different signals in the device system, including the timestamp of the synchronization signal.
- the timestamp of each display cycle is recorded in the same file in the device node of the kernel layer, that is, when the processing chip generates a corresponding synchronization signal in the next display cycle, the timestamp of the synchronization signal will overwrite the timestamp of the synchronization signal generated in the previous display cycle, so that each time the subsequent interface reads the file corresponding to the timestamp, the timestamp obtained is the latest recorded timestamp.
- the path corresponding to the file storing the timestamp can be: /sys/class/graphics/fb0/vsync_timestamp.
- the service process in the application framework layer reads the timestamp of the synchronization signal and stores it in a time variable corresponding to the service process.
- a service process is configured in the application framework layer, and the service process can read the files in the device node in the kernel layer at intervals through a polling mechanism. Since the above timestamp is stored in the device node, when the service mechanism reads the file content in the device node based on the polling mechanism, it will read the file where the above timestamp is located and store the timestamp in the corresponding time variable.
- the processing chip can generate a synchronization signal based on a preset display frame rate, and record it in the device node in the kernel layer, and then call the service process in the application framework layer to read the timestamp, and store it in the time variable, so as to realize the recording and reading of the timestamp. Since the timestamp is recorded in the hardware abstraction layer, it is convenient for the system layer and subsequent software layers such as the application framework layer to read and use it, thereby reducing the delay.
- the timestamps of the respective display cycles are sent to a plurality of threads respectively, so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamp of the Nth display cycle; the processing triggering moment of the threads is synchronized with the timestamp of the synchronization signal of the N+Pth display cycle; N and P are positive integers greater than 0; and the plurality of threads are used to generate VR composite images.
- the timestamp can be sent to each thread so that subsequent threads can calibrate the processing trigger moment according to the timestamp to facilitate orderly processing of image data.
- the processing cycle of the thread is not necessarily synchronized with the display cycle of the VR composite image, that is, some threads can reuse the image data generated in the previous display cycle to generate the VR composite image.
- some threads can reuse the image data generated in the previous display cycle to generate the VR composite image.
- the processing frame rate of the rendering thread of the virtual object can be reduced, that is, the processing cycle corresponding to the rendering thread is extended, and the duration of a processing cycle is the duration corresponding to P display cycles. Therefore, after the thread processes the image data in the Nth display cycle, the next processing trigger moment is the N+Pth display cycle.
- the above-mentioned S702 specifically may include S7021 to S7022.
- the specific implementation process is as follows:
- the runtime application running in the application layer reads the timestamp recorded in the time variable; the runtime application communicates with the service process in the application framework layer through a preset interface.
- the runtime application sends the timestamp to the multiple threads.
- a runtime application (i.e., Runtime APP) may be run in the application layer of the electronic device, and the application layer may configure a process to run the runtime application.
- the service process in the application framework layer allocates a preset interface to the runtime application, and the runtime application may read the timestamp from the service process through the preset interface.
- the runtime application is configured in an event trigger mode.
- a trigger instruction for triggering the runtime application can also be generated.
- the runtime application detects the trigger instruction, the operation of S7021 can be executed, that is, the value of the time variable is read from the service process in the application framework layer through a preset interface to determine the timestamp corresponding to the synchronization signal of the current display cycle.
- the runtime application can send it to multiple threads so that the threads can perform subsequent steps according to the timestamp. It should be noted that the runtime application can send the timestamp to each thread or to a specified thread, and forward the timestamp to other threads through the specified thread, which is not limited here.
- the runtime application can send the timestamp to the secondary rendering thread, and the primary rendering thread and the video parsing thread can obtain the timestamp through the secondary rendering thread.
- the runtime application can also send the timestamp to the secondary rendering thread, the primary rendering thread, and the video parsing thread respectively, which can be set according to actual conditions.
- timestamp acquisition and timestamp distribution are performed through a runtime application, and a process dedicated to managing multiple threads for VR composite images can be configured in the application layer to improve the accuracy of timing control and thereby improve the smoothness of subsequent VR image generation.
- the threads can correct the processing triggering moment of the next display cycle of the thread through the timestamp, thereby realizing the alignment of the processing triggering moment with the generation moment of the synchronization signal of the next display cycle (i.e., the timestamp of the next display cycle). Since after the synchronization signal is generated in the kernel layer, it takes a certain transmission time to transmit it to each thread in the application layer, when the synchronization signal is transmitted to each thread in the application layer, there is actually a certain deviation between the generation moment and the synchronization signal, and it is not aligned with the generation moment of the synchronization signal. In addition, the transmission time is uncontrollable, and the transmission time can be long or short, which will also cause the processing triggering moment of different display cycles to be triggered at any time within the display cycle, thereby resulting in the actual processing time of each display cycle.
- FIG10 shows a comparative schematic diagram before and after the calibration processing trigger moment provided by an embodiment of the present application.
- the thread is specifically a camera thread.
- the camera thread is specifically used to control the camera module to capture images.
- the electronic device will cyclically output 4 frames of different images in one display cycle, namely, image 1 to image 4.
- the synchronization signal can be transmitted to the camera thread at any time within the display cycle. Therefore, when the exposure moment is not aligned with the synchronization signal, the captured image can be any image of image 1 to image 4.
- the processing trigger moment of the camera thread is synchronized with the generation moment of the synchronization signal, since the synchronization signal is generated at a fixed time in each display cycle, such as the initial moment of the display cycle, the processing trigger moment and the initial moment of each display cycle will be synchronized and aligned.
- the image captured by the camera module is the frame of image output at the initial moment of the display cycle, that is, Image 1.
- the interval between the calibration processing trigger moments before and after calibration it can be determined that the interval between different processing trigger moments before calibration is random, while the interval between the processing trigger moments after calibration is fixed, thereby ensuring that the processing time of the thread of each display cycle after calibration is consistent, avoiding the inability to complete related image processing tasks due to too short processing time.
- FIG11 shows a flowchart of the implementation of the camera thread calibration processing trigger moment provided by an embodiment of the present application.
- the camera thread implements the process of processing the trigger moment (for the camera thread, the processing trigger moment is the exposure moment of controlling the camera module to take an image) including S1101 to S1105.
- FIG12 shows an interaction flow chart of the camera thread calibration processing moment provided by an embodiment of the present application.
- the timestamp of the camera thread is sent by the runtime application, and when the runtime application detects the synchronization signal, the timestamp is obtained from the time variable in the application framework layer.
- the camera thread sends a first start instruction to the camera framework in the hardware abstraction layer; the first start instruction includes the timestamp.
- the camera thread can specifically run in the application layer. Of course, it can run in other layers within the software framework according to actual conditions, which is not limited here.
- the runtime application can send the timestamp to the camera thread through the corresponding interface.
- a first startup instruction can be generated, and the first startup instruction can carry the timestamp received.
- the first startup instruction is specifically used to notify the camera framework to start.
- the camera framework in response to the first start-up instruction, sends a second start-up instruction to the camera module.
- the camera framework at the hardware abstraction layer when the camera framework at the hardware abstraction layer receives the above-mentioned first startup instruction, it indicates that the camera module needs to be turned on, so a second startup instruction will be sent to the interface between the kernel layer. After receiving the second startup instruction, the camera driver corresponding to the kernel layer will control the camera module to turn on. When the camera module receives the second startup instruction, it will start to obtain a preview image, which is specifically used to capture the environmental image within the current user's line of sight.
- the camera module specifically includes a main camera module and at least one slave camera module.
- the main camera module is closer to the output interface of the second startup instruction, that is, when the second startup instruction is transmitted, it will first pass through the main camera module, and then pass through the slave camera module.
- the main camera module will start first, and then when the second startup instruction is transmitted to the slave camera module, the slave camera module will start again. That is, the main camera module and the slave camera module are both turned on when the relevant hardware receives the second startup instruction, that is, the main camera module and the slave camera module are configured as a hardware synchronization relationship, and hardware synchronization can be completed according to the second startup instruction.
- hardware synchronization has the advantages of low latency and high stability, which can improve the synchronization of the acquisition time of environmental images acquired by different camera modules in the subsequent image synthesis process, improve the accuracy of subsequent image synthesis, and reduce phase deviation.
- the camera module feeds back the exposure parameters of the preview image to the camera framework, wherein the preview image is obtained by the camera module based on the second start-up instruction, and the exposure parameters include the start-up exposure moment of the preview image.
- the preview image is fed back to the camera framework located at the hardware abstraction layer. Since the image information of the preview image records the exposure parameters when the preview image is shot, the exposure parameters may include information such as exposure time, exposure duration, and ISO sensitivity. Since the above preview image is an image obtained when the camera module is started, the exposure time corresponding to the preview image is used as the start-up exposure time of the camera module.
- the camera framework can read the exposure parameters in the above preview image and extract the start-up exposure time carried therein.
- both the main camera module and the slave camera module can capture preview images, that is, the preview images fed back include the main preview image captured by the main camera module and the slave preview image captured by the slave camera module. Since the transmission path of the main camera module is shorter than the transmission path of the slave camera module, the camera framework can extract the start exposure moment of the main camera module, and perform subsequent time deviation calculation based on the start exposure moment of the main camera module.
- the camera framework calculates the time deviation between the timestamp and the exposure start time.
- the camera framework can calculate the difference between the start exposure time and the timestamp corresponding to the first start instruction received, and use the difference as the above time deviation.
- the start exposure time is t1
- the above timestamp is ts
- the exposure time of the camera thread in the N+Pth display period is determined according to the timestamp of the Nth display period and the time deviation.
- the camera thread can send a shooting instruction to the camera framework according to the exposure time, and then the camera framework can control the camera module to collect the environmental image at the corresponding exposure time.
- the shooting frame rate of the environment image is consistent with the display frame rate of the VR synthetic image, the above P is 1.
- FIG13 shows a schematic diagram of the calibration of the exposure moments provided in an embodiment of the present application.
- the dotted lines mark the exposure moments before calibration, namely t1 to tN.
- t1 is the exposure moment corresponding to when the camera module is started, that is, the above-mentioned start-up exposure moment.
- ts is the timestamp corresponding to the display period when the first start-up instruction is sent to the camera module, and the generation moment ts corresponding to the timestamp is earlier than the moment when the first start-up instruction is generated, that is, earlier than t1.
- the time difference between the two is ⁇ t.
- the time deviation between the start exposure moment and the timestamp is determined, and the exposure moments corresponding to each subsequent display cycle are adjusted, so that the exposure moment of the camera module is aligned with the timestamp of the synchronization signal, thereby ensuring the accuracy of the camera thread's control over the camera module.
- Fig. 14 shows a flowchart of the triggering moment of the calibration process of the graphics processing thread provided by an embodiment of the present application. Referring to Fig. 14, the calibration process specifically includes S1401 to S1404.
- the graphics processing thread calculates the timestamp of the N+Pth display cycle; the timestamp of the N+Pth display cycle is determined according to the timestamp of the Nth display cycle and the processing frame rate of the graphics processing thread.
- the graphics processing thread can process the image data to be processed according to the corresponding processing frame rate.
- the specific processing frame rate can be determined according to the thread type of the graphics processing thread and the processing power of the GPU, and is not limited here.
- the electronic device can send the timestamp to the graphics processing thread through the runtime application. Since the synchronization signal is generated once for each display cycle, the time interval between the synchronization signals is predictable. Therefore, when the graphics processing thread receives the timestamp corresponding to the Nth display cycle, it can predict the timestamp of the synchronization signal corresponding to the N+Pth display cycle. If the processing frame rate of the graphics processing thread is consistent with the display frame rate of the VR synthetic image, the above P is 1, that is, the display frame rate is the same as the processing frame rate, and the graphics processing thread performs an image data processing operation once for each display cycle.
- the timestamp corresponding to the N+Pth display cycle is:
- t(N+P) t(N)+1/f, where t(N+P) is the timestamp corresponding to the N+Pth display period, and t(N) is the timestamp corresponding to the Nth display period.
- the graphics processing thread determines the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the time when the graphics processing thread completes the processing of the image data in the Nth display cycle.
- the processing trigger moment of the image processing thread is specifically the moment corresponding to switching from the sleep state to the running state. Since a certain transmission time is required for the synchronization signal to be transmitted from the kernel layer to the image processing thread in the application layer, in order to align the processing trigger moment of the graphics processing thread with the generation moment of the synchronization signal (i.e., the moment corresponding to the timestamp), the electronic device can determine the timing of switching from the sleep state to the running state (i.e., the processing trigger moment) by setting the sleep duration. After determining the timestamp of the display cycle corresponding to the next image processing, i.e., the end time of the sleep state, it is necessary to determine the start time of the sleep state.
- the condition for the graphics processing thread to enter the dormant state is that the relevant processing tasks of the thread have been completed. Therefore, when the graphics processing thread completes the image data processing corresponding to the Nth display cycle, the system time will be obtained, and the system time corresponding to the completion of the image data processing will be used as the above-mentioned processing completion time.
- the graphics processing thread calculates the sleep duration; the sleep duration is calculated based on the processing completion time and the timestamp of the N+Pth display cycle.
- the graphics processing thread when the image processing thread completes the image data processing task of the Nth display cycle, the graphics processing thread will enter the sleep state and set the corresponding sleep timer.
- the graphics processing thread will detect whether the count value of the sleep timer is greater than or equal to the sleep duration calculated above. If so, the image processing thread will be awakened; otherwise, if the count value of the sleep timer is less than the sleep duration, the graphics processing thread will be kept in the sleep state.
- FIG15 shows a control timing diagram of a graphics processing thread provided by an embodiment of the present application.
- the processing frame rate of the graphics processing thread is consistent with the display frame rate of the VR synthetic image, so one display cycle corresponds to one processing cycle of the graphics processing thread.
- the graphics processing thread can receive the corresponding timestamp t(N) of the current display cycle fed back by the runtime application at any time during the process of processing image data.
- the graphics processing thread can calculate the timestamp corresponding to the next display cycle according to the processing frame rate and t(N), that is, t(N+1).
- the graphics processing thread When the graphics processing thread completes processing of the image data, it will record the corresponding processing completion time t(fin), enter the dormant state, and set the dormant timer. The graphics processing thread will detect whether the count value of the dormant counter is t(N+1)-t(fin). If so, the graphics processing thread will be awakened. At this time, the synchronization signal corresponding to the next display cycle is generated, and the runtime application will feedback the corresponding timestamp to the graphics processing thread at a certain time, and repeat the above steps, so that the graphics processing thread is awakened each time and the synchronization signal is generated. The time is aligned.
- the timing of waking up the graphics processing thread can be aligned with the generation time of the synchronization signal of the corresponding display cycle, thereby achieving the purpose of orderly controlling the graphics processing thread to process image data and reducing the probability of frame loss.
- the graphics processing thread includes: a rendering thread and a decoding thread, and the rendering thread may also include a primary rendering thread and a secondary rendering thread. Since video decoding, primary rendering and secondary rendering all need to occupy the GPU for processing, and the GPU can only perform one task at the same time, that is, the above-mentioned multiple graphics processing threads need to occupy the GPU to process the tasks of the corresponding threads in time-sharing. In order to effectively utilize GPU resources, corresponding processing priorities can be configured for different graphics processing threads.
- the secondary rendering process is to merge the virtual synthetic image with the real environment image, it is the last image synthesis stage and is directly output to the display screen for display, so its processing importance is higher and can be configured with a higher processing priority
- the primary rendering process is mainly to render virtual objects, and virtual objects may not appear in some scenes, so its corresponding processing priority will be lower than the secondary rendering thread, and for the decoding thread, the time required for video data decoding is shorter, so its corresponding processing priority can be set to be higher than the primary rendering thread, then the processing priority relationship corresponding to the above three graphics processing threads can be: video decoding ⁇ secondary rendering > primary rendering.
- Figure 16 shows a timing diagram of different graphics processing threads occupying the GPU provided by an embodiment of the present application.
- the video thread decodes the video image of the third frame, and the first rendering thread renders the virtual object for the VR composite image of the third frame, and the second rendering thread can perform image fusion through the environmental image of the second frame generated in the previous display cycle, the video image of the second frame and the virtual object of the second frame, that is, complete the secondary rendering operation.
- the secondary rendering thread since the processing priority of the secondary rendering thread is higher than that of the primary rendering thread, if the secondary rendering thread occupies the GPU for the secondary rendering task during the processing of the primary rendering thread, it will interrupt the primary rendering task of the primary rendering thread being processed, and the task of the primary rendering thread may be delayed to the next display cycle.
- the primary rendering thread When the secondary rendering thread completes the processing, the primary rendering thread will re-occupy the GPU for processing, thereby delaying the task of decoding the 4th frame of video, which may cause the 4th frame of video image frame to be lost, resulting in the final output VR composite image being not smooth.
- the electronic device can configure different processing time slots for different graphics processing threads in the same display cycle, and configure corresponding processing priorities for different graphics processing threads, so as to realize orderly control of each graphics processing thread to call the GPU.
- the electronic device can be set at the time when the synchronization signal corresponding to the display cycle is generated, triggering the video decoding thread and the secondary rendering thread to process the corresponding task, and setting the priority of the video decoding thread to be greater than the priority of the secondary rendering thread. Therefore, the GPU will first process the task corresponding to the video decoding thread.
- the GPU Since the video decoding thread takes a shorter time, when the video decoding thread completes the decoding task, the GPU will be occupied by the secondary rendering thread to execute the corresponding secondary rendering task. After a preset interval, the primary rendering thread is triggered to process the corresponding task. Since the priority of the primary rendering thread is lower than that of the secondary rendering thread, the GPU will be occupied for processing only after the secondary rendering thread completes the corresponding task. In the next display cycle, the GPU can also be occupied in turn according to the above method, thereby reducing the situation where different graphics processing threads are interrupted during the task processing process, and can also reduce the probability of frame loss, thereby improving the smoothness of VR synthetic images.
- the runtime application in the application layer can provide an interface for all graphics processing threads to obtain timestamps.
- the interface can be directly assigned to the secondary rendering thread.
- the first rendering thread and the video decoding thread need to obtain the timestamp, they can obtain it from the secondary rendering thread.
- Figure 17 shows an implementation flow chart of each graphics processing thread determining the trigger moment provided by an embodiment of the present application. Referring to Figure 17, specifically, the way different threads determine the processing trigger moment is as follows:
- Step 2.1.1 The video decoding thread calculates the timestamp of the N+1th display cycle; the timestamp of the N+1th display cycle is determined based on the timestamp of the Nth display cycle and the processing frame rate of the video decoding thread; wherein the decoding frame rate of the video decoding thread is the same as the display frame rate of the VR synthetic image.
- Step 2.1.2 The video decoding thread determines the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the time when the video decoding thread completes the processing of the image data in the Nth display cycle.
- Step 2.1.3 The video decoding thread calculates the third sleep duration; the third sleep duration is calculated based on the processing completion time and the timestamp of the N+1th display cycle.
- Step 2.1.4 sets the video decoding thread to a sleep state, and wakes up the video decoding thread to decode the video data when the duration of the sleep state reaches a third sleep duration.
- Step 2.2.1 The secondary rendering thread calculates the timestamp of the N+1th display cycle; the timestamp of the N+1th display cycle is determined based on the timestamp of the Nth display cycle and the processing frame rate of the secondary rendering thread; wherein the rendering frame rate of the secondary rendering thread is the same as the display frame rate of the VR composite image.
- Step 2.2.2 The secondary rendering thread determines the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the completion of the processing of the image data by the secondary rendering thread in the Nth display cycle.
- Step 2.2.3 The secondary rendering thread calculates the second sleep duration; the second sleep duration is calculated based on the processing completion time and the timestamp of the N+1th display cycle.
- Step 2.2.4 sets the secondary rendering thread to a sleep state, and wakes up the secondary rendering thread to decode the video data when the duration of the sleep state reaches a second sleep duration.
- ⁇ is a preset coefficient, which can be any value greater than 0 and less than 1, for example, it can be 0.65
- F2 is the display frame rate.
- Step 2.3.1 A rendering thread calculates the timestamp of the N+1th display cycle; the timestamp of the N+Pth display cycle is determined based on the timestamp of the Nth display cycle and the processing frame rate of the rendering thread; wherein, the rendering frame rate of the rendering thread may be less than or equal to the display frame rate of the VR synthetic image.
- Step 2.3.2 A rendering thread determines the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the completion of the processing of the image data by the rendering thread for the Nth display cycle.
- step 2.3.3 the rendering thread calculates a first sleep duration; the first sleep duration is determined based on the timestamp of the N+Pth display cycle, the preset interval duration, and the first rendering completion time.
- the specific calculation process of the first sleep time is: t(N+P)+offset-t(fin), wherein offset is the above-mentioned preset interval time.
- Step 2.3.4 sets the primary rendering thread to a sleep state, and wakes up the primary rendering thread to decode the video data when the duration of the sleep state reaches a first sleep duration.
- the threads are triggered to process image data at the processing triggering moments corresponding to the threads in turn to generate a VR composite image.
- the electronic device can divide the entire generation process into multiple processing stages according to the processing content and processing time of different threads when generating VR synthetic images, and each processing stage corresponds to a processing order.
- the electronic device can start each thread in sequence according to the above processing order, and trigger each thread to process image data according to the processing trigger time, so as to synthesize the corresponding VR synthetic image.
- the process of generating VR synthetic images can refer to the relevant descriptions of the above stages, which will not be repeated here.
- FIG18 shows a schematic diagram of the division of a processing order provided by an embodiment of the present application.
- the exposure stage is the first stage, which can occupy two display cycles, while the video decoding stage and the primary rendering stage can correspond to the second stage, and can also occupy one display cycle, and the secondary rendering stage requires the content output by multiple previous stages, so it is in the third stage and occupies one display cycle.
- the display stage is the fourth stage, which is specifically used to display the VR composite image output by the secondary rendering thread.
- FIG19 shows a timing diagram of each stage in the process of generating a multi-frame VR composite image by an electronic device provided by an embodiment of the present application.
- the process of displaying a multi-frame VR composite image it is a process of data stream processing.
- each stage can be executed in order, and the corresponding processing trigger time is synchronized with the timestamp of the synchronization signal.
- the primary rendering thread, the video decoding thread, and the secondary rendering thread can time-share the GPU in the same display cycle and have corresponding processing time slots, thereby reducing the loss of a frame of image due to task interruption.
- a display method provided by an embodiment of the present application can generate a synchronization signal in each display cycle when a first operation is received, and record the timestamp of generating the synchronization signal; send the timestamp of each display cycle to multiple threads used to synthesize VR synthetic images, and each thread can determine the processing trigger moment of the next display cycle according to the timestamp.
- each thread can synchronize the processing trigger moment of processing image data in the next display cycle with the timestamp of the synchronization signal of the next display cycle, it can ensure that the processing trigger moments of multiple threads are synchronized with each other, and then trigger the threads to process the image data in turn at the processing trigger moment according to the processing order corresponding to each thread, so as to generate a VR synthetic image.
- the processing trigger moment of each thread can be synchronized with the synchronization signal of the next display cycle, so as to achieve the synchronization of the processing trigger moments between multiple threads, thereby achieving orderly control of multiple threads to collaboratively process image data, reducing the probability of frame loss, ensuring the smoothness of the output picture, and then improving the user's immersion and viewing experience.
- Embodiment 2 is a diagrammatic representation of Embodiment 1:
- FIG20 shows a structural block diagram of the display device provided in the embodiment of the present application. For the sake of convenience, only the part related to the embodiment of the present application is shown.
- the display device includes:
- a timestamp recording unit 201 configured to record a timestamp of a synchronization signal generated in each display period in response to a first operation
- the timestamp sending unit 202 is used to send the timestamp of each display cycle to multiple threads respectively, so that the threads calibrate the processing triggering moment of the threads in the N+Pth display cycle according to the timestamp of the Nth display cycle; the processing triggering moment of the threads is synchronized with the timestamp of the synchronization signal of the N+Pth display cycle; N and P are positive integers greater than 0; the multiple threads are used to generate VR synthetic images;
- the image synthesis unit 203 is used to trigger the threads to process image data at the processing triggering moments corresponding to the threads in sequence according to a preset processing order, so as to generate a VR synthesized image.
- the multiple threads include a camera thread; the processing trigger moment of the camera thread is an exposure moment; and the timestamp issuing unit 202 includes:
- a first startup instruction transmission unit used for the camera thread to send a first startup instruction to the camera framework in the hardware abstraction layer; the first startup instruction includes the timestamp;
- a second startup instruction transmission unit configured to cause the camera framework to send a second startup instruction to the camera module in response to the first startup instruction
- an exposure parameter feedback unit configured for the camera module to feed back the exposure parameters of the preview image to the camera framework, wherein the preview image is obtained by the camera module based on the second start instruction, and the exposure parameters include the start exposure moment of the preview image;
- a time deviation calculation unit used for calculating the time deviation between the timestamp and the exposure start time in the camera framework
- the exposure time determination unit is used to determine the exposure time of the camera thread in the N+Pth display period according to the timestamp of the Nth display period and the time deviation.
- the camera module includes a main camera module and at least one slave camera module; the main camera module and the slave camera module complete hardware synchronization through a second startup instruction sent by the camera framework when starting.
- the multiple threads include: a graphics processing thread; and the timestamp issuing unit 202 includes:
- a timestamp calculation unit used for the graphics processing thread to calculate the timestamp of the N+Pth display cycle; the timestamp of the N+Pth display cycle is determined according to the timestamp of the Nth display cycle and the processing frame rate of the graphics processing thread;
- a rendering completion time determination unit used for the graphics processing thread to determine the processing completion time of the Nth display cycle; the processing completion time is the time corresponding to the completion of the processing of the image data by the graphics processing thread in the Nth display cycle;
- a sleep duration calculation unit used for the graphics processing thread to calculate the sleep duration; the sleep duration is calculated based on the processing completion time and the timestamp of the N+Pth display cycle;
- the sleep state triggering unit is used to set the graphics processing thread to a sleep state and wake up the graphics processing thread when the duration of the sleep state reaches the sleep duration.
- the graphics processing thread includes a secondary rendering thread
- the rendering completion time determination unit comprises:
- a second rendering time determination unit used for the secondary rendering thread to determine the second rendering completion time of the Nth display cycle
- the sleep duration calculation unit comprises:
- the second sleep calculation unit is used for the secondary rendering thread to calculate the time difference between the timestamp of the N+Pth display cycle and the rendering completion time, and use the time difference as the second sleep duration.
- the graphics processing thread includes a primary rendering thread; the time difference between the expected triggering moment of the primary rendering thread and the expected triggering moment of the secondary rendering thread is a preset interval duration;
- the rendering completion time determination unit comprises:
- a first rendering time determination unit used for the primary rendering thread to determine a first rendering completion time of an Nth display cycle
- the sleep duration calculation unit comprises:
- the first sleep calculation unit is used for calculating a first sleep duration for the primary rendering thread; the first sleep duration is determined according to the timestamp of the N+Pth display cycle, the preset interval duration and the first rendering completion time.
- the timestamp recording unit 201 includes:
- a synchronization signal generating unit configured to generate the synchronization signal of each display period based on a preset display frame rate by a processing chip
- a device node storage unit used for generating the timestamp for recording the synchronization signal based on the generation time of the synchronization signal, and storing the timestamp in the device node in the kernel layer;
- the time variable storage unit is used for the service process in the application framework layer to read the timestamp of the synchronization signal and store it in the time variable corresponding to the service process.
- the synchronization signal generating unit includes:
- the first synchronization signal generating unit is used for the central processing unit distributed unit DPU to generate the synchronization signal at the display frame rate.
- the synchronization signal generating unit includes:
- the second synchronization signal generating unit is used for the display driver chip to send a feedback signal to the central processing unit CPU, so that the CPU generates the synchronization signal when receiving the feedback signal; the feedback signal is generated when the display driver chip refreshes a frame of the VR composite image.
- the timestamp issuing unit 202 includes:
- a runtime application reading unit used for a runtime application running in the application layer to read the timestamp recorded in the time variable; the runtime application communicates with the service process in the application framework layer through a preset interface;
- a runtime application sending unit is used for the runtime application to send the timestamp to the multiple threads.
- P is determined based on a ratio between a display period of the VR synthetic image and a processing period of the thread.
- the display device provided in the embodiment of the present application can also generate a synchronization signal in each display cycle when receiving the first operation, and record the timestamp of generating the synchronization signal; send the timestamp of each display cycle to multiple threads used to synthesize VR synthetic images, and each thread can determine the processing trigger moment of the next display cycle according to the timestamp. Since each thread can synchronize the processing trigger moment of the image data in the next display cycle with the timestamp of the synchronization signal of the next display cycle, it can ensure that the processing trigger moments of multiple threads are synchronized with each other, and then trigger the threads to process the image data in turn at the processing trigger moment according to the processing order corresponding to each thread, so as to generate a VR synthetic image.
- the processing trigger moment of each thread can be synchronized with the synchronization signal of the next display cycle, so as to achieve the synchronization of the processing trigger moments between multiple threads, thereby achieving orderly control of multiple threads to collaboratively process image data, reducing the probability of frame loss, ensuring the smoothness of the output picture, and then improving the user's immersion and viewing experience.
- FIG21 is a schematic diagram of the structure of an electronic device provided by an embodiment of the present application.
- the electronic device 21 of this embodiment includes: at least one processor 210 (only one processor is shown in FIG21 ), a memory 211, and a computer program 212 stored in the memory 211 and executable on the at least one processor 210, and when the processor 210 executes the computer program 212, the steps in any of the above-mentioned display method embodiments are implemented.
- the electronic device 21 may be a computing device such as a desktop computer, a notebook, a PDA, and a cloud server.
- the electronic device may include, but is not limited to, a processor 210 and a memory 211.
- FIG. 21 is merely an example of the electronic device 21 and does not constitute a limitation on the electronic device 21.
- the electronic device 21 may include more or fewer components than shown in the figure, or may combine certain components, or different components, and may also include, for example, input and output devices, network access devices, etc.
- the processor 210 may be a central processing unit (CPU), or other general-purpose processors, digital signal processors (DSP), application-specific integrated circuits (ASIC), field-programmable gate arrays (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
- a general-purpose processor may be a microprocessor or any conventional processor, etc.
- the memory 211 may be an internal storage unit of the electronic device 21, such as a hard disk or memory of the electronic device 21. In other embodiments, the memory 211 may also be an external storage device of the electronic device 21, such as a plug-in hard disk, a smart memory card (Smart Media Card, SMC), a secure digital (Secure Digital, SD) card, a flash card (Flash Card), etc. equipped on the electronic device 21. Further, the memory 211 may also include both an internal storage unit of the electronic device 21 and an external storage device.
- the memory 211 is used to store an operating system, an application program, a boot loader (BootLoader), data, and other programs, such as the program code of the computer program. The memory 211 may also be used to temporarily store data that has been output or is to be output.
- the technicians in the relevant field can clearly understand that for the convenience and simplicity of description, only the division of the above-mentioned functional units and modules is used as an example for illustration.
- the above-mentioned function allocation can be completed by different functional units and modules as needed, that is, the internal structure of the device can be divided into different functional units or modules to complete all or part of the functions described above.
- the functional units and modules in the embodiment can be integrated in a processing unit, or each unit can exist physically separately, or two or more units can be integrated in one unit.
- the above-mentioned integrated unit can be implemented in the form of hardware or in the form of software functional units.
- An embodiment of the present application also provides an electronic device, which includes: at least one processor, a memory, and a computer program stored in the memory and executable on the at least one processor, wherein the processor implements the steps of any of the above-mentioned method embodiments when executing the computer program.
- An embodiment of the present application further provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments can be implemented.
- An embodiment of the present application provides a computer program product.
- the computer program product runs on a mobile terminal
- the mobile terminal can implement the steps in the above-mentioned method embodiments when executing the computer program product.
- the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer-readable storage medium.
- the present application implements all or part of the processes in the above-mentioned embodiment method, which can be completed by instructing the relevant hardware through a computer program.
- the computer program can be stored in a computer-readable storage medium.
- the computer program is executed by the processor, the steps of the above-mentioned method embodiments can be implemented.
- the computer program includes computer program code, which can be in source code form, object code form, executable file or some intermediate form.
- the computer-readable medium may at least include: any entity or device that can carry the computer program code to the camera/electronic device, recording medium, computer memory, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), electric carrier signal, telecommunication signal and software distribution medium.
- ROM read-only memory
- RAM random access memory
- electric carrier signal telecommunication signal and software distribution medium.
- USB flash drive mobile hard disk, disk or optical disk.
- computer-readable media cannot be electric carrier signals and telecommunication signals.
- the disclosed devices/network equipment and methods can be implemented in other ways.
- the device/network equipment embodiments described above are merely schematic.
- the division of the modules or units is only a logical function division. There may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed.
- Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be through some interfaces, indirect coupling or communication connection of devices or units, which can be electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
La présente demande est applicable au domaine technique de la commande de dispositif, et concerne un procédé et un appareil d'affichage, un dispositif électronique et un support de stockage. Le procédé consiste à : en réponse à une première opération, enregistrer une estampille temporelle d'un signal de synchronisation généré dans chaque période d'affichage ; et envoyer respectivement l'estampille temporelle dans chaque période d'affichage à une pluralité de fils, de telle sorte que les fils étalonnent les moments de déclenchement de traitement de ceux-ci dans la (N + P)ième période d'affichage en fonction de l'estampille temporelle dans la Nième période d'affichage, les moments de déclenchement de traitement des fils étant synchronisés avec l'estampille temporelle du signal de synchronisation dans la (N + P)ième période d'affichage, et N et P étant des nombres entiers positifs supérieurs à 0 ; et selon un ordre de traitement prédéfini, déclencher séquentiellement les fils pour traiter des données d'image au niveau des moments de déclenchement de traitement correspondant aux fils pour générer une image composite de RV. Selon la solution technique fournie par la présente demande, la probabilité de perte de trame peut être réduite, et la fluidité des images de sortie peut être assurée, ce qui permet d'augmenter l'immersion de visualisation de l'utilisateur, et d'améliorer l'expérience de visualisation de l'utilisateur.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310115473.5A CN118426722A (zh) | 2023-01-31 | 2023-01-31 | 一种显示方法、装置、电子设备及存储介质 |
CN202310115473.5 | 2023-01-31 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024159950A1 true WO2024159950A1 (fr) | 2024-08-08 |
Family
ID=92027386
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2023/139711 WO2024159950A1 (fr) | 2023-01-31 | 2023-12-19 | Procédé et appareil d'affichage, dispositif électronique et support de stockage |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN118426722A (fr) |
WO (1) | WO2024159950A1 (fr) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180302569A1 (en) * | 2017-04-14 | 2018-10-18 | Facebook, Inc. | Three-dimensional, 360-degree virtual reality camera live preview |
CN114579075A (zh) * | 2022-01-30 | 2022-06-03 | 荣耀终端有限公司 | 数据处理方法和相关装置 |
CN115048012A (zh) * | 2021-09-30 | 2022-09-13 | 荣耀终端有限公司 | 数据处理方法和相关装置 |
-
2023
- 2023-01-31 CN CN202310115473.5A patent/CN118426722A/zh active Pending
- 2023-12-19 WO PCT/CN2023/139711 patent/WO2024159950A1/fr unknown
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180302569A1 (en) * | 2017-04-14 | 2018-10-18 | Facebook, Inc. | Three-dimensional, 360-degree virtual reality camera live preview |
CN115048012A (zh) * | 2021-09-30 | 2022-09-13 | 荣耀终端有限公司 | 数据处理方法和相关装置 |
CN114579075A (zh) * | 2022-01-30 | 2022-06-03 | 荣耀终端有限公司 | 数据处理方法和相关装置 |
Also Published As
Publication number | Publication date |
---|---|
CN118426722A (zh) | 2024-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020259452A1 (fr) | Procédé d'affichage plein écran pour terminal mobile et appareil | |
CN113254120B (zh) | 数据处理方法和相关装置 | |
CN114089933B (zh) | 显示参数的调整方法、电子设备、芯片及可读存储介质 | |
WO2021000881A1 (fr) | Procédé de division d'écran et dispositif électronique | |
WO2020093988A1 (fr) | Procédé de traitement d'image et dispositif électronique | |
CN114397983A (zh) | 一种应用显示方法及电子设备 | |
WO2021104485A1 (fr) | Procédé de photographie et dispositif électronique | |
CN113961157B (zh) | 显示交互系统、显示方法及设备 | |
CN115048012B (zh) | 数据处理方法和相关装置 | |
WO2023000772A1 (fr) | Procédé et appareil de commutation de mode, dispositif électronique et système de puce | |
WO2022042637A1 (fr) | Procédé de transmission de données à base de bluetooth et appareil associé | |
CN115967851A (zh) | 快速拍照方法、电子设备及计算机可读存储介质 | |
US20230335081A1 (en) | Display Synchronization Method, Electronic Device, and Readable Storage Medium | |
CN113542574A (zh) | 变焦下的拍摄预览方法、终端、存储介质及电子设备 | |
CN113438366A (zh) | 信息通知的交互方法、电子设备和存储介质 | |
WO2021204103A1 (fr) | Procédé de prévisualisation d'images, dispositif électronique et support de stockage | |
WO2024078275A1 (fr) | Appareil et procédé de traitement d'image, dispositif électronique et support de stockage | |
CN114827098A (zh) | 合拍的方法、装置、电子设备和可读存储介质 | |
CN116389884B (zh) | 缩略图显示方法及终端设备 | |
CN115904184B (zh) | 数据处理方法和相关装置 | |
CN113923372B (zh) | 曝光调整方法及相关设备 | |
WO2024159950A1 (fr) | Procédé et appareil d'affichage, dispositif électronique et support de stockage | |
CN115686403A (zh) | 显示参数的调整方法、电子设备、芯片及可读存储介质 | |
WO2024066834A9 (fr) | Procédé de commande de signal vsync, dispositif électronique, support d'enregistrement et puce | |
EP4239467A1 (fr) | Procédé et dispositif de commutation de fréquence d'images |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23919512 Country of ref document: EP Kind code of ref document: A1 |