[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN115904151A - Display method and device and electronic equipment - Google Patents

Display method and device and electronic equipment Download PDF

Info

Publication number
CN115904151A
CN115904151A CN202211521939.3A CN202211521939A CN115904151A CN 115904151 A CN115904151 A CN 115904151A CN 202211521939 A CN202211521939 A CN 202211521939A CN 115904151 A CN115904151 A CN 115904151A
Authority
CN
China
Prior art keywords
interface content
display area
content
view
application
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211521939.3A
Other languages
Chinese (zh)
Inventor
吴志良
傅金智
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Opper Software Technology Co ltd
Original Assignee
Nanjing Opper Software Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Opper Software Technology Co ltd filed Critical Nanjing Opper Software Technology Co ltd
Priority to CN202211521939.3A priority Critical patent/CN115904151A/en
Publication of CN115904151A publication Critical patent/CN115904151A/en
Priority to PCT/CN2023/119109 priority patent/WO2024114051A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a display method and device and electronic equipment, and relates to the field of display; the method comprises the following steps: in response to the electronic device entering the hovering state, displaying first interface content of the target application on a first display area of the electronic device, and displaying second interface content of the target application on a second display area of the electronic device, thereby implementing visual UI separation of the interface content of the applications. Therefore, the interface content for operating the target application can be displayed on the display area which is easier for the user to operate, so that the conditions that the display area swings to change the folding angle and the like due to the touch of the user can be avoided, and the use experience of the user is improved.

Description

Display method and device and electronic equipment
Technical Field
The application relates to the field of display, in particular to a display method and device and electronic equipment.
Background
At present, with the development of flexible screen technology, various folding screens have been applied to electronic devices so as to increase the area for display on the electronic devices as much as possible, and users can also perform operations of unfolding or folding at various angles on the folding screens so as to meet the screen display requirements in various scenes and obtain better visual experience.
Generally, a user folds the folding screen inwards or outwards along a foldable part according to the requirement of the user, so that the folding screen enters a hovering state from an unfolded state, and the whole display area of the folding screen is divided into two display areas. The unfolded state is understood to be a state in which the folded screen is unfolded and is completely unfolded; hovering state, it can be understood that the folded screen is not completely folded (e.g. half folded) but assumes a state of a certain folding angle.
However, when the electronic device with the folding screen enters the hovering state from the unfolded state, further research is needed on how to perform related display when the electronic device enters the hovering state.
Disclosure of Invention
The embodiment of the application provides a display method and device and electronic equipment, and aims to solve the problem of how to perform related display when the electronic equipment enters a hovering state.
In a first aspect, a display method according to the present application includes:
in response to an electronic device entering a hover state, displaying first interface content of a target application on a first display area of the electronic device and displaying second interface content of the target application on a second display area of the electronic device;
wherein the first interface content has a function of not providing a user with a function of operating the target application, and the second interface content has a function of providing a user with a function of operating the target application; or,
the first interface content has a function of not responding to a user touch event to the first display area, and the second interface content has a function of responding to a user touch event to the second display area to operate the target application.
As can be seen, when the electronic device enters the hovering state, the embodiment of the application may divide the interface content of the target application into two types, that is, a first interface content and a second interface content, and display the first interface content and the second interface content on two display areas of the electronic device (the folding screen), respectively, that is, the first display area displays the first interface content, and the second display area displays the second interface content, thereby implementing visual UI separation of the interface content of the target application.
Therefore, when the user wants to operate the target application, the interface content of the target application is classified to be displayed on different display areas respectively, so that the interface content for operating the target application can be displayed on the display area which is easier for the user to operate, the conditions that the display area swings to change the folding angle and the like due to the touch of the user are avoided, and the use experience of the user is improved.
In a second aspect, a display device of the present application includes:
the display unit is used for responding to the electronic equipment entering a hovering state, displaying first interface content of a target application on a first display area of the electronic equipment, and displaying second interface content of the target application on a second display area of the electronic equipment;
wherein the first interface content has a function of not providing a user with a function of operating the target application, and the second interface content has a function of providing a user with a function of operating the target application; or,
the first interface content has a function of not responding to a user touch event to the first display area, and the second has a function of responding to a user touch event to the second display area to operate the target application.
In a third aspect, the present application is an electronic device, which includes a processor, a memory, and a computer program or an instruction stored in the memory, where the processor executes the computer program or the instruction to implement the steps of the method in the first aspect.
In a fourth aspect, the present invention is a computer readable storage medium, wherein the computer readable storage medium stores thereon a computer program or instructions, and the computer program or instructions are executed by a processor to implement the steps of the method in the first aspect.
A fifth aspect is a computer program product of the present application, comprising a computer program or instructions, wherein the computer program or instructions, when executed by a processor, implement the steps of the method of the first aspect.
The beneficial effects brought by the technical solutions of the second aspect to the fifth aspect may refer to the technical effects brought by the technical solutions of the first aspect, and are not described herein again.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings required to be used in the embodiments of the present application will be briefly described below.
Fig. 1 is a schematic structural diagram of a folded screen of an electronic device in an unfolded state and a hovering state respectively according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of split screens of a folded screen of an electronic device in a hovering state according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of split screens of a folded screen of another electronic device in a hovering state according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a display method according to an embodiment of the present application;
FIG. 5 is a block diagram of an application that may contain multiple Activities according to an embodiment of the present application;
fig. 6 is a schematic flowchart of split-screen display of first interface content and second interface content according to an embodiment of the present application;
FIG. 7 is a schematic flow chart illustrating creation of a new Surface to the first interface content according to an embodiment of the present application;
FIG. 8 is a schematic flow chart illustrating creation of a new Surface to the first interface content according to an embodiment of the present application;
FIG. 9 is a schematic flowchart of a process for linking a self-contained Surface of a first interface content with a newly created Surface to a view content of the first interface content that does not have a self-contained Surface according to an embodiment of the present application;
FIG. 10 is a schematic flow chart illustrating a process of linking a Surface of a Surface View with a new Surface according to an embodiment of the present application;
FIG. 11 is a schematic flowchart illustrating a split-screen display of the first interface content and the second interface content according to another embodiment of the present application;
fig. 12 is a block diagram showing functional units of a display device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of an electronic device according to an embodiment of the present application.
Detailed Description
In order to better understand the technical solutions of the present application, the following description is given for clarity and completeness in conjunction with the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the description of the embodiments of the present application without inventive step, are within the scope of the present application.
It should be understood that the terms "first", "second", and the like, referred to in the embodiments of the present application, are used for distinguishing different objects, and are not used for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, software, product, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements recited, but may also include other steps or elements not expressly listed or inherent to such process, method, product, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
In the embodiment of the present application, "and/or" describes an association relationship of the association object, which means that there may be three relationships. For example, a and/or B, may represent the following three cases: a is present alone; both A and B are present; b is present alone. Wherein, A and B can be singular or plural.
In the embodiment of the present application, the symbol "/" may indicate that the former and latter associated objects are in an "or" relationship. In addition, the symbol "/" may also indicate a division number, i.e. perform a division operation. For example, A/B, may represent A divided by B.
In the embodiment of the present application, the symbol "+" or "\9679", may indicate a multiplication number, i.e., a multiplication operation is performed. For example, A B or A \9679B, B, may represent A multiplied by B.
In the embodiments of the present application, "at least one item" or the like means any combination of these items, including a single item(s) or any combination of plural items(s), means one or more, and means two or more. For example, at least one (one) of a, b, or c may represent seven cases as follows: a, b, c, a and b, a and c, b and c, a, b and c. Each of a, b, and c may be an element or a set including one or more elements.
The term "equal to" in the embodiments of the present application may be used with greater than or equal to, or may be used with less than or equal to, or may be used with more than or equal to, or may be used with less than or equal to, or may be used with more than or equal to, the technical scheme used with less than or equal to. When the ratio is equal to or greater than the ratio for use, the ratio is not less than the ratio for use; when the ratio is equal to or less than the combined ratio, the ratio is not greater than the combined ratio.
With the application of various folding screens, a user can fold the folding screen inwards or outwards along the folding part according to the requirement of the user, so that the folding screen enters a hovering state from an unfolded state, and the whole display area of the folding screen is divided into two display areas along the folding part.
For example, as shown in fig. 1, the folding screen 110 is in an unfolded state, and the folding screen 110 may be folded inward or outward along the folded portion 120. Wherein (a) of fig. 1 represents an unfolded state of the folding screen 110 in a full screen of the horizontal screen; fig. 1 (b) shows an unfolded state of the folding screen 110 in the full screen of the vertical screen.
For another example, as shown in fig. 2, the folded screen 210 enters the hovering state from the unfolded state so that the entire display area of the folded screen 210 is divided into a display area 2201 and a display area 2202, and an application window 2401 is displayed on the display area 2201 and an application window 2402 is displayed on the display area 2202. Wherein, fig. 2 (a) shows that the folded screen 210 is folded inward along the folding portion 230 such that the folded screen is not completely folded to assume a folding angle between 0 degrees and 90 degrees; fig. 2 (b) shows that the folded screen 210 is folded inward along the folding portion 230 such that the folded screen is not completely folded to assume a folding angle between 90 degrees and 180 degrees; fig. 2 (c) shows that the folded screen 210 is folded inward along the folding portion 230 such that the folded screen is not completely folded to assume a folding angle between 90 degrees and 180 degrees.
Of course, the folding screen may also be folded outwards along the foldable portion in the embodiments of the present application, and this is not particularly limited.
However, when the folded screen enters the hovering state from the unfolded state, the embodiment of the present application needs to specifically study how to display the content displayed on the entire display area of the folded screen in the unfolded state on two display areas in the hovering state.
The following specifically describes technical solutions, advantageous effects, related concepts, and the like related to the embodiments of the present application.
1. Electronic device
It should be noted that the technical solutions of the embodiments of the present application can be applied to electronic devices. The electronic device is explained in detail below.
1. Type of electronic device
The electronic device according to the embodiment of the present application may be an entity/unit/module/device or the like having a communication function of transmitting and receiving.
In some possible implementations, the types of electronic devices may include a handheld device, an in-vehicle terminal, a wearable device, an Augmented Reality (AR) device, a Virtual Reality (VR) device, an Internet of Things (IOT) device, a projection device, a projector, or other devices connected to a wireless modem, and may also include a User Equipment (UE), a terminal device (terminal device), a terminal, a mobile phone (smart phone), a smart screen, a smart television, a smart watch, a notebook computer, a smart audio, a camera, a game pad, a microphone, a Station (STA), an Access Point (AP), a Mobile Station (MS), a Personal Digital Assistant (PDA), a Personal Computer (PC), or a relay device, and the like.
For example, taking the electronic device as a wearable device, the wearable device may also be referred to as an intelligent wearable device, which is a generic term of an intelligent device that intelligently designs and develops daily wearing using an application wearable technology, such as intelligent glasses, intelligent gloves, intelligent watches, intelligent bracelets for monitoring various specific characteristics, and intelligent jewelry. The wearable device can be worn directly on the body, and can also be integrated into a portable device on the clothing or accessories of the user. The wearable device can carry a special hardware framework, and can also carry a special software framework for data interaction, cloud interaction and the like. The wearable smart device may be independent of other smart devices to implement full or partial functionality.
It should be noted that, in the embodiments of the present application, a specific structure of an execution subject of the display method is not particularly limited as long as the execution subject can be processed by executing a computer program or an instruction recorded with the method provided in the embodiments of the present application and the method provided in the embodiments of the present application. For example, the execution subject of the method provided in the embodiment of the present application may be an electronic device, or may also be a processor/apparatus/module/unit or the like capable of invoking and executing a computer program or an instruction in the electronic device, and is not limited in particular.
2. Hardware architecture of electronic device
In some possible implementations, the electronic device of embodiments of the present application may include at least one of a processor, a sensing component, a display component, a camera component, and an input driver, among others. The following are exemplary descriptions respectively.
1) Processor with a memory having a plurality of memory cells
In some possible implementations, a processor may be used to run or add an operating system, which may be any one or more computer operating systems that implement business processes via a process (process). For example, linux Operating System, unix Operating System, android Operating System, iOS Operating System, windows Operating System, zephyr Operating System, real Time Operating System (RTOS), DOS Operating System, mac Operating System, threadX Operating System, embedded Operating System, nucleossplus Operating System, and the like.
In some possible implementations, the processor may be considered a complete System On Chip (SOC).
In some possible implementations, a processor may include one or more processing units. For example, the processing Unit may include at least one of a Central Processing Unit (CPU), an Application Processor (AP), a Micro Controller Unit (MCU), a Single Chip Microcomputer (SCM), a single chip microcomputer (GPU), an Image Signal Processor (ISP), a controller, a Digital Signal Processor (DSP), a Field Programmable Gate Array (FPGA), an application-specific integrated circuit (ASIC), a baseband processor, a neural Network Processor (NPU), and the like. Wherein, the different processing units can be separated respectively or integrated together.
In some possible implementations, one processing unit may be a single core or multiple cores.
In some possible implementations, one processing unit may run or load one multi-core subsystem. The multi-core subsystem may be an operating system with multi-core processing capability.
In some possible implementations, the processor may also include a memory for storing computer programs or instructions.
For example, the processor may call a program stored in memory to run an operating system.
As another example, a memory in a processor may hold or cache instructions that the processor has just used or cycled through. If the processor needs to reuse the instruction or data, the instruction or data can be directly called from the memory, so that repeated access is avoided, and the waiting time of the processor is reduced to improve the system efficiency.
Also for example, a memory in a processor may store or cache data, and the data may be synchronized or transferred to other processors for execution. Wherein the memory in the processor may be a cache memory.
In some possible implementations, the processor may include one or more communication interfaces. The communication interface may include at least one of a Serial Peripheral Interface (SPI), an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, a Universal Serial Bus (USB) interface, and the like.
2) Sensing assembly
In some possible implementations, the sensing component may be a sensor.
For example, the sensing components may include at least one of a gravity sensor, a gyroscope sensor, a magnetometer sensor, an acceleration sensor, an Inertial sensor (e.g., inertial Motion Unit (IMU)), a pressure sensor, a barometric pressure sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, an Ultra Wideband (UWB) sensor, a Near Field Communication (NFC) sensor, a laser sensor, and/or a visible light sensor, among others.
3) Display assembly
In some possible implementations, the display component may be used to display at least one of a user interface, user interface elements and features, user-selectable controls, various displayable objects, and the like.
In some possible implementations, the display component may be a display screen of the electronic device. The display screen may include a folding screen, and the folding screen may be touchable, clicked, may respond to a touch event or click event of a user, and the like.
In some possible implementations, the display component may include a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (active-matrix organic light-emitting diode (AMOLED)), a flexible light-emitting diode (FLED), a quantum dot light-emitting diode (QLED), or the like.
The electronic device may implement a display function through the GPU, the display component, the processor, and the like. The GPU may be configured to perform mathematical and geometric calculations and perform graphics rendering, among other things. Additionally, the GPU may be a microprocessor for image processing and may be coupled to a display assembly and a processor. The processor may include one or more GPUs that execute program instructions to generate or alter display information.
4) Camera shooting component
In some possible implementations, the camera assembly may be a camera or camera module for capturing (shooting/scanning/capturing, etc.) still/moving images or video.
In some possible implementations, the image sensing component may include a lens, a photosensitive element, and the like, and the photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
Thus, the object can generate an optical image through the lens to be projected onto the photosensitive element. The light sensing element can convert the optical signal in the optical image into an electrical signal, which is then transmitted to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP. The DSP converts the digital image signal into an image signal in a standard RGB, YUV or other format.
It should be noted that the device may implement functions of capturing (shooting/scanning, etc.) images, etc. through an ISP, a DSP, a camera component, a video codec, a GPU, a display component, a processor, etc.
In some possible implementations, the ISP may be used to process data fed back by the camera assembly. For example, when taking a picture, the shutter is opened first, then the light is transmitted to the photosensitive element of the camera component through the lens of the camera component, so that the optical signal is converted into an electrical signal, and finally the electrical signal is transmitted to the ISP through the photosensitive element to be processed into a digital image and the like.
In some possible implementations, the ISP may also perform algorithmic optimization on noise, brightness, skin color of the image.
In some possible implementations, the ISP may also optimize parameters such as exposure, color temperature, etc. of the shooting scene.
In some possible implementations, the ISP and/or DSP may be provided in the camera assembly.
5) Input driver
In some possible implementations, the input driver may be used to process various inputs from a user operating the device.
For example, when the display screen is a touch screen, the input driver may be operable to detect and process various click events or touch events, and the like. Wherein a click event or touch event on the touch screen to simultaneously indicate the region of interest and start scanning to the object. The object may be displayed on the touch screen as a preview of the image to be scanned, and a touch event at a particular location on the touch screen indicates the image that should be scanned.
3. Software architecture for electronic devices
In the embodiment of the present application, the software architecture of the electronic device may adopt a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. In the following, the embodiment of the present application takes an Android system with a hierarchical architecture as an example, and exemplarily illustrates a software architecture of an electronic device.
In some possible implementations, the electronic device may include at least one of a kernel layer, a system runtime layer, an application framework layer, and an application layer, among others.
The layers communicate with each other through a software interface, and the kernel layer, the system operation library layer and the application framework layer belong to an operating system space.
1) Application layer
In some possible implementations, the application layer belongs to the user space, and at least one application program (or simply "application") runs on the application layer, and the application program may be a native application program of the operating system itself or a third-party application program developed by a third-party developer.
For example, the application layer may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, messaging, video, and short messages.
2) Application framework layer
In some possible implementations, the application framework layer may provide various Application Programming Interfaces (APIs) and programming frameworks that may be used by applications that build the application layer, so that developers may also build their own applications by using these APIs.
For example, a window manager (window manager), a content provider (content providers), a view system (view system), a telephone manager (telephone manager), a resource manager, a notification manager (notification manager), a message manager, an activity manager (activity manager), a package manager (package manager), a location manager (location manager), and an NFC service, etc.
In particular, a window manager may be used to manage the windowing program. The window manager can obtain the size of the display screen and judge whether a status bar, a lock screen, a capture screen and the like exist.
In particular, the content provider may be used to store and retrieve data and make the data accessible to applications. The data may include, among other things, video, images, audio, calls made and answered, browsing history and bookmarks, phone books, and the like. In addition, the content provider may enable an application to access data of another application, such as a contacts database, or to share their own data.
In particular, the view system includes a visual control. For example, controls for displaying text, controls for displaying pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
In particular, a phone manager is used to provide communication functions for the electronic device. Such as management of call status (e.g., on, off, etc.).
In particular, the resource manager may provide various resources for the application. Such as localized strings, icons, pictures, layout files, video files, etc.
Specifically, the notification manager enables the application to display notification messages and the like in the status bar, can be used to convey the message type of the notification message, can automatically disappear after a short dwell, and does not require user interaction. For example, a notification manager is used to notify of download completion, message reminders, message notifications, and the like. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system. Additionally, the notification of the application running in the background may also be a notification that appears on the screen in the form of a dialog window. For example, text messages are prompted in the status bar, a prompt tone is sounded, the electronic device vibrates, indicator lights flash, and the like.
Specifically, the message manager may be configured to store data of messages reported by each application program, and process the data reported by each application program.
In particular, the campaign manager may be used to manage application lifecycle and provide common navigation fallback functionality. In one possible example, the message manager may be part of the notification manager.
3) System operation library layer
In some possible implementations, the system runtime library layer may provide a main feature support for the Android system through some C/C + + libraries. For example, the SQLite library provides support for a database, the OpenGL/ES library provides support for 3D drawing, the Webkit library provides support for a browser kernel, and the like. Also provided in the system Runtime layer 340 is an Android Runtime library (Android Runtime), which mainly provides some core libraries capable of allowing developers to write Android applications using the Java language.
4) Inner core layer
In some possible implementations, the kernel layer may provide underlying drivers for various hardware of the electronic device, such as a display driver, an audio driver, a camera driver, a bluetooth driver, a Wi-Fi driver, power management, an NFC driver, a UWB driver, and so forth.
2. Display method
1. Description of the preferred embodiment
Currently, when an electronic device with a specific folding screen enters a hovering state from an unfolded state, interface content displayed in an application window on the whole display area of the folding screen is generally displayed in an upper portion display area by adjusting the size of the application window to be positioned in the upper portion display area, and the interface content of the application is displayed in the upper portion display area in its entirety, while only one Touch Panel (Touch Panel) window is displayed on a lower portion display area, and the Touch Panel window may be a window customized by a device manufacturer for mapping application Touch events of a user to operate the application, such as mapping application Touch events of left/right sliding fast forward, sliding brightness/volume up and down, and the like.
However, since the interface content of the application is entirely displayed on the upper portion display area, the upper portion display area provides the user with the function of operating the application entirely, while the lower portion display area provides the user with the function of operating the application only partially, and the function is relatively single, but the user only needs to touch the upper portion display area when the user wants to operate the application completely. At this time, since the foldable screen is in a half-closed state, there is an operation difficulty when the user touches the upper portion of the display area, and the user may easily exit the hovering mode due to the user operation, which may result in poor user experience.
For example, in fig. 3 (a), the folding screen 310 is in an unfolded state, and interface contents of a music playing application are displayed in the application window 320 over the entire display area of the folding screen 310. The interface contents may include view contents such as comment icon 3301, share icon 3302, pause/play icon 3303, previous song icon 3304, progress bar 3305, and view contents such as text 3306 and dynamic image 3307, and the user may operate/touch the view contents such as comment icon 3301 to operate the music playing application, but the user may not operate/touch the view contents such as text 3306 to operate the music playing application. For example, the user may click on comment icon 3301 to comment on music played by the music playing application, and so on.
In fig. 3 (b), the folded screen 310 enters the hovering state from the unfolded state so that the entire display area of the folded screen 310 is divided into the display area 3501 and the display area 3502 along the folding site 340. The whole interface content of the music playing application is displayed in the application window 3601 on the display area 3501, only one touch panel window is displayed in the application window 3602 on the display area 3502, and a user can slide left/right on the touch panel window to fast forward the music played by the music playing application, and can slide up and down on the touch panel window to adjust the volume of the music played by the music playing application.
At this time, when the user needs to comment on and share the music played by the music playing application, the user still needs to touch the display area 3501. Since the folded screen 310 is in the half-closed state, the display area 3501 is in the inclined state, and thus there may be inconvenience in operation when the user touches the display area 3501, and the display area 3501 and the display area 3502 are easily swung to change the folding angle and the like due to the touch of the user, thereby affecting the experience of the user in using the folded screen 310.
Based on this, the embodiment of the application provides a display method, so as to solve the problem of inconvenient operation of a user, avoid the situation that a display area swings to change a folding angle and the like due to touch of the user, and improve the experience of the user on the folding screen.
A display method according to an embodiment of the present application will be described below.
As shown in fig. 4, fig. 4 is a flowchart illustrating a display method according to an embodiment of the present application, where the method may be applied to an electronic device, and may include the following steps:
s410, responding to the electronic equipment entering into the hovering state, displaying first interface content of a target application on a first display area, and displaying second interface content of the target application on a second display area, wherein the first interface content has a function of not providing a user with a function of operating the target application, and the second interface content has a function of providing the user with the function of operating the target application; or the first interface content has a function of not responding to the touch event of the user to the first display area, and the second interface content has a function of responding to the touch event of the user to the second display area to operate the target application.
As can be seen, when the electronic device enters the hovering state, the embodiment of the application may divide interface content of the target application into two types, that is, a first interface content and a second interface content, and respectively display the first interface content and the second interface content on two display areas, that is, the first display area displays the first interface content, and the second display area displays the second interface content, so that the interface content of the target application is visually separated by a User Interface (UI).
Therefore, when the user wants to operate the target application, the interface content of the target application is classified to be displayed on different display areas respectively, so that the interface content for operating the target application can be displayed on the display area which is easier for the user to operate, the conditions that the display area swings to change the folding angle and the like due to the touch of the user are avoided, and the use experience of the user is improved.
For example, in (c) of fig. 3, the folded screen 310 enters the hovering state from the unfolded state so that the entire display area of the folded screen 310 is divided into the display area 3501 and the display area 3502 along the folding site 340.
Then, the interface contents of the music playing application are classified, and the view contents such as comment icon 3301, share icon 3302, pause/play icon 3303, previous song icon 3304, and the like are one type of view contents, and the view contents such as text 3306, dynamic image 3307, and the like are another type of view contents. The user can operate the music playing application by operating/touching the view contents such as the comment icon 3301 and the share icon 3302, but the user cannot operate the music playing application by operating/touching the view contents such as the text 3306.
Finally, view contents such as a comment icon 3301, a share icon 3302, and the like are displayed in the application window 3602 on the display area 3502, and view contents such as text 3306, a moving image 3307, and the like are displayed in the application window 3601 on the display area 3501.
Thus, when the user needs to comment and share the music played by the music playing application, the user only needs to touch the display area 3502. Compared with the situation that the user needs to touch the display area 3501 in the step (b) of fig. 3, the user can touch the display area 3502 more conveniently and easily, the folding angle of the folding screen 310 can be prevented from being changed, and the use experience of the user is improved.
2. Description of the preferred embodiments
Some implementations related to fig. 4 described above are specifically described below.
(1) Entire display area, first display area and second display area of electronic device in unfolded state
1) Description of the preferred embodiment
It should be noted that when the folding screen of the electronic device is not folded, the folding screen is in an unfolded state in which the folding screen is completely laid flat. At this time, the electronic apparatus is in an unfolded state, and the folded screen can be displayed on the entire display area, as shown in fig. 1.
When a user folds the folding screen inwards or outwards along the folding part according to the requirement of the user, the whole display area of the folding screen is divided into two display areas. For the sake of distinction, the two display regions are referred to as a "first display region" and a "second display region", respectively.
In some possible implementations, the electronic device (the folded screen) enters the hover state from the unfolded state such that the entire display area of the folded screen is divided into the first display area and the second display area.
In some possible implementations, the electronic device (folded screen) enters a hovering state from an unfolded state such that an entire display area of the folded screen is divided into a first display area and a second display area along a folding location.
Specifically, when the first display area is a left display area (upper display area) of the folded screen, the second display area is a right display area (lower display area) of the folded screen, which is not particularly limited.
Specifically, when the first display area is a right display area (lower display area) of the folding screen, the second display area is a left display area (upper display area) of the folding screen, which is not particularly limited.
Specifically, the first display region and the second display region may have the same area. That is, the entire display area of the folded screen is divided into two halves of the same area. In this way, the first display area and the second display area may completely overlap.
Specifically, the first display region and the second display region may not have the same area. That is, the entire display region of the folded screen is divided into two display regions of different areas, which is not particularly limited.
(2) Interface content for a target application
It should be noted that the electronic device with the folding screen may run an application, so that the folding screen may display the interface content of the application. The following is a detailed description.
1) Target application
An application may be referred to as an application (application). The application program may refer to a program that is run on an operating system, may run on an application layer, interacts with a user, and has a visual user interface or an application window, in order to complete a specific task/tasks or have a certain function.
For example, the application may run on the application layer in "3, software architecture of electronic device" described above.
It should be noted that the target application in the embodiment of the present application is only an application specifically executed when the electronic device (folded screen) enters the hovering state, and is mainly used for distinguishing, and is essentially an application.
In some possible implementations, the target application may be of a different type, such as a video-like application, a music-like application, a live-like application, and so on.
In some possible implementations, the type of application displayed with the folded screen in the unfolded state may be different from the type of target application displayed with the folded screen in the hovering state.
For example, an application displayed in the unfolded state of the folded screen is a video-class application, and an application displayed in the state where the folded screen enters the hovering state from the unfolded state is a music-class application.
In some possible implementations, the type of application displayed in the unfolded state of the folded screen may be the same as the type of target application displayed in the hovering state of the folded screen. That is, the application displayed in the unfolded state of the folded screen and the target application displayed in the hovering state of the folded screen are the same application.
2) Application windows
It should be noted that, when an application is run on an electronic device with a folding screen, an application window corresponding to the application is usually displayed on a display area, and interface content of the application may be displayed in the application window.
In particular, the size of the application window may occupy the entire display area of the collapsed screen.
For example, the size of the application window may occupy the entire display area of the collapsed screen 110 shown in FIG. 1.
For another example, in FIG. 2, the size of the application window 2401 may occupy the entire display area 2201; the size of the application window 2402 may occupy the entire display area 2202, which is not particularly limited.
Specifically, the application window is displayed on the display area of the folding screen in a floating manner.
3) Interface content
It should be noted that, the interface content of the target application may be understood that, after the target application is run, all elements or contents used for forming the view/interface of the target application may be visually viewed by a user, and may also be used for operating the application by touching the corresponding interface content.
Specifically, the interface content may be composed of a plurality of view contents, and different view contents may provide different functions, or different view contents may present different visual effects.
4) Classification of interface content
(1) Classifying according to whether target application can be operated
It should be noted that, the interface content of the target application may be classified according to whether the target application can be operated or not. In this regard, the types of interface content may include non-operation type interface content (alternatively referred to as "non-operation type interface content") and operation type interface content (alternatively referred to as "operation type interface content").
Specifically, the non-operation interface content may have a function of not providing the user with the function of operating the target application, or may not have the function of operating the target application. Therefore, the user cannot operate/touch such view contents to operate the target application.
For example, in fig. 3, the non-operation type interface content may include text 3306, dynamic images 3307, etc., and the user cannot operate/touch the view content to operate the music playing application.
Specifically, the operation-class interface content may have a function of providing the user with the function of operating the target application, or a function of operating the target application. Therefore, the user can operate/touch the view content to operate the target application.
For example, in fig. 3, the operation-type interface content may include a comment icon 3301, a share icon 3302, a pause/play icon 3303, a previous song icon 3304, a progress bar 3305, and the like, and the user may operate/touch the operation-type interface content to comment on, share, pause/play, switch, fast forward, and the like, music played by the music playing application.
(2) Classifying according to whether touch event to the folding screen can be responded
It should be noted that the folding screen in the embodiment of the present application may be touch-controlled (clicked/touched, etc.). Accordingly, the user can touch (click/touch, etc.) the folded screen to perform a corresponding event. For example, the user may slide at a specific position in the display area of the folded screen to adjust the display brightness of the folded screen.
Based on this, the interface content of the target application can be classified according to whether the touch event of the folding screen can be responded or not. In this regard, the types of interface content may include non-responsive type interface content (alternatively referred to as "non-responsive type interface content") and responsive type interface content (alternatively referred to as "responsive type interface content").
Specifically, the non-responsive interface content may have a function that does not respond to the touch event of the user pair. Therefore, when the user touches the folding screen to generate a corresponding touch event, such interface content does not respond to the touch event, so that corresponding operations cannot be performed on the target application.
For example, in fig. 3, the non-responsive interface content may include text 3306, dynamic images 3307, and the like, but the interface content cannot respond to the user's touch on the folding screen 310.
Specifically, the response-type interface content may have a function of responding to a touch event of the user pair. Therefore, when a user touches the folding screen to generate a corresponding touch event, such interface content responds to the touch event, so as to perform a corresponding operation on the application.
For example, in fig. 3, the responsive interface content may include comment icon 3301, share icon 3302, pause/play icon 3303, previous song icon 3304, progress bar 3305, etc., and such interface content may be responsive to a user's touch on the folding screen 310 to perform operations such as comment, share, pause/play, switch, fast forward, etc. on music played by the music playing application.
5) First interface content and second interface content
(1) Description of the preferred embodiment
In combination with the content in the above "4) classification of interface content", for convenience of distinction and description, the embodiments of the present application introduce the first interface content and the second interface content.
Specifically, the first interface content may be non-operation interface content, and the second interface content may be operation interface content. In this regard, the first interface content may have a function of not providing the user with an operation target application; the second interface content may have a function of providing the user with an operation target application.
Specifically, the first interface content may be non-response interface content, and the second interface content may be response interface content. In this regard, the first interface content may have a function that does not respond to a touch event of the user to the folding screen; the second interface content may have a function of operating an application in response to a touch event of the user to the folding screen.
In some possible implementations, the first interface content does not include a touchable control and the second interface content includes a touchable control. Wherein the touchable control can have functionality to provide the user with an operation target application.
(2) Composition of
It should be noted that, whether the first interface content or the second interface content is composed of one or more types of view content, it is possible to use the first interface content or the second interface content.
The following describes the composition of the first interface content and the second interface content specifically by taking the Android system as an example.
Activity, canvas (Surface), canvas view (SurfaceView), texture view (TextureView)
In the Android system, an Activity may be an Android-defined application component, may be a component for displaying a user interface, and may be configured to provide an application interface that enables a user to operate and interact with only one user, where the application interface may include multiple activities.
Specifically, the View in Activity may include TextureView, button (Button), image View (ImageView), view (View), and the like. Among them, textureView may be the layer on which the content stream is output as an external texture, which itself requires a hardware acceleration layer.
Specifically, the Surface may be a meaning of "Surface"/"canvas", may be an object to which a graphics buffer (graphics buffer) is drawn, may be a graphics buffer in a memory, and may be a handle of a native buffer managed by a screen content compositor (screen compositor). Therefore, the content in the native buffer (raw buffer) can be obtained through the Surface (because the Surface is a handle), and the native buffer can be used for storing the related data of the interface content displayed by the drawing application window, and the like.
Because the interface content of the target application is presented through the Surface, and the interface content of the target application can be composed of various view contents, various views in the interface content need to be presented through the Surface.
Specifically, the type of the interface content of the target application may include View content of SurfaceView, view content of various views in Activity, and the like, such as View content of TextureView, view content of Button, view content of ImageView, view content of View.
The SurfaceView is provided with an independent Surface and a Layer (Layer), and the SurfaceView can control the format, the size, the drawing position and the like of the Surface. In this regard, the view content of the SurfaceView may be presented by its own Surface. In addition, the rendering of the view content of the SurfaceView may be placed in a separate thread rather than the main thread of the application.
Wherein, each view in the Activity shares the same Surface. In this regard, the view content of each type of view in Activity may be presented by the same Surface. In addition, the rendering of the view content of the various types of views in Activity can be placed in the main thread of the application.
For example, as shown in fig. 5, an application may contain multiple activities, such as Activity 5101, activity 5102, and Activity 5103, and each view in each Activity shares the same Surface, which is a Surface in a surf view.
Composition of first interface content and composition of second interface content
It should be noted that the first interface content and the second interface content may be composed of one or more view contents different from each other.
For example, in conjunction with the above, the first interface content may include at least one of: view content of SurfaceView, view content of TextureView in Activity, etc.; and the second interface content may include at least one of: view content of Button in Activity, imageView in Activity, view content of View in Activity, and the like.
Of course, the first interface content and the second interface content may also include other view content, which is mainly divided according to whether the application can be operated or whether the touch event of the folding screen can be responded, and this is not particularly limited.
(3) How to display the first interface content and the second interface content in a split screen manner
It should be noted that, when a user folds the folding screen inwards or outwards along the folding portion according to the requirement of the user, the folding screen enters a hovering state from an unfolded state to divide the folding screen into a first display area and a second display area, and how to display the first interface content in the first display area and display the second interface content in the second display area may be performed as shown in fig. 6, the following steps may be performed:
step 1: judging whether to create new Surface for the first interface content
It should be noted that, when the first interface content needs to be displayed in the first display area, it is generally necessary to ensure that the first interface content has a Surface, and then the Surface is used to implement the presentation of the first interface content on the first display area.
However, in combination with the content in the "5) the first interface content and the second interface content", since the first interface content may be composed of various view contents, and the first interface content may have a view content with an independent Surface (for example, a view content of a Surface view) or may not have a view content with an independent Surface, it is necessary to first determine whether to create a new Surface for the first interface content in the embodiment of the present application.
If the first interface content only does not have a self-contained Surface, it indicates that a new Surface needs to be created for the first interface content, and step 2 is executed.
If the first interface content has a self-contained Surface and does not have a self-contained Surface, it indicates that a new Surface needs to be created for the first interface content, and step 3 is executed.
If the first interface content only has the self-contained Surface, it indicates that no new Surface needs to be created for the first interface content.
Step 2: creating a new Surface to first interface content
It should be noted that, if the first interface content only does not have a self-contained Surface, the embodiment of the present application may create a new Surface for the first interface content. In this way, the Surface used for presenting the first interface content in the first display area can be obtained in the embodiment of the present application, so that the first interface content can be presented on the newly created Surface.
For example, taking the interface content of the target application as the view content of each type of view in Activity, and the first interface content as the view content of the TextureView in Activity, as shown in fig. 7, because the view content of each type of view in Activity shares the same Surface, the view content of the TextureView does not have a self-contained Surface, and therefore a new Surface needs to be created first, and then the view content of the TextureView is presented on the new Surface.
In addition, in the process of creating a new Surface, in the embodiment of the present application, a new canvas control view hold (Surface control view host) needs to be created first, where the Surface control view host holds an independent layer and has a child node, and the child node may be named as a container view (ContainerView), and then the view content of the TextureView is drawn to the ContainerView, so that the view content of the TextureView is presented on the new Surface.
For example, as shown in FIG. 8, a new SurfaceControlViewHost is created first for TextureView under Activity SurfaceControl. The surfacontrolviewhost has two child (son) nodes of root canvas control (rootsurfacontrol) and controlview, respectively, that is, a parent of the controlview is the rootsurfacontrol, and a parent of the rootsurfacontrol is the surfacontrowhos. Then, drawing the view content of the TextureView on the ContainerView to enable the Surface control of the TextureView to be called the Rootsurface control, thereby realizing the presentation of the view content of the TextureView on the new Surface.
And step 3: linking the self-contained Surface of the first interface content with the newly created Surface without the self-contained Surface view content of the first interface content
It should be noted that, if there are both view contents without a self-contained Surface and view contents with a self-contained Surface in the first interface content, in the embodiment of the present application, a new Surface needs to be created for the view contents without a self-contained Surface in the first interface content, then the view contents without a self-contained Surface are presented on the newly created Surface, and finally the self-contained Surface of the first interface content is linked with the newly created Surface for the view contents without a self-contained Surface of the first interface content. In this way, the Surface for presenting the first interface content on the first display area can be obtained in the embodiment of the application.
For example, taking the interface content of the application as including the view content of each view in the Activity and the view content of the Surface view, and the first interface content includes the view content of the textview in the Activity and the view content of the Surface view as an example, as shown in fig. 9, since the view content of each view in the Activity shares the same Surface, the view content of the textview does not have a self-contained Surface, a new Surface needs to be created first, then the view content of the textview needs to be presented on the new Surface, and finally the self-contained Surface of the Surface view is linked with the new Surface.
In addition, in the process of creating a new Surface, in the embodiment of the present application, a new Surface control view host needs to be created first, where the Surface control view host holds an independent layer and has a child node, i.e., a containervew, and then the view content of the TextureView is drawn onto the containervew, so that the view content of the TextureView is presented on the new Surface.
In some possible implementations, linking the Surface of the Surface view with the new Surface may include re-parent (parent) the Surface control of the Surface view to the new Surface.
For example, as shown in FIG. 10, the parent of SurfaceControl of self-contained Surface of SurfaceView is Activity SurfaceControl. Wherein, the Activity SurfaceControl is the SurfaceControl of the interface content. Therefore, the Surface control of the Surface View is re-superdated to the Surface control of the new Surface, so that the Surface of the Surface View is linked with the new Surface.
As can be seen from the foregoing steps 1 to 3, the Surface for presenting the first interface content on the first display area may be one of the following items:
the first interface content is obtained by linking the self-contained Surface of the first interface content, the newly created Surface to the first interface content, the self-contained Surface of the first interface content and the newly created Surface to the view content without the self-contained Surface in the first interface content.
For example, if the first interface content includes the view content of the Surface view, the Surface used for presenting the view content of the Surface view on the first display area may be the Surface owned by the Surface view.
For another example, if the first interface content includes view content of TextureView in Activity, the Surface for presenting the view content of TextureView on the first display area may be a Surface newly created to the SurfaceView.
For another example, if the first interface content includes a view content of a SurfaceView and a view content of a TextureView in Activity, the Surface for presenting the first interface content on the first display area may be obtained by linking the Surface of the SurfaceView and the Surface of the TextureView.
In addition, as can be seen from the above steps 1 to 3, the surfacontrol of the Surface for presenting the first interface content in the first display area may be a child node under the surfacontrol for presenting the interface content in the entire display area of the electronic device (the folding screen) in the unfolded state, or may be newly created.
For example, if the first interface content includes the view content of the surfeview, the surfacontrol of the Surface for presenting the view content of the surfeview on the first display area may be a child node under the surfacontrol of the interface content, as shown in fig. 10.
For another example, if the first interface content includes the view content of TextureView in Activity, the SurfaceControl of Surface for presenting the view content of SurfaceView on the first display area may be newly created, as shown in FIG. 8.
And 4, step 4: creating an application window on the new first display area for displaying the first interface content
It should be noted that, in combination with the content in the "2) application window", in the embodiment of the present application, an application window corresponding to a new application needs to be created on the first display area, and then the first interface content needs to be displayed in the newly created application window. At this time, the newly created application window may be referred to as an application window for displaying the first interface content on the first display area.
That is, the application window for displaying the first interface content on the first display region may be newly created.
In addition, when the application window is newly created and completed, the SurfaceControl of the newly created application window needs to be created in the embodiment of the present application.
And 5: linking the Surface for presenting the first interface content on the first display area to the newly created application window
It should be noted that, in order to ensure that the first interface content is displayed in the newly created application window, in the embodiment of the present application, the Surface for presenting the first interface content on the obtained first display area needs to be linked to the newly created application window.
Specifically, the linking may be implemented by the following steps:
and re-parent the Surface control of the Surface for presenting the first interface content in the first display area to the Surface control of the created application window, namely the Surface control of the application window for displaying the first interface content in the first display area.
At this time, the parent of the SurfaceControl of the Surface for presenting the first interface content on the first display area is the SurfaceControl of the created application window.
In this way, the embodiment of the present application may implement linking the Surface for presenting the first interface content on the first display area to the newly created application window by re-parent.
Step 6: the window size of the application window used for displaying the interface content on the whole display area of the folding screen is adjusted to obtain the application window used for displaying the second interface content on the second display area
It should be noted that, in combination with the content in the "2) application window", an application window needs to be provided on the second display area, so as to display the second interface content in the application window, and the embodiment of the present application may perform window size adjustment on the application window for displaying the interface content on the entire display area of the folded screen, so as to obtain the application window for displaying the second interface content on the second display area.
That is, the application window for displaying the second interface content on the second display area may be obtained by performing window size adjustment on the application window for displaying the interface content on the entire display area of the electronic device (folding screen) in the unfolded state.
Therefore, the application window used for displaying the second interface content on the second display area can be quickly obtained only by adjusting the window size through the application window, and the display efficiency is improved.
For example, fig. 11 is shown in addition to fig. 9. Wherein the application window 1120 on the entire display area of the folded screen 1110 displays interface contents of an application when the folded screen 1110 is in an unfolded state. The interface content comprises view content of various views in Activity and view content of SurfaceView, and the first interface content comprises view content of TextureView and view content of SurfaceView in Activity.
When the folded screen 1110 enters the hovering state from the unfolded state such that the entire display area of the folded screen 1110 is divided into the first display area 1130 and the second display area 1140 along the folding portion, a newly created application window 1150 is displayed on the first display area 1130, and the application window 1160 resulting from the window size adjustment of the application window 1120 is displayed on the second display area 1140.
Then, the Surface obtained by linking the Surface of the Surface view with the new Surface is linked in the application window 1150, and the Surface shared by the view contents of various views in the Activity is kept in the application window 1160. Finally, the view content of SurfaceView and the view content of TextureView may be displayed on the application window 1150, while the view content of the remaining views in Activity other than TextureView may be displayed on the application window 1160.
In summary, the Surface for presenting the first interface content on the first display area is different from the Surface for presenting the second interface content on the second display area, as shown in fig. 11.
Specifically, the Surface for presenting the second interface content on the second display area may be the Surface for presenting the interface content on the entire display area of the folding screen, as shown in fig. 11.
3. Exemplary description of a display device
1. Description of the invention
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It will be appreciated that the apparatus, in order to carry out the above-described functions, comprises corresponding hardware structures and/or software modules for performing the respective functions. Those of skill in the art would appreciate that the various illustrative methods, functions, modules, elements, or steps described in connection with the embodiments provided herein may be implemented as hardware or in combination with computer software. Whether a method, function, module, unit or step is performed by hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. A person skilled in the art may use different methods to implement the described methods, functions, modules, units or steps for each specific application, but such implementation should not be considered beyond the scope of the present application.
The embodiment of the application can divide the functional units/modules of the electronic equipment according to the method example. For example, each functional unit/module may be divided for each function, or two or more functions may be integrated into one functional unit/module. The integrated functional units/modules may be implemented in a hardware manner or a software program manner. It should be noted that, in the embodiment of the present application, the division of the functional units/modules is schematic, and only one logical function division is used, and there may be another division manner in actual implementation.
In the case of using an integrated unit, fig. 12 is a functional unit composition block diagram of a display device according to an embodiment of the present application. The display device 1200 specifically includes: a display unit 1210.
It should be noted that the display unit 1210 may be a module unit for processing or displaying signals, data, information, images, and the like, and is not limited in particular.
In some possible implementations, the display unit may be integrated in the display assembly.
In some possible implementations, the display unit 1210 may be integrated in one unit.
For example, the display unit 1210 may be integrated in the processing unit.
It should be noted that the processing unit may be a processor or a controller, and for example, may be a Central Processing Unit (CPU), a general purpose processor, a Digital Signal Processor (DSP), an application-specific integrated circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, a transistor logic device, a hardware component, or any combination thereof. Which may implement or execute the various illustrative logical blocks, modules, and circuits described in connection with the disclosure herein. A processing unit may also be a combination of computing functions, e.g., comprising one or more microprocessors in combination, a DSP and a microprocessor in combination, or the like.
In some possible implementations, the display device 1200 may also include a storage unit to store computer programs or instructions for execution by the display device 1200.
For example, the storage unit may be a memory.
In some possible implementations, the display device 1200 may be a chip/chip module/processor/hardware or the like.
In particular implementation, the display device 1200 is configured to perform the steps as described in the above method embodiments. The details will be described below.
A display unit 1210 for displaying a first interface content of a target application on a first display area of the electronic device and a second interface content of the target application on a second display area of the electronic device in response to the electronic device entering a hovering state;
the first interface content has a function of not providing the user with the operation target application, and the second interface content has a function of providing the user with the operation target application; or,
the first interface content has a function of not responding to a user's touch event to the first display area, and the second interface content has a function of responding to a user's touch event to the second display area to operate the target application.
The contents in "4) the classification of the interface contents" and "5) the first interface contents and the second interface contents" are combined. The first interface content may be non-operation interface content, and the second interface content may be operation interface content.
The non-operation interface content may have a function of not providing the user with the operation target application or a function of not providing the user with the operation target application. Therefore, the user cannot operate/touch such interface contents to operate the target application.
For example, in fig. 3, the non-operation interface content may include text 3306, dynamic images 3307, and the like, and the user cannot operate/touch the interface content to operate the music playing application.
The operation interface content may have a function of providing the user with the operation target application or a function of operating the target application. Therefore, the user can operate/touch such interface contents to operate the application.
For example, in fig. 3, the operation-type interface content may include a comment icon 3301, a share icon 3302, a pause/play icon 3303, a previous song icon 3304, a progress bar 3305, and the like, and the user may operate/touch the operation-type interface content to comment on, share, pause/play, switch, fast forward, and the like, music played by the music playing application.
In addition, the first interface content may be non-response-type interface content, and the second interface content may be response-type interface content.
The non-response interface content may have a function that does not respond to the touch event of the user pair. Therefore, when the user touches the folding screen to generate a corresponding touch event, such interface content does not respond to the touch event, so that corresponding operations cannot be performed on the target application.
For example, in fig. 3, the non-responsive interface content may include text 3306, dynamic images 3307, etc., while such view content is not responsive to the user's touch of the folding screen 310.
The response type interface content can have a function of responding to the touch event of the user pair. Therefore, when the user touches the folding screen to generate a corresponding touch event, such interface content will respond to the touch event, so as to perform a corresponding operation on the target application.
For example, in fig. 3, the response-type interface content may include a comment icon 3301, a share icon 3302, a pause/play icon 3303, a previous song icon 3304, a progress bar 3305, and the like, and the interface content may respond to the user's touch on the folding screen 310 to comment on, share, pause/play, switch, fast forward, and the like, the music played by the music playing application.
As can be seen, when the electronic device enters the hovering state, the interface content of the target application may be divided into two types, that is, the first interface content and the second interface content, and the first interface content and the second interface content are respectively displayed on two display areas of the electronic device, that is, the first display area displays the first interface content, and the second display area displays the second interface content, so that the interface content of the target application is visually separated by the UI.
Therefore, when the user wants to operate the target application, the interface content of the target application is classified to be displayed on different display areas respectively, so that the interface content for operating the target application can be displayed on the display area which is easier for the user to operate, the conditions that the display area swings to change the folding angle and the like due to the touch of the user are avoided, and the use experience of the user is improved.
For example, in fig. 3 (c), the folded screen 310 enters the hovering state from the unfolded state so that the entire display area of the folded screen 310 is divided into the display area 3501 and the display area 3502 along the folding site 340.
Then, the interface contents of the music playing application are classified, and the view contents such as comment icon 3301, share icon 3302, pause/play icon 3303, previous song icon 3304, and the like are one type of view contents, and the view contents such as text 3306, dynamic image 3307, and the like are another type of view contents. The user can operate the music playing application by operating/touching the view contents such as the comment icon 3301 and the share icon 3302, but the user cannot operate the music playing application by operating/touching the view contents such as the text 3306.
Finally, view contents such as a comment icon 3301, a share icon 3302, and the like are displayed in the application window 3602 on the display area 3502, and view contents such as text 3306, a moving image 3307, and the like are displayed in the application window 3601 on the display area 3501.
Thus, when the user needs to comment and share the music played by the music playing application, the user only needs to touch the display area 3502. Compared with the situation that the user needs to touch the display area 3501 in the step (b) of fig. 3, the user can touch the display area 3502 more conveniently and easily, the folding angle of the folding screen 310 can be prevented from being changed, and the use experience of the user is improved.
It should be noted that, for specific implementation of each operation performed by the display device 1200, reference may be made to the corresponding description of the method embodiment described above, and details are not described here again.
2. Other embodiments
Some of the related implementations are described below, and other irrelevant contents may be specifically described in the above description, which is not described again.
In some possible implementations, the first interface content includes at least one of: view content of canvas view SurfaceView, view content of texture view TextureView in Activity;
second interface content comprising at least one of: view content of Button in Activity, view content of image View ImageView in Activity, view content of View View in Activity.
It should be noted that, in combination with the content in the "5) the composition (2) of the first interface content and the second interface content", when the Android system is taken as an example in the embodiment of the present application to describe the composition of the first interface content and the second interface content, the first interface content may include at least one of the following: view content of SurfaceView, view content of TextureView in Activity, etc.; and the second interface content may include at least one of: view contents of Button in Activity, view contents of ImageView in Activity, view contents of View in Activity, and the like.
Of course, the first interface content and the second interface content may also include other view content, which is mainly divided according to whether the application can be operated or whether the touch event of the folding screen can be responded, and this is not particularly limited.
In some possible implementations, the canvas Surface on the first display area for presenting the first interface content is different than the Surface on the second display area for presenting the second interface content.
It should be noted that, in combination with the content in "how to perform split-screen display on the first interface content and the second interface content" in the above description, when the first interface content needs to be displayed in the first display area, it is usually required to ensure that the first interface content has a Surface, and then the first interface content is presented in the first display area by using the Surface. Similarly, when the second interface content needs to be displayed in the second display area, it is usually required to ensure that the second interface content has a Surface, and then the second interface content is displayed in the second display area through the Surface.
However, as can be seen from fig. 11, since the Surface for presenting the first interface content on the first display area in the embodiment of the present application may be newly created or obtained by linking, the Surface for presenting the first interface content on the first display area may be different from the Surface for presenting the second interface content on the second display area.
In some possible implementations, the Surface on the first display area for presenting the first interface content is one of:
the first interface content self-contained Surface, the Surface newly created for the first interface content, the first interface content self-contained Surface and the Surface newly created for the view content without the self-contained Surface in the first interface content are linked to obtain the first interface content self-contained Surface.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", since the first interface content may be composed of various view contents, and the first interface content may have view contents with an independent Surface (such as view contents of Surface view) or may not have view contents with an independent Surface, it is necessary to first determine whether to create a new Surface for the first interface content in the embodiment of the present application.
If the first interface content only does not have a self-contained Surface, the embodiment of the application may create a new Surface for the first interface content. In this way, the Surface for presenting the first interface content on the first display area can be obtained in the embodiment of the application, so that the first interface content can be presented on the newly created Surface.
If the first interface content has a self-contained Surface and no self-contained Surface, the embodiment of the present application needs to create a new Surface for the view content without a self-contained Surface in the first interface content, then present the view content without a self-contained Surface on the newly created Surface, and finally link the self-contained Surface of the first interface content with the newly created Surface for the view content without a self-contained Surface of the first interface content. In this way, the Surface for presenting the first interface content on the first display area can be obtained in the embodiment of the application.
In some possible implementations, if the first interface content includes the view content of the SurfaceView, the Surface for presenting the view content of the SurfaceView on the first display area is the Surface of the SurfaceView; or,
if the first interface content comprises the view content of the TextureView in the Activity, a Surface used for presenting the view content of the TextureView on the first display area is a Surface newly created for the SurfaceView; or,
if the first interface content comprises the view content of the SurfaceView and the view content of the TextureView in the Activity, the Surface used for presenting the first interface content on the first display area is obtained by linking the Surface of the SurfaceView and the Surface of the TextureView.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", since the first interface content may include view content of textview in Activity, view content of surfaview, and view content of textaview in Activity, the embodiment of the present application needs to be specifically described in different cases so as to determine a Surface for presenting the first interface content on the first display area.
In some possible implementations, the Surface for presenting the second interface content on the second display area is in the Surface for presenting the interface content on the entire display area of the collapsed screen.
It should be noted that, with reference to the content in "3," how to perform split-screen display on the first interface content and the second interface content, "in fig. 11, since the Surface of the first interface content and the Surface of the second interface content may be surfaces sharing interface content, the Surface used for presenting the second interface content on the second display area may be in the surfaces of the interface content. Therefore, creation of the Surface for presenting the second interface content on the second display area can be saved, and display efficiency is improved.
In some possible implementations, the application window on the first display area for displaying the first interface content is newly created;
the application window for displaying the second interface content on the second display area is obtained by performing window size adjustment on the application window for displaying the interface content on the entire display area of the electronic device (the folding screen) in the unfolded state.
It should be noted that, in combination with the content in "3," how to perform split-screen display on the first interface content and the second interface content ", in the embodiment of the present application, an application window corresponding to a new application needs to be created on the first display area, and then the first interface content needs to be displayed in the newly created application window. At this time, the newly created application window may be referred to as an application window for displaying the first interface content on the first display area.
In addition, when the application window is newly created and completed, the SurfaceControl of the newly created application window needs to be created in the embodiment of the present application.
In some possible implementations, the Surface for presenting the first interface content on the first display region is linked to an application window for displaying the first interface content on the first display region.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", in order to ensure that the first interface content is displayed in the newly created application window, in the embodiment of the present application, the Surface for presenting the first interface content on the obtained first display area needs to be linked to the newly created application window.
In some possible implementations, the linking is implemented using the following steps:
and controlling the Surface control of the canvas of the Surface for presenting the first interface content on the first display area, and resetting the canvas to the Surface control of the application window for displaying the first interface content on the first display area.
It should be noted that, in combination with the content in "3, how to display the first interface content and the second interface content in a split screen manner" above, the embodiment of the present application may implement linking the Surface for presenting the first interface content on the first display area to the newly created application window by re-parent. At this time, the parent of the SurfaceControl of the Surface for presenting the first interface content on the first display area is the SurfaceControl of the created application window.
In some possible implementations, the surfacontrol of the Surface for presenting the first interface content on the first display area may be a child node under the surfacontrol for presenting the interface content on the entire display area of the electronic device (the folding screen) in the unfolded state, or may be newly created.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", if the first interface content includes the view content of the Surface view, the Surface control of the Surface for presenting the view content of the Surface view on the first display area may be a child node under the Surface control of the interface content, as shown in fig. 10.
If the first interface content includes view content of TextureView in Activity, a SurfaceControl for Surface on the first display area for presenting the view content of SurfaceView may be newly created, as shown in FIG. 8.
In some possible implementations, the first display region has the same area as the second display region.
It should be noted that, in combination with the content in "1, the entire display area of the folded screen, the first display area, and the second display area", the embodiment of the present application may divide the entire display area of the folded screen into two halves of the same area along the folding portion. In this way, the first display area and the second display area may completely overlap.
4. Exemplary description of an electronic device
1. Description of the invention
A schematic structural diagram of an electronic device according to an embodiment of the present application is described below, as shown in fig. 13. The electronic device 1300 includes a processor 1310, a memory 1320, and at least one communication bus for connecting the processor 1310 and the memory 1320.
In some possible implementations, the processor 1310 may be one or more central processing units CPU. In the case where the processor 1310 is a CPU, the CPU may be a single core CPU or a multi-core CPU. The memory 1320 includes, but is not limited to, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a portable read-only memory (CD-ROM), and the memory 1320 is used to store computer programs or instructions.
In some possible implementations, the electronic device 1300 also includes a communication interface to receive and transmit data.
In some possible implementations, the processor 1310 in the electronic device 1300 is used to execute the computer program or instructions 1321 stored in the memory 1320 to implement the steps of:
in response to the electronic device 1300 entering the hover state, displaying first interface content of the target application on a first display area of the electronic device 1300 and displaying second interface content of the target application on a second display area of the electronic device 1300;
the first interface content has a function of not providing the user with the operation target application, and the second interface content has a function of providing the user with the operation target application; or,
the first interface content has a function of not responding to a user's touch event to the first display area, and the second interface content has a function of responding to a user's touch event to the second display area to operate the target application.
The contents in "4) the classification of the interface contents" and "5) the first interface contents and the second interface contents" are combined. The first interface content may be non-operation type interface content, and the second interface content may be operation type interface content.
The non-operation interface content may have a function of not providing the user with the operation target application or a function of not providing the user with the operation target application. Therefore, the user cannot operate/touch such interface contents to operate the target application.
For example, in fig. 3, the non-operation type interface content may include text 3306, a dynamic image 3307, and the like, and the user cannot operate/touch the interface content to operate a music playing application.
The operation interface content may have a function of providing the user with the operation target application or a function of operating the target application. Therefore, the user can operate/touch such interface contents to operate the application.
For example, in fig. 3, the operation-type interface content may include a comment icon 3301, a share icon 3302, a pause/play icon 3303, a previous song icon 3304, a progress bar 3305, and the like, and the user may operate/touch the operation-type interface content to comment on, share, pause/play, switch, fast forward, and the like, music played by the music playing application.
In addition, the first interface content may be non-response-type interface content, and the second interface content may be response-type interface content.
The non-response interface content may have a function that does not respond to the touch event of the user pair. Therefore, when the user touches the folding screen to generate a corresponding touch event, such interface content does not respond to the touch event, so that corresponding operations cannot be performed on the target application.
For example, in fig. 3, the non-responsive interface content may include text 3306, dynamic images 3307, etc., and such view content may not be responsive to the user's touch on the folding screen 310.
The response type interface content can have a function of responding to the touch event of the user pair. Therefore, when the user touches the folding screen to generate a corresponding touch event, such interface content will respond to the touch event, so as to perform a corresponding operation on the target application.
For example, in fig. 3, the response-type interface content may include a comment icon 3301, a share icon 3302, a pause/play icon 3303, a previous song icon 3304, a progress bar 3305, and the like, and the interface content may respond to the user's touch on the folding screen 310 to comment on, share, pause/play, switch, fast forward, and the like, the music played by the music playing application.
Therefore, when the electronic device enters the hovering state, the interface content of the target application can be divided into two types, namely the first interface content and the second interface content, and the two types of view types are respectively displayed on the two display areas of the electronic device, namely the first display area displays the first interface content, and the second display area displays the second interface content, so that the visual UI separation of the application interface content is realized.
Therefore, when the user wants to operate the target application, the interface content of the target application is classified to be displayed on different display areas respectively, so that the interface content for operating the target application can be displayed on the display area which is easier for the user to operate, the conditions that the display area swings to change the folding angle and the like due to the touch of the user are avoided, and the use experience of the user is improved.
For example, in fig. 3 (c), the folded screen 310 enters the hovering state from the unfolded state so that the entire display area of the folded screen 310 is divided into the display area 3501 and the display area 3502 along the folding site 340.
Then, the interface contents of the music playing application are classified, and the view contents such as comment icon 3301, share icon 3302, pause/play icon 3303, previous song icon 3304, and the like are one type of view contents, and the view contents such as text 3306, dynamic image 3307, and the like are another type of view contents. The user can operate the music playing application by operating/touching the view contents such as the comment icon 3301, the share icon 3302, and the like, but the user cannot operate the music playing application by operating/touching the view contents such as the text 3306.
Finally, view contents such as a comment icon 3301, a share icon 3302, and the like are displayed in the application window 3602 on the display area 3502, and view contents such as a text 3306, a moving image 3307, and the like are displayed in the application window 3601 on the display area 3501.
Thus, when the user needs to comment and share the music played by the music playing application, the user only needs to touch the display area 3502. Compared with the situation that the user needs to touch the display area 3501 in the step (b) of fig. 3, the user can touch the display area 3502 more conveniently and easily, the folding angle of the folding screen 310 can be prevented from being changed, and the use experience of the user is improved.
It should be noted that, for specific implementation of each operation performed by the electronic device 1300, reference may be made to the corresponding description of the method embodiment shown above, and details are not described here again.
2. Other possible implementations
Some of the related implementations are described below, and other irrelevant contents may be specifically described in the above description, which is not described again.
In some possible implementations, the first interface content includes at least one of: view content of canvas view SurfaceView, view content of texture view TextureView in Activity;
second interface content comprising at least one of: view content of Button in Activity, view content of ImageView of image View in Activity, view content of View in Activity.
It should be noted that, with reference to the content in the "5) the composition (2) of the first interface content and the second interface content", in the embodiment of the present application, for example, the Android system is taken as an example, when the composition of the first interface content and the second interface content is described, the first interface content may include at least one of the following: the view content of SurfaceView, the view content of TextureView in Activity, etc.; and the second interface content may include at least one of: view content of Button in Activity, imageView in Activity, view content of View in Activity, and the like.
Of course, the first interface content and the second interface content may also include other view content, which is mainly divided according to whether the application can be operated or whether the touch event of the folding screen can be responded, and this is not particularly limited.
In some possible implementations, the canvas Surface on the first display area for presenting the first interface content is different than the Surface on the second display area for presenting the second interface content.
It should be noted that, in combination with the content in "how to perform split-screen display on the first interface content and the second interface content" in the above description, when the first interface content needs to be displayed in the first display area, it is usually required to ensure that the first interface content has a Surface, and then the first interface content is presented in the first display area by using the Surface. Similarly, when the second interface content needs to be displayed in the second display area, it is usually required to ensure that the second interface content has a Surface, and then the second interface content is displayed in the second display area through the Surface.
However, as can be seen from fig. 11, in the embodiment of the present application, since the Surface for presenting the first interface content in the first display area may be newly created or obtained by linking, the Surface for presenting the first interface content in the first display area may be different from the Surface for presenting the second interface content in the second display area.
In some possible implementations, the Surface on the first display area for presenting the first interface content is one of:
the first interface content self-contained Surface, the Surface newly created for the first interface content, the first interface content self-contained Surface and the Surface newly created for the view content without the self-contained Surface in the first interface content are linked to obtain the first interface content self-contained Surface.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", since the first interface content may be composed of various view contents, and the first interface content may have view contents with an independent Surface (for example, view contents of Surface view) or may not have view contents with an independent Surface, it needs to first determine whether to create a new Surface for the first interface content.
If the first interface content only does not have a self-contained Surface, the embodiment of the application may create a new Surface for the first interface content. In this way, the Surface for presenting the first interface content on the first display area can be obtained in the embodiment of the application, so that the first interface content can be presented on the newly created Surface.
If the first interface content has a self-contained Surface and a non-self-contained Surface, the embodiment of the present application needs to create a new Surface to the view content without a self-contained Surface in the first interface content, present the view content without a self-contained Surface on the newly created Surface, and link the self-contained Surface of the first interface content with the newly created Surface to the view content without a self-contained Surface of the first interface content. In this way, the Surface for presenting the first interface content on the first display area can be obtained in the embodiment of the application.
In some possible implementations, if the first interface content includes the view content of the SurfaceView, the Surface for presenting the view content of the SurfaceView on the first display area is the Surface of the SurfaceView; or,
if the first interface content comprises the view content of the TextureView in the Activity, a Surface used for presenting the view content of the TextureView on the first display area is a Surface newly created for the SurfaceView; or,
if the first interface content comprises the view content of the SurfaceView and the view content of the TextureView in the Activity, the Surface used for presenting the first interface content on the first display area is obtained by linking the Surface of the SurfaceView and the Surface of the TextureView.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", the first interface content may include a view content of textview in Activity, a view content of SurfaceView, and a view content of textviewview in Activity, and therefore, in the embodiment of the present application, it is necessary to perform detailed description in different cases so as to determine a Surface for presenting the first interface content on the first display area.
In some possible implementations, the Surface for presenting the second interface content on the second display area is in the Surface for presenting the interface content on the entire display area of the collapsed screen.
It should be noted that, with reference to the content in "3," how to perform split-screen display on the first interface content and the second interface content, "in fig. 11, since the Surface of the first interface content and the Surface of the second interface content may be surfaces sharing interface content, the Surface used for presenting the second interface content on the second display area may be in the surfaces of the interface content. Therefore, creation of the Surface for presenting the second interface content on the second display area can be saved, and display efficiency is improved.
In some possible implementations, the application window on the first display area for displaying the first interface content is newly created;
the application window for displaying the second interface content on the second display area is obtained by performing window size adjustment on the application window for displaying the interface content on the entire display area of the electronic device (folding screen) in the unfolded state.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", in the embodiment of the present application, an application window corresponding to a new application needs to be created on the first display area, and then the first interface content is displayed in the newly created application window. At this time, the newly created application window may be referred to as an application window on the first display region for displaying the first interface content.
In addition, when the application window is newly created, the SurfaceControl of the newly created application window needs to be created in the embodiment of the present application.
In some possible implementations, the Surface for presenting the first interface content on the first display area is linked to an application window for displaying the first interface content on the first display area.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", in order to ensure that the first interface content is displayed in the newly created application window, in the embodiment of the present application, the Surface for presenting the first interface content on the obtained first display area needs to be linked to the newly created application window.
In some possible implementations, the linking is implemented using the following steps:
and controlling the Surface control of the canvas of the Surface for presenting the first interface content on the first display area, and resetting the canvas to the Surface control of the application window for displaying the first interface content on the first display area.
It should be noted that, in combination with the content in "3, how to display the first interface content and the second interface content in a split screen manner" above, the embodiment of the present application may implement linking the Surface for presenting the first interface content on the first display area to the newly created application window by re-parent. At this time, the parent of the SurfaceControl of the Surface for presenting the first interface content on the first display area is the SurfaceControl of the created application window.
In some possible implementations, the surfacontrol of the Surface for presenting the first interface content on the first display area may be a child node under the surfacontrol for presenting the interface content on the entire display area of the electronic device (folding screen) in the unfolded state, or newly created.
It should be noted that, in combination with the content in "3, how to perform split-screen display on the first interface content and the second interface content", if the first interface content includes the view content of the Surface view, the Surface control of the Surface for presenting the view content of the Surface view on the first display area may be a child node under the Surface control of the interface content, as shown in fig. 10.
If the first interface content includes view content of TextureView in Activity, a SurfaceControl for Surface on the first display area for presenting the view content of SurfaceView may be newly created, as shown in FIG. 8.
In some possible implementations, the first display region has the same area as the second display region.
It should be noted that, in combination with the content in "1, the entire display area of the folded screen, the first display area, and the second display area", the embodiment of the present application may divide the entire display area of the folded screen into two halves of the same area along the folding portion. In this way, the first display area and the second display area may completely overlap.
5. Other exemplary description
Embodiments of the present application also provide a computer-readable storage medium, where a computer program or an instruction is stored on the computer-readable storage medium, and the computer program or the instruction is executed by a processor to implement the steps described in the above embodiments.
Embodiments of the present application also provide a computer program product, which includes a computer program or instructions, where the computer program or instructions are executed by a processor to implement the steps described in the above embodiments. Illustratively, the computer program product may be a software installation package.
In addition, a computer program product should be understood as a software product that mainly implements a computer program or instructions to solve the technical solutions of the present application.
For simplicity of description, the above embodiments are described as a series of combinations of operations. Those skilled in the art should appreciate that the present application is not limited by the order of acts described, as some steps in the embodiments of the present application may occur in other orders or concurrently. In addition, those skilled in the art should also realize that the embodiments described in the specification all belong to the preferred embodiments, and that the referred actions, steps, modules, units, and the like are not necessarily required by the embodiments of the present application.
In the foregoing embodiments, the description of each embodiment in the embodiments of the present application has an emphasis, and reference may be made to relevant descriptions of other embodiments for parts that are not described in detail in a certain embodiment.
It should be clear to a person skilled in the art that the methods, steps or functions of related modules/units described in the embodiments of the present application can be implemented in whole or in part by software, hardware, firmware or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product or in the form of computer program instructions executed by a processor. Wherein the computer program product comprises at least one computer program instruction, which may consist of corresponding software modules, which may be stored in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable hard disk, a compact disc read only memory (CD-ROM), or any other form of storage medium known in the art. The computer program instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium. For example, the computer program instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by wired or wireless means. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more of the available media. The available media may be magnetic media (e.g., floppy disks, hard disks, tapes), optical media, or semiconductor media (e.g., SSDs), among others.
Each module/unit included in each apparatus or product described in the foregoing embodiments may be a software module/unit, may be a hardware module/unit, or may be a part of the software module/unit and another part of the hardware module/unit. For example, for each device or product in which the application is or is integrated on a chip, each module/unit included in the device or product may be implemented in a hardware manner, such as a circuit; alternatively, a part of the modules/units included in the method may be implemented by using a software program running on a processor integrated inside a chip, and another part (if any) of the modules/units may be implemented by using hardware such as a circuit. The same is true for various devices or products in which the application is integrated or integrated into the chip module, or for various devices or products in which the application is integrated or integrated into the terminal.
The above embodiments are intended to illustrate the objects, aspects and advantages of the embodiments of the present application in further detail, and it should be understood that the above embodiments are only illustrative of the embodiments of the present application and are not intended to limit the scope of the embodiments of the present application. Any modification, equivalent replacement, improvement and the like made on the basis of the technical solutions of the embodiments of the present application should be included in the protection scope of the embodiments of the present application.

Claims (14)

1. A display method, comprising:
in response to an electronic device entering a hover state, displaying first interface content of a target application on a first display area of the electronic device and displaying second interface content of the target application on a second display area of the electronic device;
wherein the first interface content has a function of not providing a user with a function of operating the target application, and the second interface content has a function of providing a user with a function of operating the target application; or,
the first interface content has a function of not responding to a user touch event to the first display area, and the second interface content has a function of responding to a user touch event to the second display area to operate the target application.
2. The method of claim 1, wherein the first interface content comprises at least one of: view content of canvas view SurfaceView, view content of texture view TextureView in Activity;
the second interface content comprising at least one of: view content of Button in Activity, view content of image View ImageView in Activity, view content of View View in Activity.
3. The method of claim 1, wherein a canvas Surface on the first display area for presenting the first interface content is different from a canvas Surface on the second display area for presenting the second interface content.
4. The method of claim 1, wherein a Surface on the first display area for presenting the first interface content comprises one of:
the first interface content self-contained Surface, the Surface newly created for the first interface content, the first interface content self-contained Surface and the Surface newly created for the view content without the self-contained Surface in the first interface content are linked to obtain the first interface content self-contained Surface.
5. The method of claim 4, wherein if the first interface content comprises a SurfaceView view content, a Surface on the first display area for presenting the SurfaceView view content is a Surface owned by the SurfaceView; or,
if the first interface content comprises view content of a TextureView in Activity, a Surface used for presenting the view content of the TextureView on the first display area is a Surface newly created for the SurfaceView; or,
if the first interface content comprises the view content of the SurfaceView and the view content of the TextureView in Activity, the Surface used for presenting the first interface content on the first display area is obtained by linking the Surface of the SurfaceView and the Surface of the TextureView.
6. The method of claim 1, wherein the Surface on the second display area for presenting the second interface content is within the Surface for presenting the interface content of the target application on the entire display area of the electronic device in the unfolded state.
7. The method of claim 1, wherein the application window on the first display area for displaying the first interface content is newly created;
the application window for displaying the second interface content on the second display area is obtained by performing window size adjustment on the application window for displaying the target application on the whole display area of the electronic device in the unfolded state.
8. The method of claim 1, wherein the Surface on the first display area for presenting the first interface content is linked to an application window on the first display area for displaying the first interface content.
9. The method of claim 8, wherein the linking is performed by:
and re-parent the canvas control Surface control of the Surface used for presenting the first interface content in the first display area to the Surface control of the application window used for displaying the first interface content in the first display area.
10. The method according to claim 1, wherein the Surface control of the Surface for presenting the first interface content on the first display area is a child node under the Surface control for presenting the interface content of the target application on the whole display area of the electronic device in the unfolded state, or is newly created.
11. The method of claim 1, wherein the first display region and the second display region have the same area.
12. A display device, comprising:
the display unit is used for responding to the electronic equipment entering the hovering state, displaying first interface content of a target application on a first display area of the electronic equipment, and displaying second interface content of the target application on a second display area of the electronic equipment;
wherein the first interface content has a function of not providing a user with a function of operating the target application, and the second interface content has a function of providing a user with a function of operating the target application; or,
the first interface content has a function of not responding to a user touch event to the first display area, and the second interface content has a function of responding to a user touch event to the second display area to operate the target application.
13. An electronic device comprising a processor, a memory, and a computer program or instructions stored on the memory, the processor executing the computer program or instructions to implement the steps of the method of any of claims 1-11.
14. A computer-readable storage medium, characterized in that it stores a computer program or instructions which, when executed, implement the steps of the method of any one of claims 1-11.
CN202211521939.3A 2022-11-30 2022-11-30 Display method and device and electronic equipment Pending CN115904151A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202211521939.3A CN115904151A (en) 2022-11-30 2022-11-30 Display method and device and electronic equipment
PCT/CN2023/119109 WO2024114051A1 (en) 2022-11-30 2023-09-15 Display method and apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211521939.3A CN115904151A (en) 2022-11-30 2022-11-30 Display method and device and electronic equipment

Publications (1)

Publication Number Publication Date
CN115904151A true CN115904151A (en) 2023-04-04

Family

ID=86491020

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211521939.3A Pending CN115904151A (en) 2022-11-30 2022-11-30 Display method and device and electronic equipment

Country Status (2)

Country Link
CN (1) CN115904151A (en)
WO (1) WO2024114051A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024114051A1 (en) * 2022-11-30 2024-06-06 南京欧珀软件科技有限公司 Display method and apparatus, and electronic device

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102127930B1 (en) * 2014-02-14 2020-06-29 엘지전자 주식회사 Mobile terminal and method for controlling the same
CN110308956B (en) * 2018-03-20 2021-01-12 青岛海信移动通信技术股份有限公司 Application interface display method and device and mobile terminal
CN113268196A (en) * 2019-06-05 2021-08-17 华为技术有限公司 Display method of flexible screen and electronic equipment
CN111522523A (en) * 2020-04-30 2020-08-11 北京小米移动软件有限公司 Display processing method and device and computer storage medium
CN115904151A (en) * 2022-11-30 2023-04-04 南京欧珀软件科技有限公司 Display method and device and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024114051A1 (en) * 2022-11-30 2024-06-06 南京欧珀软件科技有限公司 Display method and apparatus, and electronic device

Also Published As

Publication number Publication date
WO2024114051A1 (en) 2024-06-06

Similar Documents

Publication Publication Date Title
WO2021023021A1 (en) Display method and electronic device
WO2021159922A1 (en) Card display method, electronic device, and computer-readable storage medium
WO2021057830A1 (en) Information processing method and electronic device
CN111966252A (en) Application window display method and electronic equipment
WO2020062294A1 (en) Display control method for system navigation bar, graphical user interface, and electronic device
WO2021008334A1 (en) Data binding method, apparatus, and device of mini program, and storage medium
WO2022062898A1 (en) Window display method and device
WO2021052311A1 (en) Method for displaying user interface according to color of rear case, and electronic device
CN111602381A (en) Icon switching method, method for displaying GUI (graphical user interface) and electronic equipment
WO2023005751A1 (en) Rendering method and electronic device
WO2024114051A1 (en) Display method and apparatus, and electronic device
WO2021052488A1 (en) Information processing method and electronic device
CN112416486A (en) Information guiding method, device, terminal and storage medium
WO2022001261A1 (en) Prompting method and terminal device
CN111949150B (en) Method and device for controlling peripheral switching, storage medium and electronic equipment
CN114741121A (en) Method and device for loading module and electronic equipment
CN114500731A (en) Advertisement display method and electronic equipment
CN117130688B (en) Quick application card loading method, electronic equipment and storage medium
WO2024230434A1 (en) Method for implementing media playback control, electronic device, system, and readable storage medium
WO2024066976A1 (en) Control display method and electronic device
US20240361897A1 (en) Window interaction method and electronic device
WO2024067122A1 (en) Window display method and electronic device
WO2024193666A1 (en) Display method for electronic device, and electronic device and storage medium
WO2024099206A1 (en) Graphical interface processing method and apparatus
WO2024149089A1 (en) Display method, display apparatus, and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination