[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114201738A - Unlocking method and electronic equipment - Google Patents

Unlocking method and electronic equipment Download PDF

Info

Publication number
CN114201738A
CN114201738A CN202010911832.4A CN202010911832A CN114201738A CN 114201738 A CN114201738 A CN 114201738A CN 202010911832 A CN202010911832 A CN 202010911832A CN 114201738 A CN114201738 A CN 114201738A
Authority
CN
China
Prior art keywords
electronic device
user
mobile phone
display interface
equipment
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010911832.4A
Other languages
Chinese (zh)
Other versions
CN114201738B (en
Inventor
杨诗姝
杨桐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202010911832.4A priority Critical patent/CN114201738B/en
Priority to PCT/CN2021/113610 priority patent/WO2022048453A1/en
Publication of CN114201738A publication Critical patent/CN114201738A/en
Application granted granted Critical
Publication of CN114201738B publication Critical patent/CN114201738B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • G06F21/32User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/30Authentication, i.e. establishing the identity or authorisation of security principals
    • G06F21/31User authentication
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for program control, e.g. control units
    • G06F9/06Arrangements for program control, e.g. control units using stored programs, i.e. using an internal store of processing equipment to receive or retain programs
    • G06F9/44Arrangements for executing specific programs
    • G06F9/451Execution arrangements for user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M2250/00Details of telephonic subscriber devices
    • H04M2250/12Details of telephonic subscriber devices including a sensor for measuring a physical value, e.g. temperature or motion

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Telephone Function (AREA)

Abstract

The application relates to the technical field of terminal equipment, and provides an unlocking method and electronic equipment. The unlocking method comprises the following steps: receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment; if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers. And if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target equipment identifier. The embodiment of the application improves the efficiency of cross-device interaction.

Description

Unlocking method and electronic equipment
Technical Field
The present application relates to the technical field of terminal devices, and in particular, to an unlocking method and an electronic device.
Background
With the rapid development of terminal technology, more and more functions can be realized by terminal equipment. For example, more and more applications are installed on the terminal device, and the functions of the applications are enriched.
In actual use, when a certain application or a certain function of the terminal device needs to be used, the user needs to unlock the terminal device first and then open the application, or open the application to enter the function interface.
Therefore, a user needs to perform complicated operations to use a certain application or function, which is inefficient.
Disclosure of Invention
The embodiment of the application provides an unlocking method and electronic equipment, and at least one technical problem related to the prior art can be solved.
In a first aspect, an embodiment of the present application provides an unlocking method, which is applied to an electronic device, and the unlocking method includes:
receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment;
if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
and if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target equipment identifier.
According to the embodiment of the first aspect of the application, the device identification of the peripheral device is displayed in the screen-saving display interface, so that a user can unlock the electronic device quickly and start a quick path for the target device, and the efficiency of cross-device interaction under the screen-saving is improved.
As a possible implementation manner of the first aspect, the distribution of the one or more device identifiers in the information screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
In the implementation mode, the distribution of the equipment identifiers is mapped according to the spatial relation between the peripheral equipment and the electronic equipment, so that a user can quickly and accurately select the target equipment identifiers, a quick path for the target equipment is started, and the efficiency and the accuracy of cross-equipment interaction under the screen-off state are improved.
As a possible implementation of the first aspect, the spatial relationship comprises a spatial relationship of positioning and/or orientation.
As a possible implementation manner of the first aspect, in a case that the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an included angle between a connection line between each of the peripheral devices and the electronic device and an orientation of the electronic device.
As a possible implementation manner of the first aspect, the message screen display interface further includes an orientation identifier of the electronic device.
In the implementation mode, the orientation of the electronic equipment is identified through visualization of the information screen display interface, so that a user can quickly correspond the equipment identification distribution of the information screen display interface to an actual scene, and the operation efficiency and accuracy are improved.
As a possible implementation manner of the first aspect, the enabling a shortcut path corresponding to the target device identifier includes:
controlling the target equipment corresponding to the target equipment identification to respond to a preset instruction; or the like, or, alternatively,
displaying a control panel interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
calling up a display interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identification.
In the implementation mode, the diversification of the shortcut path is realized, so that the method and the device can be suitable for different application scenes and have strong environmental adaptability.
As a possible implementation manner of the first aspect, the touch operation includes a finger press operation, and the user authentication includes user authentication based on fingerprint recognition.
In the implementation mode, the electronic equipment based on full-screen fingerprint identification combines the user operation of starting the shortcut path and the user identity identification, so that the complex operation is reduced, and the operation efficiency is improved.
As a possible implementation manner of the first aspect, the one or more device identifiers satisfy a first condition, where the first condition includes an upper limit of the number of the device identifiers, and/or a deviation angle between a peripheral device corresponding to the device identifier and the electronic device is smaller than or equal to a maximum deviation angle.
In the implementation mode, the number of the equipment identifications displayed in the message screen display interface is reduced, the situation that when the number of peripheral equipment is too large, the equipment identifications are overlapped to cause the user to select the target equipment by mistake can be avoided, and the interaction accuracy is improved.
In a second aspect, corresponding to the unlocking method provided in the first aspect, an unlocking device is provided, which is configured on an electronic device, and includes:
the mobile terminal comprises a receiving module, a display module and a display module, wherein the receiving module is used for receiving touch operation input by a user in a message screen display interface, the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment;
the authentication module is used for starting user identity authentication if the touch operation is determined to act on the target equipment identifier; the target device identifier is any one of the one or more device identifiers;
and the unlocking module is used for unlocking and starting the shortcut path corresponding to the target equipment identification if the user identity authentication passes.
As a possible implementation manner of the second aspect, the distribution of the one or more device identifiers in the information screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
As a possible implementation of the second aspect, the spatial relationship comprises a spatial relationship of positioning and/or orientation.
As a possible implementation manner of the second aspect, in a case that the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an included angle between a connection line of each of the peripheral devices and the electronic device and an orientation of the electronic device.
As a possible implementation manner of the second aspect, the message screen display interface further includes an orientation identifier of the electronic device.
As a possible implementation manner of the second aspect, the unlocking module is specifically configured to:
unlocking and controlling the target equipment corresponding to the target equipment identification to respond to a preset instruction; or the like, or, alternatively,
unlocking and displaying a control panel interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
unlocking and calling a display interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
and unlocking and projecting a screen or projecting a sound to the target equipment corresponding to the target equipment identification.
As a possible implementation manner of the second aspect, the touch operation includes a finger press operation, and the user authentication includes user authentication based on fingerprint recognition.
As a possible implementation manner of the second aspect, the one or more device identifiers satisfy a first condition, where the first condition includes an upper limit of the number of the device identifiers, and/or a deviation angle between the peripheral device corresponding to the device identifier and the electronic device is smaller than or equal to a maximum deviation angle.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program, so that the electronic device implements the method according to any one of the first aspect and possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method according to any one of the first aspect and possible implementation manners of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the method described in any one of the foregoing first aspect and possible implementations of the first aspect.
It will be appreciated that the advantageous effects of the second to fifth aspects described above may be seen in relation to the description of the first aspect described above.
Drawings
Fig. 1 is a schematic view of a message screen display interface of a mobile phone according to an embodiment of the present application;
fig. 2A is a schematic structural diagram of a positioning system according to an embodiment of the present application;
FIG. 2B is a schematic diagram of a positioning principle provided by an embodiment of the present application;
FIG. 2C is a schematic view of an orientation principle provided by an embodiment of the present application;
FIG. 2D is a schematic view of another orientation principle provided by an embodiment of the present application;
fig. 3 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
FIG. 4 is a diagram of a software architecture of an electronic device provided by an embodiment of the present application;
FIG. 5 is a first application scenario provided by an embodiment of the present application;
fig. 6A is a schematic view of a setting interface of a mobile phone according to an embodiment of the present application;
fig. 6B is a schematic view of another setting interface of the mobile phone according to an embodiment of the present application;
fig. 7 is a schematic view of a setting interface of a mobile phone according to an embodiment of the present application;
fig. 8 is a schematic view of another setting interface of the mobile phone according to an embodiment of the present application;
fig. 9A is a schematic view of a message screen display interface of a mobile phone according to an embodiment of the present application;
fig. 9B is a schematic diagram of a message screen display interface of a mobile phone according to an embodiment of the present application;
FIG. 10 is a third application scenario provided by an embodiment of the present application;
fig. 11A is a schematic view of a message screen display interface of a mobile phone according to an embodiment of the present application;
fig. 11B is a schematic diagram of a control panel interface of a mobile phone display television according to an embodiment of the present application;
fig. 12A is a schematic view of a message screen display interface of a mobile phone according to an embodiment of the present application;
fig. 12B is a schematic diagram illustrating a display interface of a tablet pc being invoked on a mobile phone according to an embodiment of the present application;
fig. 13A is a schematic diagram of positioning of a first mobile phone, a television and a tablet computer in a third application scenario;
fig. 13B is a schematic diagram of an angle between the first mobile phone and the television in the third application scenario;
fig. 13C is another schematic diagram of an angle between the mobile phone and the television in a third application scenario;
FIG. 13D is a schematic illustration of a handset orientation in a third application scenario;
fig. 14 is a schematic view of a scene change of a third application scene before and after the mobile phone rotates counterclockwise;
fig. 15 is a schematic diagram illustrating a change of a display interface of a mobile phone before and after counterclockwise rotation in a third application scenario;
FIG. 16 is a schematic diagram illustrating another variation of the screenshot display of the mobile phone before and after counterclockwise rotation in a third application scenario;
fig. 17 is a flowchart illustrating an implementation of an unlocking method according to an embodiment of the present application;
fig. 18 is a flowchart of an implementation of an unlocking method according to another embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise.
It should also be understood that in the embodiments of the present application, "a plurality" and "one or more" mean one, two or more; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" or "if" may be interpreted depending on the context as "when.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
At present, more and more applications are installed in electronic devices, and when a user wants to open a certain application or use a certain function of the application, a user operation, which is usually complicated, is required to enter an interface of the application or the function. Therefore, the unlocking method and the electronic device are provided, the operation complexity of a user is reduced, and the operation efficiency is improved.
In order to better understand the technical solution of the present application, several important terms related to the present application will be introduced.
Information screen Display (AOD On Display)
The information screen display means that on the premise that the whole screen of the electronic equipment is not lightened, contents such as time, temperature, date, calendar, incoming call information or push messages are directly displayed in a partial area of the screen.
As a non-limiting example, FIG. 1 illustrates a single-screen display interface of a cell phone. In the message screen display interface shown in fig. 1, a partial area of the mobile phone screen is lighted up for displaying time, date and electric quantity.
In practical applications, when the screen of the electronic device is not lit in a whole block, the electronic device is usually in a screen locking state.
Wireless location technology
The wireless positioning technology refers to a measurement method for acquiring mobile location information in various wireless networks, namely a positioning algorithm. Wireless location technologies include, but are not limited to, Ultra Wide Band (UWB), Wireless fidelity (Wi-Fi), or Bluetooth (BT), among others.
The most common wireless positioning algorithms at present mainly include: an Angle of Arrival (AOA) based positioning algorithm, a Time of Arrival (TOA) based positioning algorithm, a Time Difference of Arrival (TDOA) based positioning algorithm, or a Received Signal Strength (RSS) based positioning algorithm, etc. In some practical applications, a plurality of positioning algorithms can be combined for positioning.
Due to the high time resolution of UWB signals, TOA and TDOA location algorithms have higher accuracy relative to other location algorithms. At present, a relatively effective solution for UWB positioning is to adopt a hybrid positioning algorithm of TOA and TDOA, because the two positioning algorithms complement each other, and the advantages of the two positioning algorithms are combined, so that high positioning accuracy can be achieved.
For convenience of description, the embodiments of the present application take the UWB positioning technology based on the TOA positioning algorithm as an example. It should be understood that the UWB positioning technology is not to be interpreted as a specific limitation to the present application, and all positioning methods that can implement the technical solutions of the present application can be used in the present application.
The TOA positioning algorithm is that a base station sends a specific ranging command or instruction signal to an electronic device and requests the electronic device to respond to the command. The base station records the time taken for sending the ranging command to the electronic equipment to acknowledge the signal, and the time mainly consists of the propagation delay of a radio frequency signal (such as a UWB signal) on a loop, the response delay and the processing delay of the electronic equipment, and the processing delay of the base station. If the response and processing delays of the electronic device and the base station can be accurately obtained, the loop propagation delay of a radio frequency signal (e.g., a UWB signal) can be calculated. Since radio waves propagate in the air at the speed of light, the distance between the base station and the electronic equipment can be estimated. When there are three base stations involved in the measurement, the location of (e.g. UWB signals) can be determined according to triangulation.
As a non-limiting example, as shown in FIG. 2A, a UWB positioning system comprises: 3 base stations carrying UWB modules (which may be referred to as UWB base stations) and handsets carrying UWB modules (i.e. tags to be located). With UWB positioning technology, UWB base stations need to be installed in the environment in advance. In the present example, 3 UWB base stations, i.e., a first UWB base station 21, a second UWB base station 22, and a third UWB base station 23 are installed. Relying on 3 UWB base stations to locate handsets 20 in the environment.
According to fig. 2B, the position coordinates (x0, y0) of the handset 20 are obtained according to the first distance r1 from the first UWB base station 21 to the handset 20, the second distance r2 from the second UWB base station 22 to the handset 20, and the third distance r3 from the third UWB base station 23 to the handset 20, in combination with the known position coordinates (x1, y1) of the first UWB base station 21, the known position coordinates (x2, y2) of the second UWB base station 22, and the known position coordinates (x3, y3) of the third UWB base station 23.
UWB orientation can be achieved in two ways.
The first is the AOA measurement method in which a UWB device that needs to be oriented uses multiple antennas, while a UWB base station or a device on which the UWB base station depends uses only one antenna. As shown in fig. 2C, the UWB base station transmits a special data packet through a low power transmitter through a single antenna. The surrounding low-power receiver, for example, a receiver of a mobile phone, has a plurality of antennas arranged in an array, and because the distances from the plurality of antennas to the transmitter are different, each antenna of the receiver may find a phase difference of a received signal, and finally, a relative direction of the signal is obtained through data calculation, for example, an angle of the mobile phone relative to the UWB base station is θ.
The second is an Angle Of Departure (AOD) measurement method in which a UWB device requiring orientation uses only one antenna, and a UWB base station or a device on which the UWB base station depends uses a plurality Of antennas. As shown in fig. 2D, the mobile phone may receive signals through the low power consumption receiver, and the UWB base station may transmit a special data packet when switching between the active antennas arranged in a display through the low power consumption transmitter. The receiver of the mobile phone obtains IQ samples in the received signals, knows the antenna arrangement in the transmitter, and finally obtains the relative directions of the signals through data calculation, for example, the angles of the mobile phone relative to two UWB base stations are theta 1 and theta 2 respectively.
Through UWB directional technology, can calculate the angle of cell-phone for carrying electronic equipment such as TV, intelligent stereo set, panel computer of UWB module.
In order to explain the technical means of the present application, the following description will be given by way of specific examples.
The unlocking method provided by the embodiment of the application can be applied to electronic devices, including but not limited to a mobile phone with a touch display screen, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a tablet computer, a smart sound box, a television, or the like. The embodiment of the present application does not set any limit to the specific type of the electronic device.
In some embodiments of the present application, the electronic device may comprise a portable, handheld, or mobile electronic device, such as a cell phone, tablet, wearable device, or portable game console, among others.
Fig. 3 shows a schematic structural diagram of the electronic device 100, taking a mobile phone as an example.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide solutions for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., Wi-Fi networks), BT, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), UWB, and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (UWB), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 4 is a block diagram of the software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 4, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 4, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated.
When the touch sensor 180K of the electronic device 100 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event.
The following illustrates application scenarios and implementation flows of the embodiments of the present application by way of non-limiting examples. For convenience of description, in the following application scenarios, a mobile phone is described as an example of the electronic device. It should be noted that the exemplary description of an application scenario is not intended to limit the application scenario. Indeed, various modifications, combinations, substitutions, or alterations may be contemplated in connection with the illustrative description of various application scenarios.
First application scenario
The first application scene is a scene that the mobile phone is unlocked under the display of the message screen and enters a specific function or a specific application of the mobile phone.
At present, the technology of fingerprint under the screen is mature day by day, all large terminal manufacturers begin to exert their efforts on the aspect of comprehensive screen fingerprint technology, and related comprehensive screen fingerprint unlocking terminal products are also available at present. In a first application scenario, the mobile phone performs user identity authentication in a mode of fingerprint identification under a screen. The mobile phone supports full screen fingerprint identification.
As shown in a diagram in fig. 5, in the message screen display interface of the mobile phone, in addition to the time, date and power, 6 blocks are displayed. One block of the 6 blocks is a first block 51 and one block is a second block 52.
When the user is ready to pay with the mobile phone in the convenience store, the mobile phone receives the finger of the user, such as the thumb of the left hand or the right hand, and the pressing operation of the first block 51 shown in the diagram a in fig. 5 is performed, after the fingerprint identification is passed, that is, the user identity authentication is successful, the mobile phone is unlocked, and the user directly enters the payment treasure from the screen display interfaceTMWeChat, WeChatTMOr a payment code interface provided by an application such as a financial client. The payment code interface may be as shown in B of fig. 5, and the payment code is a two-dimensional code.
After a period of payment, the mobile phone automatically locks the screen and displays an information screen display interface. The user walks out of the convenience store if the user wants to quickly view, for example, WechatTMAnd the like for instant messaging applications. When the mobile phone receives a finger of a user, such as a thumb of a left hand or a right hand, and presses the second block 52 shown in diagram a in fig. 5, the mobile phone can directly enter the display interface of the chat information list from the message screen display interface until the fingerprint identification is passed, that is, the user identity authentication is successful. The display interface of the chat information list can be as shown in the C diagram in fig. 5.
In this process, the mobile phone determines that the user wishes to unlock the mobile phone according to the detected user operation, for example, the user picks up or shakes the mobile phone, presses a mechanical key such as a power-on key or a volume key, touches or taps a screen, and the mobile phone displays a message screen display interface shown in a diagram a in fig. 5. The user presses the fingerprint in the first block 51 to prepare for unlocking the mobile phone, the mobile phone collects the fingerprint of the user according to the pressing operation of the user on the first area 51, the fingerprint is authenticated, and after the authentication is passed, the mobile phone is on the screen and a payment code display interface is presented. After the user scans the payment code and finishes payment, the mobile phone can automatically enter a screen locking state, or enter the screen locking state according to the received screen locking operation (such as pressing a key of the mobile phone and the like) input by the user. When the mobile phone enters a screen locking state, the mobile phone is turned off (namely, a screen is blacked), and after the mobile phone is turned off for a period of time, the user wants to check the chat message, the mobile phone determines that the user wants to unlock the mobile phone according to the detected user operation or action, for example, the user finger touches the mobile phone screen, and the like, the mobile phone presents a screen-saving display interface, the user presses the fingerprint to unlock the mobile phone in the second block 52 of the screen-saving display interface, the mobile phone is bright, and the display interface of the chat message list is presented.
In the message screen display interface shown in diagram a in fig. 5, 6 tiles are displayed. The remaining 4 blocks, except for the first block 51 and the second block 52, may each correspond to an interface of a specific function or application. And when the fingerprint identification is passed, namely the user identity authentication is successful, the mobile phone enters a function or an application interface corresponding to the block.
It should be noted that, in the example shown in fig. 5, the two-dimensional code is adopted as the payment code in the payment code interface. In other examples, the payment code interface may also adopt a barcode, or a combination of a barcode and a two-dimensional code, or the like, as the payment code. The specific presentation form of the payment code is not particularly limited in the embodiments of the present application.
According to the method and the device, visual feedback of screen displaying is combined, the unlocking areas are divided on the terminal screen supporting comprehensive screen fingerprints, and each area is used as a shortcut path for entering a specific function or application. The user can quickly enter specific functions or applications through fingerprint unlocking, and the purpose of improving the operation efficiency of the user is achieved.
It should be noted that the number of tiles included in the screen display interface and/or the function or application corresponding to each tile may be set by default by the system or may be set by user-defined.
In some possible implementations, the electronic device may provide a setting interface for the shortcut path, and the user may modify or edit the number of blocks in the display interface of the message screen and/or the specific function or application corresponding to each block in the setting interface, which is not limited in this application.
As a non-limiting example, as shown in fig. 6A and 6B, a shortcut path setting interface 61 of a mobile phone provides 6 selectable shortcut paths, and each shortcut path corresponds to a bar control. For example, the top left tile corresponds to bar control 611 and the bottom left tile corresponds to bar control 612.
As shown in fig. 6A, the mobile phone receives a click operation applied to the bar control 611 by the user, determines that the block to be set by the user this time is an upper left block, and presents a setting interface 62 for the upper left block. The upper left tile setup interface 62 may include a tile layout display area 621, a system recommendation area 622, and a user customization area 623. The block layout display area 621 is used for displaying the position of the upper left area in the whole layout, visually identifying the currently set block, and facilitating the user to distinguish different blocks. In the tile layout display area 621 shown in fig. 6A, the top left tile is different from the other 5 tiles in display style for distinguishing. The system recommendation area 622 is used to display functions or applications that the system recommends settings. In the system recommendation area 622 shown in fig. 6A, a riding code function in an application recommended by the system is displayed, a user can click the switch control 6221 in the system recommendation area 622, and the mobile phone can set the upper left block to correspond to the function or application recommended by the system, that is, the riding code function, according to the click operation of opening the switch control 6221 input by the user. The user-defined area 623 is used for a user to define and set functions or applications corresponding to the upper left block. The user clicks the drop-down menu control 6231 in the user-defined region 623, and can select a function or application in the drop-down menu list as the function or application corresponding to the upper-left block.
As shown in fig. 6B, the mobile phone receives a click operation applied to the bar control 612 by the user, determines that the block to be set by the user this time is a lower left block, and presents the setting interface 63 for the lower left block. The lower left tile setup interface 63 may include a tile layout display area 631, a system recommendation area 632, and a user customization area 633. The block layout display area 631 is used for displaying the position of the lower left area in the whole layout, visually identifying the currently set block, and facilitating the user to distinguish different blocks. In the block layout display area 631 shown in fig. 6B, the lower left block is displayed in a different manner from the other 5 blocks for distinguishing. The system recommendation area 632 is used to display functions or applications that the system recommends settings. In the system recommendation area 632 shown in fig. 6B, a payment code function in an application recommended by the system is displayed, the user may click the switch control 6321 in the system recommendation area 632, and the mobile phone may set the lower left block to correspond to the function or application recommended by the system, that is, the payment code function, according to the click operation of the user to turn on the switch control 6321. The user-defined area 633 is used for a user to set a function or application corresponding to the lower left block in a user-defined manner. The user clicks a drop-down menu control 6331 in the user-defined region 633, and may select a function or application in the drop-down menu list as a function or application corresponding to the lower-left tile.
As shown in fig. 6A and 6B, the shortcut path setting interface 61 includes 6 selectable shortcut paths, where two blocks are provided with corresponding shortcut paths. The lower left block is set to correspond to the pay code function, and the middle right block is set to correspond to a chat information list of an application. Based on this setting, the user can enter the respective corresponding functions or applications of the tiles by triggering the lower left tile and the middle right tile in the information screen display interface, for example, see the example shown in fig. 5. It should be noted that, in other examples, when two blocks have corresponding shortcut paths, the two blocks may be displayed in the message screen display interface, and other blocks not having corresponding functions or applications are not displayed.
As another non-limiting example, as shown in fig. 7, the shortcut path setting interface 71 of the mobile phone provides a plurality of selectable tile layout forms, and different tile layout forms can be switched by user operation. The layout display area 711 displays the layout of 4 blocks. The user can click the right control 7111 in the layout display area 711 or input a right sliding touch operation, and the mobile phone switches to the layout display area 712 including the layout form of 5 blocks according to the received click operation acted on the right control 7111 by the user or the right sliding touch operation input by the user. The user can click the left control 7112 in the layout display area 711 or input a leftward sliding touch operation, and the mobile phone switches to the layout display area 713 in a layout form including 3 blocks according to the received click operation applied to the left control 7112 by the user or the leftward sliding touch operation input by the user.
Based on the example shown in fig. 7, in some implementations, the user may gradually increase the number of tiles displayed by the layout display area by inputting a first direction, e.g., a sliding touch operation to the right, multiple times; the number of blocks displayed in the layout display area can be reduced by inputting a sliding touch operation in a direction opposite to the first direction, for example, to the left, a plurality of times. Alternatively, in some other implementations, the user may gradually decrease the number of blocks displayed in the layout display area by inputting the first direction for multiple times, for example, a sliding touch operation to the right; the number of blocks displayed in the layout display area may be gradually increased by inputting a sliding touch operation in a direction opposite to the first direction, for example, to the left, a plurality of times.
On the basis of the example shown in fig. 7, as shown in fig. 8, the layout display area 711 in the shortcut path setting interface 71 displays the layout form of 4 blocks, and the user can change the display position of any block by long pressing and moving the block, thereby changing the layout of the block. For example, as shown in fig. 8, the mobile phone receives a long-press moving touch operation applied to the tile 7113 by the user, and changes the location of the tile 7113 to a target location, i.e., a lift-up (up) location of the long-press moving touch operation.
On the basis of the example shown in fig. 7 and 8, the layout display area 711 in the shortcut path setting interface 71 displays the layout form of 4 blocks, and the user can set a function or an application for any block by pressing the block for a long time. That is, the mobile phone can enter the setting interface of the block according to the long press operation applied to any block input by the user, and set the function or application corresponding to the block in the setting interface of the block. The setting interface of the tile may be similar to the setting interface 62 of the upper left tile shown in fig. 6A or the setting interface 63 of the lower left tile shown in fig. 6B.
Based on the examples shown in fig. 7 and 8, a user may custom set the number and/or layout of tiles displayed in a touchscreen display interface.
It should be understood that the user interfaces shown in fig. 6A, 6B, 7, and 8 are merely exemplary descriptions. In actual use, the user interface may include more or fewer interface elements than in fig. 6A, 6B, 7, and 8, and the interface layout may be different.
It should be noted that, in the example shown in fig. 5, a rectangle is used as an identifier of each shortcut path. In other examples, icons, patterns, and/or words and the like corresponding to functions or applications can also be used as the identifier of the shortcut path. For example, as shown in fig. 9A and 9B, an icon or a pattern corresponding to a function or an application is used as an identifier of a shortcut path, so that a user can distinguish each shortcut path conveniently, a threshold for user operation is reduced, and operation accuracy is improved. In the example shown in fig. 9A and 9B, icon 91 in the message screen display interface corresponds to a calculator application, icon 92 corresponds to a music player application, icon 93 corresponds to a mail application, and icon 94 corresponds to a WeChat applicationTMIn application, the pattern 95 corresponds to a two-dimensional code payment function in an application. The user can enter the calculator application interface after unlocking the phone by pressing the screen area where icon 91 is displayed. The user may enter the music player application by pressing the screen area where the icon 92 is displayed, unlocking the phone. The user can unlock the mobile phone and then enter the mailbox application by pressing the screen area of the display icon 93. The user may enter a WeChat after unlocking the phone by pressing the screen area displaying icon 94TMApplication is carried out. The user can unlock the hand by pressing the screen area displaying the pattern 95And entering a two-dimensional code payment interface of an application after the payment.
In the example shown in fig. 9A and 9B, shortcut path identifiers in a display interface of a message screen are displayed in different areas of the screen. The display area of the shortcut path identifier can be set according to system setting or user-defined setting. In some embodiments, the mobile phone screen is large, so as to facilitate user operations and improve operation efficiency, and the message screen display interface may be displayed in the area below the screen as shown in fig. 9B according to a system default setting or a user-defined setting. In other embodiments, the screen of the mobile phone is large, and in order to further facilitate the user operation and further improve the operation efficiency, the shortcut identifier may be displayed in a left or right area below the screen according to the detected holding state of the left or right hand. In other embodiments, the shortcut path identifier in the message screen display interface may change the display area, and similarly, other contents in the message screen display interface may also change the display area, for example, the display position of the displayed content is changed once every preset time interval; for another example, the display positions of the contents displayed in the screen display interface are all different from the previous display position. It is to be understood that this is done by way of example only and is not to be construed as a specific limitation on the application.
In other possible implementation manners, the mobile phone determines the function or application corresponding to each block according to a combination of one or more of a user usage habit, a consumed flow rate, scene information, and the like of an installed application or function. The usage habit of the user may include a usage rule, a usage duration and/or a usage frequency, etc. The scene information includes position information and/or motion state information, etc.
As a non-limiting example, the handset may rank applications and/or functions according to their usage duration, number of uses, or amount of traffic consumed, etc., over a certain period of time in the past. The top ranked applications and/or functions have a greater probability of the user wanting to use after unlocking than the bottom ranked applications and/or functions.
For example, mobile phone default settingsSetting 6 shortcut paths, analyzing historical use data of a user by the mobile phone to obtain a usage rate ranking of the last week of the applications and/or functions, wherein the top 6 applications and/or functions are sequentially as follows: WeChatTMMicroblog, news, music, telephone and payTMThe payment code of (1). Therefore, the mobile phone sets the 6 applications and functions as shortcut paths, and displays respective identifications of the 6 shortcut paths on an information screen display interface for a user to select, so that the user can conveniently and quickly enter the applications and/or functions.
As a non-limiting example, the handset may count usage of applications and/or functions on a daily basis based on a record of usage of the application or function on a daily basis over a certain period of time in the past. The time period for intensively using the mobile phone in one day may include N time periods, for example, 5 time periods, etc. And recording the use duration and the like of each application and/or function aiming at each time period, and sequencing the applications and/or functions according to the use duration. For each time period, the top M (M is a positive integer) top applications and/or functions are set as the shortcut path for the time period, because there is a greater probability that the top M top applications and/or functions are the applications and/or functions that the user wants to use after unlocking than the top applications and/or functions.
For example, the mobile phone analyzes historical use data of a user, 4 time periods, such as 8:00 to 10:00,12:00 to 13:30,18:00 to 20:00, and 20:30 to 23:00, are obtained for intensively using the mobile phone every day, the use rates of applications and/or functions in the 4 time periods are respectively counted, ranking is performed according to the use rates, the mobile phone sets M applications and/or functions which are ranked at the top (M is a positive integer) into shortcut paths, and respective identifiers of the shortcut paths are displayed on an information screen display interface for the user to select, so that the user can conveniently and quickly enter the applications and/or functions. It should be noted that the number of the shortcut paths corresponding to the 4 time periods may be the same or different, and the application does not limit this.
As another non-limiting example, the cell phone may determine the current location of the user based on the location, and determine the environment in which the user is located based on the current location of the user, such as a mall, an airport, a home, a subway, or a movie theater, etc. And the mobile phone displays the shortcut path associated with the scene on the information screen display interface according to the scene of the user.
For example, the mobile phone locates the current position, determines the environment where the user is located as a shopping mall according to the current position, and sets shopping applications, payment functions and the like as shortcut paths by the mobile phone. And displaying icons or patterns corresponding to shopping applications, payment functions and the like on an information screen display interface for a user to select, so that the user can conveniently and quickly enter the applications and/or functions.
For another example, the mobile phone locates the current position, determines that the environment where the user is located is at home according to the current position, and sets the application or function of the control home device, the reading application, the music playing application and the like as a shortcut path. And displaying corresponding icons or patterns for controlling the applications or functions of the household equipment, reading applications, music playing applications and the like on the information screen display interface for the user to select, so that the user can conveniently and quickly enter the applications and/or functions.
As another non-limiting example, the mobile phone may obtain the motion state information of the mobile phone according to a positioning and/or acceleration sensor, and/or a gyroscope sensor, etc., may determine the motion state of the user according to the motion state information, such as running, walking, riding, etc., and the mobile phone may display a shortcut associated with the motion state information on a display screen of the mobile phone.
For example, the mobile phone determines that the current movement speed is slow, determines that the user is in a jogging state, and sets the movement management application, the music playing application and the like as shortcut paths. And displaying corresponding icons or patterns such as a motion management application, a music playing application or a music control function on a display interface of the information screen for a user to select, so that the user can conveniently and quickly enter the application and/or function.
Second application scenario
The second application scenario is a scenario in which the mobile phone is unlocked under the display of the message screen and enters a specific function or a specific application of the mobile phone.
At present, the face recognition technology is mature and safe day by day, and a terminal product unlocked through face recognition is also available at present. In the second application scenario, the mobile phone performs user identity authentication by starting the camera to perform face recognition. The mobile phone is a mobile phone supporting face recognition.
Continuing with fig. 5, as shown in diagram a, 6 tiles are displayed in the message screen display interface of the mobile phone, in addition to the time, date and power. One block of the 6 blocks is a first block 51 and one block is a second block 52.
When a user prepares to pay by using a mobile phone in a convenience store, the mobile phone monitors the finger or the stylus of the user, and the touch screen operation of the first block 51 shown in the diagram A in FIG. 5 is performed, the mobile phone starts the front-facing camera 53 to shoot the face image of the user, when the face identification is passed, namely the user identity authentication is successful, the mobile phone is unlocked, and the user directly enters into, for example, a payment treasure from the information screen display interfaceTMWeChat, WeChatTMOr a payment code interface provided by an application such as a financial client. The payment code interface may be as shown in B of fig. 5, and the payment code is a two-dimensional code.
After a period of payment, the mobile phone locks the screen and displays the information screen and the display interface. The user walks out of the convenience store if the user wants to quickly view, for example, WechatTMAnd the like for instant messaging applications. The mobile phone monitors the finger of the user or a stylus pen, and the like, in the touch screen operation of the second block 52 shown in the diagram a in fig. 5, the mobile phone starts the front camera 53 to shoot the face image of the user, and when the face identification is passed, that is, the user identity authentication is successful, the mobile phone is unlocked, and the mobile phone can directly enter the display interface of the chat information list from the information screen display interface. The display interface of the chat information list can be as shown in the C diagram in fig. 5.
It should be noted that the second application scenario adopts a user identity authentication method different from that of the first application scenario, and the rest of the process is the same as that of the first application scenario, which is not described herein again.
Third application scenario
The third application scenario is an application scenario of cross-device interaction. In the third application scenario, the user sits on a sofa at home, and at the moment, the user holds the mobile phone with his hand, the tablet personal computer is on the left side, the television is right in front, the smart sound box is on the right side, and the spare mobile phone is on the left side of the sofa. In a third application scenario, a first mobile phone 1010, a television 1020, a smart speaker 1030, a tablet 1040, and a second mobile phone 1050 (i.e., an alternate mobile phone) held by a user are included.
In the scenario shown in fig. 10, the first mobile phone 1010 is taken as a search device, and the other devices are taken as connectable devices. The first handset 1010 may establish a communication connection with one or more of the other devices. Other devices include a television 1020, a smart speaker 1030, a tablet 1040, and a second handset 1050. It should be appreciated that in other application scenarios, second handset 1050 may act as a search device and first handset 1010 may act as a connectable device. The roles of the devices may be interchanged according to the actual situation of the application scenario, which is not limited in this application and is only an exemplary illustration here.
In the scenario shown in fig. 10, the first handset 1010 is a searching device that searches for surrounding connectable devices, and the first handset 1010 may wirelessly communicate with any one or more of the searched surrounding connectable devices using a wireless communication technology supported by the devices. The Wireless communication technologies supported by the device include, but are not limited to, Wi-Fi, BT, IR, GPS, High Performance Wireless local area network (High Performance Radio LAN), Radio Frequency (RF), Wireless USB (WUSB), UWB, or the like. It should be appreciated that in actual practice, other wireless communication techniques may also be employed, or alternatively, wired communication techniques may also be employed. In the description of the third application scenario, it is exemplified that the first mobile phone 1010 can perform wireless communication with other peripheral devices, that is, the first mobile phone 1010 can establish a wireless communication connection with the television 1020, the smart sound box 1030, the tablet computer 1040, and the second mobile phone 1050.
After the first mobile phone 1010 is in wireless communication connection with the television 1020, the smart sound box 1030, the tablet pc 1040 and the second mobile phone 1050 respectively, data interaction between the first mobile phone 1010 and the television 1020, between the first mobile phone 1010 and the smart sound box 1030, between the first mobile phone 1010 and the tablet pc 1040, and between the first mobile phone 1010 and the second mobile phone 1050 can be realized, so that the first mobile phone 1010 controls the television 1020, the smart sound box 1030, the tablet pc 1040 and the second mobile phone 1050, and the like.
In a first implementation, the touchscreen display of the first handset 1010 may display a device identification of at least a portion of the connectable devices. The device identifier is used as a shortcut path identifier. The connectable device is a connectable peripheral device searched by the first mobile phone 1010. The user may select a connectable device, referred to as a target device, in the message screen display of the first handset 1010. The first mobile phone 1010 starts fingerprint recognition or face recognition according to a pressing operation of the device identifier of the target device by the user to unlock the first mobile phone 1010. After successful unlocking, the first mobile phone 1010 establishes a wireless communication connection with the target device and enables a shortcut path associated with the target device.
In a second implementation, the information screen display interface of the first mobile phone 1010 may display the device identifier of at least a portion of the connected device. The device identifier is used as a shortcut path identifier. The connected device is a peripheral device that establishes a wireless connection with the first mobile phone 1010. The user may select a connected device, referred to as a target device, in the message screen display of the first handset 1010. The first mobile phone 1010 starts fingerprint recognition or face recognition according to a pressing operation of the device identifier of the target device by the user to unlock the first mobile phone 1010. After successful unlocking, the first mobile phone 1010 may enable a shortcut path associated with the target device.
In the first implementation manner and the second implementation manner, the device identification includes but is not limited to one or more combination of patterns, texts and the like.
Patterns include, but are not limited to, regular geometric patterns, irregular patterns, pictures, engineering drawings, or the like. For example, the pattern may take the form of an outline or schematic of the attachable device, or the like. As another example, the pattern may also take the form of dots or any geometric pattern, etc.
Text includes, but is not limited to, a combination of one or more of letters, numbers, words, symbols (e.g., emoticons), and the like. For example, the text may take the device name or device type of the connectable device, etc. The device type is television, display screen, tablet computer or notebook computer, etc. As another example, the text may take the friendly name of the connectable device that is customized by the holder of the device, i.e., a name that can be recognized by other devices.
In some implementations, the device identifier can be associated with a particular device, such as a mobile phone, a mobile phone. For example, the pattern takes the form of a profile of the connectable device, with the device name displayed below the profile. In other implementations, patterns and text corresponding to the same connectable device may be displayed in a fused manner. For example, the pattern takes a rectangular frame within which the device name is displayed.
The device identifier may be displayed statically or dynamically, for example, by flashing.
It should be further noted that the display area of the device identifier in the display interface of the message screen may be set by default by the system, or may be set by user-defined. The display area of the device identifier may include an area above, an area below, an area to the left or right below, etc. of the screen of the electronic device.
On the basis of the first implementation manner or the second implementation manner, in some implementation manners, the information screen displays device identifiers displayed in the interface, and the electronic devices corresponding to the device identifiers are the same as the account logged in by the first mobile phone 1010.
On the basis of the first implementation manner or the second implementation manner, in some implementation manners, the first mobile phone 1010 may determine, according to a system default setting or a user-defined setting, an upper limit of the number of the device identifiers displayed in the information screen display interface, and the like.
On the basis of the first implementation manner or the second implementation manner, in some implementation manners, the distribution of the device identifiers in the information screen display interface may not be mapped according to the spatial relationship of each connectable device. The distribution of the device identifiers in the information screen display interface can be set by default by a system or can be set by user definition.
The device identifiers may be distributed randomly or at regular intervals. The distribution of device identifications may form a regular geometric image, such as a line, triangle, or matrix. The distribution of device identifications may also form an irregular curve. For example, when the number of the device identifiers is three, the three device identifiers are respectively distributed at three corners of an equilateral triangle; for another example, when the number of the device identifiers is four, the four device identifiers are distributed at four corners of a rectangle.
As a non-limiting example, on the basis of the application scenario shown in fig. 10, as shown in fig. 11A, in addition to displaying time, date and power, device identifiers corresponding to four connectable devices, that is, device identifiers corresponding to the television 1020, the smart sound box 1030, the tablet computer 1040 and the second mobile phone 1050, are displayed in the information screen display interface of the first mobile phone 1010. The distribution of the four device identifications forms a regular rectangle. The device identifier 1102 corresponds to the television 1020, the device identifier 1103 corresponds to the smart sound box 1030, the device identifier 1104 corresponds to the tablet 1040, and the device identifier 1105 corresponds to the second handset 1050. In the example shown in fig. 11A, in order to better distinguish the connectable devices from each other by the user and improve the operation efficiency and the operation accuracy, the device identifier is an external device diagram.
According to the shortcut path set by the default of the system or the user's self-definition, when the first mobile phone 1010 detects that the user presses the area of the display device identifier 1102 in fig. 11A, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user identity. In some embodiments, to pass the identity of the user, first cell phone 1010 unlocks, establishes a wireless communication connection with television 1020, and enters the television control panel interface. Television control panel interface 1110 may be shown in FIG. 11B, where a user may control television 1020 to turn on by clicking on power-on control 1111 in television control panel interface 1110. In other embodiments, to be used for the user identity, first mobile phone 1010 is unlocked, a wireless communication connection with television 1020 is established, and television 1020 is controlled to be turned on according to the established wireless communication connection. In other embodiments, to be authenticated by the user, first mobile phone 1010 unlocks, establishes a wireless communication connection with tv 1020, and first mobile phone 1010 controls tv 1020 to turn on according to the established wireless communication connection and enters the tv control panel interface. It should be understood that what operation the first mobile phone 1010 performs after the user identity is passed is determined by the shortcut path of the default setting of the system or the user-defined setting.
Based on the examples shown in fig. 11A and 11B, the implementation of first mobile phone 1010 controlling television 1020 to turn on may include the following two non-limiting examples.
First, the first mobile phone 1010 receives a user operation, for example, the user presses the device identifier 1102 corresponding to the tv 1020 and successfully unlocks the first mobile phone 1010; for another example, when the user clicks a start-up control in the tv control panel interface, the first mobile phone 1010 generates a corresponding start-up instruction, and based on the communication connection established between the first mobile phone 1010 and the tv 1020, the first mobile phone 1010 sends the control instruction to the tv 1020, so as to control the tv 1020 to start through the first mobile phone 1010.
Secondly, the first mobile phone 1010 is operated by a user, for example, the user presses the device identifier 1102 corresponding to the tv 1020 and successfully unlocks the first mobile phone 1010; for another example, when the user clicks a start-up control in the television control panel interface, the first mobile phone 1010 generates a control instruction, the control instruction is sent to the cloud, for example, the smart home cloud obtains the current state of the television 1020 based on the communication connection established between the first mobile phone 1010 and the television 1020, forwards the control instruction to the television 1020, obtains an execution result of the television 1020, and returns the execution result to the first mobile phone 1010.
As another non-limiting example, on the basis of the application scenario shown in fig. 10, as shown in fig. 12A, in addition to displaying time, date and power, device identifiers corresponding to 2 connected devices, that is, device identifiers corresponding to the smart sound box 1030 and the tablet 1040, are displayed in the information screen display interface of the first mobile phone 1010. The distribution of the 2 device identities is random. Device identification 1203 corresponds to smart sound box 1030, and device identification 1204 corresponds to tablet 1040. In the example shown in fig. 12A, in order to better distinguish the connectable devices from each other by the user and improve the operation efficiency and the operation accuracy, the device identifier is an external device diagram.
According to the shortcut path set by the default of the system or the user's self-defined setting, when the first mobile phone 1010 detects that the user presses the area of the display device identifier 1204 in fig. 12A, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user identity. When the user identity passes, the first mobile phone 1010 is unlocked, and an unlocking interface of the first mobile phone 1010 and a display interface of the tablet computer 1040 are displayed at the same time. The unlocking interface of the first mobile phone 1010 and the display interface of the tablet computer 1040 may be displayed in a split screen manner or in an overlapping manner. In the case of a superimposed display, as an example, as shown in fig. 12B, the display interface 1214 of the tablet 1040 may be superimposed and displayed on the unlocking interface 1211 of the first mobile phone 1010 in the form of a floating window.
In some embodiments, after the first mobile phone 1010 is successfully unlocked, the tablet 1040 sends a screen-casting instruction to the tablet 1040 according to the established wireless communication connection, and the tablet 1040 casts a display interface to the first mobile phone 1010 according to the screen-casting instruction. The display interface of the tablet computer is directly invoked across devices on the first mobile phone 1010. The display interface of the tablet computer can be transferred to the mobile phone for display and operation.
It should be noted that the display interface of the tablet pc that can be invoked by the first mobile phone 1010 may be set by a user of the tablet pc or set by a system default of the tablet pc.
If the first mobile phone 1010 continues to play music in the screen-locked state, when the first mobile phone 1010 detects that the user presses the area of the display device identifier 1203 in fig. 12A, the first mobile phone 1010 starts fingerprint recognition or face recognition to authenticate the user identity. When the user identity passes, the first mobile phone 1010 is unlocked, and the music being played by the first mobile phone 1010 is projected to the smart sound 1030.
On the basis of the first implementation manner or the second implementation manner, in some implementation manners, the distribution of the device identifiers in the information screen display interface may be mapped according to a spatial relationship of the electronic device. The spatial relationship includes a spatial relationship of positioning and/or orientation.
The first cell phone 1010 obtains the spatial relationship of the peripheral electronic devices (including one or more of the television 1020, the smart speaker 1030, the tablet 1040, and the second cell phone 1050) to the first cell phone 1010. The spatial relationship includes a distance and an included angle. As an implementation, the distance refers to a relative or absolute distance between each peripheral electronic device and the first mobile phone. The included angle is an included angle between a connection line of each peripheral electronic device and the first mobile phone and the orientation of the first mobile phone. As an example, according to the UWB positioning technology and the directivity of the UWB technology, the distance and the angle between the first mobile phone and the peripheral electronic device can be calculated. According to the directivity of the UWB technology, when the orientation of the first mobile phone changes, the angle between the first mobile phone and the peripheral electronic device changes accordingly.
In the subsequent embodiments or examples of the present application, for convenience of description, an angle between a connection line between the first mobile phone and the peripheral electronic device and a clockwise direction from the first mobile phone towards the straight line is taken as an exemplary description for calculating the size of the included angle. It should be understood that no specific limitation to the application is intended.
In conjunction with the application scenario shown in fig. 10, as shown in fig. 13A, point a represents the anchor point of the first mobile phone 1010, point B represents the anchor point of the tv 1020, and the distance between point a and point B is a; point C represents the location point of tablet 1040, with a distance b between points a and C. As shown in fig. 13A, assuming that the orientation of the first mobile phone 1010 is the direction indicated by the arrow X in fig. 13A, the straight line of the arrow X is rotated clockwise by an angle θ 1 to the connection line between the point a and the point B, and the angle θ 1 is the included angle between the connection line between the tv 1020 and the first mobile phone 1010 and the straight line of the orientation of the first mobile phone 1010, which may be simply referred to as the included angle between the first mobile phone 1010 and the tv 1020. The line of the arrow X is rotated clockwise by an angle θ 2 to a line connecting the point a and the point C, and the angle θ 2 is an included angle between a line connecting the tablet 1040 and the first mobile phone 1010 and a line connecting the first mobile phone 1010 and the direction of the first mobile phone 1010, and may be referred to as an included angle between the first mobile phone 1010 and the tablet 1040. The device identifiers in the message screen display interface of the first mobile phone 1010 are arranged according to the spatial relationship of the peripheral electronic receiving devices. Specifically, the distribution of the device identifiers is mapped according to the spatial relationship between the peripheral electronic device and the first mobile phone.
In some implementations, the first mobile phone 1010 may be oriented in a direction indicated by an arrow X1 in fig. 13B, that is, a radial direction of the top of the first mobile phone along the long side is taken as the orientation of the first mobile phone. At this time, the angle between the first mobile phone 1010 and the tv 1020 is θ 1. In other implementations, the first mobile phone 1010 may be oriented as shown by an arrow X2 in fig. 13C, and the angle between the first mobile phone 1010 and the tv 1020 is θ 2. It should be understood that the orientation of the first handset may be set by default in the system, or may be customized by the user. The orientation of the first handset can be set according to the needs and/or habits, and the orientation of the first handset can be set in other ways than in the direction of the arrows shown in fig. 13B and 13C. For example, the direction indicated by the arrow X3 or X4 in fig. 13D. The orientation of the first handset is not particularly limited in this application. Preferably, in order to reduce the difficulty of the user, save the memory cost, and improve the operation efficiency, the orientation of the first mobile phone may be set to the direction shown by the arrow X1 in fig. 13B. In application scenarios, embodiments, implementations, or examples subsequent to the present application, for convenience of description, the orientation of the mobile phone is exemplified by a ray direction along the long side direction of the mobile phone. It should be understood that no specific limitation with respect to the embodiments or implementations of the application is intended.
In fig. 13A, 13B, and 13C, the distance and/or angle between first cell phone 1010 and television 1020 is used as an exemplary description, and it is understood that the distance and angle between smart audio 1030, tablet 1040, and second cell phone 1050, respectively, and first cell phone 1010 can be analogized with reference to these examples.
In the case where the orientation of the first mobile phone 1010 is set to a certain direction, the orientation of the first mobile phone is determined with respect to the first mobile phone body, i.e., is not changed. When the positions of the first mobile phone 1010 and the peripheral electronic devices are not changed and the body of the first mobile phone 1010 is turned over or rotated, the orientation of the first mobile phone 1010 is not changed relative to the first mobile phone, so that the included angle between the mobile phone 1010 and each of the peripheral electronic devices may be changed. Since the distribution of the device identifiers in the message screen display interface is mapped according to the spatial relationship between the first mobile phone and the peripheral electronic device, and the spatial relationship includes the distance and the included angle, the distribution of the device identifiers in the message screen display interface changes along with the change of the included angle between the peripheral electronic device and the first mobile phone.
As a non-limiting example, the orientation of the first handset is taken as the ray direction along the long side direction of the handset. In conjunction with the application scenario shown in fig. 14, the orientation of the first mobile phone 1010 is adjusted from the pointing television 1020 shown in diagram a of fig. 14 to the pointing tablet 1040 shown in diagram B of fig. 14. In the process, the user holds the body of the first mobile phone 1010 and rotates the first mobile phone 1010 counterclockwise by an angle θ, and the first mobile phone 1010 is oriented from pointing to the television 1020 to pointing to the tablet 1040. During the rotation of the body of the first mobile phone 1010, the angle between the first mobile phone 1010 and the four connectable electronic devices is changed, and the distance is not changed. The message screen display interface of the first mobile phone 1010 changes from that shown in fig. 15 a to that shown in fig. 15B. As shown in fig. 15, the first mobile phone 1010 is oriented in the radial direction along the longitudinal direction of the mobile phone, i.e., in the direction indicated by the black arrow shown in fig. 15. The message screen display of the first mobile phone 1010 displays the device identifiers corresponding to the four connectable electronic devices. The device identification adopts a device outline drawing. The device identifier 1502 corresponds to the television 1020, the device identifier 1503 corresponds to the smart sound box 1030, the device identifier 1504 corresponds to the tablet 1040, and the device identifier 1505 corresponds to the second handset 1050. As can be seen from fig. 15, the distribution or layout of the device identifications changes as the spatial relationship of the first handset and the peripheral electronic devices changes.
More generally, when the location of at least one of the first cell phone 1010 and the connectable devices changes and/or the angle between the first cell phone 1010 and any one or more of the connectable devices changes, the distribution of the connectable devices in the touchscreen display displayed by the first cell phone 1010 changes with these changes.
In another non-limiting example, the breath screen display interface may also include an orientation identification of the first cell phone 1010. The orientation mark can be presented in a static form such as a line or an arrow, and can also be presented in a dynamic form such as a flashing line or an arrow. The orientation mark can be synchronous with the display of the message screen display interface and can also be shorter than the display time of the message screen display interface. For example, the heading indicator is displayed for a period of time from the beginning of displaying the message screen display interface and then disappears. As another example, the orientation indicator is displayed during the period of time that the user rotates, turns, or moves the first handset, and then disappears.
For example, continuing with the application scenario shown in fig. 14, as shown in fig. 16, the first cell phone 1010 is oriented along a ray along the long side of the cell phone, i.e., in the direction indicated by the black arrow. The orientation of the first cell phone 1010 in the touchscreen display is identified as a white line 1611. The information screen display interface displays four connectable devices searched by the first mobile phone 1010, and the device identifiers adopt rectangular patterns. The device identifier 1602 corresponds to the television 1020, the device identifier 1603 corresponds to the smart sound box 1030, the device identifier 1604 corresponds to the tablet computer 1040, and the device identifier 1605 corresponds to the second handset 1050. The body of the first mobile phone 1010 is rotated counterclockwise by an angle, and the direction of the first mobile phone 1010 is adjusted from pointing to the television 1020 to pointing to the tablet 1040. In the message screen display interface shown in diagram a of fig. 16, the device identifier 1602 corresponding to the tv 1020 is located on the white line 1611. In the message screen display interface shown in diagram B of fig. 16, the device identifier 1604 corresponding to the tablet computer is located on the white line 1611. As can be seen from fig. 16, the orientation indication of the message screen display, i.e. the position of the white line, does not change as the spatial relationship of the first mobile phone 1010 and the connectable device changes. But the display position of the device identifier corresponding to each connectable device is changed. Because the orientation identification is displayed in the message screen display interface, a user can more visually confirm the equipment pointed by the orientation of the mobile phone, the distribution or the layout of the equipment identification can be conveniently corresponding to the actual scene, the user can conveniently and accurately and efficiently select the target equipment, and a quick path aiming at the target equipment is started.
In the examples shown in fig. 15 and fig. 16, the user may press (e.g., click or long-press operation, etc.) a screen area displaying any device identifier, and after unlocking the mobile phone, the mobile phone enables the shortcut path corresponding to the device identifier.
In the example shown in fig. 15 and 16, the device identifier (i.e., shortcut identifier) in the screen display interface is displayed in the upper area of the screen. The display area of the shortcut path identifier can be set according to system setting or user-defined setting. In some embodiments, the mobile phone screen is large, so that the user operation is facilitated, the operation efficiency is improved, and the shortcut path identifier in the message screen display interface can be displayed in the lower area of the screen according to the default setting of the system or the user-defined setting. In other embodiments, the screen of the mobile phone is large, and in order to further facilitate the user operation and further improve the operation efficiency, the shortcut identifier may be displayed in a left or right area below the screen according to the detected holding state of the left or right hand. In other embodiments, the shortcut path identifier in the message screen display interface may change the display area, and similarly, other contents in the message screen display interface may also change the display area, for example, the display position of the displayed content is changed once every preset time interval; for another example, the display positions of the contents displayed in the screen display interface are all different from the previous display position. It is to be understood that this is done by way of example only and is not to be construed as a specific limitation on the application.
Based on the implementation modes of mapping the distribution of the device identifiers in the display interface of the message screen according to the spatial relationship of each connectable device, the first mobile phone 1010 may obtain the preset conditions according to the default settings of the system or the user-defined settings, and display the device identifiers of the peripheral devices meeting the preset conditions in the display interface of the message screen.
As a non-limiting example, the user-defined preset condition is that the maximum deviation angle of the peripheral device is, for example, 15 (angle unit: degree). Thus, the deviation angle between the peripheral device corresponding to the device identifier displayed on the message screen display interface of the first mobile phone 1010 and the first mobile phone 1010 is within 15 °. That is, when the included angle between the peripheral device and the first mobile phone 1010 is within 15 °, or within 345 ° to 360 °, the device identifier corresponding to the peripheral device is displayed in the message screen display interface. Through the arrangement, the number of the peripheral equipment displayed in the information screen display interface is reduced, misoperation caused by the fact that a user does not accurately distinguish the peripheral equipment is avoided, and the accuracy of operation is improved.
As another non-limiting example, the user-defined preset conditions are that the maximum deviation angle of the peripheral device is, for example, 15 ° (angle unit: degree), and the upper limit of the number of device identifications is 2. Therefore, the information screen display interface of the first mobile phone 1010 can display the device identifiers corresponding to 2 peripheral devices at most, and the deviation angle between the peripheral device corresponding to the displayed device identifier and the first mobile phone 1010 is within 15 °. In some examples, if exactly 2 peripherals are included within 15 °, or 345 ° to 360 ° (i.e., within-15 ° to +15 °) of the first mobile phone 1010, the device identifications corresponding to the 2 peripherals are displayed in the message screen display interface. In other examples, if more than 2 peripheral devices are included within 15 °, or within 345 ° to 360 ° (i.e., within-15 ° to +15 °) of the first mobile phone 1010, the device identifiers corresponding to the two peripheral devices with the smallest deviation angle are displayed in the message screen display interface.
Fourth application scenario
In some possible implementation manners, the mobile phone may determine to adopt a scheme of a first application scenario (hereinafter referred to as a first shortcut path scheme) according to a user use habit and/or scenario information, and the like, that is, a shortcut path identifier for an application or a function in the mobile phone is displayed in the information screen display interface; or, a third application scenario scheme (hereinafter referred to as a second shortcut path scheme) is adopted, that is, a shortcut path identifier for the peripheral device is displayed in the information screen display interface. The scene information includes location information and the like. In the fourth application scenario, switching of different shortcut path schemes is achieved.
As a non-limiting example, the mobile phone may count usage rules of the first shortcut path scheme and the second shortcut path scheme according to usage records of the two schemes in a certain past period. For example, the statistical result is: the second shortcut path scheme is centrally used throughout the day including 2 time periods, 8:00 to 9:00, and 10:00 to 11: 00. And when the mobile phone detects that the current time is in the 2 time periods, a second shortcut path scheme is adopted, namely shortcut path identification aiming at the peripheral equipment is displayed in the information screen display interface. And when the current time is in the 2 time periods, a first shortcut path scheme is adopted, namely shortcut path identification aiming at the application or the function in the mobile phone is displayed in the information screen display interface.
As another non-limiting example, the handset may determine the current location based on the location, and determine the scene where the user is located based on the current location, such as a home or office scene. When the mobile phone determines that the user is in a scene such as a home or an office, the mobile phone is switched to the second shortcut path scheme, namely shortcut path identifiers for the peripheral equipment are displayed in the information screen display interface. In other scenes except the home or the office, the mobile phone is switched to the first shortcut path scheme, namely shortcut path identification aiming at the application or the function in the mobile phone is displayed in the information screen display interface.
It should be noted that whether to switch the shortcut path scheme may be set by default by the system or may be set by user. The mobile phone can automatically switch the shortcut path scheme according to the use habit of the user and/or the scene information and the like, and can also switch the shortcut path scheme according to the manual operation of the user.
With reference to the application scenario and the related drawings, an embodiment of the present application provides an unlocking method, which can be executed by an electronic device. For example, the unlocking method may be performed by a mobile phone in the aforementioned application scenario. For another example, in other practical application scenarios, the unlocking method may also be executed by an electronic device such as a tablet computer, a television, or a smart speaker having a touch display screen. As shown in fig. 17, the unlocking method is applied to an electronic device, and the unlocking method includes steps S1710 to S1730.
S1710, receiving a touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more shortcut path identifiers, and the shortcut path identifiers correspond to identifiers of applications or functions in electronic equipment.
Wherein, the shortcut path identifier includes a pattern and/or text, etc.
And the electronic equipment determines the application or function corresponding to each shortcut path identifier according to system default setting or user-defined setting.
S1720, if it is determined that the touch operation acts on the target shortcut path identifier, starting user identity authentication; the target shortcut path identifier is any one of the one or more shortcut path identifiers.
The user identity authentication comprises identity authentication based on fingerprint identification and/or face identification and the like. The electronic equipment inputs fingerprint information and/or face information of a user in advance so as to carry out subsequent identity authentication.
When identity authentication based on fingerprint identification is adopted, the touch operation input by the user in the information screen display interface can be a finger pressing operation. On one hand, the electronic equipment can determine whether a user finger acts on a certain shortcut path identifier, namely a target shortcut path identifier; and on the other hand, collecting the fingerprint of the user for identity authentication.
It should be noted that, in some implementation manners, if it is determined that the touch operation does not act on any shortcut path identifier, the electronic device is not unlocked, and the message screen display interface continues to be displayed. In some implementation manners, if it is determined that the touch operation does not act on any shortcut path identifier, the electronic device may start the camera to perform face recognition, and after the face recognition passes, the electronic device is unlocked and enters an unlocking interface. In some implementation manners, a fingerprint pattern may be displayed in the message screen display interface to prompt a user to perform fingerprint identification to unlock the electronic device, if it is determined that the touch operation acts on the fingerprint pattern displayed in the message screen display interface, the electronic device starts fingerprint identification, and if the fingerprint identification passes, the electronic device is unlocked and enters the unlocking interface.
And S1730, if the user identity authentication is passed, unlocking and displaying an interface of the application or the function corresponding to the target shortcut path identifier.
And when the user identity authentication is passed, the electronic equipment is unlocked, an application or function interface corresponding to the target shortcut path identifier is displayed, and the user can directly use the function or application.
It should be noted that, in some implementation manners, if the user identity authentication fails, the electronic device is not unlocked, and the message screen display interface continues to be displayed.
In the embodiment, a plurality of areas are divided on a display interface of a message screen, and each area is used as a shortcut path for entering a specific application or function. The user can quickly enter specific applications or functions through fingerprint unlocking or face recognition, and the purpose of improving the operation efficiency of the user is achieved.
Another embodiment of the present application provides an unlocking method, which may be performed by an electronic device. For example, the unlocking method may be performed by a mobile phone in the aforementioned application scenario. For another example, in other practical application scenarios, the unlocking method may also be executed by a tablet computer with a touch display screen, a television, a smart sound box, or the like. As shown in fig. 18, the unlocking method is applied to the electronic device, and the unlocking method includes steps S1810 to S1830.
S1810, receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more device identifications, and the device identifications are identifications of peripheral devices which can be connected with the electronic device and/or connected peripheral devices.
Wherein the device identification comprises a pattern and/or text and the like.
And the electronic equipment determines the shortcut path of the peripheral equipment corresponding to each equipment identifier according to system default setting or user-defined setting.
S1820, if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers.
The user identity authentication comprises identity authentication based on fingerprint identification and/or face identification and the like. The electronic equipment inputs fingerprint information and/or face information of a user in advance so as to carry out subsequent identity authentication.
When identity authentication based on fingerprint identification is adopted, the touch operation input by the user in the information screen display interface can be a finger pressing operation. In one aspect, the electronic device may determine whether a user's finger is acting on a device identifier, i.e., a target device identifier; and on the other hand, collecting the fingerprint of the user for identity authentication.
It should be noted that, in some implementation manners, if it is determined that the touch operation does not act on any device identifier, the electronic device is not unlocked, and the message screen display interface continues to be displayed. In some implementation manners, if it is determined that the touch operation does not act on any device identifier, the electronic device may start the camera to perform face recognition, and after the face recognition passes, the electronic device is unlocked and enters an unlocking interface. In some implementation manners, a fingerprint pattern may be displayed in the message screen display interface to prompt a user to perform fingerprint identification to unlock the electronic device, if it is determined that the touch operation acts on the fingerprint pattern displayed in the message screen display interface, the electronic device starts fingerprint identification, and if the fingerprint identification passes, the electronic device is unlocked and enters the unlocking interface.
And S1830, if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target device identifier.
And when the user identity authentication is passed, the electronic equipment is unlocked, and a shortcut path corresponding to the target equipment identification is started.
It should be understood that the shortcut path is associated with the target device corresponding to the target device identification.
Shortcut paths include, but are not limited to: control the target device to turn on, display the control panel interface of the target device, turn up the display interface of the target device, project a screen or sound to the target device, and so on.
It should be noted that, in some implementation manners, if the user identity authentication fails, the electronic device is not unlocked, and the message screen display interface continues to be displayed.
In this embodiment, the device identifiers of one or more peripheral devices are displayed on the display interface of the message screen, and each device identifier is used as a shortcut path for entering a specific peripheral device. The user can quickly start a quick path aiming at any peripheral equipment through fingerprint unlocking or face recognition, and the quick path is used for controlling the peripheral equipment, entering a control interface of the peripheral equipment, turning up an interface of the peripheral equipment, projecting a screen or a sound and the like. The cross-device interaction is realized under the screen switching conveniently and rapidly, and the purpose of improving the operation efficiency of the user is achieved.
It should be understood that the execution sequence of each process in the above embodiments should be determined by the function and the inherent logic thereof, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the unlocking method described in the above embodiment, the embodiment of the present application further provides an unlocking device. Each module included in the unlocking device can correspondingly realize each step of the unlocking method.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application can be realized in hardware or a combination of hardware and computer software in conjunction with the description of the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application with the embodiment described, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should be noted that, because the contents of information interaction, execution process, and the like between the modules/units are based on the same concept as that of the method embodiment of the present application, specific functions and technical effects thereof may be referred to specifically in the method embodiment section, and are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the electronic device is enabled to implement the steps in the above method embodiments.
As an example, the electronic device may include a wearable device, a cell phone, or a tablet computer, among others.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application provide a computer program product, which when executed on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/electronic device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunication signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed electronic device and method may be implemented in other ways. For example, the above-described electronic device embodiments are merely illustrative. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (20)

1. An unlocking method is applied to an electronic device, and is characterized by comprising the following steps:
receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment;
if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
and if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target equipment identifier.
2. The unlocking method according to claim 1, wherein the distribution of the one or more device identifiers in the message screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
3. Unlocking method according to claim 2, characterized in that said spatial relationship comprises a spatial relationship of positioning and/or orientation.
4. The unlocking method according to claim 3, wherein in a case where the spatial relationship includes a spatial relationship of positioning and orientation, the spatial relationship includes a distance between each of the peripheral devices and the electronic device, and an angle between a line connecting each of the peripheral devices and the electronic device and an orientation of the electronic device.
5. The unlocking method of claim 4, wherein the message screen display interface further comprises an orientation indicator of the electronic device.
6. The unlocking method according to any one of claims 1 to 5, wherein enabling the shortcut path corresponding to the target device identifier includes:
controlling the target equipment corresponding to the target equipment identification to respond to a preset instruction; or the like, or, alternatively,
displaying a control panel interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
calling up a display interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identification.
7. The unlocking method according to any one of claims 1 to 5, wherein the touch operation includes a finger-press operation, and the user authentication includes user authentication based on fingerprint recognition.
8. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the electronic device to perform the steps of:
receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment;
if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
and if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target equipment identifier.
9. The electronic device of claim 8, wherein the distribution of the one or more device identifiers in the message screen display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
10. The electronic device of claim 9, wherein the spatial relationship comprises a spatial relationship of positioning and/or orientation.
11. The electronic device of claim 10, wherein in the case that the spatial relationship comprises a spatial relationship of positioning and orientation, the spatial relationship comprises a distance between each of the peripheral devices and the electronic device, and an angle between a line connecting each of the peripheral devices and the electronic device and an orientation of the electronic device.
12. The electronic device of claim 11, wherein the breath screen display interface further comprises an orientation indicator of the electronic device.
13. The electronic device of any of claims 8-12, wherein enabling the shortcut path corresponding to the target device identification comprises:
controlling the target equipment corresponding to the target equipment identification to respond to a preset instruction; or the like, or, alternatively,
displaying a control panel interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
calling up a display interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identification.
14. The electronic device according to any one of claims 8 to 12, wherein the touch operation includes a finger press operation, and the user authentication includes user authentication based on fingerprint recognition.
15. A computer-readable storage medium storing a computer program, the computer program when executed by a processor implementing the steps of:
receiving touch operation input by a user in a message screen display interface, wherein the message screen display interface comprises one or more equipment identifications, and the equipment identifications are identifications of peripheral equipment which can be connected with the electronic equipment and/or connected peripheral equipment;
if the touch operation is determined to act on the target equipment identifier, starting user identity authentication; the target device identifier is any one of the one or more device identifiers;
and if the user identity authentication passes, unlocking and starting the shortcut path corresponding to the target equipment identifier.
16. The computer-readable storage medium of claim 15, wherein the distribution of the one or more device identifiers in the information display interface is mapped according to a spatial relationship between each of the peripheral devices and the electronic device.
17. The computer-readable storage medium of claim 16, wherein the spatial relationship comprises a spatial relationship of positioning and/or orientation.
18. The computer-readable storage medium of claim 17, wherein in a case that the spatial relationship comprises a spatial relationship of positioning and orientation, the spatial relationship comprises a distance between each of the peripheral devices and the electronic device, and an angle between a line connecting each of the peripheral devices and the electronic device and an orientation of the electronic device.
19. The computer-readable storage medium according to any one of claims 15 to 18, wherein the enabling of the shortcut path corresponding to the target device identification comprises:
controlling the target equipment corresponding to the target equipment identification to respond to a preset instruction; or the like, or, alternatively,
displaying a control panel interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
calling up a display interface of the target equipment corresponding to the target equipment identification; or the like, or, alternatively,
and projecting a screen or a sound to the target equipment corresponding to the target equipment identification.
20. The computer-readable storage medium according to any one of claims 15 to 18, wherein the touch operation includes a finger press operation, and the user authentication includes user authentication based on fingerprint recognition.
CN202010911832.4A 2020-09-02 2020-09-02 Unlocking method and electronic equipment Active CN114201738B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010911832.4A CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment
PCT/CN2021/113610 WO2022048453A1 (en) 2020-09-02 2021-08-19 Unlocking method and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010911832.4A CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment

Publications (2)

Publication Number Publication Date
CN114201738A true CN114201738A (en) 2022-03-18
CN114201738B CN114201738B (en) 2023-04-21

Family

ID=80491580

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010911832.4A Active CN114201738B (en) 2020-09-02 2020-09-02 Unlocking method and electronic equipment

Country Status (2)

Country Link
CN (1) CN114201738B (en)
WO (1) WO2022048453A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700473A (en) * 2022-09-13 2023-09-05 荣耀终端有限公司 Display method and device for screen-extinguishing display and terminal equipment
CN117131555A (en) * 2023-04-28 2023-11-28 荣耀终端有限公司 Information display method and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140081642A (en) * 2012-12-21 2014-07-01 삼성전자주식회사 Method and system for controlling for external apparatus
CN108243281A (en) * 2017-12-27 2018-07-03 深圳信炜科技有限公司 The fingerprint identification method of electronic equipment
CN108958582A (en) * 2018-06-28 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal
CN110597473A (en) * 2019-07-30 2019-12-20 华为技术有限公司 Screen projection method and electronic equipment
CN111459388A (en) * 2020-04-13 2020-07-28 深圳康佳电子科技有限公司 Information screen display method, display equipment and storage medium for smart home information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140181683A1 (en) * 2012-12-21 2014-06-26 Samsung Electronics Co., Ltd. Method and system for controlling external device
CN111328051B (en) * 2020-02-25 2023-08-29 上海银基信息安全技术股份有限公司 Digital key sharing method and device, electronic equipment and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20140081642A (en) * 2012-12-21 2014-07-01 삼성전자주식회사 Method and system for controlling for external apparatus
CN108243281A (en) * 2017-12-27 2018-07-03 深圳信炜科技有限公司 The fingerprint identification method of electronic equipment
CN108958582A (en) * 2018-06-28 2018-12-07 维沃移动通信有限公司 A kind of application program launching method and terminal
CN110597473A (en) * 2019-07-30 2019-12-20 华为技术有限公司 Screen projection method and electronic equipment
CN111459388A (en) * 2020-04-13 2020-07-28 深圳康佳电子科技有限公司 Information screen display method, display equipment and storage medium for smart home information

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116700473A (en) * 2022-09-13 2023-09-05 荣耀终端有限公司 Display method and device for screen-extinguishing display and terminal equipment
CN116700473B (en) * 2022-09-13 2024-04-05 荣耀终端有限公司 Display method and device for screen-extinguishing display and terminal equipment
CN117131555A (en) * 2023-04-28 2023-11-28 荣耀终端有限公司 Information display method and electronic equipment

Also Published As

Publication number Publication date
WO2022048453A1 (en) 2022-03-10
CN114201738B (en) 2023-04-21

Similar Documents

Publication Publication Date Title
CN113645351B (en) Application interface interaction method, electronic device and computer-readable storage medium
CN109814766B (en) Application display method and electronic equipment
CN113794801B (en) Method and device for processing geo-fence
CN112751954B (en) Operation prompting method and electronic equipment
CN111602108B (en) Application icon display method and terminal
CN113805487B (en) Control instruction generation method and device, terminal equipment and readable storage medium
CN110633043A (en) Split screen processing method and terminal equipment
CN112671976A (en) Control method of electronic equipment and electronic equipment
CN114079893A (en) Bluetooth communication method, terminal device and computer readable storage medium
CN111492678B (en) File transmission method and electronic equipment
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
WO2022048453A1 (en) Unlocking method and electronic device
WO2020221062A1 (en) Navigation operation method and electronic device
CN114691064B (en) Dual-path screen throwing method and electronic equipment
CN114095542A (en) Display control method and electronic equipment
CN114338642B (en) File transmission method and electronic equipment
CN116048236B (en) Communication method and related device
CN114205318B (en) Head portrait display method and electronic equipment
CN114764300B (en) Window page interaction method and device, electronic equipment and readable storage medium
CN114006976B (en) Interface display method and terminal equipment
CN113973152A (en) Unread message quick reply method and electronic equipment
CN115906033A (en) Voiceprint authentication response method and system and electronic equipment
CN114579900A (en) Cross-device page switching method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant