CN114223139A - Interface switching method and device, wearable electronic equipment and storage medium - Google Patents
Interface switching method and device, wearable electronic equipment and storage medium Download PDFInfo
- Publication number
- CN114223139A CN114223139A CN201980099239.XA CN201980099239A CN114223139A CN 114223139 A CN114223139 A CN 114223139A CN 201980099239 A CN201980099239 A CN 201980099239A CN 114223139 A CN114223139 A CN 114223139A
- Authority
- CN
- China
- Prior art keywords
- behavior
- wearable electronic
- user interface
- scene
- interface
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 90
- 238000000605 extraction Methods 0.000 claims abstract description 14
- 230000006399 behavior Effects 0.000 claims description 394
- 238000012549 training Methods 0.000 claims description 41
- 238000013528 artificial neural network Methods 0.000 claims description 13
- 238000007781 pre-processing Methods 0.000 claims description 8
- 238000004140 cleaning Methods 0.000 claims description 7
- 238000012937 correction Methods 0.000 claims description 6
- 238000005065 mining Methods 0.000 claims description 6
- 230000007613 environmental effect Effects 0.000 claims description 3
- 230000002123 temporal effect Effects 0.000 claims description 2
- 230000008569 process Effects 0.000 description 10
- 238000012545 processing Methods 0.000 description 9
- 238000010586 diagram Methods 0.000 description 8
- 230000001133 acceleration Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 210000002569 neuron Anatomy 0.000 description 4
- 239000013598 vector Substances 0.000 description 4
- 230000003542 behavioural effect Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 210000004556 brain Anatomy 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 239000004984 smart glass Substances 0.000 description 2
- 238000012706 support-vector machine Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 235000019580 granularity Nutrition 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000005406 washing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application discloses an interface switching method, an interface switching device, wearable electronic equipment and a storage medium, wherein the interface switching method is applied to the wearable electronic equipment, the wearable electronic equipment comprises a sensor used for collecting behavior data, and the method comprises the following steps: acquiring behavior data acquired by the sensor; performing feature extraction on the behavior data to obtain behavior features; inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics; and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene. The method can realize automatic switching of the dial plate of the wearable electronic equipment, and improve user experience.
Description
The present application relates to the field of mobile terminal technologies, and in particular, to an interface switching method and apparatus, a wearable electronic device, and a storage medium.
Wearable electronic devices, such as smart watches, smart glasses, and smart bracelets, have become one of the most common consumer electronic products in people's daily lives. Wearable electronic equipment is more and more popular among consumers because of the convenience of wearing and the capability of providing more humanized services for users. However, when the user interface of the traditional wearable electronic device is switched, the operation is complicated, and inconvenience is brought to the user.
Disclosure of Invention
In view of the foregoing problems, the present application provides an interface switching method, an interface switching apparatus, a wearable electronic device, and a storage medium.
In a first aspect, an embodiment of the present application provides an interface switching method, which is applied to a wearable electronic device, where the wearable electronic device includes a sensor for collecting behavior data, and the method includes: acquiring behavior data acquired by the sensor; performing feature extraction on the behavior data to obtain behavior features; inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics; and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In a second aspect, an embodiment of the present application provides an interface switching apparatus, which is applied to a wearable electronic device, where the wearable electronic device includes a sensor for collecting behavior data, and the apparatus includes: the system comprises a data acquisition module, a feature acquisition module, a scene recognition module and an interface switching module, wherein the data acquisition module is used for acquiring behavior data acquired by the sensor; the characteristic acquisition module is used for extracting characteristics of the behavior data to acquire behavior characteristics; the scene recognition module is used for inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, and the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics; the interface switching module is used for switching a current interface displayed by the wearable electronic equipment to a user interface corresponding to the current behavior scene.
In a third aspect, an embodiment of the present application provides a wearable electronic device, including: one or more processors; a memory; one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the interface switching method provided by the first aspect above.
In a fourth aspect, an embodiment of the present application provides a computer-readable storage medium, where a program code is stored in the computer-readable storage medium, and the program code may be called by a processor to execute the interface switching method provided in the first aspect.
According to the scheme, the behavior data acquired by the sensor of the wearable electronic device are acquired, the behavior data are subjected to feature extraction, behavior features are acquired, the behavior features are input into a trained preset model, the current behavior scene where the wearable electronic device is located is acquired, the preset model is trained in advance, the recognition result of the behavior scene is output according to the input behavior features, then the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene, the behavior features corresponding to the behavior data acquired by the sensor are achieved, the current behavior scene where the wearable electronic device is located is automatically recognized, the displayed current interface is switched to the user interface corresponding to the current behavior scene, user operation in the switched user interface is reduced, and user experience is improved.
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 shows a flow chart of an interface switching method according to an embodiment of the present application.
Fig. 2 shows a schematic interface diagram provided by an embodiment of the present application.
FIG. 3 illustrates another interface diagram provided by an embodiment of the present application.
Fig. 4 shows a schematic view of another interface provided by an embodiment of the present application.
FIG. 5 shows a flow diagram of an interface switching method according to another embodiment of the present application.
FIG. 6 is a flow chart of a method for interface switching according to yet another embodiment of the present application.
Fig. 7 shows a flowchart of step S310 in an interface switching method according to another embodiment of the present application.
Fig. 8 is a flowchart illustrating an interface switching method according to still another embodiment of the present application.
FIG. 9 shows a block diagram of an interface switching device according to an embodiment of the present application.
Fig. 10 is a block diagram of a wearable electronic device for executing an interface switching method according to an embodiment of the present application.
Fig. 11 is a storage unit, according to an embodiment of the present application, configured to store or carry program code for implementing an interface switching method according to an embodiment of the present application.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application.
With the development of science and technology and living standards, wearable electronic devices (such as smart watches) are more and more frequently appearing in daily life. The wearable electronic device may not be limited to displaying time, but may also display some information, such as weather information, health information, reminder information, and the like. Therefore, the user interfaces (e.g., dials) of the wearable electronic devices have different styles, and different users have different requirements for the style of the user interface, which makes one style of user interface unable to meet the requirements of the users, so that there are multiple styles of user interfaces in the wearable electronic devices.
The inventor has found through long-term research that, in the conventional technology, when a plurality of styles of user interfaces exist in the wearable electronic device, the user is required to perform cumbersome operations when switching the user interfaces. For example, when the dial plate is switched, a setting interface of the dial plate needs to be entered, then the currently existing dial plate is checked, a required dial plate style is selected, and the whole process takes more time.
In view of the above problems, the inventor provides an interface switching method, an interface switching apparatus, a wearable electronic device, and a storage medium provided in the embodiments of the present application, and according to behavior characteristics corresponding to behavior data acquired by a sensor, a current behavior scene where the wearable electronic device is located is automatically identified, and a displayed current interface is switched to a user interface corresponding to the current behavior scene, so that operations of switching users in the user interface are reduced, and user experience is improved. The specific interface switching method is described in detail in the following embodiments.
Referring to fig. 1, fig. 1 is a schematic flowchart illustrating an interface switching method according to an embodiment of the present application. The interface switching method is used for automatically identifying the current behavior scene where the wearable electronic equipment is located according to the behavior characteristics corresponding to the behavior data acquired by the sensor, and switching the displayed current interface to the user interface corresponding to the current behavior scene, so that the operation of users in the switched user interface is reduced, and the user experience is improved. In a specific embodiment, the interface switching method is applied to the interface switching apparatus 400 shown in fig. 9 and the wearable electronic device 100 (fig. 10) equipped with the interface switching apparatus 400. The wearable electronic device may include a sensor for acquiring behavior data, and the sensor may include an acceleration sensor, a gyroscope sensor, a gravity sensor, a heart rate sensor, a brain wave sensor, a positioning sensor, an infrared sensor, and the like, which are not limited herein. The specific process of the present embodiment will be described below by taking an electronic device as an example, and it is understood that the wearable electronic device applied in the present embodiment may be a smart watch, a smart bracelet, and the like, which is not limited herein. As will be described in detail with respect to the flow shown in fig. 1, the interface switching method may specifically include the following steps:
step S110: and acquiring the behavior data acquired by the sensor.
In the embodiment of the present application, a mobile terminal may be provided with various sensors for acquiring behavior data, such as an acceleration sensor, a gyroscope sensor, a gravity sensor, a heart rate sensor, a brain wave sensor, a positioning sensor, an infrared sensor, and the like. Where behavioral data may refer to data that characterizes user behavior. The user behaviors may include different types of user behaviors such as walking, standing, running, squatting, hand movement, head shaking, and the like, and the specific user behavior may not be limited. The behavior data collected by these sensors may vary for different user behaviors. The behavior data may reflect characteristics of some behavior scenarios, for example, a scenario when the user walks on a route to a home location, and characteristics of the scenario may be reflected according to the positioning data, the motion data, the biometric data, the environmental data, and the time data. Therefore, the behavior scene where the wearable electronic device is located can be identified by using the behavior data collected by the sensor.
In some embodiments, the wearable electronic device may acquire behavior data collected by a sensor used to collect the behavior data when performing the identification of the behavior scenario. The behavior data acquired by the mobile terminal may include behavior data acquired by a plurality of sensors, for example, all behavior data acquired by sensors that can acquire the behavior data may be acquired, or some behavior data acquired by sensors may also be acquired, which is not limited herein. It can be understood that the more the types of the acquired behavior data (i.e., the more the types of the sensors acquiring the behavior data), the more the features used for identifying the behavior scene are, the higher the dimensionality of the features is, and the accuracy rate of identifying the behavior scene can be improved.
In some embodiments, the wearable electronic device may acquire behavior data collected by the sensor within a preset time period, so as to identify the behavior scene later. It can be understood that the accuracy of the subsequently identified behavior scene can be high through the behavior data acquired within a period of time.
Step S120: and performing feature extraction on the behavior data to obtain behavior features.
In the embodiment of the application, after the wearable electronic device acquires the behavior data acquired by the sensor for acquiring the behavior data, the wearable electronic device may perform feature extraction on the behavior data to acquire the behavior feature, so as to perform a subsequent process of identifying a behavior scene according to the behavior feature.
In some embodiments, after the wearable electronic device acquires the behavior data acquired by the sensor, feature extraction may be performed on the acquired behavior data, the extracted features may include a timing feature, a frequency domain feature, a statistical feature, and the like, and the specific extracted features may not be limited.
As an implementation manner, behavior data acquired by a sensor and acquired by wearable electronic equipment may be time series data (i.e., time domain data), and the mobile terminal may perform statistical feature extraction to acquire median, mean, maximum, minimum, peak, and the like from the behavior data detected by the sensor, so as to obtain statistical features in the time series data; the mobile terminal may also obtain a time characteristic by obtaining a value before a certain point on the time axis and before a preset time interval according to the time data, and the like, which is not limited herein.
As another embodiment, the mobile terminal may further perform fast fourier transform on the time series data to obtain frequency domain data, then separate the high and low frequency signals, calculate the overall capacity of the frequency domain signals, and take at least a part of coefficients as the frequency domain features. For example, after performing fast fourier transform on a curve of acceleration with time, a change curve of acceleration with frequency may be obtained, and from the obtained change curve of acceleration with frequency, a frequency domain feature may be calculated. Of course, the specific manner of extracting the behavior feature corresponding to the behavior data may not be limited.
Step S130: inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics.
In some embodiments, after obtaining the behavior features according to the behavior data, the obtained behavior features may be input into a trained preset model to obtain a recognition result of a behavior scene output by the preset model according to the input behavior features, the recognition result includes the behavior scene, and the obtained behavior scene is recognized as the current behavior scene.
In some embodiments, the preset model may be trained from a large number of training samples, the training samples used for training the preset model may include input samples and output samples, the input samples may include behavior features, and the output samples may include behavior scenes corresponding to the behavior features. Therefore, the trained preset model can be used for outputting a recognition result according to the input behavior characteristics, and the recognition result can not include the current behavior scene of the wearable electronic device. In an optional implementation manner, the recognition result output by the trained preset model may be a scene identifier of the behavior scene, where different behavior scenes may be stored in different scene identifiers, for example, different behavior scenes may be identified by an integer number of 1 byte (i.e., 0 to 255).
In some embodiments, the preset model may include a Support Vector Machine (SVM), a neural network, a decision tree model, and the like, which are not limited herein.
Step S140: and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In some embodiments, after the wearable electronic device identifies the behavior scene where the wearable electronic device is located, the user interface corresponding to the behavior scene may be determined according to the behavior scene where the wearable electronic device is located, so that the current interface of the wearable electronic device is switched to the user interface corresponding to the current behavior scene, automatic switching of the user interface is achieved, and the switched user interface corresponds to the current behavior scene.
In some embodiments, user interfaces corresponding to different behavior scenes may be stored in the wearable electronic device in advance, and the wearable electronic device may determine the user interface to be switched to according to the behavior scene obtained through recognition. For example, when the recognition result includes a scene identifier of the behavior scene, the user interface corresponding to the scene identifier may be read, and the current interface of the wearable electronic device may be switched to the determined user interface.
In some implementations, the user interface can include a dial interface, a home screen interface, a lock screen interface, or an application interface. That is to say, wearable electronic equipment can switch to any one user interface in table dish interface, main screen interface, lock screen interface and the application interface automatically, has reduced user operation to the interface after the switching corresponds with current action scene, satisfies user's demand.
In some application scenarios, the user interface may be a dial interface, and the behavior scenarios may include sports scenarios, home scenarios, and work scenarios. If the current behavior scene is a motion scene, the current dial can be switched to the dial corresponding to the motion scene, as shown in fig. 2, after the wearable electronic device switches the dial to the dial corresponding to the motion scene, the dial corresponding to the motion scene can display the content related to motion, such as time, step number, heat consumption, heart rate and the like; if the current behavior scene is the working scene, the current dial can be switched to the dial corresponding to the working scene, as shown in fig. 3, after the wearable electronic device switches the dial to the dial corresponding to the working scene, the dial corresponding to the working scene can display the content related to the work, such as date, time, work schedule, and the like; if the current family scene is the family scene, the current dial plate can be switched to the dial plate corresponding to the family scene, as shown in fig. 4, after the wearable electronic device switches the dial plate to the dial plate corresponding to the family scene, the content related to the family, such as date, time, weather, television program reminding and the like, can be displayed in the dial plate corresponding to the working scene. Of course, the above behavior scenarios and the contents of the dial are only examples, and do not represent the limitations of the actual behavior scenarios and the dial.
According to the interface switching method provided by the embodiment of the application, the behavior data acquired by the sensor of the wearable electronic equipment is acquired, the characteristic extraction is carried out on the behavior data, the behavior characteristic is acquired, the behavior characteristic is input into the trained preset model, the current behavior scene where the wearable electronic equipment is located is acquired, and then the current interface displayed by the wearable electronic equipment is switched to the user interface corresponding to the current behavior scene, so that the current behavior scene where the wearable electronic equipment is located is automatically identified according to the behavior characteristic corresponding to the behavior data acquired by the sensor, the displayed current interface is switched to the user interface corresponding to the current behavior scene, the requirement of a user on the user interface is met, the operation of the user in the switched user interface is reduced, and the user experience is improved.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating an interface switching method according to another embodiment of the present application. The interface switching method may be applied to the wearable electronic device, where the wearable electronic device includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 5, the interface switching method may specifically include the following steps:
step S210: and acquiring the behavior data acquired by the sensor.
Step S220: and performing feature extraction on the behavior data to obtain behavior features.
In the embodiment of the present application, step S210 and step S220 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S230: and preprocessing the behavior characteristics to obtain the preprocessed behavior characteristics.
In this embodiment of the application, after obtaining the behavior characteristics, the wearable electronic device may also pre-process the behavior characteristics before inputting the behavior characteristics into the trained preset model.
In some embodiments, preprocessing the behavior feature may include: and carrying out feature cleaning and feature mining on the behavior features. The characteristic cleaning comprises the step of cleaning the content in the behavior characteristic according to a preset cleaning rule; feature mining includes mining behavioral features to form more dimensional features.
In some embodiments, the mobile terminal performing feature washing on the behavior feature may include: and removing missing values and abnormal values in the behavior characteristics, for example, removing incomplete data, data with wrong types and the like. As a specific implementation, the feature cleaning may be missing value processing, a dimension with a missing value smaller than a preset percentage may fit the missing value according to other values of the dimension, and if the number of missing values is greater than the preset percentage, it indicates that the feature is an invalid feature, and the dimension is removed. The preset percentage may be 35%, 40%, and the like, and the specific preset percentage may not be limited.
In some embodiments, feature mining the behavior feature may include: and excavating the behavior characteristics after the characteristics are cleaned by utilizing the lifting tree model. Before the behavior characteristics are input into the lifting tree model, the mobile terminal can also quantize the numerical characteristics in the behavior characteristics, and quantize the behavior characteristics into vectors after the behavior characteristics are quantized. And then inputting the quantized vector into a lifting tree model, and outputting a multi-dimensional characteristic vector by the lifting tree model according to the input vector so as to obtain the behavior characteristic after the preprocessing.
Step S240: inputting the preprocessed behavior features into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior features.
Step S250: and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In the embodiment of the present application, step S240 and step S250 may refer to the contents of the foregoing embodiments, and are not described herein again.
Step S260: when a switching operation is detected, the user interface displayed by the wearable electronic equipment is switched to a target user interface corresponding to the switching operation.
In the embodiment of the application, after the current interface displayed by the wearable electronic device is switched to the user interface corresponding to the current behavior scene, the user may be dissatisfied with the automatically switched user interface, so that the wearable electronic device may further detect a switching operation on the user interface displayed by the wearable electronic device, and when the switching operation is detected, the current interface displayed by the wearable electronic device may be switched to the target user interface corresponding to the switching operation, so as to meet the requirements of the user.
In some embodiments, the switching operation of the user interface may be a shaking operation of the wearable electronic device. Specifically, the detecting the switching operation may include: acquiring a shaking track of the wearable electronic equipment; and if the shaking track meets a preset track condition, determining that switching operation is detected. It can be understood that the wearable electronic device is usually worn on the user, and after the displayed user interface is automatically switched to the user interface corresponding to the behavior scene, if the user is dissatisfied with the automatically switched user interface, the user can switch the user interface by shaking the wearable electronic device according to a preset rule, which is also convenient for the user to switch the user interface. As an alternative embodiment, the wearable electronic device may determine the shaking trajectory of the wearable electronic device according to the angular velocity value detected by the gyroscope sensor, and the like. The preset track condition may be a preset judgment condition for determining whether to switch the user interface, and the preset track condition may be that the obtained shaking track is any one of a plurality of shaking tracks.
In this embodiment, step S260 may include: acquiring a target user interface corresponding to the shaking track; switching the user interface displayed by the wearable electronic device to the target user interface. The wearable electronic equipment can determine the user interface corresponding to the shaking track according to the corresponding relation when the shaking track is determined to meet the preset track condition. Therefore, the user can perform different shaking operations on the intelligent wearable device according to the shaking tracks, the user interface displayed can be switched to be the user interface required, the interface can be conveniently switched, and the user experience is improved.
Step S270: and if the switching operation is detected within the preset time and the target user interface corresponds to the target behavior scene, inputting the behavior characteristics marked with the target behavior scene into the preset model, and carrying out correction training on the preset model.
In this embodiment of the application, after the wearable electronic device switches the displayed interface to the corresponding user interface according to the switching operation, the wearable electronic device may further determine whether the switching operation is an operation detected within a preset time duration, where the preset time duration may be a short time length, and may be, for example, 1 minute to 5 minutes. If the operation is monitored within the preset time length, the user interface is switched wrongly according to the identified behavior scene, and the fact that the behavior scene is wrongly displayed is also displayed. Therefore, the target behavior scene corresponding to the target user interface can be determined, and then the behavior characteristics (the recognized behavior characteristics) marked with the target behavior scene are input into the preset model to perform correction training on the preset model. The behavior characteristics are used as input samples, the target behavior scene is used as an output sample, the preset model is trained, the purpose of correcting the preset model is achieved, and the accuracy of the output result of the preset model is higher.
The interface switching method provided by the embodiment of the application comprises the steps of obtaining collected behavior data, extracting behavior characteristics, preprocessing the behavior characteristics to obtain preprocessed behavior characteristics, inputting the preprocessed behavior characteristics into a trained preset model to obtain a current behavior scene where wearable electronic equipment is located, switching a displayed current interface to a user interface corresponding to the current behavior scene, switching the displayed interface to a target user interface corresponding to switching operation when the switching operation is detected, and correcting the preset model when the switching operation is detected within a preset duration. Therefore, the accuracy of recognition is improved by preprocessing the behavior characteristics, and in addition, the accuracy of recognition of the behavior scene can also be improved by correcting the preset model.
Referring to fig. 6, fig. 6 is a schematic flowchart illustrating an interface switching method according to another embodiment of the present application. The interface switching method may be applied to the wearable electronic device, where the wearable electronic device includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 6, the interface switching method may specifically include the following steps:
step S310: and acquiring the corresponding relation between the behavior scene and the user interface.
In the embodiment of the application, the wearable electronic device may obtain the corresponding relationship between the behavior scene and the user interface in advance, wherein in the user interfaces corresponding to different behavior scenes, different user interfaces may correspond to different behavior scenes, or a same user interface may correspond to multiple user scenes, and the specific corresponding relationship may not be limited.
In some embodiments, referring to fig. 7, step S310 may include:
step S311: and displaying a setting interface, wherein the setting interface is used for setting the corresponding relation between the behavior scene and the user interface.
In some embodiments, displaying a settings interface may include: acquiring user interfaces of various types currently existing in the wearable electronic equipment and various behavior scenes identifiable by the preset model; and displaying a setting interface comprising a plurality of first options and a plurality of second options, wherein the first options correspond to the user interfaces of the plurality of styles one by one, and the second options correspond to the plurality of behavior scenes one by one. It can be understood that, multiple styles of user interfaces may be stored in the wearable electronic device in advance, multiple behavior scenes recognizable by the preset model are stored, and according to the multiple styles of user interfaces and the multiple behavior scenes, a setting interface including multiple first options and multiple second options may be displayed, so that a user can associate the behavior scenes with the user interfaces through the first options and the second options.
Step S312: and setting the corresponding relation between the behavior scene and the user interface according to the setting operation detected in the setting interface, and storing the corresponding relation.
In some embodiments, the wearable electronic device may detect a setting operation in the setting interface, and store a correspondence between the behavior scene and the user interface after setting the correspondence according to the setting operation, so that the wearable electronic device determines the user interface by using the correspondence when switching the user interface.
In some embodiments, the wearable electronic device may also display a first setting interface, where the first setting interface includes a scene option for selecting a behavior scene, and display a second setting interface after detecting a touch operation on the scene option corresponding to the behavior scene, where the second setting interface includes an interface option for selecting a user interface, and after detecting a touch operation on the interface option, associate a behavior scene corresponding to the operated scene option with the user interface corresponding to the operated interface option, so as to obtain a correspondence relationship between the behavior scene and the user interface.
In other embodiments, step S310 may also include: receiving a corresponding relation between a behavior scene and a user interface sent by a server, wherein the corresponding relation is generated by a mobile terminal according to detected association operation of the behavior scene and the user interface and then sent to the server, and the mobile terminal is associated with the wearable electronic equipment. It can be understood that, by the method, the behavior scene and the user interface can be associated by the user through the mobile phone, the tablet and the like associated with the wearable electronic device, and the corresponding relation between the behavior scene and the user interface is generated.
It should be noted that, when the wearable electronic device executes the interface switching method each time, it is not necessary to acquire the corresponding relationship between the behavior scene and the user interface each time.
Step S320: and acquiring the behavior data acquired by the sensor.
Step S330: and performing feature extraction on the behavior data to obtain behavior features.
Step S340: inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics.
In the embodiment of the present application, steps S320 to S340 may refer to the contents of the foregoing embodiments, and are not described herein again.
In some embodiments, the wearable electronic device may also augment the behavior scenario according to the user's operations. Therefore, the interface switching method may further include: acquiring scene addition data, wherein the scene addition data comprises a new behavior scene and corresponding behavior characteristics thereof; and updating the preset model according to the new behavior scene and the corresponding behavior characteristics. As can be understood, the wearable electronic device may acquire a new behavior scene and a behavior feature corresponding to the behavior scene, and then input the behavior feature labeled with the behavior scene into the preset model for training, so as to update the preset model.
Further, after the behavior scene is newly added, the corresponding relationship between the behavior scene and the user interface may be updated, and therefore, the interface switching method may further include: and according to the detected updating operation, after the new behavior scene is associated with any one of the user interfaces of the multiple styles, updating the corresponding relation. It can be understood that, by updating the corresponding relationship between the behavior scene and the user interface, when the wearable electronic device identifies that the behavior scene is the new behavior scene, the user interface can be switched to the user interface corresponding to the new behavior scene.
Step S350: and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In this embodiment of the application, the wearable electronic device may determine, according to the correspondence obtained in step S310, a user interface corresponding to the current behavior scenario, and switch the displayed current interface to the user interface corresponding to the current behavior scenario.
According to the interface switching method provided by the embodiment of the application, the corresponding relation between the behavior scene and the user interface is obtained in advance, then the behavior data is obtained, the behavior characteristics are extracted, the behavior characteristics are input into the trained preset model, the current behavior scene is obtained, the user interface corresponding to the current behavior scene is determined according to the corresponding relation between the behavior scene and the user interface, the displayed current interface is switched to the user interface corresponding to the current behavior scene, and the automatic switching of the user interface is achieved. And the corresponding relation between the behavior scene and the user interface can be freely set by the user, so that the requirements of different users are met.
Referring to fig. 8, fig. 8 is a schematic flowchart illustrating an interface switching method according to still another embodiment of the present application. The interface switching method may be applied to the wearable electronic device, where the wearable electronic device includes a sensor for acquiring behavior data, and as will be described in detail with reference to the flow shown in fig. 8, the interface switching method may specifically include the following steps:
step S410: a training data set is obtained, wherein the training data set comprises sample behavior features labeled with behavior scenes.
In this embodiment of the present application, for the preset model in the foregoing embodiment, the embodiment of the present application further includes a training method for the preset model, and it is worth to be noted that training for the preset model may be performed in advance according to the acquired training data set, and then, when performing recognition of a behavior scene each time, the preset model may be used, and it is not necessary to train the model when performing recognition of a behavior scene each time.
In the embodiment of the application, behavior data under different behavior scenes can be collected, behavior features corresponding to the behavior data are extracted to serve as sample behavior features, and the sample behavior features are labeled to be uplink as scenes. In this way, sample behavior characteristics corresponding to a plurality of behavior scenarios can be obtained.
In the training data set, the sample behavior characteristics are input samples for training, the labeled behavior scenes are output samples for training, and each set of training data may include one input sample and one output sample.
Step S420: and inputting the training data set into a neural network, training the neural network to obtain the trained preset model, wherein the preset model can determine a behavior scene corresponding to the behavior characteristic according to the behavior characteristic.
In the embodiment of the application, the training data set can be input to the neural network for training according to the training data set, so that the preset model is obtained. The neural network may be a deep neural network, which is not limited herein.
The training of the initial model from the training data set is explained below.
The sample behavior characteristics in a group of data in the training data set are used as input samples of the neural network, and behavior scenes marked in the group of data can be used as output samples of the neural network. In the neural network, the neurons in the input layer are fully connected with the neurons in the hidden layer, and the neurons in the hidden layer are fully connected with the neurons in the output layer, so that potential features with different granularities can be effectively extracted. And the number of the hidden layers can be multiple, so that the nonlinear relation can be better fitted, and the trained preset model is more accurate. It is understood that the training process for the preset model may or may not be performed by the wearable electronic device. When the training process is not completed by the wearable electronic device, the wearable electronic device can be used as a direct user or an indirect user, that is, the wearable electronic device can send the acquired behavior characteristics to a server in which a preset model is stored, and acquire the recognition result of the behavior scene from the server.
In some embodiments, the preset model obtained through training may be stored locally in the wearable electronic device, and the preset model obtained through training may be stored in the server in communication connection with the wearable electronic device in a manner that the preset model is stored in the server, so that the storage space occupied by the wearable electronic device may be reduced, and the operating efficiency of the wearable electronic device may be improved.
In some embodiments, the preset model may be periodically or aperiodically trained and updated by acquiring new training data.
Step S430: and acquiring the behavior data acquired by the sensor.
Step S440: and performing feature extraction on the behavior data to obtain behavior features.
Step S450: inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics.
Step S460: and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In some embodiments, before switching the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scenario, the interface switching method may further include: determining whether a user interface corresponding to the current behavior scene exists in the wearable electronic device; if the user interface corresponding to the current behavior scene exists, switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene; if the user interface corresponding to the current behavior scene does not exist, acquiring the use frequency of each user interface in all user interfaces existing in the wearable electronic equipment; and switching the current interface displayed by the wearable electronic equipment to the user interface with the highest use frequency according to the use frequency of each user interface.
It can be understood that, if the user interface corresponding to the identified current behavior scenario is not stored in the wearable electronic device, the wearable electronic device cannot switch the displayed current interface to the user interface corresponding to the current behavior scenario at this time. Under the condition, the current interface displayed by the wearable electronic equipment can be switched to the user interface with the highest use frequency, so that the switched user interface can meet the requirements of users as much as possible.
The interface switching method provided by the embodiment of the application provides a training method of a preset model, the initial model is trained through the sample behavior characteristics marked with the behavior scene, so that the trained preset model is obtained, and the preset model can be used for outputting the recognition result of the behavior scene according to the behavior characteristics corresponding to the collected behavior data. The wearable electronic equipment obtains the current behavior scene by acquiring the behavior data, extracting the behavior characteristics and inputting the behavior characteristics into the trained preset model, and switches the displayed current interface into the user interface corresponding to the current behavior scene, so that the user interface is automatically switched, the operation of a user is reduced, and the user experience is improved.
Referring to fig. 9, a block diagram of an interface switching device 400 according to an embodiment of the present disclosure is shown. The interface switching device 400 is applied to the wearable electronic device, which includes a sensor for acquiring behavior data. The interface switching apparatus 400 includes: a data acquisition module 410, a feature acquisition module 420, a scene recognition module 430, and an interface switching module 440. The data acquisition module 410 is configured to acquire behavior data acquired by the sensor; the feature obtaining module 420 is configured to perform feature extraction on the behavior data to obtain behavior features; the scene recognition module 430 is configured to input the behavior feature into a trained preset model, to obtain a current behavior scene where the wearable electronic device is located, where the preset model is pre-trained to output a recognition result of the behavior scene according to the input behavior feature; the interface switching module 440 is configured to switch a current interface displayed by the wearable electronic device to a user interface corresponding to the current behavior scene.
In some implementations, the scene recognition module 430 includes a feature processing unit and a feature input unit. The characteristic processing unit is used for preprocessing the behavior characteristics to obtain preprocessed behavior characteristics; and the characteristic input unit is used for inputting the preprocessed behavior characteristics into a trained preset model.
In this embodiment, the feature processing unit may be specifically configured to: and carrying out feature cleaning and feature mining on the behavior features.
In some embodiments, the interface switching module 440 may be further configured to, after switching the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scene, switch the user interface displayed by the wearable electronic device to a target user interface corresponding to a switching operation when a switching operation is detected.
In some embodiments, the interface switching apparatus 400 may further include: and a model correction module. And the model correction module is used for inputting the behavior characteristics marked with the target behavior scene into the preset model and carrying out correction training on the preset model if the switching operation is detected within the preset duration and the target user interface corresponds to the target behavior scene.
In some embodiments, the interface switching apparatus 400 may further include: the device comprises a track acquisition module and an operation acquisition module. The track acquisition module is used for acquiring a shaking track of the wearable electronic equipment; the operation acquisition module is used for determining that switching operation is detected if the shaking track meets a preset track condition.
Further, the interface switching module 440 switches the user interface displayed by the wearable electronic device to a target user interface corresponding to a switching operation, including: acquiring a target user interface corresponding to the shaking track; switching the user interface displayed by the wearable electronic device to the target user interface.
In some embodiments, the interface switching apparatus 400 may further include: and a corresponding relation obtaining module. The corresponding relation obtaining module is used for obtaining the corresponding relation between the behavior scene and the user interface before the current interface displayed by the wearable electronic equipment is switched to the user interface corresponding to the current behavior scene.
In this embodiment, the interface switching module 440 may include: an interface determining unit, configured to determine, according to the correspondence, a user interface corresponding to the current behavior scene; and the switching execution unit is used for switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
In this embodiment, the correspondence obtaining module may include: the interface display unit is used for displaying a setting interface, and the setting interface is used for setting the corresponding relation between a behavior scene and a user interface; and the corresponding relation setting unit is used for setting the corresponding relation between the behavior scene and the user interface according to the setting operation detected in the setting interface and storing the corresponding relation.
Further, the interface display unit may be specifically configured to: acquiring user interfaces of various types currently existing in the wearable electronic equipment and various behavior scenes identifiable by the preset model; and displaying a setting interface comprising a plurality of first options and a plurality of second options, wherein the first options correspond to the user interfaces of the plurality of styles one by one, and the second options correspond to the plurality of behavior scenes one by one.
In some embodiments, the interface switching apparatus 400 may further include: the device comprises a data acquisition module and a model updating module. The data acquisition module is used for acquiring scene addition data, and the scene addition data comprises a new behavior scene and corresponding behavior characteristics thereof; and the model updating module is used for updating the preset model according to the new behavior scene and the corresponding behavior characteristics.
Further, the interface switching apparatus 400 may further include: and the corresponding relation updating module is used for updating the corresponding relation after the preset model is updated according to the new behavior scene and the corresponding behavior characteristics thereof and the new behavior scene is associated with any one user interface in the user interfaces of the multiple styles according to the detected updating operation.
In this embodiment, the correspondence obtaining module may be specifically configured to: receiving a corresponding relation between a behavior scene and a user interface sent by a server, wherein the corresponding relation is generated by a mobile terminal according to detected association operation of the behavior scene and the user interface and then sent to the server, and the mobile terminal is associated with the wearable electronic equipment.
In some embodiments, the interface switching apparatus 400 may further include: the training data acquisition module is used for acquiring a training data set, and the training data set comprises sample behavior characteristics marked with behavior scenes; and the model training module is used for inputting the training data set into a neural network, training the neural network and obtaining the trained preset model, and the preset model can determine a behavior scene corresponding to the behavior characteristic according to the behavior characteristic.
In some embodiments, the interface switching module 440 may be specifically configured to: determining whether a user interface corresponding to the current behavior scene exists in the wearable electronic device; and if the user interface corresponding to the current behavior scene exists, switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
Further, the interface switching module 440 may further be configured to: if the user interface corresponding to the current behavior scene does not exist, acquiring the use frequency of each user interface in all user interfaces existing in the wearable electronic equipment; and switching the current interface displayed by the wearable electronic equipment to the user interface with the highest use frequency according to the use frequency of each user interface.
In some embodiments, the behavioral data includes: at least one of positioning data, motion data, biometric data, environmental data, and temporal data; the user interface includes a dial interface, a home screen interface, a lock screen interface, or an application interface.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
In summary, behavior data acquired by a sensor of the wearable electronic device is acquired, feature extraction is performed on the behavior data to acquire behavior features, the behavior features are input into a trained preset model to acquire a current behavior scene where the wearable electronic device is located, the preset model is pre-trained to output a recognition result of the behavior scene according to the input behavior features, and then a current interface displayed by the wearable electronic device is switched to a user interface corresponding to the current behavior scene, so that the current behavior scene where the wearable electronic device is located is automatically recognized according to the behavior features corresponding to the behavior data acquired by the sensor, the displayed current interface is switched to the user interface corresponding to the current behavior scene, user operations in the switched user interface are reduced, and user experience is improved.
Referring to fig. 10, a block diagram of a wearable electronic device according to an embodiment of the present application is shown. The wearable electronic device 100 may be an electronic device capable of running an application, such as a smart watch, a smart bracelet, and smart glasses. Wearable electronic device 100 in the present application may include one or more of the following components: a processor 110, a memory 120, and one or more applications, wherein the one or more applications may be stored in the memory 120 and configured to be executed by the one or more processors 110, the one or more programs configured to perform a method as described in the aforementioned method embodiments.
Processor 110 may include one or more processing cores. Processor 110 interfaces with various interfaces and circuitry throughout wearable electronic device 100 to perform various functions of wearable electronic device 100 and process data by executing or executing instructions, programs, code sets, or instruction sets stored in memory 120, and invoking data stored in memory 120. Alternatively, the processor 110 may be implemented in hardware using at least one of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 110 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing display content; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 110, but may be implemented by a communication chip.
The Memory 120 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory 120 may be used to store instructions, programs, code sets, or instruction sets. The memory 120 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for implementing at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing various method embodiments described below, and the like. The storage data area may also store data created by the terminal 100 in use, such as a phonebook, audio-video data, chat log data, and the like.
Referring to fig. 11, a block diagram of a computer-readable storage medium according to an embodiment of the present application is shown. The computer-readable medium 800 has stored therein a program code that can be called by a processor to execute the method described in the above-described method embodiments.
The computer-readable storage medium 800 may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium 800 includes a non-volatile computer-readable storage medium. The computer readable storage medium 800 has storage space for program code 810 to perform any of the method steps of the method described above. The program code can be read from or written to one or more computer program products. The program code 810 may be compressed, for example, in a suitable form.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (20)
- An interface switching method is applied to a wearable electronic device, wherein the wearable electronic device comprises a sensor for collecting behavior data, and the method comprises the following steps:acquiring behavior data acquired by the sensor;performing feature extraction on the behavior data to obtain behavior features;inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, wherein the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics;and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
- The method of claim 1, wherein inputting the behavior feature into a trained pre-set model comprises:preprocessing the behavior characteristics to obtain preprocessed behavior characteristics;and inputting the preprocessed behavior characteristics into a trained preset model.
- The method of claim 2, wherein the preprocessing the behavior feature comprises:and carrying out feature cleaning and feature mining on the behavior features.
- The method of any of claims 1-3, wherein after switching a current interface displayed by the wearable electronic device to a user interface corresponding to the current behavior scenario, the method further comprises:when a switching operation is detected, the user interface displayed by the wearable electronic equipment is switched to a target user interface corresponding to the switching operation.
- The method of claim 4, wherein after switching the user interface displayed by the wearable electronic device to a target user interface corresponding to a target behavior scenario, the method further comprises:and if the switching operation is detected within the preset time and the target user interface corresponds to the target behavior scene, inputting the behavior characteristics marked with the target behavior scene into the preset model, and carrying out correction training on the preset model.
- The method according to claim 4 or 5, wherein before switching the user interface displayed by the wearable electronic device to a target user interface corresponding to a switching operation according to the switching operation when the switching operation is detected, the method further comprises:acquiring a shaking track of the wearable electronic equipment;and if the shaking track meets a preset track condition, determining that switching operation is detected.
- The method of claim 6, wherein switching the user interface displayed by the wearable electronic device to a target user interface corresponding to a toggle operation comprises:acquiring a target user interface corresponding to the shaking track;switching the user interface displayed by the wearable electronic device to the target user interface.
- The method of any of claims 1-7, wherein prior to the switching the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scenario, the method further comprises:acquiring a corresponding relation between a behavior scene and a user interface;the switching the current interface displayed by the wearable electronic device to the user interface corresponding to the behavior scene includes:determining a user interface corresponding to the current behavior scene according to the corresponding relation;and switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
- The method of claim 8, wherein obtaining the correspondence between the behavior scenario and the user interface comprises:displaying a setting interface, wherein the setting interface is used for setting the corresponding relation between the behavior scene and the user interface;and setting the corresponding relation between the behavior scene and the user interface according to the setting operation detected in the setting interface, and storing the corresponding relation.
- The method of claim 9, wherein displaying the settings interface comprises:acquiring user interfaces of various types currently existing in the wearable electronic equipment and various behavior scenes identifiable by the preset model;and displaying a setting interface comprising a plurality of first options and a plurality of second options, wherein the first options correspond to the user interfaces of the plurality of styles one by one, and the second options correspond to the plurality of behavior scenes one by one.
- The method of claim 10, further comprising:acquiring scene addition data, wherein the scene addition data comprises a new behavior scene and corresponding behavior characteristics thereof;and updating the preset model according to the new behavior scene and the corresponding behavior characteristics.
- The method of claim 11, wherein after the updating the preset model according to the new behavior scenario and the corresponding behavior feature, the method further comprises:and according to the detected updating operation, after the new behavior scene is associated with any one of the user interfaces of the multiple styles, updating the corresponding relation.
- The method of claim 8, wherein obtaining the correspondence between the behavior scenario and the user interface comprises:receiving a corresponding relation between a behavior scene and a user interface sent by a server, wherein the corresponding relation is generated by a mobile terminal according to detected association operation of the behavior scene and the user interface and then sent to the server, and the mobile terminal is associated with the wearable electronic equipment.
- The method according to any one of claims 1 to 13, wherein the predetermined model is trained by:acquiring a training data set, wherein the training data set comprises sample behavior characteristics marked with behavior scenes;and inputting the training data set into a neural network, training the neural network to obtain the trained preset model, wherein the preset model can determine a behavior scene corresponding to the behavior characteristic according to the behavior characteristic.
- The method of any of claims 1-14, wherein prior to the switching the current interface displayed by the wearable electronic device to the user interface corresponding to the current behavior scenario, the method further comprises:determining whether a user interface corresponding to the current behavior scene exists in the wearable electronic device;and if the user interface corresponding to the current behavior scene exists, switching the current interface displayed by the wearable electronic equipment to the user interface corresponding to the current behavior scene.
- The method of claim 15, wherein after the determining whether the user interface corresponding to the current behavior scenario exists in the wearable electronic device, the method further comprises:if the user interface corresponding to the current behavior scene does not exist, acquiring the use frequency of each user interface in all user interfaces existing in the wearable electronic equipment;and switching the current interface displayed by the wearable electronic equipment to the user interface with the highest use frequency according to the use frequency of each user interface.
- The method of any of claims 1-16, wherein the behavior data comprises: at least one of positioning data, motion data, biometric data, environmental data, and temporal data;the user interface includes a dial interface, a home screen interface, a lock screen interface, or an application interface.
- An interface switching device applied to a wearable electronic device, wherein the wearable electronic device comprises a sensor for collecting behavior data, the device comprises: a data acquisition module, a characteristic acquisition module, a scene recognition module and an interface switching module, wherein,the data acquisition module is used for acquiring behavior data acquired by the sensor;the characteristic acquisition module is used for extracting characteristics of the behavior data to acquire behavior characteristics;the scene recognition module is used for inputting the behavior characteristics into a trained preset model to obtain a current behavior scene where the wearable electronic device is located, and the preset model is trained in advance to output a recognition result of the behavior scene according to the input behavior characteristics;the interface switching module is used for switching a current interface displayed by the wearable electronic equipment to a user interface corresponding to the current behavior scene.
- A wearable electronic device, comprising:one or more processors;a memory;one or more applications, wherein the one or more applications are stored in the memory and configured to be executed by the one or more processors, the one or more programs configured to perform the method of any of claims 1-17.
- A computer-readable storage medium having program code stored therein, the program code being invoked by a processor to perform the method of any one of claims 1 to 17.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2019/114076 WO2021081768A1 (en) | 2019-10-29 | 2019-10-29 | Interface switching method and apparatus, wearable electronic device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114223139A true CN114223139A (en) | 2022-03-22 |
CN114223139B CN114223139B (en) | 2023-11-24 |
Family
ID=75714716
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201980099239.XA Active CN114223139B (en) | 2019-10-29 | 2019-10-29 | Interface switching method and device, wearable electronic equipment and storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114223139B (en) |
WO (1) | WO2021081768A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114675740A (en) * | 2022-03-29 | 2022-06-28 | 西安歌尔泰克电子科技有限公司 | Dial plate switching method, device and system of wrist strap equipment and storage medium |
CN117785353A (en) * | 2023-12-01 | 2024-03-29 | 珠海市杰理科技股份有限公司 | An adaptive watch dial generation method, device and storage medium |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113960911B (en) * | 2021-11-02 | 2022-09-20 | 珠海读书郎软件科技有限公司 | System and method for automatically generating and switching watch dial of sports watch |
CN116173484A (en) * | 2023-03-03 | 2023-05-30 | 乐渊网络科技(上海)有限公司 | Motion data processing method and device and electronic equipment |
CN118536665A (en) * | 2024-05-28 | 2024-08-23 | 广东壹健康健康产业集团股份有限公司 | Intelligent finger ring scene prediction method and device based on user physiological data |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107817891A (en) * | 2017-11-13 | 2018-03-20 | 广东欧珀移动通信有限公司 | Screen control method, device, equipment and storage medium |
CN110010224A (en) * | 2019-03-01 | 2019-07-12 | 出门问问信息科技有限公司 | User movement data processing method, device, wearable device and storage medium |
CN110134316A (en) * | 2019-04-17 | 2019-08-16 | 华为技术有限公司 | Model training method, Emotion identification method and relevant apparatus and equipment |
CN110334497A (en) * | 2019-06-28 | 2019-10-15 | Oppo广东移动通信有限公司 | Switching method and wearable electronic equipment, the storage medium of display interface |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102208110B1 (en) * | 2013-09-11 | 2021-01-27 | 엘지전자 주식회사 | Wearable computing device and user interface method |
CN106598222A (en) * | 2016-11-14 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | Scene mode switching method and system |
CN107422944A (en) * | 2017-06-09 | 2017-12-01 | 广东乐心医疗电子股份有限公司 | Method and device for automatically adjusting menu display mode and wearable device |
CN108764059B (en) * | 2018-05-04 | 2021-01-01 | 南京邮电大学 | Human behavior recognition method and system based on neural network |
CN108831526A (en) * | 2018-05-21 | 2018-11-16 | 四川斐讯信息技术有限公司 | A kind of wearable sports equipment of intelligence with identification function and its recognition methods |
CN108703760A (en) * | 2018-06-15 | 2018-10-26 | 安徽中科智链信息科技有限公司 | Human motion gesture recognition system and method based on nine axle sensors |
-
2019
- 2019-10-29 WO PCT/CN2019/114076 patent/WO2021081768A1/en active Application Filing
- 2019-10-29 CN CN201980099239.XA patent/CN114223139B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107817891A (en) * | 2017-11-13 | 2018-03-20 | 广东欧珀移动通信有限公司 | Screen control method, device, equipment and storage medium |
CN110010224A (en) * | 2019-03-01 | 2019-07-12 | 出门问问信息科技有限公司 | User movement data processing method, device, wearable device and storage medium |
CN110134316A (en) * | 2019-04-17 | 2019-08-16 | 华为技术有限公司 | Model training method, Emotion identification method and relevant apparatus and equipment |
CN110334497A (en) * | 2019-06-28 | 2019-10-15 | Oppo广东移动通信有限公司 | Switching method and wearable electronic equipment, the storage medium of display interface |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114675740A (en) * | 2022-03-29 | 2022-06-28 | 西安歌尔泰克电子科技有限公司 | Dial plate switching method, device and system of wrist strap equipment and storage medium |
CN117785353A (en) * | 2023-12-01 | 2024-03-29 | 珠海市杰理科技股份有限公司 | An adaptive watch dial generation method, device and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN114223139B (en) | 2023-11-24 |
WO2021081768A1 (en) | 2021-05-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN114223139B (en) | Interface switching method and device, wearable electronic equipment and storage medium | |
CN110765939B (en) | Identity recognition method and device, mobile terminal and storage medium | |
CN103024529B (en) | The user profile processing method of intelligent television | |
CN110781881B (en) | A method, device, equipment and storage medium for identifying sports scores in video | |
CN111339842A (en) | Video jamming identification method and device and terminal equipment | |
CN112587920B (en) | Equipment control method, device, electronic equipment and storage medium | |
CN111027507A (en) | Training data set generation method and device based on video data identification | |
CN111524513A (en) | Wearable device and voice transmission control method, device and medium thereof | |
CN111507268B (en) | Alarm method and device, storage medium and electronic device | |
CN110109899B (en) | Internet of things data filling method, device and system | |
CN105554373A (en) | Photographing processing method and device and terminal | |
CN111860082A (en) | Information processing method, device and system | |
CN106598222A (en) | Scene mode switching method and system | |
WO2021147473A1 (en) | Model training method, content generation method, and related devices | |
CN111797867A (en) | System resource optimization method, device, storage medium and electronic device | |
CN110275639B (en) | Touch data processing method and device, terminal and storage medium | |
WO2024159914A1 (en) | Weak supervision time sequence boundary positioning method and apparatus, electronic device, and storage medium | |
CN113225676A (en) | Near field communication setting method and device, mobile terminal and storage medium | |
CN106155707A (en) | Information processing method and electronic equipment | |
CN110309740A (en) | Gesture identification method, wearable device and gestural control system | |
JP2010191589A (en) | Action prediction apparatus, action prediction method, and program | |
CN115578752A (en) | Human body recognition method and device, electronic equipment and storage medium | |
CN112820273B (en) | Wake-up judging method and device, storage medium and electronic equipment | |
CN110232393B (en) | Data processing method and device, storage medium and electronic device | |
CN114005174A (en) | Method and device for determining working state, electronic equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |