CN109238306A - Step counting data verification method, device, storage medium and terminal based on wearable device - Google Patents
Step counting data verification method, device, storage medium and terminal based on wearable device Download PDFInfo
- Publication number
- CN109238306A CN109238306A CN201811000964.0A CN201811000964A CN109238306A CN 109238306 A CN109238306 A CN 109238306A CN 201811000964 A CN201811000964 A CN 201811000964A CN 109238306 A CN109238306 A CN 109238306A
- Authority
- CN
- China
- Prior art keywords
- step counting
- counting data
- wearable device
- user
- motion feature
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C25/00—Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C22/00—Measuring distance traversed on the ground by vehicles, persons, animals or other moving solid bodies, e.g. using odometers, using pedometers
- G01C22/006—Pedometers
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Manufacturing & Machinery (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The embodiment of the present application discloses a kind of step counting data verification method, device, storage medium and terminal based on wearable device, this method comprises: firstly, determining the motion feature of current scene according to image information;Then, the initial step counting data of target user are obtained;Finally, verifying according to the motion feature to the initial step counting data, target step counting data are obtained, initial step counting data can be verified based on image information, improve the accuracy of step counting data.
Description
Technical field
The invention relates to technical field of mobile terminals, more particularly to the step counting data check based on wearable device
Method, apparatus, storage medium and terminal.
Background technique
With the continuous development of wearable device, the function of wearable device is more and more abundant, and wearable device gradually becomes
Develop in direction convenient, small in size.Important input data of the step counting data as user health data, the accuracy of itself
It is particularly significant.
Currently, the Intelligent bracelet or smart phone usually worn by user carry out the statistics of step counting data.But meeting
There is statistical data and user's practical the problem of being out of step, step counting data accuracy is low.
Summary of the invention
The purpose of the embodiment of the present application is to provide a kind of step counting data verification method based on wearable device, device, deposits
The accuracy of step counting data can be improved in storage media and terminal.
In a first aspect, the embodiment of the present application provides a kind of step counting data verification method based on wearable device, comprising:
The motion feature of current scene is determined according to image information;
Obtain the initial step counting data of target user;
The initial step counting data are verified according to the motion feature, obtain target step counting data.
Second aspect, the embodiment of the present application provide a kind of step counting data calibration device based on wearable device, comprising:
Determining module, for determining the motion feature of current scene according to image information;
Module is obtained, for obtaining the initial step counting data of target user;
Correction verification module, the institute that the motion feature for being determined according to the determining module obtains the acquisition module
It states initial step counting data to be verified, obtains target step counting data.
The third aspect, the embodiment of the present application provide a kind of computer readable storage medium, are stored thereon with computer journey
Sequence realizes the step counting data verification method as shown in first aspect based on wearable device when the program is executed by processor.
Fourth aspect, the embodiment of the present application provide a kind of terminal, including memory, and processor and storage are on a memory
And the computer program that can be run in processor, the processor are realized as shown in first aspect when executing the computer program
The step counting data verification method based on wearable device.
The step counting data check scheme based on wearable device provided in the embodiment of the present application, firstly, being believed according to image
Cease the motion feature for determining current scene;Then, the initial step counting data of target user are obtained;Finally, special according to the movement
Sign verifies the initial step counting data, obtains target step counting data, can be based on image information to initial step counting data
It is verified, improves the accuracy of step counting data.
Detailed description of the invention
Fig. 1 is a kind of process signal of step counting data verification method based on wearable device provided by the embodiments of the present application
Figure;
Fig. 2 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to;
Fig. 3 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to;
Fig. 4 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to;
Fig. 5 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to;
Fig. 6 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to;
Fig. 7 is a kind of structural representation of the step counting data calibration device based on wearable device provided by the embodiments of the present application
Figure;
Fig. 8 is a kind of structural schematic diagram of terminal device provided by the embodiments of the present application;
Fig. 9 is a kind of structural schematic diagram of wearable device provided by the embodiments of the present application;
Figure 10 is a kind of signal pictorial diagram of wearable device provided by the embodiments of the present application.
Specific embodiment
Further illustrate the technical solution of the application below with reference to the accompanying drawings and specific embodiments.It is understood that
It is that specific embodiment described herein is used only for explaining the application, rather than the restriction to the application.It further needs exist for illustrating
, part relevant to the application is illustrated only for ease of description, in attached drawing rather than entire infrastructure.
It should be mentioned that some exemplary embodiments are described as before exemplary embodiment is discussed in greater detail
The processing or method described as flow chart.Although each step is described as the processing of sequence by flow chart, many of these
Step can be implemented concurrently, concomitantly or simultaneously.In addition, the sequence of each step can be rearranged.When its operation
The processing can be terminated when completion, it is also possible to have the additional step being not included in attached drawing.The processing can be with
Corresponding to method, function, regulation, subroutine, subprogram etc..
With the continuous development of wearable device, the function of wearable device is more and more abundant, due to wearable device by
Gradually develop in direction convenient, small in size.Meanwhile user can obtain step counting number by the electronic equipment with step function
According to aforementioned electronic devices include wearable device, Intelligent bracelet, smartwatch, smart phone, tablet computer etc..With step counting
The step counting algorithm of the electronic equipment of function is usually to use the vibration of oscillator or the inspection of multiple axial directions in acceleration transducer
The variation for measuring pulse signal determines the step number of user.But there can be user and be seated on the seat generation step counting quantity equal error.
The embodiment of the present application provides a kind of step counting data verification method based on wearable device, can be believed based on image
Breath detects initial step counting data, is corrected in time to step counting data when image information is mismatched with step counting data,
And then avoiding passing through the single detection physics motion change such as sensor causes user's body local motion bring step counting data to become
Change, improves the accuracy of step counting data.Concrete scheme is as follows:
Fig. 1 is the flow diagram of the step counting data verification method provided by the embodiments of the present application based on wearable device,
This method is used for the case where using step counting is carried out when wearable device, when step counting of the initial step counting data in wearable device
When module, this method can be executed or by mobile terminal execution by wearable device.It is worn when initial step counting data come to remove
When wearing other electronic equipments other than formula equipment, such as smart phone, Intelligent bracelet or smartwatch, this method can be by mobile whole
End executes.Optionally, in order to reinforce wearable device lightweight, after image information being got by wearable device, will scheme
Handled as information is sent to mobile terminal, the mobile terminal can for smart phone, tablet computer, laptop etc.,
Wearable device can be intelligent glasses, intelligent helmet etc., and this method specifically comprises the following steps:
Step 110, the motion feature that current scene is determined according to image information.
Image information can be obtained by wearable device according to the enabled instruction that user inputs, i.e., obtained according to wearable device
The image information taken determines the motion feature of current scene.Enabled instruction can be preset in user's continuous touch wearable device
Incude component when trigger, enabled instruction can according to user export phonetic order, pass through vocal print compare and voice know
It not can determine the enabled instruction of primary user's output.
Motion feature can indicate that current scene corresponds to motion conditions.For example, being identified to image information, obtain current
The identification information of scene, the identification information can be indoor feature or outdoor feature.Further, front court can be worked as by identification
Specific identifier in scape, such as two dimensional code accurately identify the motion feature of current scene.Such as user's sitting in an office,
But the pedometer on Intelligent bracelet or mobile terminal can be because the local motion of user's body carries out erroneous judgement step counting.At this point,
After getting the motion feature of current scene, corresponding step counting algorithm can be determined according to motion feature, so more accurately into
Row step counting.It is located on the vehicles in another example user can be got by the identification of image information, it at this time can be more accurately right
The erroneous judgement step counting that user generates with the vehicles is corrected.
Optionally, motion feature can be also used for indicating the misalignment of user in current scene.It can according to image information
The current moving direction of user is got, such as moves or beats or simple rotatable head etc. forwards, it can be according to user
Misalignment step counting data are verified.
In one implementation, wearable device is equipped with camera, can get image information by camera.Intelligence
Image information is sent to mobile terminal by data transmission module after getting image information by the camera of energy terminal.Number
Bluetooth, WLAN (Wireless-Fidelity, Wi-Fi), ZigBee protocol (ZigBee) etc. can be supported according to transmission module
Short haul connection mode.When the camera of wearable device stops working, the bandwidth resources of releasable image information occupancy.
Further, when wearable device is in wearing state, the initial step counting data of target user are obtained.
If user moves after removing glasses, it will appear and determine that user does not move according to image information, and pedometer
It is generated into user is measured.
Wearable device can be with independent operating system, can be with other intelligent terminals (such as mobile phone, computer)
The wearable device interacted has the characteristics that easy to use, small in size.Wearable device in the embodiment of the present application is
With terminal interaction, the interactive process based on terminal and wearable device, terminal completes whether user dresses wearable device
Detection.
In the embodiment of the present application, the mode that whether detection terminal user dresses wearable device can be wearable device
Itself judged, if terminal user dresses wearable device, information can be sent to terminal, terminal detects whether to receive the letter
Breath, if receiving the information, illustrates that wearable device is dressed by terminal user.It is also possible to wearable device and is merely responsible for acquisition one
A little detection datas, and collected detection data is sent to terminal, the detection data that end-on receives is handled, by end
Complete the judgement whether wearable device is dressed by terminal user in end.Since the convenient and small in size characteristic of wearable device makes
The processing data of wearable device system ability be not it is very strong, therefore, detect terminal user whether dress it is wearable
When equipment, it is preferred that can be by wearable device detection data, be sent to terminal and judged.
Optionally, the detection data for judging whether terminal user dresses wearable device may include range data and biology
Sign data (for example, temperature, iris, face recognition etc.) etc..Specifically, by installing the related inspection of detection in wearable device
The sensor of measured data, analyzes and determines whether terminal user dresses wearable device, example according to the data that sensor detects
Such as, can be whether met by the temperature of the detection of temperature sensor in wearable device and wearable device contactant it is default
Temperature (normal body temperature of such as human body), if satisfied, then illustrating that terminal user dresses the wearable device.
Step 120, the initial step counting data for obtaining target user.
Wherein, target user refers to the user of mobile terminal, which is also equally the user of wearing wearable device.It is logical
Crossing the step counting data that acceleration transducer is got can be initial step counting data.Acceleration transducer can be 3-axis acceleration
Sensor.Further, corresponding step counting algorithm can be determined according to motion feature, obtained according to corresponding step counting algorithm initial
Step counting data.Step counting algorithm can according to user different motion state, such as walk, by bus, run give, movement of sitting quietly it is special
Sign, determines that corresponding noise reduction algorithm, different noise reduction algorithms correspond to corresponding step counting algorithm.
Step 130 verifies initial step counting data according to motion feature, obtains target step counting data.
The corresponding step counting frequency of motion feature is obtained, judges whether initial step counting data match with step counting frequency.If just
Beginning step counting data are matched with step counting frequency, then verify success.If initial step counting data and step counting frequency mismatch, according to fortune
Dynamic feature is corrected initial step counting data.
The step counting data verification method based on wearable device provided in the embodiment of the present application, firstly, being believed according to image
Cease the motion feature for determining current scene.Then, the initial step counting data of target user are obtained.Finally, according to motion feature pair
Initial step counting data are verified, and target step counting data are obtained.Relative to being only filtered according to sensor detection results
Obtain final step counting data, the step counting data verification method provided by the embodiments of the present application based on wearable device, Neng Gou
When getting preliminary step counting data, the motion feature of current scene is determined in conjunction with the image information that wearable device is shot, according to
The motion feature detects initial step counting data, for being corrected with the unmatched step counting data of motion feature, in turn
Improve the accuracy of step counting data.
Fig. 2 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to, as the further explanation to above-described embodiment, comprising:
Step 210 determines pixel displacement information according to image information.
The image information that wearable device is got can be picture frame, can determine phase in consecutive frame by comparing consecutive frame
With the pixel displacement information of pixel.Optionally, specific pixel can be randomly selected to be tracked, specific pixel can be with
To have the pixel of prominent pixel value in the pixel or picture frame of picture frame central point.By being clicked through to specific pixel
Line trace obtains pixel displacement information, and pixel displacement information can be motion vector.
Step 220 determines motion feature according to pixel displacement information.
Several pixel displacement information are smaller, then illustrate that the amount of exercise of user's head is lower.Several pixel displacement information are larger,
Then illustrate that the amount of exercise of user's head is higher.Determining motion feature can be stationary state and motion state.
Step 230, the initial step counting data for obtaining target user.
Step 240 verifies initial step counting data according to motion feature, obtains target step counting data.
If motion state is stationary state, detect whether initial step counting data match with stationary state.If initial
Step counting data are matched with stationary state, then initial step counting data are determined as target step counting data.If initial step counting data with
Stationary state mismatches, then eliminates extra step counting quantity.Further, by the prompt facility of wearable device,
The step counting data for prompting the local motion bring of user's body additional.According to user to the feedback of the prompt to inspection process into
Row optimization.For example, user confirms the prompt, and then stop the movement of body part, can avoid lasting prompt.If user feedback
Mistake is prompted, then checking algorithm is adjusted according to the content of user feedback.
Step counting data verification method provided by the embodiments of the present application based on wearable device can be believed based on pixel displacement
It ceases and determines motion feature, since pixel displacement can fast and accurately reflect user's visual field, when user's visual field does not become
When change, generation step counting data are error information, and then eliminate error information, improve the accuracy of step counting data.
Fig. 3 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to, as the further explanation to above-described embodiment, comprising:
Step 310, the motion state that target user is determined according to image information.
It can determine scene information according to the content that image information includes.It can determine the movement shape of user according to the scene information
State.Scene information can be identified by the specific identifier occurred in user's visual field.Specific identifier can be two dimensional code, can also
Think that the text in identification scene, such as firm name can also be specific identifier object, such as the door texture of user, Yong Humen
Lock feature etc..
Step 320 determines motion feature according to motion state.
User is not stopping in scene by different motion states, the corresponding Sensor Filter Algorithm of different motion state
It is not quite similar.According to user, multi-movement states can determine the motion feature of user in current scene.
Step 330, the initial step counting data for obtaining target user.
Optionally, the initial step counting data of target user are obtained based on the motion feature.
Step 340 verifies initial step counting data according to motion feature, obtains target step counting data.
Step counting data verification method provided by the embodiments of the present application based on wearable device, can believe according to image first
Scene where determining user is ceased, is then obtained according to scene and determines motion state, step counting, Jin Ershi are carried out according to the motion state
Now to the optimization of initial step counting data, the reliability of step counting data is improved.
Fig. 4 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to, as the further explanation to above-described embodiment, comprising:
Step 410 determines user's velocity of displacement according to the location information of target user.
During user wears wearable device due to closing, falling asleep etc., when head rotation amount is smaller, at this time
It can not accurately determine whether user remains static according only to pixel displacement information.The movement of user can be got by GPS
Speed.
Step 420 judges whether user's velocity of displacement is less than preset displacement threshold value.
Preset displacement threshold value can be 1M/min, if user's velocity of displacement illustrates user not less than preset displacement threshold value
It is subjected to displacement, thens follow the steps 430.Otherwise, if user's velocity of displacement is greater than preset displacement threshold value, 460 are thened follow the steps.
If step 430, user's velocity of displacement are less than preset displacement threshold value, current scene is determined according to image information
Motion feature.
When user's velocity of displacement be less than preset displacement threshold value when, can preliminary judgement user remain static.If wearable
The image information that equipment is got determines to be currently indoor scene, it is determined that user remains static.
Step 440, the initial step counting data for obtaining target user.
Step 450 verifies initial step counting data according to motion feature, obtains target step counting data.
If step 460, user's velocity of displacement are greater than preset displacement threshold value, and the initial step counting data of target user are true
It is set to target step counting data.
Step counting data verification method provided by the embodiments of the present application based on wearable device, can be by user's velocity of displacement
It is combined with current scene, more accurately determines whether user movement can generate step number, and then improve the standard of step counting data
True property.
Fig. 5 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to, as the further explanation to above-described embodiment, comprising:
Step 510, the motion feature that current scene is determined according to image information.
By pattern analysis, current scene is determined according to image information.The motion feature of current scene is determined as user
Motion feature.Such as user is in gymnasium, then the behavioural characteristic phase of the behavioural characteristic of user and other users in gymnasium
Seemingly.Step counting algorithm can be optimized in conjunction with the step counting data of multiple users, and then obtain current scene and more accurately transport
Dynamic feature.Such as the position that its in the different zones of gymnasium is taken exercise is different, the intelligent wearable device that user wears at this time can be because
To be judged by accident under instrument effect.
Step 520, the initial step counting data for obtaining target user.
Step 530 obtains the corresponding step counting data interval of motion feature.
The average step number of the corresponding step counting data interval of motion feature is greater than the corresponding step counting data interval of static feature
Average step number.
Step 540 verifies initial step counting data according to step counting data interval, obtains target step counting data.
Step counting data verification method provided by the embodiments of the present application based on wearable device, can be by image analysis more
Add and accurately identifies the corresponding user behavior characteristics of current scene, it can based on the corresponding step counting data interval of different motion feature
Quantify the corresponding step counting data of different motion dress feature, improves the accuracy of step counting data check.
Fig. 6 is that the process of another step counting data verification method based on wearable device provided by the embodiments of the present application is shown
It is intended to, as the further explanation to above-described embodiment, comprising:
Step 610, the motion feature that current scene is determined according to image information.
Step 620, the initial step counting data for obtaining target user.
Step 630, when motion feature be static feature when, judge whether initial step counting data match with static feature.
When motion feature is static feature, if initial step counting data are greater than the corresponding step number section of static feature, it is determined that
Initial step counting data and static feature mismatch.
If step 640, initial step counting data and static feature mismatch, initial step counting data are deleted.
If step 650, initial step counting data are matched with static feature, initial step counting data are determined as target step counting
Data.
Step counting data verification method provided by the embodiments of the present application based on wearable device, can for user it is static when
It generates error step counting to be corrected, improves the accuracy of step counting data.
Fig. 7 is a kind of structural representation of the step counting data calibration device based on wearable device provided by the embodiments of the present application
Figure.As shown in fig. 7, the device comprises determining that module 720, obtains module 710 and correction verification module 730.
Determining module 720, for determining the motion feature of current scene according to image information;
Module 710 is obtained, for obtaining the initial step counting data of target user;
Correction verification module 730, the motion feature for being determined according to the determining module 720 is to the acquisition module
The 710 initial step counting data obtained are verified, and target step counting data are obtained.
Further, determining module 720 is used for:
Pixel displacement information is determined according to image information;
Motion feature is determined according to the pixel displacement information.
Further, determining module 720 is used for:
The motion state of target user is determined according to image information;
Motion feature is determined according to the motion state.
Further, determining module 720 is used for:
User's velocity of displacement is determined according to the location information of the target user;
If user's velocity of displacement is less than preset displacement threshold value, the movement of current scene is determined according to image information
Feature.
Further, correction verification module 730 is used for:
Obtain the corresponding step counting data interval of the motion feature;
The initial step counting data are verified according to the step counting data interval.
Further, module 710 is obtained to be used for:
When the wearable device is in wearing state, the initial step counting data of target user are obtained.
Further, correction verification module 730 is used for:
When the motion feature is static feature, judge whether the initial step counting data match with static feature;
If the initial step counting data and static feature mismatch, the initial step counting data are deleted.
The step counting data calibration device based on wearable device provided in the embodiment of the present application, firstly, determining module 720
The motion feature of current scene is determined according to image information;Then, the initial step counting number that module 710 obtains target user is obtained
According to;Finally, correction verification module 730 verifies the initial step counting data according to the motion feature, target step counting number is obtained
According to.Relative to being only filtered to obtain final step counting data according to sensor detection results, the embodiment of the present application is provided
The step counting data verification method based on wearable device, can be when getting preliminary step counting data, in conjunction with wearable device
The image information of shooting determines the motion feature of current scene, is detected according to the motion feature to initial step counting data, right
It is corrected in the unmatched step counting data of motion feature, and then improves the accuracy of step counting data.
Method provided by the aforementioned all embodiments of the application can be performed in above-mentioned apparatus, and it is corresponding to have the execution above method
Functional module and beneficial effect.The not technical detail of detailed description in the present embodiment, reference can be made to the aforementioned all implementations of the application
Method provided by example.
Fig. 8 is a kind of structural schematic diagram of terminal device provided by the embodiments of the present application, which is above-mentioned movement
A kind of implementation of terminal.As shown in figure 8, the terminal may include: shell (not shown), memory 801, centre
Reason device (Central Processing Unit, CPU) 802 (also known as processor, hereinafter referred to as CPU) is stored in memory 801
Computer program, circuit board (not shown) and power circuit (not shown) upper and can be run on processor 802.
The circuit board is placed in the space interior that the shell surrounds;The CPU802 and the memory 801 are arranged in the electricity
On the plate of road;The power circuit, for each circuit or the device power supply for the terminal;The memory 801, for storing
Executable program code;The CPU802 run by reading the executable program code stored in the memory 801 with
The corresponding program of the executable program code.
The terminal further include: Peripheral Interface 803, RF (Radio Frequency, radio frequency) circuit 805, voicefrequency circuit
806, loudspeaker 811, power management chip 808, input/output (I/O) subsystem 809, touch screen 812, other input/controls
Equipment 810 and outside port 804, these components are communicated by one or more communication bus or signal wire 807.
It should be understood that graphic terminal 800 is only an example of terminal, and terminal device 800 can be with
With than shown in the drawings more or less component, two or more components can be combined, or can have
Different component configurations.Various parts shown in the drawings can include one or more signal processings and/or dedicated integrated
It is realized in the combination of hardware, software or hardware and software including circuit.
Just provided in this embodiment below to be described in detail for a kind of terminal device, the terminal device is with intelligent hand
For machine.
Memory 801, the memory 801 can be accessed by CPU802, Peripheral Interface 803 etc., and the memory 801 can
It can also include nonvolatile memory to include high-speed random access memory, such as one or more disk memory,
Flush memory device or other volatile solid-state parts.
The peripheral hardware that outputs and inputs of equipment can be connected to CPU802 and deposited by Peripheral Interface 803, the Peripheral Interface 803
Reservoir 801.
I/O subsystem 809, the I/O subsystem 809 can be by the input/output peripherals in equipment, such as touch screen 812
With other input/control devicess 810, it is connected to Peripheral Interface 803.I/O subsystem 809 may include 8091 He of display controller
For controlling one or more input controllers 8092 of other input/control devicess 810.Wherein, one or more input controls
Device 8092 processed receives electric signal from other input/control devicess 810 or sends electric signal to other input/control devicess 810,
Other input/control devicess 810 may include physical button (push button, rocker buttons etc.), dial, slide switch, behaviour
Vertical pole clicks idler wheel.It is worth noting that input controller 8092 can with it is following any one connect: keyboard, infrared port,
The indicating equipment of USB interface and such as mouse.
Wherein, according to the working principle of touch screen and transmission information medium classification, touch screen 812 can for resistance-type,
Capacitor induction type, infrared-type or surface acoustic wave type.Classify according to mounting means, touch screen 812 can be with are as follows: external hanging type, built-in
Formula or monoblock type.Classify according to technical principle, touch screen 812 can be with are as follows: vector pressure sensing technology touch screen, resistive technologies touching
Touch screen, capacitance technology touch screen, infrared technology touch screen or surface acoustic wave technique touch screen.
Touch screen 812, the touch screen 812 are the input interface and output interface between user terminal and user, can
It is shown to user depending on output, visual output may include figure, text, icon, video etc..Optionally, touch screen 812 is by user
The electric signal (electric signal of such as contact surface) triggered on touch screen curtain, is sent to processor 802.
Display controller 8091 in I/O subsystem 809 receives electric signal from touch screen 812 or sends out to touch screen 812
Electric signals.Touch screen 812 detects the contact on touch screen, and the contact that display controller 8091 will test is converted to and is shown
The interaction of user interface object on touch screen 812, i.e. realization human-computer interaction, the user interface being shown on touch screen 812
Object can be the icon of running game, the icon for being networked to corresponding network etc..It is worth noting that equipment can also include light
Mouse, light mouse are the extensions for the touch sensitive surface for not showing the touch sensitive surface visually exported, or formed by touch screen.
RF circuit 805 is mainly used for establishing the communication of intelligent sound box Yu wireless network (i.e. network side), realizes intelligent sound box
Data receiver and transmission with wireless network.Such as transmitting-receiving short message, Email etc..
Voicefrequency circuit 806 is mainly used for receiving audio data from Peripheral Interface 803, which is converted to telecommunications
Number, and the electric signal is sent to loudspeaker 811.
Loudspeaker 811 is reduced to for intelligent sound box to be passed through RF circuit 805 from the received voice signal of wireless network
Sound simultaneously plays the sound to user.
Power management chip 808, the hardware for being connected by CPU802, I/O subsystem and Peripheral Interface are powered
And power management.
In the present embodiment, central processing unit 802 is used for:
The motion feature of current scene is determined according to image information;
Obtain the initial step counting data of target user;
The initial step counting data are verified according to the motion feature, obtain target step counting data.
Further, the motion feature that current scene is determined according to image information, comprising:
Pixel displacement information is determined according to image information;
Motion feature is determined according to the pixel displacement information.
Further, the motion feature that current scene is determined according to image information, comprising:
The motion state of target user is determined according to image information;
Motion feature is determined according to the motion state.
Further, the motion feature that current scene is determined according to image information, comprising:
User's velocity of displacement is determined according to the location information of the target user;
If user's velocity of displacement is less than preset displacement threshold value, the movement of current scene is determined according to image information
Feature.
It is further, described that the initial step counting data are verified according to the motion feature, comprising:
Obtain the corresponding step counting data interval of the motion feature;
The initial step counting data are verified according to the step counting data interval.
Further, the initial step counting data for obtaining target user, comprising:
When the wearable device is in wearing state, the initial step counting data of target user are obtained.
Further, described that the initial step counting data are verified according to the motion feature, obtain target step counting
Data, comprising:
When the motion feature is static feature, judge whether the initial step counting data match with static feature;
If the initial step counting data and static feature mismatch, the initial step counting data are deleted.
The present embodiment provides a kind of wearable device on the basis of the various embodiments described above, and Fig. 9 is the embodiment of the present application
A kind of structural schematic diagram of the wearable device provided, Figure 10 is a kind of signal of wearable device provided by the embodiments of the present application
Pictorial diagram.As shown in Figure 9 and Figure 10, which includes: memory 201, processor (Central Processing
Unit, CPU) 202, display unit 203, touch panel 204, heart rate detection mould group 205, range sensor 206, camera 207,
Bone-conduction speaker 208, microphone 209, breath light 213, acceleration transducer 212, these components pass through one or more logical
Bus or signal wire 211 are believed to communicate.
It should be understood that illustrating the example that wearable device is only wearable device, and wearable device
It can have than shown in the drawings more or less component, can combine two or more components, or can be with
It is configured with different components.Various parts shown in the drawings can include one or more signal processings and/or dedicated
It is realized in the combination of hardware, software or hardware and software including integrated circuit.
Just the wearable device provided in this embodiment for master gage step data is described in detail below, the wearing
Formula equipment is by taking intelligent glasses as an example.
Memory 201, the memory 201 can be accessed by CPU202, and the memory 201 may include that high speed is random
Access memory, can also include nonvolatile memory, for example, one or more disk memory, flush memory device or its
His volatile solid-state part.
Display unit 203, can be used for the operation and control interface of display image data and operating system, and display unit 203 is embedded in
In the frame of wearable device, frame is internally provided with inner transmission lines 211, the inner transmission lines 211 and display unit
Part 203 connects.
Touch panel 204, which is arranged in the outside of at least one wearable device temple, for obtaining
Touch data, touch panel 204 are connected by inner transmission lines 211 with CPU202.Wherein, touch panel 204 is detectable uses
The finger sliding at family, clicking operation, and the data detected are transmitted to processor 202 accordingly and are handled to generate correspondence
Control instruction, illustratively, can be left shift instruction, right shift instruction, move up instruction, move down instruction etc..Illustratively, it shows
Component 203 can video-stream processor 202 transmit virtual image data, which can be accordingly according to touch panel
204 user's operations that detect carry out corresponding changes, specifically, can be carry out screen switching, when detect left shift instruction or
Switch upper one or next virtual image picture after right shift instruction accordingly;When display unit 203 shows video playing information
When, which, which can be, plays out playbacking for content, and right shift instruction can be the F.F. for playing out content;Work as display
The display of component 203 is when being editable word content, and the left shift instruction, right shift instruction move up instruction, move down instruction and can be pair
The displacement operation of cursor, the i.e. position of cursor can move the touch operation of touch tablet according to user;Work as display unit
When the contents of 203 displays are game animation picture, the left shift instruction, right shift instruction move up instruction, move down instruction and can be to trip
Object in play is controlled, in machine game like flying, can by the left shift instruction, right shift instruction, move up instruction, move down instruction point
Not Kong Zhi aircraft heading;When display unit 203 can show the video pictures of different channel, which is moved to right
It instructs, moves up instruction, moves down the switching for instructing and can carrying out different channel, wherein moving up instruction and moving down instruction can be switching
To pre-set channel (the common channel that such as user uses);When display unit 203 shows static images, which is moved to right
It instructs, moves up instruction, moves down the switching that instructs and can carry out between different pictures, wherein left shift instruction can be to switch to one
Width picture, right shift instruction, which can be, switches to next width figure, and an atlas can be to switch to by moving up instruction, and moving down instruction can be with
It is to switch to next atlas.The touch panel 204 can also be used to control the display switch of display unit 203, exemplary
, when long pressing 204 touch area of touch panel, display unit 203, which is powered, shows graphic interface, when long pressing touch again
When 204 touch area of panel, display unit 203 is powered off, can be by carrying out in touch panel 204 after display unit 203 is powered
Upper cunning and operation of gliding are to adjust the brightness or resolution ratio that show image in display unit 203.
Heart rate detection mould group 205, for measuring the heart rate data of user, heart rate refers to beats per minute, the heart rate
Mould group 205 is detected to be arranged on the inside of temple.Specifically, the heart rate detection mould group 205 can be in such a way that electric pulse measures
Human body electrocardio data are obtained using stemness electrode, heart rate size is determined according to the amplitude peak in electrocardiogram (ECG) data;The heart rate detection
Mould group 205 can also be by being formed using the light transmitting and light receiver of photoelectric method measurement heart rate, correspondingly, the heart rate is examined
Mould group 205 is surveyed to be arranged at temple bottom, the ear-lobe of human body auricle.Heart rate detection mould group 205 can phase after collecting heart rate data
The progress data processing in processor 202 that is sent to answered has obtained the current heart rate value of wearer, in one embodiment, processing
Device 202, can be by the heart rate value real-time display in display unit 203 after determining the heart rate value of user, optional processor
202 are determining that heart rate value lower (such as less than 50) or higher (such as larger than 100) can trigger alarm accordingly, while by the heart
Rate value and/or the warning message of generation are sent to server by communication module.
Range sensor 206, may be provided on frame, the distance which is used to incude face to frame,
The realization of infrared induction principle can be used in the range sensor 206.Specifically, the range sensor 206 is by the range data of acquisition
It is sent to processor 202, data control the bright dark of display unit 203 to processor 202 according to this distance.Illustratively, work as determination
When the collected distance of range sensor 206 is less than 5 centimetres out, the corresponding control display unit 203 of processor 202, which is in, to be lighted
State, when determine range sensor be detected with object close to when, it is corresponding control display unit 204 and be in close shape
State.
Breath light 213 may be provided at the edge of frame, when display unit 203 closes display picture, the breath light 213
It can be lighted according to the control of processor 202 in the bright dark effect of gradual change.
Camera 207 can be the position that the upper side frame of frame is arranged in, and acquire the proactive of the image data in front of user
As module, the rear photographing module of user eyeball information can also be acquired, is also possible to the combination of the two.Specifically, camera 207
When acquiring forward image, the image of acquisition is sent to the identification of processor 202, processing, and trigger accordingly according to recognition result
Trigger event.Illustratively, when user wears the wearable device at home, by being identified to the forward image of acquisition,
If recognizing article of furniture, corresponding inquiry whether there is corresponding control event, if it is present accordingly by the control
The corresponding control interface of event processed is shown in display unit 203, and user can carry out corresponding furniture object by touch panel 204
The control of product, wherein the article of furniture and wearable device are connected to the network by bluetooth or wireless self-networking;When user exists
When the outdoor wearing wearable device, target identification mode can be opened accordingly, which can be used to identify specifically
The image of acquisition is sent to processor 202 and carries out recognition of face processing by people, camera 207, if recognizing the default of setting
Face, the then loudspeaker that can be integrated accordingly by wearable device carry out sound casting, which can also use
In identifying different plants, for example, what processor 202 was acquired according to the touch operation of touch panel 204 with recording camera 207
Present image is simultaneously sent to server by communication module to be identified, server identifies the plant in acquisition image
And it feeds back relevant botanical name, introduce to wearable device, and feedback data is shown in display unit 203.Camera
207 can also be the image for acquiring user's eye such as eyeball, generate different control by the identification of the rotation to eyeball
Instruction, illustratively, moves up control instruction as eyeball is rotated up generation, eyeball rotates down generation and moves down control instruction, eye
The ball generation that turns left moves to left control instruction, and the eyeball generation that turns right moves to right control instruction, wherein qualified, display unit 203
Can video-stream processor 202 transmit virtual image data, what which can detect according to camera 207 accordingly
Control instruction that the mobile variation of user eyeball generates and change, specifically, can be carry out screen switching, moved to left when detecting
Control instruction switches upper one or next virtual image picture after moving to right control instruction accordingly;When display unit 203 is aobvious
When showing video playing information, this, which moves to left control instruction and can be, plays out playbacking for content, move to right control instruction can be into
The F.F. of row broadcasting content;When the display of display unit 203 is editable word content, this moves to left control instruction, moves to right control
System instruction moves up control instruction, moves down control instruction and can be displacement operation to cursor, i.e. the position of cursor can be according to user
The touch operation of touch tablet is moved;When the content that display unit 203 is shown is game animation picture, this moves to left control
System instruction moves to right control instruction, moves up control instruction, moving down control instruction and can be and control the object in game, such as
In aircraft game, control instruction can be moved to left by this, control instruction is moved to right, moves up control instruction, moving down control instruction and control respectively
The heading of aircraft processed;When display unit 203 can show the video pictures of different channel, this moves to left control instruction, moves to right
Control instruction moves up control instruction, moves down control instruction and can carry out the switching of different channel, wherein move up control instruction and under
Pre-set channel (the common channel that such as user uses) can be to switch to by moving control instruction;When display unit 203 shows static map
When piece, this moves to left control instruction, moves to right control instruction, moves up control instruction, moving down control instruction and can carry out between different pictures
Switching, wherein a width picture can be to switch to by moving to left control instruction, moved to right control instruction and be can be and switch to next width
Figure, an atlas can be to switch to by moving up control instruction, moved down control instruction and be can be and switch to next atlas.
The inner wall side of at least one temple is arranged in bone-conduction speaker 208, bone-conduction speaker 208, for that will receive
To processor 202 send audio signal be converted to vibration signal.Wherein, sound is passed through skull by bone-conduction speaker 208
It is transferred to human body inner ear, is transmitted in skull cochlea by the way that the electric signal of audio is changed into vibration signal, then by auditory nerve
It is perceived.Reduce hardware configuration thickness as sounding device by bone-conduction speaker 208, weight is lighter, while without electromagnetism
Radiation will not be influenced by electromagnetic radiation, and have antinoise, waterproof and liberation ears a little.
Microphone 209, may be provided on the lower frame of frame, for acquiring external (user, environment) sound and being transmitted to
Processor 202 is handled.Illustratively, the sound that microphone 209 issues user be acquired and pass through processor 202 into
Row Application on Voiceprint Recognition can receive subsequent voice control, specifically, user if being identified as the vocal print of certification user accordingly
Collected voice is sent to processor 202 and identified according to recognition result generation pair by capable of emitting voice, microphone 209
The control instruction answered, such as " booting ", " shutdown ", " promoting display brightness ", " reducing display brightness ", the subsequent basis of processor 202
The control instruction of the generation executes corresponding control processing.
Acceleration transducer 212 can be 3-axis acceleration sensor.It can according to the vibration amplitude of oscillator and frequency test
To get the initial step counting data of user.
The wearable device provided in above-described embodiment can be performed provided by any embodiment of the invention based on wearable
The step counting data verification method of equipment has and executes the corresponding functional module of this method and beneficial effect.Not in above-described embodiment
In detailed description technical detail, reference can be made to the step counting data school provided by any embodiment of the invention based on wearable device
Proved recipe method.
The embodiment of the present application also provides a kind of storage medium comprising terminal device executable instruction, and the terminal device can
It executes instruction when being executed by terminal device processor for executing a kind of step counting data verification method based on wearable device,
This method comprises:
The motion feature of current scene is determined according to image information;
Obtain the initial step counting data of target user;
The initial step counting data are verified according to the motion feature, obtain target step counting data.
Further, the motion feature that current scene is determined according to image information, comprising:
Pixel displacement information is determined according to image information;
Motion feature is determined according to the pixel displacement information.
Further, the motion feature that current scene is determined according to image information, comprising:
The motion state of target user is determined according to image information;
Motion feature is determined according to the motion state.
Further, the motion feature that current scene is determined according to image information, comprising:
User's velocity of displacement is determined according to the location information of the target user;
If user's velocity of displacement is less than preset displacement threshold value, the movement of current scene is determined according to image information
Feature.
It is further, described that the initial step counting data are verified according to the motion feature, comprising:
Obtain the corresponding step counting data interval of the motion feature;
The initial step counting data are verified according to the step counting data interval.
Further, the initial step counting data for obtaining target user, comprising:
When the wearable device is in wearing state, the initial step counting data of target user are obtained.
Further, described that the initial step counting data are verified according to the motion feature, obtain target step counting
Data, comprising:
When the motion feature is static feature, judge whether the initial step counting data match with static feature;
If the initial step counting data and static feature mismatch, the initial step counting data are deleted.
The computer storage medium of the embodiment of the present application, can be using any of one or more computer-readable media
Combination.Computer-readable medium can be computer-readable signal media or computer readable storage medium.It is computer-readable
Storage medium for example may be-but not limited to-the system of electricity, magnetic, optical, electromagnetic, infrared ray or semiconductor, device or
Device, or any above combination.The more specific example (non exhaustive list) of computer readable storage medium includes: tool
There are electrical connection, the portable computer diskette, hard disk, random access memory (RAM), read-only memory of one or more conducting wires
(ROM), erasable programmable read only memory (EPROM or flash memory), optical fiber, portable compact disc read-only memory (CD-
ROM), light storage device, magnetic memory device or above-mentioned any appropriate combination.In this document, computer-readable storage
Medium can be any tangible medium for including or store program, which can be commanded execution system, device or device
Using or it is in connection.
Computer-readable signal media may include in a base band or as carrier wave a part propagate data-signal,
Wherein carry computer-readable program code.The data-signal of this propagation can take various forms, including but unlimited
In electromagnetic signal, optical signal or above-mentioned any appropriate combination.Computer-readable signal media can also be that computer can
Any computer-readable medium other than storage medium is read, which can send, propagates or transmit and be used for
By the use of instruction execution system, device or device or program in connection.
The program code for including on computer-readable medium can transmit with any suitable medium, including --- but it is unlimited
In wireless, electric wire, optical cable, RF etc. or above-mentioned any appropriate combination.
Can with one or more programming languages or combinations thereof come write for execute the application operation computer
Program code, programming language include object oriented program language-such as Java, Smalltalk, C++, are also wrapped
Include conventional procedural programming language-such as " C " language or similar programming language.Program code can be complete
Ground executes on the user computer, partly executes on the user computer, executing as an independent software package, partially existing
Part executes on the remote computer or executes on a remote computer or server completely on subscriber computer.It is being related to
In the situation of remote computer, remote computer can pass through the network of any kind --- including local area network (LAN) or wide area
Net (WAN)-be connected to subscriber computer, or, it may be connected to outer computer (such as utilize ISP
To be connected by internet).
Certainly, a kind of storage medium comprising computer executable instructions, computer provided by the embodiment of the present application
The step counting data check operation based on wearable device that executable instruction is not limited to the described above, can also be performed the application and appoints
Relevant operation in step counting data verification method based on wearable device provided by embodiment of anticipating.
Note that above are only the preferred embodiment and institute's application technology principle of the application.It will be appreciated by those skilled in the art that
The application is not limited to specific embodiment described here, be able to carry out for a person skilled in the art it is various it is apparent variation,
The protection scope readjusted and substituted without departing from the application.Therefore, although being carried out by above embodiments to the application
It is described in further detail, but the application is not limited only to above embodiments, in the case where not departing from the application design, also
It may include more other equivalent embodiments, and scope of the present application is determined by the scope of the appended claims.
Claims (10)
1. a kind of step counting data verification method based on wearable device characterized by comprising
The motion feature of current scene is determined according to image information;
Obtain the initial step counting data of target user;
The initial step counting data are verified according to the motion feature, obtain target step counting data.
2. the step counting data verification method according to claim 1 based on wearable device, which is characterized in that the basis
Image information determines the motion feature of current scene, comprising:
Pixel displacement information is determined according to image information;
Motion feature is determined according to the pixel displacement information.
3. the step counting data verification method according to claim 1 based on wearable device, which is characterized in that the basis
Image information determines the motion feature of current scene, comprising:
The motion state of target user is determined according to image information;
Motion feature is determined according to the motion state.
4. the step counting data verification method according to claim 1 based on wearable device, which is characterized in that the basis
Image information determines the motion feature of current scene, comprising:
User's velocity of displacement is determined according to the location information of the target user;
If user's velocity of displacement is less than preset displacement threshold value, determine that the movement of current scene is special according to image information
Sign.
5. the step counting data verification method according to claim 1 based on wearable device, which is characterized in that the basis
The motion feature verifies the initial step counting data, comprising:
Obtain the corresponding step counting data interval of the motion feature;
The initial step counting data are verified according to the step counting data interval.
6. the step counting data verification method according to claim 1 based on wearable device, which is characterized in that the acquisition
The initial step counting data of target user, comprising:
When the wearable device is in wearing state, the initial step counting data of target user are obtained.
7. the step counting data verification method according to any one of claim 1 to 6 based on wearable device, feature exist
In, it is described that the initial step counting data are verified according to the motion feature, obtain target step counting data, comprising:
When the motion feature is static feature, judge whether the initial step counting data match with static feature;
If the initial step counting data and static feature mismatch, the initial step counting data are deleted.
8. a kind of step counting data calibration device based on wearable device characterized by comprising
Determining module, for determining the motion feature of current scene according to image information;
Module is obtained, for obtaining the initial step counting data of target user;
Correction verification module, the motion feature for being determined according to the determining module obtain the acquisition module described first
Beginning step counting data are verified, and target step counting data are obtained.
9. a kind of computer readable storage medium, is stored thereon with computer program, which is characterized in that the program is held by processor
The step counting data verification method as described in any in claim 1-7 based on wearable device is realized when row.
10. a kind of terminal, including memory, processor and storage are on a memory and can be in the computer journey of processor operation
Sequence, which is characterized in that the processor is realized as claimed in claim 1 based on wearing when executing the computer program
Wear the step counting data verification method of formula equipment.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811000964.0A CN109238306A (en) | 2018-08-30 | 2018-08-30 | Step counting data verification method, device, storage medium and terminal based on wearable device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811000964.0A CN109238306A (en) | 2018-08-30 | 2018-08-30 | Step counting data verification method, device, storage medium and terminal based on wearable device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN109238306A true CN109238306A (en) | 2019-01-18 |
Family
ID=65069876
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811000964.0A Pending CN109238306A (en) | 2018-08-30 | 2018-08-30 | Step counting data verification method, device, storage medium and terminal based on wearable device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN109238306A (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110132303A (en) * | 2019-05-21 | 2019-08-16 | 出门问问信息科技有限公司 | Step counting test data collection method, storage medium and electronic equipment |
CN110457411A (en) * | 2019-07-05 | 2019-11-15 | 深圳壹账通智能科技有限公司 | Methods of exhibiting, device, terminal and the storage medium of step counting data |
CN115023589A (en) * | 2020-02-11 | 2022-09-06 | Oppo广东移动通信有限公司 | IMU static noise calibration scaling for VISLAM applications |
CN115088016A (en) * | 2020-02-05 | 2022-09-20 | Oppo广东移动通信有限公司 | Method and system for implementing dynamic input resolution for vSLAM systems |
US20230101609A1 (en) * | 2021-09-28 | 2023-03-30 | Siemens Healthcare Gmbh | Motion correction method and apparatus in mr imaging, mr imaging system, and storage medium |
CN117033911A (en) * | 2023-10-07 | 2023-11-10 | 深圳市魔样科技有限公司 | Step counting analysis method based on intelligent glasses data |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015141591A (en) * | 2014-01-29 | 2015-08-03 | キヤノン株式会社 | Device of measuring number of steps, method of measuring number of steps and program |
CN106598222A (en) * | 2016-11-14 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | Scene mode switching method and system |
US20170303093A1 (en) * | 2016-04-18 | 2017-10-19 | Kyocera Corporation | Mobile device, control method, and non-transitory storage medium |
CN107314777A (en) * | 2017-06-28 | 2017-11-03 | 厦门美图移动科技有限公司 | The method and mobile terminal of a kind of dynamic setting meter step threshold value |
CN108279021A (en) * | 2018-01-26 | 2018-07-13 | 广东欧珀移动通信有限公司 | Step-recording method, electronic device and computer readable storage medium |
-
2018
- 2018-08-30 CN CN201811000964.0A patent/CN109238306A/en active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2015141591A (en) * | 2014-01-29 | 2015-08-03 | キヤノン株式会社 | Device of measuring number of steps, method of measuring number of steps and program |
US20170303093A1 (en) * | 2016-04-18 | 2017-10-19 | Kyocera Corporation | Mobile device, control method, and non-transitory storage medium |
CN106598222A (en) * | 2016-11-14 | 2017-04-26 | 上海斐讯数据通信技术有限公司 | Scene mode switching method and system |
CN107314777A (en) * | 2017-06-28 | 2017-11-03 | 厦门美图移动科技有限公司 | The method and mobile terminal of a kind of dynamic setting meter step threshold value |
CN108279021A (en) * | 2018-01-26 | 2018-07-13 | 广东欧珀移动通信有限公司 | Step-recording method, electronic device and computer readable storage medium |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110132303A (en) * | 2019-05-21 | 2019-08-16 | 出门问问信息科技有限公司 | Step counting test data collection method, storage medium and electronic equipment |
CN110457411A (en) * | 2019-07-05 | 2019-11-15 | 深圳壹账通智能科技有限公司 | Methods of exhibiting, device, terminal and the storage medium of step counting data |
CN115088016A (en) * | 2020-02-05 | 2022-09-20 | Oppo广东移动通信有限公司 | Method and system for implementing dynamic input resolution for vSLAM systems |
CN115023589A (en) * | 2020-02-11 | 2022-09-06 | Oppo广东移动通信有限公司 | IMU static noise calibration scaling for VISLAM applications |
CN115023589B (en) * | 2020-02-11 | 2024-03-22 | Oppo广东移动通信有限公司 | IMU static noise calibration scaling for VISLAM applications |
US20230101609A1 (en) * | 2021-09-28 | 2023-03-30 | Siemens Healthcare Gmbh | Motion correction method and apparatus in mr imaging, mr imaging system, and storage medium |
CN117033911A (en) * | 2023-10-07 | 2023-11-10 | 深圳市魔样科技有限公司 | Step counting analysis method based on intelligent glasses data |
CN117033911B (en) * | 2023-10-07 | 2024-01-30 | 深圳市魔样科技有限公司 | Step counting analysis method based on intelligent glasses data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109238306A (en) | Step counting data verification method, device, storage medium and terminal based on wearable device | |
US10582289B2 (en) | Enhanced biometric control systems for detection of emergency events system and method | |
US10575086B2 (en) | System and method for sharing wireless earpieces | |
US10671231B2 (en) | Electromagnetic interference signal detection | |
US10216474B2 (en) | Variable computing engine for interactive media based upon user biometrics | |
US20170105622A1 (en) | Monitoring pulse transmissions using radar | |
US10747337B2 (en) | Mechanical detection of a touch movement using a sensor and a special surface pattern system and method | |
US20160261268A1 (en) | Processing Electromagnetic Interference Signal Using Machine Learning | |
CN109804641A (en) | The output equipment and its control method of output audio signal | |
CN109348135A (en) | Photographic method, device, storage medium and terminal device | |
CN109240639A (en) | Acquisition methods, device, storage medium and the terminal of audio data | |
US10101869B2 (en) | Identifying device associated with touch event | |
CN109040462A (en) | Stroke reminding method, apparatus, storage medium and wearable device | |
CN109120790A (en) | Call control method, device, storage medium and wearable device | |
CN108475112A (en) | Use the method and apparatus of friction sound | |
KR102067281B1 (en) | Electromagnetic interference signal detection | |
US20200034729A1 (en) | Control Method, Terminal, and System | |
CN109461124A (en) | A kind of image processing method and terminal device | |
CN109036410A (en) | Audio recognition method, device, storage medium and terminal | |
CN109241900A (en) | Control method, device, storage medium and the wearable device of wearable device | |
CN109224432A (en) | Control method, device, storage medium and the wearable device of entertainment applications | |
CN109257490A (en) | Audio-frequency processing method, device, wearable device and storage medium | |
CN109331455A (en) | Movement error correction method, device, storage medium and the terminal of human body attitude | |
CN109120864A (en) | Light filling processing method, device, storage medium and mobile terminal | |
CN109782968A (en) | A kind of interface method of adjustment and terminal device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20190118 |