CN113555132A - Multi-source data processing method, electronic device and computer-readable storage medium - Google Patents
Multi-source data processing method, electronic device and computer-readable storage medium Download PDFInfo
- Publication number
- CN113555132A CN113555132A CN202010330890.8A CN202010330890A CN113555132A CN 113555132 A CN113555132 A CN 113555132A CN 202010330890 A CN202010330890 A CN 202010330890A CN 113555132 A CN113555132 A CN 113555132A
- Authority
- CN
- China
- Prior art keywords
- data
- physiological data
- physiological
- data segment
- segment
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 15
- 238000000034 method Methods 0.000 claims abstract description 88
- 238000011156 evaluation Methods 0.000 claims description 77
- 230000004927 fusion Effects 0.000 claims description 48
- 238000007405 data analysis Methods 0.000 claims description 33
- 238000004590 computer program Methods 0.000 claims description 15
- 238000004458 analytical method Methods 0.000 claims description 14
- 238000012417 linear regression Methods 0.000 claims description 12
- QVGXLLKOCUKJST-UHFFFAOYSA-N atomic oxygen Chemical compound [O] QVGXLLKOCUKJST-UHFFFAOYSA-N 0.000 claims description 10
- 239000008280 blood Substances 0.000 claims description 10
- 210000004369 blood Anatomy 0.000 claims description 10
- 238000004364 calculation method Methods 0.000 claims description 10
- 229910052760 oxygen Inorganic materials 0.000 claims description 10
- 239000001301 oxygen Substances 0.000 claims description 10
- 230000036760 body temperature Effects 0.000 claims description 9
- 230000036387 respiratory rate Effects 0.000 claims description 9
- 238000013441 quality evaluation Methods 0.000 claims description 8
- 230000008569 process Effects 0.000 description 45
- 238000004891 communication Methods 0.000 description 35
- 230000006854 communication Effects 0.000 description 35
- 230000006870 function Effects 0.000 description 28
- 238000012545 processing Methods 0.000 description 24
- 238000010586 diagram Methods 0.000 description 20
- 238000007726 management method Methods 0.000 description 15
- 238000005259 measurement Methods 0.000 description 14
- 238000010295 mobile communication Methods 0.000 description 11
- 210000000988 bone and bone Anatomy 0.000 description 10
- 230000001360 synchronised effect Effects 0.000 description 10
- 230000036541 health Effects 0.000 description 8
- 230000005236 sound signal Effects 0.000 description 8
- 239000012634 fragment Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000007499 fusion processing Methods 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000007781 pre-processing Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 238000001914 filtration Methods 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 210000004027 cell Anatomy 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 238000012544 monitoring process Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 230000001413 cellular effect Effects 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000004069 differentiation Effects 0.000 description 2
- 230000033001 locomotion Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000009467 reduction Effects 0.000 description 2
- 230000000717 retained effect Effects 0.000 description 2
- 238000011524 similarity measure Methods 0.000 description 2
- 238000012356 Product development Methods 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 238000009825 accumulation Methods 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000033764 rhythmic process Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003860 sleep quality Effects 0.000 description 1
- 239000004984 smart glass Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H80/00—ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0002—Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/0205—Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
- A61B5/02055—Simultaneously evaluating both cardiovascular condition and temperature
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/02—Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
- A61B5/024—Detecting, measuring or recording pulse rate or heart rate
- A61B5/02438—Detecting, measuring or recording pulse rate or heart rate with portable devices, e.g. worn by the patient
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/08—Detecting, measuring or recording devices for evaluating the respiratory organs
- A61B5/0816—Measuring devices for examining respiratory frequency
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/14542—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/145—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue
- A61B5/1455—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters
- A61B5/14551—Measuring characteristics of blood in vivo, e.g. gas concentration, pH value; Measuring characteristics of body fluids or tissues, e.g. interstitial fluid, cerebral tissue using optical sensors, e.g. spectral photometrical oximeters for measuring blood gases
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
- A61B5/4806—Sleep evaluation
- A61B5/4815—Sleep quality
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/6804—Garments; Clothes
- A61B5/6805—Vests
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
- A61B5/681—Wristwatch-type devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Biomedical Technology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Heart & Thoracic Surgery (AREA)
- Cardiology (AREA)
- Physiology (AREA)
- Pulmonology (AREA)
- Optics & Photonics (AREA)
- Data Mining & Analysis (AREA)
- Epidemiology (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Primary Health Care (AREA)
- Theoretical Computer Science (AREA)
- Artificial Intelligence (AREA)
- Evolutionary Computation (AREA)
- General Physics & Mathematics (AREA)
- Evolutionary Biology (AREA)
- General Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
Abstract
The embodiment of the application discloses a multi-source data processing method, electronic equipment and a computer readable storage medium, wherein the method comprises the following steps: acquiring first physiological data acquired by first electronic equipment and second physiological data acquired by second electronic equipment; determining a first target data segment and a second target data segment, wherein the similarity meets a preset condition, the first target data segment is a signal segment of a preset time length intercepted from first physiological data, and the second target data segment is a signal segment of a preset time length intercepted from second physiological data; determining a time deviation between the first electronic device and the second electronic device according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment; and performing data alignment operation on the first physiological data and the second physiological data according to the time deviation. According to the embodiment of the application, time deviation among electronic equipment does not need to be recorded in advance, and multi-source data can be aligned.
Description
Technical Field
The present application relates to the field of data processing, and in particular, to a multi-source data processing method, an electronic device, and a computer-readable storage medium.
Background
The multi-source data refers to data with a plurality of data sources, namely, a plurality of data sources exist in the same data object. For example, log data from multiple different network devices, physiological data from the same person of multiple health monitoring devices.
Generally, the data generation time is characterized by a data time stamp, and the time stamp of the data is consistent with the device time. However, clocks of different electronic devices often deviate, that is, time on different electronic devices may deviate at the same time. For example, clocks of a mobile phone and a computer have a deviation of several seconds to several tens of seconds at the same time. Therefore, before data fusion is performed on multi-source data, multi-source data alignment is generally performed according to time deviation between different electronic devices, and then data fusion is performed.
At present, multi-source data alignment needs to record the time difference of two electronic devices first, and then perform data alignment according to the pre-recorded time difference. For example, the master station and each slave station need to perform network synchronization in advance, and the time difference between the slave station and the master station is recorded. Data having no time difference recorded in advance cannot be aligned, and the time difference between electronic devices is not constant but varies with the time, so that the time difference recorded in advance needs to be updated constantly.
Disclosure of Invention
The application provides a multi-source data processing method, electronic equipment and a computer readable storage medium, which are used for solving the problem that data alignment can be carried out only by recording time difference among the equipment in advance in the conventional multi-source data alignment scheme.
In a first aspect, an embodiment of the present application provides a multi-source data processing method, which first acquires first physiological data acquired by a first electronic device and second physiological data acquired by a second electronic device, where the first physiological data and the second physiological data both include timestamp information used for characterizing data generation time, and the first physiological data and the second physiological data are data of a same physiological parameter in a preset time period; then, determining a first target data segment and a second target data segment, wherein the similarity of the first target data segment and the second target data segment meets a preset condition, the first target data segment is a signal segment of a preset time length intercepted from the first physiological data, and the second target data segment is a signal segment of a preset time length intercepted from the second physiological data; then, according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment, determining the time deviation between the first electronic device and the second electronic device; and finally, performing data alignment operation on the first physiological data and the second physiological data according to the time deviation.
It can be seen that, according to the multisource data processing method provided by the embodiment of the application, based on the uniqueness of the physiological parameter of each individual changing along with time, the first target data segment and the second target data segment with the similarity meeting the preset condition are determined from the first physiological data and the second physiological data, the time deviation between the two electronic devices is obtained according to the timestamp information of the two target data segments, and then the data alignment is performed according to the time deviation, so that the multisource data can be aligned without recording the time deviation between the two electronic devices in advance, and the data with strong time dependence and without recording the time deviation of the devices in advance can be used normally.
By way of example and not limitation, the physiological parameter is a heart rate, and at this time, the first physiological data may be heart rate data acquired by a smart band, and the second physiological data may be heart rate data acquired by a Polysomnography (PSG). Generally, the heart rate data of the same person at each moment is unique, and the heart rate data and the corresponding time stamp collected by the smart band and the PSG are the same without considering the collection precision, the time deviation among devices and other reasons. However, due to the collection accuracy and the time deviation between different electronic devices, the collected heart rate data and the corresponding timestamp may not be identical. Although the acquisition precision and the device time of each smart bracelet and the PSG device are different, the heart rate data at the same moment (the device time of the non-smart bracelet and the PSG device) can be found out according to the similarity of the heart rate data, that is, when the similarity meets the preset condition, the first target data segment and the second target data segment are corresponding heart rate data at the same moment. And then obtaining the time deviation of the smart band and the PSG device according to the time stamps of the two first target data fragments and the second target data fragment.
For example, the timestamp information corresponding to the first target data segment is 10s to 15s, the timestamp information corresponding to the second target data segment is 12s to 17s, and at this time, the time offset between the smart band and the PSG device is 2 s.
In a possible implementation manner of the first aspect, the determining the first target data segment and the second target data segment whose similarity satisfies the preset condition may include, but is not limited to, the following two manners.
The first mode is as follows:
respectively setting a first time sliding window with a preset time length in the first physiological data and a second time sliding window with the preset time length in the second physiological data;
calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window;
determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index;
if the similarity meets a preset condition, taking the first data segment as a first target data segment, and taking the second data segment as a second target data segment;
if the similarity does not meet the preset conditions, after sliding one of the time sliding windows, returning to the step of calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window, and determining whether the similarity between the first data segment and the second data segment meets the preset conditions according to the similarity evaluation index until the first physiological data and the second physiological data are traversed or the data segments with the similarity meeting the preset conditions are found;
the second mode is as follows:
intercepting at least one first data segment with a preset time length from first physiological data, and intercepting at least one second data segment with the preset time length from second physiological data;
respectively calculating similarity evaluation indexes between each first data segment and each second data segment;
determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index;
and taking the first data segment corresponding to the similarity meeting the preset condition as a first target data segment, and taking the corresponding second data segment as a second target data segment.
It should be noted that, when calculating the similarity evaluation index between the first data segment and the second data segment, data preprocessing, such as filtering and fourier transform, may be performed on the first data segment and the second data segment, and then the similarity evaluation index may be calculated according to the preprocessed data segments; or the data preprocessing may be performed on the first data segment and the second data segment to generate a first curve corresponding to the first data segment and a second curve corresponding to the second data segment, and then the similarity evaluation index between the first curve and the second curve is calculated. The first curve and the second curve are curves of physiological parameters changing along with time of the original equipment, for example, the first curve is a curve of heart rate changing along with time of the first electronic equipment.
The similarity evaluation index is used for evaluating an index for setting the similarity between two data segments, and includes but is not limited to one or more of measurement indexes of Euclidean distance, Manhattan distance and the like.
Further, if the similarity evaluation index includes a correlation coefficient and a distance metric index, the distance metric index is used to represent the distance between the samples, such as euclidean distance and manhattan distance;
at this time, the process of determining whether the similarity between the first data segment and the second data segment satisfies the preset condition according to the similarity metric index may include:
if the correlation coefficient between the first data segment and the second data segment is larger than a first preset threshold value and the distance measurement index is smaller than a second preset threshold value, determining that the similarity between the first data segment and the second data segment meets a preset condition;
and if the correlation coefficient between the first data segment and the second data segment is smaller than or equal to a first preset threshold value and/or the distance metric index is larger than or equal to a second preset threshold value, determining that the similarity between the first data segment and the second data segment does not meet a preset condition.
In a possible implementation manner of the first aspect, it may also be determined whether the first physiological data and the second physiological data originate from the same individual by being unable to find two target data segments whose similarity satisfies a preset condition. Namely, the method may further include: if a first target data segment and a second target data segment with similarity meeting a preset condition exist, determining that the first physiological data and the second physiological data belong to the same individual; and if the first target data segment and the second target data segment with the similarity meeting the preset condition do not exist, determining that the first physiological data and the second physiological data do not belong to the same individual.
For example, the first physiological data is heart rate data acquired by the smart band, the second physiological data is heart rate data acquired by the PSG device, and if two target data fragments with similarity meeting a preset condition can be found, the heart rate data acquired by the smart band and the PSG device is considered to be heart rate data of the same individual. Otherwise, think the heart rate data of two people that smart bracelet and PSG equipment gathered.
It is noted that in this implementation, prior to calculating the time offset between the first physiological data and the second physiological data, it is assumed that the two physiological data originate from the same individual. When two target data segments with similarity meeting the preset condition cannot be found, the two physiological data are not considered to be from the same individual.
In other embodiments, it may be determined whether the two physiological data originate from the same individual, and if the two physiological data originate from the same individual, the time offset between the two devices may be calculated.
That is to say, in a possible implementation manner of the first aspect, before determining the first target data segment and the second target data segment whose similarity satisfies the preset condition, the method may further include:
calculating similarity measurement characteristics of data corresponding to the overlapped time periods in the first physiological data and the second physiological data;
extracting linear features, nonlinear features and discrete features of the first physiological data and the second physiological data;
performing linear regression analysis according to the similarity measurement characteristic, the linear characteristic, the nonlinear characteristic and the discrete characteristic to obtain a difference evaluation result of the first physiological data and the second physiological data;
if the difference evaluation result is smaller than a third preset threshold value, determining that the first physiological data and the second physiological data belong to the same individual;
and if the difference evaluation result is larger than a third preset threshold value, determining that the first physiological data and the second physiological data do not belong to the same individual.
In this implementation, not only is the similarity metric characteristic between two physiological data calculated, but also linear, nonlinear, discrete type characteristics of the data, and the like are introduced. And then, performing linear regression analysis on the features to obtain an evaluation result for evaluating the difference of the two physiological data, wherein if the difference is larger, the two physiological data are not considered to belong to the same individual, otherwise, if the difference is smaller, the two physiological data are considered to belong to the same individual. The similarity metric features may include, but are not limited to, euclidean distance, etc., the linear correlation features may include, but are not limited to, Vector Length Index (VLI), Vector Angle Index (VAI), etc., and the discrete features may include, but are not limited to, the overall variance and mean of the data, etc.
In a possible implementation manner of the first aspect, if it is determined that the first physiological data and the second physiological data belong to the same individual, the two physiological data may be subjected to data fusion to obtain data-fused physiological data. That is, after performing a data alignment operation on the first physiological data and the second physiological data according to the time offset, the method further comprises: and performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data.
It should be noted that, if data alignment and data fusion operations need to be performed on a plurality of physiological data, two-by-two data alignment may be performed first according to the above data alignment scheme. And after all the physiological data are subjected to data alignment, performing data fusion on the plurality of physiological data to obtain fused physiological data.
The process of data fusion may be: through data quality comparison, data fragments with higher data quality are reserved; and for some missing data segments, the data segments of corresponding time segments of other physiological data can be used for multi-source compensation. Specifically, in a possible implementation manner of the first aspect, the data fusing the first physiological data and the second physiological data to obtain fused physiological data may include: comparing the quality of the first physiological data with that of the second physiological data through the data quality evaluation index to obtain a third data segment with higher quality in the same time period; if one physiological data of the first physiological data and the second physiological data has a missing data segment, a fourth data segment corresponding to the corresponding time segment in the other physiological data is obtained, and the third data segment and the fourth data segment are combined according to the time sequence to obtain the fused physiological data.
The data quality evaluation indexes can include, but are not limited to, integrity, consistency, accuracy, redundancy and the like of data. In addition, the data quality may also be evaluated by data availability, data volume, and the like.
In a possible implementation manner of the first aspect, after the fused physiological data is obtained, a corresponding physiological data analysis report may be further generated, and further, the physiological data analysis report may be further displayed to the user terminal. That is, after obtaining the fused physiological data, the method may further include: and generating a corresponding physiological data analysis report according to the fused physiological data.
For example, the first physiological data is heart rate data collected by the smart band, the second physiological data is heart rate data collected by the PSG device, and the heart rate data collected by the smart band and the PSG device are both transmitted to the mobile phone of the user. The user mobile phone can align the heart rate data collected by the intelligent bracelet and the PSG equipment, then perform data fusion, generate a heart rate data analysis report according to the heart rate data after the data fusion, and display the report to the user.
In a possible implementation manner of the first aspect, after data alignment and before data fusion, secondary similarity matching may be performed according to the physiological data after data alignment, and then one physiological data analysis report or multiple physiological data analysis reports may be selected to be generated according to a result of the similarity matching. That is, before the data fusion is performed on the first physiological data and the second physiological data to obtain fused physiological data, the method may further include:
similarity calculation is carried out on the first physiological data and the second physiological data after data alignment, and target similarity is obtained;
if the target similarity is larger than a fourth preset threshold, performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data;
and if the target similarity is smaller than a fourth preset threshold, generating a physiological data analysis report of the first physiological data and a physiological data analysis report of the second physiological data respectively.
In this implementation, the secondary similarity matching is performed on the physiological data after the data alignment, which can further improve the accuracy of the data fusion and the generated data analysis report.
In one possible implementation of the first aspect, the physiological parameter is heart rate, respiratory rate, body temperature, or blood oxygen.
In a second aspect, an embodiment of the present application provides a multi-source data processing apparatus, which may include:
the physiological data acquisition module is used for acquiring first physiological data acquired by first electronic equipment and second physiological data acquired by second electronic equipment, wherein the first physiological data and the second physiological data both comprise timestamp information used for representing data generation time, and the first physiological data and the second physiological data are data of the same physiological parameter in a preset time period;
the target data segment determining module is used for determining a first target data segment and a second target data segment, the similarity of which meets a preset condition, wherein the first target data segment is a signal segment which is intercepted from first physiological data and has a preset time length, and the second target data segment is a signal segment which is intercepted from second physiological data and has a preset time length;
the inter-device time deviation determining module is used for determining the time deviation between the first electronic device and the second electronic device according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment;
and the data alignment module is used for performing data alignment operation on the first physiological data and the second physiological data according to the time deviation.
In a possible implementation manner of the second aspect, the target data segment determining module is specifically configured to:
respectively setting a first time sliding window with a preset time length in the first physiological data and a second time sliding window with the preset time length in the second physiological data;
calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window;
determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index;
if the similarity meets a preset condition, taking the first data segment as a first target data segment, and taking the second data segment as a second target data segment;
if the similarity does not meet the preset conditions, after sliding one of the time sliding windows, returning to the step of calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window, and determining whether the similarity between the first data segment and the second data segment meets the preset conditions according to the similarity evaluation index until the first physiological data and the second physiological data are traversed or the data segments with the similarity meeting the preset conditions are found;
or,
intercepting at least one first data segment with a preset time length from first physiological data, and intercepting at least one second data segment with the preset time length from second physiological data;
respectively calculating similarity evaluation indexes between each first data segment and each second data segment;
determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index;
and taking the first data segment corresponding to the similarity meeting the preset condition as a first target data segment, and taking the corresponding second data segment as a second target data segment.
In a possible implementation manner of the second aspect, if the similarity evaluation index includes a correlation coefficient and a distance metric index, the distance metric index is used to characterize the distance between samples;
the target data segment determination module is specifically configured to:
if the correlation coefficient between the first data segment and the second data segment is larger than a first preset threshold value and the distance measurement index is smaller than a second preset threshold value, determining that the similarity between the first data segment and the second data segment meets a preset condition;
and if the correlation coefficient between the first data segment and the second data segment is smaller than or equal to a first preset threshold value and/or the distance metric index is larger than or equal to a second preset threshold value, determining that the similarity between the first data segment and the second data segment does not meet a preset condition.
In a possible implementation manner of the second aspect, the apparatus may further include:
the first judgment module is used for determining that the first physiological data and the second physiological data belong to the same individual if a first target data segment and a second target data segment with similarity meeting a preset condition exist; and if the first target data segment and the second target data segment with the similarity meeting the preset condition do not exist, determining that the first physiological data and the second physiological data do not belong to the same individual.
In a possible implementation manner of the second aspect, the apparatus may further include:
the second judgment module is used for calculating the similarity measurement characteristics of data corresponding to the overlapped time periods in the first physiological data and the second physiological data; extracting linear features, nonlinear features and discrete features of the first physiological data and the second physiological data; performing linear regression analysis according to the similarity measurement characteristic, the linear characteristic, the nonlinear characteristic and the discrete characteristic to obtain a difference evaluation result of the first physiological data and the second physiological data; if the difference evaluation result is smaller than a third preset threshold value, determining that the first physiological data and the second physiological data belong to the same individual; and if the difference evaluation result is larger than a third preset threshold value, determining that the first physiological data and the second physiological data do not belong to the same individual.
In a possible implementation manner of the second aspect, if it is determined that the first physiological data and the second physiological data belong to the same individual, the apparatus may further include:
and the data fusion module is used for carrying out data fusion on the first physiological data and the second physiological data to obtain fused physiological data.
In a possible implementation manner of the second aspect, the data fusion module is specifically configured to:
comparing the quality of the first physiological data with that of the second physiological data through the data quality evaluation index to obtain a third data segment with higher quality in the same time period;
if one physiological data in the first physiological data and the second physiological data has a missing data segment, obtaining a fourth data segment corresponding to the corresponding time segment in the other physiological data,
and combining the third data segment and the fourth data segment according to the time sequence to obtain the fused physiological data.
In a possible implementation manner of the second aspect, the apparatus may further include:
and the report generation module is used for generating a corresponding physiological data analysis report according to the fused physiological data.
In a possible implementation manner of the second aspect, the apparatus may further include:
the secondary similarity matching module is used for carrying out similarity calculation on the first physiological data and the second physiological data after data alignment to obtain target similarity; if the target similarity is larger than a fourth preset threshold, performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data; and if the target similarity is smaller than a fourth preset threshold, generating a physiological data analysis report of the first physiological data and a physiological data analysis report of the second physiological data respectively.
In one possible implementation of the second aspect, the physiological parameter may be, but is not limited to, heart rate, respiratory rate, body temperature, or blood oxygen, etc.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor, when executing the computer program, implements the method according to any one of the above first aspects.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method according to any one of the above first aspects.
In a fifth aspect, embodiments of the present application provide a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the method according to any one of the above first aspects. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In a sixth aspect, embodiments of the present application provide a computer program product, which, when run on an electronic device, causes the electronic device to perform the method of any one of the above first aspects.
It is understood that the beneficial effects of the second to sixth aspects can be seen from the description of the first aspect, and are not described herein again.
Drawings
Fig. 1 is a schematic diagram of a possible scenario provided in an embodiment of the present application;
fig. 2 is a schematic diagram of another possible scenario provided in the embodiment of the present application;
FIG. 3 is a schematic block diagram of a flow chart of a multi-source data processing method according to an embodiment of the present application;
fig. 4 is a schematic block diagram of a flow of a similarity matching process provided in an embodiment of the present application;
FIG. 5 is a schematic block diagram illustrating another flowchart of a multi-source data processing method according to an embodiment of the present application;
FIG. 6(a) is S provided in the examples of the present application1(t) schematic diagram;
FIG. 6(b) is S provided in the examples of the present application2(t) schematic diagram;
FIG. 7 is a schematic diagram illustrating misalignment of signals provided by an embodiment of the present application;
FIG. 8 is a schematic diagram of signal alignment provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a multi-source data processing method according to an embodiment of the present application
FIG. 10 is a block diagram of a multi-source data processing apparatus according to an embodiment of the present application;
fig. 11 is a schematic block diagram of a hardware structure of an electronic device according to an embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application.
The following first illustrates a system architecture and scenarios that may be involved in embodiments of the present application.
Referring to fig. 1, a schematic view of a possible scenario is shown, as shown in fig. 1, an electronic device 11 is communicatively connected to a plurality of electronic devices 12 with physiological data acquisition functions. The electronic device 11 may be, but is not limited to, a mobile phone, a tablet, a computer, or the like. The electronic device 12 is an electronic device having a physiological data acquisition function, and the number of the electronic devices 12 may be arbitrary. The physiological data may include, but is not limited to, one or more of heart rate, body temperature, respiratory rate, and blood oxygen of the human body, i.e., the electronic device 12 may acquire one or more of heart rate, body temperature, respiratory rate, and blood oxygen. The electronic device 12 may be a wearable device, e.g., a smart bracelet; it may also be a non-wearable device, e.g. a PSG device. By way of example and not limitation, electronic device 12 may include, but is not limited to, one or more of a smart bracelet, a PSG device, a smart watch, and a wearable garment, which may be, for example, a vest provided with heart rate acquisition sensors and blood oxygen acquisition sensors.
By way of example and not limitation, electronic device 11 shown in fig. 1 is a cell phone and electronic device 12 includes a smart band and a PSG device. At this moment, intelligence bracelet and PSG equipment possess functions such as rhythm of the heart collection, body temperature collection, respiratory rate collection and blood oxygen collection, and intelligence bracelet and PSG equipment can also monitor user's sleep quality etc.. Physiological data collected by the intelligent bracelet and the PSG equipment can be transmitted to the mobile phone in a short-distance wireless communication mode, and the mobile phone receives the physiological data transmitted by the intelligent bracelet and the PSG equipment and correspondingly processes the physiological data. The short-range wireless communication mode may be, but is not limited to, bluetooth low energy, infrared, and Wi-Fi peer-to-peer, etc. Certainly, in some other cases, the PSG device may also transmit the acquired physiological data to the mobile phone through the wireless router, for example, in an intelligent home scenario, both the PSG device and the mobile phone are connected to the wireless router in the home, and the PSG device may forward the acquired physiological data to the mobile phone through the wireless router. That is, the communication mode between the electronic device 11 and the electronic device 12 may be different in different application scenarios.
The intelligent bracelet and the PSG equipment both have respective corresponding equipment time, and the respective equipment time of the intelligent bracelet and the PSG equipment is not completely synchronous under the general condition. For example, at the same time, the device time on the smart bracelet is 12 hours 00 minutes 30 seconds at 1 month 1 in 2020, and the device time on the PSG device is 35 seconds at 12 hours 00 minutes at 1 month 1 in 2020. That is, the times on different electronic devices are not synchronized. In addition, the time offset between different electronic devices is not constant over time. For example, at the current moment, the time deviation between the smart bracelet and the PSG device is 5 s; because the clock precision on intelligent bracelet and the PSG equipment is different, constantly advancing along with the time, the time accumulation deviation of intelligent bracelet and PSG equipment is also different, and 3 months later, the time deviation between intelligent bracelet and the PSG equipment becomes 7 s.
When the physiological data are collected, the intelligent bracelet and the PSG device can set a timestamp for the physiological data according to the device time of the intelligent bracelet and the PSG device, and the timestamp is used for representing the generation time of the physiological data. A timestamp is typically a sequence of characters that uniquely identifies the time of a particular time instant. For example, a timestamp of 1577861668 corresponds to 54 minutes 28 seconds at 1 month, 1 day, 14 hours in 2020. That is, if the smart band generates physiological data in 54 minutes and 28 seconds when the device time of the smart band is 1 month, 1 day and 14 days of 2020, the physiological data is stamped with 1577861668.
Since the device times of the smart band and the PSG device are not synchronized, the timestamps for the physiological data at the same time are not consistent. Therefore, when the mobile phone receives the physiological data collected by the smart band and the PSG device, the time deviation between the smart band and the PSG device needs to be determined first. Then, the mobile phone adjusts the time stamps of the physiological data acquired by the two devices to be synchronous according to the time deviation between the two devices so as to align the physiological data between the two devices, and finally, data fusion and other operations can be performed according to the physiological data after data alignment.
Based on the uniqueness of the individual physiological data changing along with time, the data segment of the same time (the time is not the device time) can be found out from the physiological data of the two devices according to the similarity of the physiological data, and then the time deviation between the two devices is determined according to the time stamp corresponding to the physiological data of the time.
For example, the physiological data is heart rate data, and the heart rate data at the same time (the time is not the device time) is the same for the same individual. In other words, if the time deviation between the devices and the difference in the acquisition accuracy are not considered, i.e. assuming that the smart band and the PSG device are synchronized in time and the acquisition accuracy is the same, the heart rate data acquired by the smart band and the PSG device are the same, i.e. the curve of the heart rate data along with the device time variation of the original devices (e.g. the smart band and the PSG device) is the same. However, in practical applications, time between devices may have a deviation, which is not synchronous, and the acquisition accuracy may also have a difference in height, so that the heart rate data acquired by the smart bracelet and the PSG device may be different, that is, a curve of the heart rate data along with the device time variation of the original device may be different.
Time asynchronism between the electronic equipment may cause the timestamp that the heart rate data of same moment correspond to be different, for example, the actual time is 12 minutes 30 seconds 12 hours, and at this moment, the equipment time on the intelligent bracelet is 12 minutes 35 seconds 12 hours, so the timestamp information that the heart rate data that the intelligent bracelet gathered under this actual time corresponds is 12 minutes 35 seconds 12 hours. On the other hand, since the device time on the PSG device is 12 hours, 12 minutes, and 33 seconds, the time stamp information corresponding to the heart rate data acquired by the PSG device at the actual time is 12 hours, 12 minutes, and 33 seconds. In fact, the heart rate data that 12 hours 12 minutes 35 seconds correspond on the smart bracelet and the heart rate data that 12 hours 12 minutes 33 seconds correspond on the PSG equipment are the heart rate data at the same moment, but because the time deviation between two equipment, the time stamp that leads to the heart rate data correspondence at the same moment is different, or said differently, the time stamp of two equipment is the same, and heart rate data is not the data at the same moment.
Even the heart rate data of the same actual moment may be caused by the high and low acquisition precision between the electronic devices, the heart rate data acquired by the smart band and the PSG device are not completely the same. Although the heart rate data of the smart band and the PSG device at the same moment are not completely the same due to the influence of the acquisition precision, the similarity of the heart rate data of the smart band and the PSG device at the same moment satisfies a certain condition.
Based on this, the heart rate data segments of the same actual time range can be found according to the similarity of the heart rate data collected by the two devices, and then the time deviation between the two devices is determined based on the time deviation between the devices mentioned above, namely, according to the time stamps of the two data segments. For example, according to the similarity, the two found heart rate data are respectively: 12 minutes 33 seconds 12 minutes 38 seconds heart rate data 12 minutes on the intelligent bracelet to and 12 minutes 35 seconds 12 minutes 40 seconds heart rate data 12 minutes on the PSG equipment. Then, the time offset between the smart band and the PSG device can be obtained from any one of the two heart rate data pieces. Specifically, the 12 hours 12 minutes 38 seconds heart rate data on the smart bracelet and the 12 hours 12 minutes 40 seconds heart rate data on the PSG device are the same actual time heart rate data, so that the time deviation between the smart bracelet and the PSG device can be obtained to be 2 s.
It should be noted that, if the time deviation between the electronic devices 12 needs to be determined, two calculations may be performed in sequence in the above-mentioned manner until the time deviation between the electronic devices is determined. For example, when electronic device 12 includes the smart band, the PSG device, and the smart watch, the time offset between the smart band and the PSG device is determined first, and then the time offset between the smart band or the PSG device and the smart watch is determined, so as to obtain the time offset between the smart band, the PSG device, and the smart watch.
In addition, it should be noted that the time value, the time stamp information, and the like are listed above only for convenience of description and calculation, and are not limited to specific time values, time stamps, and the like.
After the mobile phone determines the time deviation between the smart bracelet and the PSG device according to the principle and the process, the physiological data collected by the two devices can be aligned according to the time deviation so as to adjust the time stamps of the two physiological data to be synchronous.
In addition, when the mobile phone finds the data segments with the similarity meeting the preset condition from the physiological data of the two devices, whether the physiological data acquired by the two devices are from the same individual can be determined according to whether the two data segments with the similarity meeting the preset condition can be found. Specifically, when the data segment with the similarity meeting the preset condition cannot be found, the physiological data collected by the smart band and the physiological data collected by the PSG device are considered not to be of the same person. On the contrary, when the data fragment with the similarity meeting the preset condition can be found, the physiological data collected by the intelligent bracelet and the physiological data collected by the PSG equipment are considered to be of the same person.
After the physiological data collected by the smart bracelet and the PSG equipment are subjected to data alignment by the mobile phone, if the physiological data collected by the smart bracelet and the PSG equipment are of the same person, the physiological data collected by the smart bracelet and the physiological data collected by the PSG equipment are subjected to data fusion to obtain fused physiological data, a corresponding data analysis report is generated according to the fused physiological data, and the data analysis report is displayed for a user. And if the physiological data collected by the intelligent bracelet and the PSG equipment are not of the same person, two data analysis reports can be generated respectively.
It should be noted that, in the above-mentioned scheme for determining whether the physiological data acquired by the two devices belong to the same individual according to whether the data segment meeting the preset condition can be found, it is necessary to assume in advance that the physiological data acquired by the two devices belong to the same individual. In other embodiments, it may also be determined whether the physiological data acquired by the two devices belong to the same individual, and after determining that the physiological data belong to the same individual, two data segments satisfying the similarity and satisfying the preset condition are selected from the physiological data of the two devices. At this time, linear regression analysis can be performed according to the similarity measurement characteristics, linear characteristics, nonlinear characteristics, discrete characteristics and the like of the physiological data acquired by the two devices, so as to obtain a differentiation evaluation result. If the difference between the two physiological data is smaller than a preset threshold value, the physiological data acquired by the two devices are considered to be from the same individual, otherwise, if the difference is larger than the preset threshold value, the physiological data acquired by the two devices are not considered to be from the same individual.
After the mobile phone performs data alignment, data fusion and other operations on the physiological data acquired by the smart bracelet and the PSG device, a physiological data analysis report can be generated again, and the report is displayed to a user. For example, what intelligence bracelet and PSG equipment gathered is heart rate data, and the cell-phone can finally show the curve of this heart rate data along with time variation to and the analysis report that this heart rate data corresponds.
Up to this point, based on fig. 1, the physiological data processing process between the mobile phone, the smart band, and the PSG device is described by taking the mobile phone, the smart band, and the PSG device as examples.
In a specific application, the electronic device 11 may also be a computer or a tablet, and the electronic device 12 may also include wearable clothing, a smart watch, and the like. If the time deviation of the physiological data acquired by more than three electronic devices needs to be determined, two-by-two calculation can be performed according to the above mentioned process to obtain the time deviation among more than three electronic devices. In addition, the physiological data collected by different electronic devices is data of the same physiological parameter, which may be heart rate, body temperature, respiratory rate or blood oxygen. For example, the physiological data collected by the smart band and the PSG device are both heart rate data.
In the scenario corresponding to fig. 1, the physiological data collected by the devices such as the smart band and the PSG is transmitted to the user terminal through the short-range wireless communication technology, or transmitted to the user terminal through the wireless router. In other scenarios, physiological data collected by devices such as the smart band and the PSG may be uploaded to the cloud server, and the cloud server performs operations such as data alignment and data fusion on the physiological data, which will be exemplarily described below with reference to another possible scenario diagram shown in fig. 2.
As shown in fig. 2, the cloud server 21 may be communicatively connected to the electronic device 22 through the internet or the like, and the electronic device 22 may be communicatively connected to the electronic device 23 through short-range wireless communication. For example, the electronic device 22 may be a cell phone, and the electronic device 23 may include, but is not limited to, wearable devices and non-wearable devices, each of which may be capable of detecting a health or physiological parameter. By way of example and not limitation, the electronic device 23 includes a smart band and a PSG device. At this moment, the physiological data that gathers can be earlier transmitted to the cell-phone to equipment such as intelligent bracelet and PSG, and then by the cell-phone with these physiological data transmission to high in the clouds server.
In other cases, the cloud server 21 may also be directly connected to the electronic device 23 in a communication manner. For example, in the smart home scene, the smart band and the PSG device may both be connected to the wireless router, and the physiological data collected by the smart band and the PSG device may be uploaded to the cloud server through the wireless router.
It should be noted that, with the application of a full scene, each user may have terminal devices of various forms, and there are more and more scenes where multi-source data and multi-source data are fused. In some scenes, if data alignment is not performed on multi-source data, but data fusion is directly performed on data from multiple devices, beneficial effects of the multi-source data may be mutually offset, so that serious deviation exists between subsequent feature extraction, model establishment, model prediction and the like and actual conditions.
In the field of mobile health, health monitoring equipment is more and more intelligent, and intelligent wearing equipment is more and more popularized, and the user can use wearable equipment and domestic health monitoring equipment to gather human physiological parameters. At this time, the user may need to check physiological parameters of multiple individuals, for example, a certain user needs to pay attention to physical health conditions of multiple family members, in this case, physiological data of each family member may need to be transmitted to the mobile phone of the user, or transmitted to the cloud server, and the cloud server processes the physiological data of the multiple family members and then pushes the physiological data to the mobile phone of the user. Alternatively, the same user may monitor his or her own human physiological parameters using a plurality of different wearable devices. Under the conditions, whether the physiological data come from the same individual needs to be identified, and operations such as data alignment, data fusion and the like are carried out on the multi-source data.
It should be noted that fig. 1 and fig. 2 are only some system architectures and scenarios that may be involved, and in a specific application, the multi-source data processing scheme of the embodiment of the present application may also be applied to other system architectures and scenarios.
After introducing the system architecture and the scenarios that may be involved in the embodiments of the present application, the following describes the solutions provided in the embodiments of the present application in detail.
Referring to FIG. 3, a schematic block flow diagram of a multi-source data processing method is shown, which may include the steps of:
step S301, acquiring first physiological data acquired by a first electronic device and second physiological data acquired by a second electronic device, where the first physiological data and the second physiological data both include timestamp information used for representing data generation time, and the first physiological data and the second physiological data are data of the same physiological parameter in a preset time period.
Specifically, the third electronic device acquires first physiological data acquired by the first electronic device and second physiological data acquired by the second electronic device. The third electronic device may be a user terminal, for example, the third electronic device is a mobile phone, the first electronic device is an intelligent bracelet, and the second electronic device is a PSG device; the system can also be a cloud server or other terminal equipment.
The first physiological data and the second physiological data both carry time stamp information, and the time stamp information can represent the generation time of the physiological data, and generally, each data corresponds to one time stamp information. The time stamp information is consistent with the time of the original device, that is, the time stamp information of each piece of data in the first physiological data is consistent with the device time of the first electronic device, and the time stamp information of each piece of data in the second physiological data is consistent with the device time of the second electronic device.
In the embodiment of the application, the timestamp information can be embodied as a timestamp, that is, each data carries a timestamp; it can also be embodied as information that can characterize the timestamp, that is, each data carries not the timestamp, but information that can be used to infer the data generation time, that is, information that can be used to infer the timestamp. The specific representation of the time stamp information is not limited herein.
The first physiological data and the second physiological data are data of the same physiological parameter in a certain time period, and the physiological parameter can be heart rate, blood oxygen, body temperature or respiratory rate and the like. For example, the first physiological data and the second physiological data are both heart rate data, and the first physiological data and the second physiological data are heart rate data in a time period of 9 hours, 30 minutes and 10 hours, 00 minutes.
In addition, the method for acquiring the physiological data may be arbitrary, and in particular, the method for acquiring the physiological data by the electronic device includes, but is not limited to, optical, electrical, magnetic field, imaging, and the like.
Step S302, determining a first target data segment and a second target data segment, where the similarity satisfies a preset condition, where the first target data segment is a signal segment of a preset time length intercepted from the first physiological data, and the second target data segment is a signal segment of a preset time length intercepted from the second physiological data.
It should be noted that the first target data segment and the second target data segment are both data segments with preset time lengths, and the preset time lengths can be set according to actual needs. Generally, the length of the data segment used for similarity matching is determined by the value of the preset time length, and in order to make the index used for matching evaluation have a certain degree of distinction, the length of the data segment used for similarity matching needs to be moderate, so the value of the preset time length needs to be moderate. That is, the intercepted or selected data segment should not be too long, nor too short. For example, when the first physiological data and the second physiological data are both heart rate data, the heart rate of a normal person is generally 60-120 Hz. If the physiological data of the instantaneous heart rate (the heart rate of the beat-to-beat) is used for similarity matching, at the moment, the preset time length is preferably 120-200 s, and the specific length can be determined by combining the heart rate prediction accuracy and the heart rate change degree.
And finding out a first target data segment from the first physiological data and a second target data segment from the second physiological data according to the similarity of the physiological data. The similarity of the first target data segment and the second target data segment meets a preset condition, and the preset condition may be correspondingly different according to different similarity evaluation indexes. By way of example and not limitation, if the similarity evaluation index includes a correlation coefficient and a euclidean distance, the preset condition is that the correlation coefficient is greater than a threshold value and the euclidean distance is less than the threshold value.
In a specific application, the process of similarity matching may be expressed in various forms, that is, the process of determining the first target data segment and the second target data segment whose similarities meet the preset condition may be expressed in various forms. Two ways will be exemplarily described below.
The first mode is as follows:
referring to the schematic flow diagram of the similarity matching process shown in fig. 4, the similarity matching process may include the following steps:
step S401, respectively setting a first time sliding window with a preset time length in the first physiological data and a second time sliding window with a preset time length in the second physiological data.
Step S402, calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window.
It should be noted that the similarity evaluation index may include, but is not limited to, one or more of a correlation coefficient, an euclidean distance, a manhattan cluster, and a chebyshev distance, that is, the similarity matching may be performed through one or more similarity evaluation indexes to evaluate the matching degree of the two data segments. For example, when the similarity evaluation index includes a correlation coefficient and a euclidean distance, the correlation coefficient and the euclidean distance between the data segments within the two sliding windows are calculated. The calculation process of the correlation coefficient and the euclidean distance is not described in detail herein.
When the similarity evaluation index of the data segment is calculated, the similarity evaluation index can be directly calculated based on the data; or converting the data into curves, and then calculating the similarity evaluation index between the curves. Specifically, the first data segment and the second data segment may be filtered and fourier transformed to obtain a first curve corresponding to the first data segment and a second curve corresponding to the second data segment, where the first curve and the second curve are curves of the physiological parameter transformed along with the time of the original device. And finally, calculating the similarity evaluation indexes of the first curve and the second curve to obtain the similarity evaluation indexes of the first data segment and the second data segment.
Step S403, determining whether the similarity between the first data segment and the second data segment meets a preset condition according to the similarity evaluation index. If the similarity meets the preset condition, the step S404 is entered; otherwise, if the similarity does not satisfy the preset condition, the process proceeds to step S405.
By way of example and not limitation, if the similarity measure includes a correlation coefficient and a distance measure, the distance measure is used to characterize the inter-sample distance, such as euclidean distance and manhattan distance;
at this time, the process of determining whether the similarity between the first data segment and the second data segment satisfies the preset condition according to the similarity evaluation index may include: if the correlation coefficient between the first data segment and the second data segment is greater than a first preset threshold value and the distance metric index is smaller than a second preset threshold value, it can be determined that the similarity between the first data segment and the second data segment meets a preset condition; if the correlation coefficient between the first data segment and the second data segment is less than or equal to a first preset threshold, or the distance metric index is greater than or equal to a second preset threshold, or the correlation coefficient is less than or equal to the first preset threshold and the distance metric index is greater than or equal to the second preset threshold, it may be determined that the similarity between the first data segment and the second data segment does not satisfy the preset condition.
Of course, in other embodiments, the similarity between data segments may be evaluated by magnitude of the magnitude, in addition to the distance metric. At this time, if the correlation coefficient of the two data segments is greater than a certain threshold and the amplitude difference between the two data segments is smaller than the certain threshold, the similarity between the two data segments may also be considered to satisfy the preset condition.
Step S404, the first data segment is used as a first target data segment, and the second data segment is used as a second target data segment.
And S405, if the first physiological data and the second physiological data are not traversed, after sliding one of the time sliding windows, returning to the steps S402 and S403 until the first physiological data and the second physiological data are traversed or a data segment with the similarity meeting a preset condition is found.
Generally, the first time sliding window and the second time sliding window are windows of the same time period at the beginning, that is, it is firstly determined whether the similarity of two data segments of the same time period meets the preset condition. If the first target data segment does not meet the second target data segment, fixing one sliding window, sliding the other sliding window by a certain sliding step length, calculating the similarity of the two data segments corresponding to the two sliding windows in the sliding process, and searching the first target data segment and the second target data segment according to the similarity. One of the sliding windows is slid to traverse one of the physiological data by fixing the other sliding window unchanged. If the first target data segment and the second target data segment with the similarity meeting the preset condition are not found after traversing one piece of physiological data, the fixed sliding window before is changed, specifically, the fixed sliding window before can be slid by a certain sliding step length, and after once sliding, another sliding window is continuously slid, and the steps are repeated continuously until the two pieces of physiological data are traversed or the first target data segment and the second target data segment with the similarity meeting the preset condition are found.
If the first target data segment and the second target data segment which meet the preset condition are not found after the two pieces of physiological data are traversed, the first target data segment and the second target data segment do not exist in the two pieces of physiological data, and at this time, it can be determined that the two pieces of physiological data do not belong to one individual.
In a special case, the first time sliding window and the second time sliding window may not be windows of the same time period at the beginning.
For example, the preset time length is 5s, that is, the time lengths of the data segments corresponding to the first time sliding window and the second time sliding window are 5 s. The acquired first physiological data and the acquired second physiological data are 0-100 s of data. The data segment corresponding to the first time sliding window is 10 s-15 s of data in the first physiological data, and the data segment corresponding to the second time sliding window is 10 s-15 s of data in the second physiological data, namely, the first two time sliding windows correspond to the data segment of the same time period. Firstly, calculating similarity evaluation indexes of data segments corresponding to two time sliding windows of 10 s-15 s, and judging whether the similarity of the two data segments meets a preset condition. If the first physiological data does not meet the requirement, one sliding window needs to be fixed, and the other sliding window needs to be slid, at this time, the sliding window corresponding to the first physiological data can be fixed, and the sliding window of the second physiological data can be slid. That is, the data segment for similarity matching in the first physiological data is still the data segment corresponding to 10s to 15s, and the data segment corresponding to the second time sliding window is constantly changed.
Taking the sliding step length as 2s as an example, fixing the first time sliding window, sliding the second time sliding window one step forward, and the data segment corresponding to the second time sliding window after sliding is 12 s-17 s of data. Then, it is determined whether the similarity of the data of 10s to 15s in the first physiological data and the data of 12s to 17s in the second physiological data satisfies a preset condition. If not, the second time sliding window continues to slide. And sliding the second time sliding window one step forward, wherein the data corresponding to the second time sliding window after sliding is 14 s-19 s data. Then, whether the similarity of the data of 10s to 15s in the first physiological data and the data of 14s to 19s in the second physiological data meets a preset condition is determined. If not, continuing to slide the second time sliding window, and repeating the process.
And if the second physiological data is traversed, the first target data segment and the second target data segment with the similarity meeting the preset condition are still not found. At this time, after the first time sliding window is changed, the second time sliding window may be sequentially slid. Taking the sliding step length of the first time sliding window as 1s as an example, after the first time sliding window is slid by one step, the data segment corresponding to the first time sliding window is 11 s-16 s of data in the first physiological data. And then, determining whether the similarity of the data segment corresponding to the first time sliding window after sliding and the data segment corresponding to the second time sliding window meets a preset condition. If not, fixing the first time sliding window unchanged, continuously sliding the second time sliding window, and repeating the process.
And so on, continuously changing the first time sliding window until the first physiological data is traversed. After the first physiological data and the second physiological data are traversed, if two data segments with similarity meeting the preset condition cannot be found, the two physiological data are considered not to belong to the same individual. In this process, if two data segments whose similarity satisfies a preset condition can be found, the two data segments are taken as a first target data segment and a second target data segment.
It should be noted that the preset time length, the sliding step length, the time length of the physiological data, and the like are mentioned above for convenience of example. In practice, these values are not affected by the above-mentioned values.
In addition, in the above-mentioned similarity matching process, in addition to the matching based on the physiological data, the matching may also be performed based on a curve corresponding to the physiological data, that is, the physiological data is converted into a corresponding curve, and then the similarity matching is performed based on the curve, so as to find the first target data segment and the second target data segment whose similarities meet the preset condition.
It should also be noted that the step size of the time sliding window may or may not be fixed. When the sliding step corresponding to each time sliding window is fixed, if two data segments meeting the similarity and meeting the preset condition are not found based on the current sliding step, the current sliding step can be changed, and the similarity matching process is performed again. For example, the current sliding step is 2s, and the changed sliding step is 1 s. The specific value of the sliding step length can be set according to the actual application requirement.
In the first method, the similarity matching is performed based on the time sliding windows with two preset time lengths, and in other implementation manners, two data segments with similarity satisfying the preset condition may not be searched based on the time sliding windows. The second mode will be described below.
The second mode is as follows:
in an implementation manner, the process of determining the first target data segment and the second target data segment whose similarity satisfies the preset condition may include the following steps:
the method comprises the following steps of firstly, intercepting at least one first data segment with a preset time length from first physiological data, and intercepting at least one second data segment with a preset time length from second physiological data.
And secondly, respectively calculating similarity evaluation indexes between each first data segment and each second data segment.
And thirdly, determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index.
And fourthly, taking the first data segment corresponding to the similarity meeting the preset condition as a first target data segment, and taking the corresponding second data segment as a second target data segment.
Specifically, a plurality of first data segments with a preset time length are respectively intercepted from the first physiological data, and a plurality of first data segments with a preset time length are respectively intercepted from the second physiological data. And then calculating a similarity evaluation index between the first data segment and the second data segment, and finding out two data segments with the similarity meeting a preset condition according to the similarity evaluation index.
For example, the preset time length is 10s, and the first physiological data and the second physiological data are data of 0s to 100 s. Intercepting a plurality of first data segments from first physiological data, the first data segments illustratively including: data segments of 1s to 11s, data segments of 2s to 12s, data segments of 3s to 13s, data segments of 5s to 15s, data segments of 20s to 30s, data segments of 85s to 95s, and the like. Intercepting a plurality of second data segments from the second physiological data, the second data segments illustratively including: data segments of 1s to 11s, data segments of 2s to 12s, data segments of 3s to 13s, data segments of 4s to 14s, data segments of 20s to 30s, data segments of 80s to 90s, and the like. Then, similarity evaluation indexes of each first data segment and each second data segment are respectively calculated. And finding out the first target data segment and the second target data segment according to the similarity evaluation index.
It should be noted that, when calculating the similarity evaluation index between the first data segment and the second data segment, data preprocessing, such as filtering and fourier transform, may be performed on the first data segment and the second data segment, and then the similarity evaluation index may be calculated according to the preprocessed data segments; or the data preprocessing may be performed on the first data segment and the second data segment to generate a first curve corresponding to the first data segment and a second curve corresponding to the second data segment, and then the similarity evaluation index between the first curve and the second curve is calculated. The first curve and the second curve are both curves of the physiological parameter changing with time of the original device, for example, the first curve is a curve of the heart rate changing with time of the first electronic device.
The similarity evaluation index is used for evaluating an index for setting the similarity between two data segments, and includes but is not limited to one or more of measurement indexes of Euclidean distance, Manhattan distance and the like.
In other embodiments, the process of determining the first target data segment and the second target data segment whose similarity satisfies the preset condition may also be embodied in other forms, which are not limited herein.
After the first target data segment and the second target data segment with the similarity satisfying the preset condition are determined, the time offset between the two devices can be obtained according to the timestamp information of the two data segments.
Step S303, determining a time offset between the first electronic device and the second electronic device according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment.
It is understood that there is corresponding time stamp information for each piece of data, and that multiple pieces of data are included within each of the first and second target data segments. Based on this, after the first target data segment and the second target data segment are aligned, the time segments of the two devices can be calculated according to the data at a certain time point.
For example, the first target data segment is data corresponding to a time period of 10s to 20s in the first physiological data, and the second target data segment is data corresponding to a time period of 14s to 24s in the second physiological data. Taking the data corresponding to 10s in the first target data segment and the data corresponding to 14s in the second physiological data segment as an example, it is considered that the two data are physiological data at the same time, and the time of one is 10s, and the time of the other is 14s, because there is a deviation between the times of the two devices, and the deviation is 4 s. In this way, a time offset of 4s between the first electronic device and the second electronic device can be obtained.
And step S304, performing data alignment operation on the first physiological data and the second physiological data according to the time deviation.
In particular, after determining the time offset between the first electronic device and the second electronic device, the time stamps of the physiological data of the two devices may be adjusted to be synchronized based on the time offset to align the two physiological data.
It can be seen from the above that, based on the uniqueness of each individual physiological parameter changing with time, the first target data segment and the second target data segment with the similarity meeting the preset condition are determined from the first physiological data and the second physiological data, the time deviation between the two electronic devices is obtained according to the timestamp information of the two target data segments, then the data alignment is performed according to the time deviation, the time deviation between the two electronic devices is not required to be recorded in advance, the multi-source data can be aligned, the data with strong time dependence and without the time deviation of the pre-recorded device can be normally used, the utilization rate of the data is improved, the data acquisition cost and time are reduced, and the period and cost of product development are reduced.
For example, in the prior art, if the device time deviation is not recorded in advance, multi-source data collected by the electronic device cannot be subjected to data alignment, and then the data become unusable data. By the scheme of the embodiment of the application, even if the time deviation of the equipment is not recorded in advance, the time deviation can be determined according to the similarity of the data, so that data alignment is carried out, and the data are still usable data. Therefore, the data utilization rate is improved, and the data acquisition cost and time are reduced.
It should be noted that, if data alignment operations of multiple devices need to be performed, the above-mentioned multi-source data processing procedure between two devices may be performed in sequence to achieve data alignment of multiple devices.
In addition, it is also noted that the physiological data acquired by the different devices may be homologous, e.g., both being Electrocardiograms (ECGs); or may not be homologous, e.g. one device is a photoplethysmogram (PPG) and the other device is an Electrocardiogram (ECG). And the collection can be at the same position or different positions.
In the process of finding the first target data segment and the second target data segment meeting the preset condition according to the similarity, if the first target data segment and the second target data segment meeting the preset condition in the similarity can be found, the time deviation of the two devices can be determined according to the first target data segment and the second target data segment, and the two physiological data can be judged to be from the same individual. And if the first target data segment and the second target data segment with the similarity meeting the preset condition cannot be found after traversing the two physiological data, the two physiological data can be judged not to be from the same individual.
That is, in some embodiments, the method may further include: if a first target data segment and a second target data segment with similarity meeting a preset condition exist, determining that the first physiological data and the second physiological data belong to the same individual; and if the first target data segment and the second target data segment with the similarity meeting the preset condition do not exist, determining that the first physiological data and the second physiological data do not belong to the same individual.
For example, the first physiological data is heart rate data acquired by the smart band, the second physiological data is heart rate data acquired by the PSG device, and if two target data fragments with similarity meeting a preset condition can be found, the heart rate data acquired by the smart band and the PSG device is considered to be heart rate data of the same individual. Otherwise, think the heart rate data of two people that smart bracelet and PSG equipment gathered.
In addition to determining whether two physiological data originate from the same individual in the above-mentioned manner, it may also be determined by other means whether two physiological data originate from the same individual.
In other embodiments, it may be determined whether the two physiological data originate from the same individual, and if the two physiological data originate from the same individual, the time offset between the two devices may be calculated. That is, before determining the first target data segment and the second target data segment whose similarity satisfies the preset condition, the method may further include:
the method comprises the steps of firstly, calculating similarity measurement characteristics of data corresponding to overlapped time periods in first physiological data and second physiological data. The similarity metric features may include, but are not limited to, euclidean distance and manhattan distance.
And secondly, extracting linear features, nonlinear features and discrete features of the first physiological data and the second physiological data. The phenomenon characteristics include VLI, VAI, and the like. The discrete features include variance and mean, etc.
And thirdly, performing linear regression analysis according to the similarity measurement characteristic, the linear characteristic, the nonlinear characteristic and the discrete characteristic to obtain a difference evaluation result of the first physiological data and the second physiological data.
In a specific application, a linear regression model can be used to evaluate the difference between the two data.
Fourthly, if the difference evaluation result is smaller than a third preset threshold value, determining that the first physiological data and the second physiological data belong to the same individual;
and fifthly, if the difference evaluation result is larger than a third preset threshold value, determining that the first physiological data and the second physiological data do not belong to the same individual.
The third preset threshold may be set according to actual needs, and is not limited herein.
Specifically, not only is the similarity metric characteristic between two physiological data calculated, but also linear, nonlinear and discrete characteristics of the data and the like are introduced. And then, performing linear regression analysis on the features to obtain an evaluation result for evaluating the difference of the two physiological data, wherein if the difference is larger, the two physiological data are not considered to belong to the same individual, otherwise, if the difference is smaller, the two physiological data are considered to belong to the same individual.
It should be noted that the above exemplary embodiments provide two ways of determining whether physiological data originates from the same individual, the first relies on the process of finding a first target data segment and a second target data segment satisfying a predetermined condition according to similarity, and determining whether the data originates from the same individual based on whether the two target data segments can be found. The second method is to evaluate the difference between two data based on various characteristics of the data, and determine whether the data are from the same individual according to the difference. The first way is to assume that the two physiological data are the same individual before performing the similarity matching process, and then perform similarity matching before determining whether the two physiological data are from the same individual. The second method is to determine whether the two physiological data are from the same individual before similarity matching, and then perform similarity matching after determining that the two physiological data are from the same individual.
It is worth noting that the above-mentioned means for determining whether two or more physiological data originate from the same individual may also be applied to identification in a smart context. Specifically, in a full scenario, each person may have a plurality of different terminal devices, and the different terminal devices may not be used by only one user, so it is necessary to determine which user's data the terminal device transmits. Based on the above, it can be determined which user's data the data transmitted or collected by the terminal device is, so as to realize the identification of the whole scene.
After determining that the plurality of physiological data are from the same individual and performing data alignment on the plurality of physiological data, performing data fusion on the plurality of physiological data to obtain fused physiological data. Further, after the fused physiological data is obtained, a corresponding data analysis report can be generated according to the fused physiological data. This process will be described below.
In some embodiments, after performing the data alignment operation on the first physiological data and the second physiological data according to the time offset, the first physiological data and the second physiological data may be further subjected to data fusion to obtain fused physiological data.
It should be noted that, if data alignment and data fusion operations need to be performed on a plurality of physiological data, two-by-two data alignment may be performed first according to the above data alignment scheme. And then, after all the physiological data are subjected to data alignment, performing data fusion on the plurality of physiological data to obtain fused physiological data.
The process of data fusion may include: a data quality comparison process and a missing data compensation process. And keeping the data fragments with higher quality than the retained data. And for some missing data segments, the data segments of corresponding time segments of other physiological data can be used for multi-source compensation.
Specifically, the first physiological data and the second physiological data are compared in quality through a data quality evaluation index, and a third data segment with higher quality in the same time period is obtained. Namely, the data segment with higher quality in the two physiological data is selected.
The data quality evaluation index may include, but is not limited to, integrity, consistency, accuracy, redundancy, and the like of the data. In addition, the data quality may also be evaluated by data availability, data volume, and the like. For example, for a certain time period, the first physiological data exists in the data segment a, and the second physiological data exists in the data segment B. And evaluating the quality of the data segment A and the data segment B through the data quality evaluation index, and if the quality of the data segment A is higher than that of the data segment B, keeping the data segment A, namely taking the data segment A as the data of a certain time segment in the fused physiological data. Conversely, if data segment a is of lower quality than data segment B, data segment B is retained. And according to the process, obtaining the data segment with higher data quality in the first physiological data and the second physiological data.
Then, if one of the first physiological data and the second physiological data has a missing data segment, a fourth data segment corresponding to the corresponding time segment in the other physiological data can be obtained. For example, in a certain time period, the data segment C corresponding to the time period exists in the first physiological data, but the data segment corresponding to the time period does not exist in the second physiological data, that is, the data of the time period in the second physiological data is absent. At this time, the data segment C corresponding to the time segment in the first physiological data may be used as the data of the time segment in the fused physiological data.
And finally, combining the third data segment and the fourth data segment according to the time sequence to obtain the fused physiological data.
It is understood that when the physiological data is three or more, the fusion process is similar to the fusion process of the two physiological data, and is not described herein again.
After the plurality of physiological data are subjected to data fusion to obtain fused physiological data, a corresponding physiological data analysis report can be generated according to the fused physiological data.
For example, the first physiological data is heart rate data collected by the smart band, the second physiological data is heart rate data collected by the PSG device, and the heart rate data collected by the smart band and the PSG device are both transmitted to the mobile phone of the user. The user mobile phone can carry out data alignment with the heart rate data that intelligent bracelet and PSG equipment gathered, carries out data fusion again, then generates heart rate data analysis report according to the heart rate data after the data fusion to show this report for the user.
In some embodiments, after the data alignment and before the data fusion, a second similarity matching may be performed according to the physiological data after the data alignment, and then one physiological data analysis report or multiple physiological data analysis reports may be selected to be generated according to a similarity matching result.
Specifically, before data fusion is performed on the first physiological data and the second physiological data to obtain fused physiological data, similarity calculation is performed on the first physiological data and the second physiological data after data alignment to obtain target similarity. The target similarity may be evaluated by euclidean distance.
In specific application, the Euclidean distance is calculated based on data in an overlapping time period of two physiological data after data alignment.
And then, judging whether the target similarity is greater than a fourth preset threshold, if so, performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data, namely performing data fusion on a plurality of physiological data, and then generating a physiological data analysis report according to the fused physiological data. On the contrary, if the target similarity is smaller than the fourth preset threshold, a physiological data analysis report of the first physiological data and a physiological analysis report of the second physiological data are respectively generated. That is, if the target similarity is smaller than the fourth preset threshold, it is determined that the first physiological data and the second physiological data are not data of the same individual, or the data quality of the first physiological data and the second physiological data is poor, and at this time, a data fusion process is not performed, and physiological data analysis reports of the first physiological data and the second physiological data are generated respectively.
It should be noted that, performing the second similarity matching on the physiological data after the data alignment can further improve the accuracy of the data fusion and the generated data analysis report.
In the above-mentioned technical solution, whether the plurality of physiological data originate from the same individual may be determined according to the similarity matching result, or whether the plurality of physiological data originate from the same individual may be determined before the similarity matching. The overall data processing for these two schemes may be slightly different for a particular application. These two cases will be described separately below.
First case
Taking the physiological data acquired by the two devices as an example, firstly, assuming that the physiological data acquired by the two devices are from the same individual, then carrying out similarity matching, if a first target data segment and a second target data segment with the similarity meeting a preset condition can be found, judging that the two physiological data are from the same individual, otherwise, judging that the two physiological data are not from the same individual.
Specifically, referring to still another flow schematic block diagram of the multi-source data processing method shown in fig. 5, the process may include the following steps:
step S501, respectively receiving physiological signals collected by two electronic devices, and respectively recording the physiological signals as S1(t) and S2(t)。
For example, one electronic device is a smart band, the other electronic device is a PSG device, and the smart band and the PSG device transmit the acquired heart rate signal to a mobile phone.
See, as an example, S shown in FIG. 61(t) and S2(t) wherein S is shown in FIG. 6(a)1(t) graph, horizontal axis is time, vertical axis is heart rate, FIG. 6(b) is S2(t) graph, horizontal axis is time and vertical axis is heart rate. Fig. 6(a) is an ECG electrocardiographic signal, and fig. 6(b) is a PPG electrocardiographic signal.
It will be understood that S1(t) and S2And (t) are time sequence data, namely, physiological signals acquired by various devices change along with time. And the time stamp of each physiological signal is consistent with the time of the original device. And the signals of the same physiological parameter are acquired.
Step S502, respectively from S1(t) and S2(t) cutting a small signal, denoted as D1(t) and D2(t)。
In addition, D is1(t) and D2The lengths of (t) are the same, and the length of the intercepted signal segment can be determined by setting a preset time length. Generally, the length of the intercepted signal needs to be moderate, and cannot be too long or too short.
As an example, referring to fig. 6, the data corresponding to D1 in fig. 6(a) and D2 in fig. 6(b) is the intercepted signal segment.
Step S503, for D1(t) and D2And (t) carrying out preprocessing operations such as filtering, Fourier transform and the like to obtain curves corresponding to the two signal segments.
It will be appreciated that the profile is a time-varying profile of the physiological signal of a common physiological parameter, such as heart rate, over the truncation time.
Step S504, calculating D1(t) and D2(t) degree of matching of the corresponding curve.
And step S505, judging whether the signals in the interception time can be aligned or not according to the matching degree.
Specifically, the evaluation index of the curve matching degree described above is a similarity evaluation index, that is, the similarity evaluation index of the curve is used to evaluate the matching degree of the curve. For example, one or more of the correlation coefficient, euclidean distance, and manhattan distance may be used to evaluate the degree of matching of the curves.
Taking the euclidean distance and the correlation coefficient to evaluate the curve matching degree as an example, if the euclidean distance is smaller than the set threshold and the correlation coefficient is larger than the set threshold, the two signals are considered to be aligned, i.e. the first target data segment and the second target data segment meeting the preset condition can be found. On the contrary, if the euclidean distance is greater than the set threshold, or the correlation coefficient is less than the set threshold, or the euclidean distance is greater than the set threshold and the correlation coefficient is less than the set threshold, the two signals are considered to be aligned, that is, the first target data segment and the second target data segment which satisfy the preset condition cannot be found.
Two signal segments when the two signals are aligned are taken as the first target data segment and the second target data segment mentioned above.
In this case, two signal segments in the case of signal alignment need to be found according to the curve matching degree. This process is similar to the similarity matching process mentioned above, i.e., the process of determining the first target data segment and the second target data segment whose similarities satisfy the preset condition, as mentioned above.
In particular, if D1(t) and D2(t) if the two corresponding signals are not aligned, that is, the two signal segments do not satisfy the preset condition, fixing one segment, sliding the timestamp of the other signal segment, and repeating steps S502 to S505 until two signal segments when the signals are aligned are found. If the physiological signal corresponding to the other signal segment is traversed, and two signal segments with aligned signals are not found, the previously fixed signal segment is changed, the timestamp of the other signal segment is slid, and the steps S502 to S505 are repeated until the signal segments with aligned signals are found.
If the two physiological signals are traversed, and the signal segment with aligned signals is not found, the two physiological signals are not considered to be the physiological data of the same individual, otherwise, if the two physiological signals can be found, the two physiological signals are considered to be the physiological signal of the same individual.
For example, referring to FIG. 6, D in FIG. 6(a) is maintained1D in FIG. 6(b) is changed without change2Each sliding of D2Then calculate D once1And D2The similarity evaluation index between them, determine D1And D2Whether a preset condition is satisfied. If not, continue sliding D2Until S is traversed2(t) of (d). If the traversal is finished S2(t) D satisfying the predetermined condition is not found yet1And D2Then slip D1Then continue sliding D2Repeating the above process until S is traversed1(t) or, two signal segments satisfying a preset condition are found.
Step S506, if the signals within the interception time can be aligned, determining S1(t) and S2(t) is derived from the same individual and is according to D1(t) and D2The time stamp of (t) calculates a time stamp offset of the two physiological signals.
And step S507, if the signals within the intercepted time cannot be aligned, determining whether two physiological signals are traversed or not, if not, returning to the step S502, and continuously searching two signal segments when the signals are aligned. If the traversal is completed, the process proceeds to step S508.
See, for example, fig. 7 for a schematic diagram of signals when misaligned. One curve is the PPG instantaneous heart rate curve of the bracelet device, the other curve is the ECG instantaneous heart rate curve of the PSG device. The correlation coefficient of the two signal segments is 0.79328, i.e. corrcoef 0.79328. The preset threshold value of the correlation coefficient is 0.85, and the correlation coefficient of the two signal segments is smaller than the set threshold value of the correlation coefficient, the two signal segments are considered to be not aligned within the interception time, namely the similarity of the two signal segments does not meet the preset condition. At this time, the timestamp deviation of the two signals is 661.2s, i.e., timedif-t 1-t2 is 661.2 s.
Referring to fig. 8, a diagram of the signals when aligned is shown, with time on the horizontal axis and instantaneous heart rate on the vertical axis, as shown in fig. 8. One curve is the PPG instantaneous heart rate curve of the bracelet device, the other curve is the ECG instantaneous heart rate curve of the PSG device. The correlation coefficient of the two signal segments is 0.9964, i.e. corrcoef 0.9964. The preset threshold value of the correlation coefficient is 0.85, and the correlation coefficient of the two signal segments is greater than the set threshold value of the correlation coefficient, the two signal segments are considered to be aligned in the interception time, that is, the similarity of the two signal segments meets the preset condition. At this time, the timestamp deviation of the two signals is-9.8 s, i.e., timedif-t 1-t2 is-9.8 s. That is, the time offset between the bracelet device and the PSG device is-9.8 s. After the time offset between the bracelet device and the PGS device is determined, the time stamps of the corresponding physiological data may be adjusted according to the time offset to perform data alignment on the physiological data of the two devices.
Step S508, determining S1(t) and S2(t) does not originate from the same individual and the time stamp offset of the two physiological signals cannot be calculated.
After calculating the time offset of the two electronic devices, the time offset can be used to perform data alignment on the two physiological signals, adjusting the time of the two physiological signals to be more synchronous.
After the data alignment operation is performed, data fusion can be performed to generate physiological data analysis reports and other operations.
Second case
In this case, whether the plurality of physiological data are from the same individual is determined, if the plurality of physiological data are from the same individual, similarity matching is performed to obtain timestamp deviations of the two devices, and then data alignment, data fusion, data analysis report generation and other operations are performed according to the timestamp deviations.
Specifically, referring to a schematic diagram of the multi-source data processing method shown in fig. 9, as shown in fig. 9, first, features such as euclidean distance, VLI, VAI, mean, and variance of a plurality of physiological data are extracted. Then, linear regression analysis is performed according to the characteristics to obtain a linear regression analysis result. The linear regression analysis results were used to assess the magnitude of the differentiation between data. If the difference is smaller than the threshold A, the plurality of physiological data are considered to be data of the same individual, and if the difference is larger than the threshold A, the plurality of physiological data are not considered to be data of the same individual.
After determining that the plurality of physiological data are data of the same individual, timestamp deviations of the plurality of devices can be determined through similarity matching. The process of determining the timestamp deviation between the devices may refer to the process of the first case and the content corresponding to the first target data segment and the second target data segment determined to satisfy the preset condition, which is not described herein again.
After the time deviation between the devices is calculated, the plurality of physiological data are subjected to data alignment. And then, calculating Euclidean distance based on the physiological data after data alignment, and performing secondary similarity matching according to the Euclidean distance. And if the Euclidean distance is smaller than the threshold B, performing data fusion operation, and generating a fusion report based on the fused physiological data. And if the Euclidean distance is greater than the threshold B, the multiple physiological data are not the same individual data or the data quality of the multiple physiological data is poor, data fusion is not carried out, multiple data analysis reports are generated, and each physiological data corresponds to one report.
It should be noted that the data fusion process can refer to the data fusion process mentioned above, and is not described herein again.
It should be noted that, for some details and related descriptions in the first case and the second case, reference may be made to the corresponding contents above, and details are not described herein again.
Fig. 10 shows a block diagram of a multi-source data processing apparatus provided in an embodiment of the present application, and only shows portions related to the embodiment of the present application for convenience of description.
Referring to fig. 10, the apparatus may include:
the physiological data acquisition module 101 is configured to acquire first physiological data acquired by a first electronic device and second physiological data acquired by a second electronic device, where the first physiological data and the second physiological data both include timestamp information used for representing data generation time, and the first physiological data and the second physiological data are data of the same physiological parameter in a preset time period;
a target data segment determining module 102, configured to determine a first target data segment and a second target data segment, where the similarity meets a preset condition, where the first target data segment is a signal segment of a preset time length intercepted from the first physiological data, and the second target data segment is a signal segment of a preset time length intercepted from the second physiological data;
an inter-device time offset determining module 103, configured to determine a time offset between the first electronic device and the second electronic device according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment;
and a data alignment module 104, configured to perform a data alignment operation on the first physiological data and the second physiological data according to the time offset.
In some embodiments, the target data segment determining module may be specifically configured to: respectively setting a first time sliding window with a preset time length in the first physiological data and a second time sliding window with the preset time length in the second physiological data; calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window; determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index; if the similarity meets a preset condition, taking the first data segment as a first target data segment, and taking the second data segment as a second target data segment; if the similarity does not meet the preset conditions, after sliding one of the time sliding windows, returning to the step of calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window, and determining whether the similarity between the first data segment and the second data segment meets the preset conditions according to the similarity evaluation index until the first physiological data and the second physiological data are traversed or the data segments with the similarity meeting the preset conditions are found;
or at least one first data segment with a preset time length is intercepted from the first physiological data, and at least one second data segment with the preset time length is intercepted from the second physiological data; respectively calculating similarity evaluation indexes between each first data segment and each second data segment; determining whether the similarity between the first data segment and the second data segment meets a preset condition or not according to the similarity evaluation index; and taking the first data segment corresponding to the similarity meeting the preset condition as a first target data segment, and taking the corresponding second data segment as a second target data segment.
In some embodiments, the distance metric is used to characterize the distance between samples if the similarity measure includes a correlation coefficient and the distance metric. The target data segment determining module may be specifically configured to: if the correlation coefficient between the first data segment and the second data segment is larger than a first preset threshold value and the distance measurement index is smaller than a second preset threshold value, determining that the similarity between the first data segment and the second data segment meets a preset condition; and if the correlation coefficient between the first data segment and the second data segment is smaller than or equal to a first preset threshold value and/or the distance metric index is larger than or equal to a second preset threshold value, determining that the similarity between the first data segment and the second data segment does not meet a preset condition.
In some embodiments, the apparatus may further comprise: the first judgment module is used for determining that the first physiological data and the second physiological data belong to the same individual if a first target data segment and a second target data segment with similarity meeting a preset condition exist; and if the first target data segment and the second target data segment with the similarity meeting the preset condition do not exist, determining that the first physiological data and the second physiological data do not belong to the same individual.
In some embodiments, the apparatus may further comprise: the second judgment module is used for calculating the similarity measurement characteristics of data corresponding to the overlapped time periods in the first physiological data and the second physiological data; extracting linear features, nonlinear features and discrete features of the first physiological data and the second physiological data; performing linear regression analysis according to the similarity measurement characteristic, the linear characteristic, the nonlinear characteristic and the discrete characteristic to obtain a difference evaluation result of the first physiological data and the second physiological data; if the difference evaluation result is smaller than a third preset threshold value, determining that the first physiological data and the second physiological data belong to the same individual; and if the difference evaluation result is larger than a third preset threshold value, determining that the first physiological data and the second physiological data do not belong to the same individual.
In some embodiments, if it is determined that the first physiological data and the second physiological data belong to the same individual, the apparatus may further include: and the data fusion module is used for carrying out data fusion on the first physiological data and the second physiological data to obtain fused physiological data.
In some embodiments, the data fusion module may be specifically configured to: comparing the quality of the first physiological data with that of the second physiological data through the data quality evaluation index to obtain a third data segment with higher quality in the same time period; if one physiological data of the first physiological data and the second physiological data has a missing data segment, a fourth data segment corresponding to the corresponding time segment in the other physiological data is obtained, and the third data segment and the fourth data segment are combined according to the time sequence to obtain the fused physiological data.
In some embodiments, the apparatus may further comprise: and the report generation module is used for generating a corresponding physiological data analysis report according to the fused physiological data.
In some embodiments, the apparatus may further comprise: the secondary similarity matching module is used for carrying out similarity calculation on the first physiological data and the second physiological data after data alignment to obtain target similarity; if the target similarity is larger than a fourth preset threshold, performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data; and if the target similarity is smaller than a fourth preset threshold, generating a physiological data analysis report of the first physiological data and a physiological data analysis report of the second physiological data respectively.
In some embodiments, the physiological parameter may be, but is not limited to, heart rate, respiratory rate, body temperature, or blood oxygen, etc.
The multi-source data processing device has the function of realizing the multi-source data processing method, the function can be realized by hardware, and can also be realized by hardware executing corresponding software, the hardware or the software comprises one or more modules corresponding to the function, and the modules can be software and/or hardware.
It should be noted that, for the information interaction, execution process, and other contents between the above-mentioned devices/modules, the specific functions and technical effects thereof are based on the same concept as those of the embodiment of the method of the present application, and reference may be made to the part of the embodiment of the method specifically, and details are not described here.
In the embodiment of the present application, the type of the electronic device for acquiring the physiological data of the user may be any, and the electronic device may be a wearable device, or a home health monitoring device. By way of example and not limitation, when a wearable device, the wearable device may also be a generic term for applying wearable technology to intelligently design daily wear, developing wearable devices, such as glasses, gloves, watches, apparel, and shoes, and the like. A wearable device is a portable device that is worn directly on the body or integrated into the clothing or accessories of the user. The wearable device is not only a hardware device, but also realizes powerful functions through software support, data interaction and cloud interaction. The generalized wearable intelligent device has the advantages that the generalized wearable intelligent device is complete in function and large in size, can realize complete or partial functions without depending on a smart phone, such as a smart watch or smart glasses, and only is concentrated on a certain application function, and needs to be matched with other devices such as the smart phone for use, such as various smart bracelets for monitoring physical signs, smart jewelry and the like. Of course, the wearable device has a physiological data acquisition function.
The electronic devices for acquiring physiological data collected by each device and performing multi-source data processing operations on the data, such as determining inter-device time offset, data alignment, data fusion, generating data reports, and the like, may refer to any type of device having data processing capabilities. For example, it may be a cell phone, a computer, a tablet, and a cloud server.
By way of example and not limitation, as shown in fig. 11, the electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, keys 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller may be, among other things, a neural center and a command center of the electronic device 100. The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140 and provides power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements the steps that can be implemented in the above method embodiments.
The embodiments of the present application provide a computer program product, which when running on an electronic device, enables the electronic device to implement the steps in the above method embodiments when executed.
An embodiment of the present application further provides a chip system, where the chip system includes a processor, the processor is coupled with a memory, and the processor executes a computer program stored in the memory to implement the method according to any one of the above first aspects. The chip system can be a single chip or a chip module consisting of a plurality of chips.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application. Furthermore, in the description of the present application and the appended claims, the terms "first," "second," "third," and the like are used for distinguishing between descriptions and not necessarily for describing or implying relative importance. Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise.
Finally, it should be noted that: the above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.
Claims (12)
1. A multi-source data processing method, comprising:
acquiring first physiological data acquired by first electronic equipment and second physiological data acquired by second electronic equipment, wherein the first physiological data and the second physiological data both comprise timestamp information used for representing data generation time, and the first physiological data and the second physiological data are data of the same physiological parameter in a preset time period;
determining a first target data segment and a second target data segment, wherein the similarity of the first target data segment and the second target data segment meets a preset condition, the first target data segment is a signal segment of a preset time length intercepted from the first physiological data, and the second target data segment is a signal segment of the preset time length intercepted from the second physiological data;
determining a time deviation between the first electronic device and the second electronic device according to the timestamp information corresponding to the first target data segment and the timestamp information corresponding to the second target data segment;
and performing data alignment operation on the first physiological data and the second physiological data according to the time deviation.
2. The method of claim 1, wherein determining the first target data segment and the second target data segment whose similarity satisfies a preset condition comprises:
respectively setting a first time sliding window of the preset time length in the first physiological data and setting a second time sliding window of the preset time length in the second physiological data;
calculating a similarity evaluation index between a first data segment corresponding to the first time sliding window and a second data segment corresponding to the second time sliding window;
determining whether the similarity between the first data segment and the second data segment meets the preset condition or not according to the similarity evaluation index;
if the similarity meets the preset condition, taking a first data segment as the first target data segment, and taking a second data segment as the second target data segment;
if the similarity does not meet the preset condition, after sliding one of the time sliding windows, returning to the step of calculating the similarity evaluation index between the first data segment corresponding to the first time sliding window and the second data segment corresponding to the second time sliding window, and determining whether the similarity between the first data segment and the second data segment meets the preset condition according to the similarity evaluation index until the first physiological data and the second physiological data are traversed or the data segment with the similarity meeting the preset condition is found;
or,
intercepting at least one first data segment of the preset time length from the first physiological data, and intercepting at least one second data segment of the preset time length from the second physiological data;
respectively calculating similarity evaluation indexes between each first data segment and each second data segment;
determining whether the similarity between the first data segment and the second data segment meets the preset condition or not according to the similarity evaluation index;
and taking a first data segment with the similarity meeting the preset condition as the first target data segment, and taking a corresponding second data segment as the second target data segment.
3. The method of claim 2, wherein the similarity assessment indicators comprise correlation coefficients and distance metric indicators, the distance metric indicators being used to characterize inter-sample distances;
determining whether the similarity between the first data segment and the second data segment meets the preset condition according to the similarity metric index, wherein the determining comprises the following steps:
if the correlation coefficient between the first data segment and the second data segment is larger than a first preset threshold value and the distance metric index is smaller than a second preset threshold value, determining that the similarity between the first data segment and the second data segment meets the preset condition;
and if the correlation coefficient between the first data segment and the second data segment is smaller than or equal to the first preset threshold value and/or the distance metric index is larger than or equal to the second preset threshold value, determining that the similarity between the first data segment and the second data segment does not meet the preset condition.
4. The method of any of claims 1 to 3, further comprising:
if a first target data segment and a second target data segment with similarity meeting the preset condition exist, determining that the first physiological data and the second physiological data belong to the same individual;
and if the first target data segment and the second target data segment with the similarity meeting the preset condition do not exist, determining that the first physiological data and the second physiological data do not belong to the same individual.
5. The method of claim 1, wherein prior to determining the first target data segment and the second target data segment whose similarity satisfies a preset condition, the method further comprises:
calculating similarity metric characteristics of data corresponding to the overlapped time periods in the first physiological data and the second physiological data;
extracting linear, nonlinear and discrete features of the first and second physiological data;
performing linear regression analysis according to the similarity metric feature, the linear feature, the nonlinear feature and the discrete feature to obtain difference evaluation results of the first physiological data and the second physiological data;
if the difference evaluation result is smaller than a third preset threshold value, determining that the first physiological data and the second physiological data belong to the same individual;
and if the difference evaluation result is larger than a third preset threshold value, determining that the first physiological data and the second physiological data do not belong to the same individual.
6. The method of claim 4 or 5, wherein if it is determined that the first physiological data and the second physiological data belong to the same individual, after performing a data alignment operation on the first physiological data and the second physiological data according to the time offset, the method further comprises:
and performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data.
7. The method of claim 6, wherein data fusing the first physiological data and the second physiological data to obtain fused physiological data comprises:
comparing the quality of the first physiological data with that of the second physiological data through a data quality evaluation index to obtain a third data segment with higher quality in the same time period;
if one physiological data of the first physiological data and the second physiological data has a missing data segment, obtaining a fourth data segment corresponding to the corresponding time segment in the other physiological data,
and combining the third data segment and the fourth data segment according to the time sequence to obtain the fused physiological data.
8. The method of claim 6, wherein after obtaining the fused physiological data, the method further comprises:
and generating a corresponding physiological data analysis report according to the fused physiological data.
9. The method of claim 6, wherein prior to data fusing the first physiological data and the second physiological data to obtain fused physiological data, the method further comprises:
similarity calculation is carried out on the first physiological data and the second physiological data after data alignment, and target similarity is obtained;
if the target similarity is larger than a fourth preset threshold, performing data fusion on the first physiological data and the second physiological data to obtain fused physiological data;
and if the target similarity is smaller than the fourth preset threshold, generating a physiological data analysis report of the first physiological data and a physiological analysis report of the second physiological data respectively.
10. The method of claim 1, wherein the physiological parameter is heart rate, respiratory rate, body temperature, or blood oxygen.
11. An electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor implements the method of any of claims 1 to 10 when executing the computer program.
12. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 10.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010330890.8A CN113555132B (en) | 2020-04-24 | 2020-04-24 | Multi-source data processing method, electronic device, and computer-readable storage medium |
PCT/CN2021/084390 WO2021213165A1 (en) | 2020-04-24 | 2021-03-31 | Multi-source data processing method, electronic device and computer-readable storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010330890.8A CN113555132B (en) | 2020-04-24 | 2020-04-24 | Multi-source data processing method, electronic device, and computer-readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113555132A true CN113555132A (en) | 2021-10-26 |
CN113555132B CN113555132B (en) | 2024-09-17 |
Family
ID=78129577
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010330890.8A Active CN113555132B (en) | 2020-04-24 | 2020-04-24 | Multi-source data processing method, electronic device, and computer-readable storage medium |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113555132B (en) |
WO (1) | WO2021213165A1 (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114145759A (en) * | 2021-11-11 | 2022-03-08 | 歌尔股份有限公司 | Myoelectric signal compensation method and device, myoelectric detection device and storage medium |
CN114246572A (en) * | 2021-11-30 | 2022-03-29 | 歌尔科技有限公司 | Earphone set |
CN115049015A (en) * | 2022-07-14 | 2022-09-13 | 北京中科心研科技有限公司 | Method, device and equipment for aligning time sequence data after sliding window and storage medium |
CN115736915A (en) * | 2022-11-21 | 2023-03-07 | 大连理工大学 | Patient physical ability assessment method based on multi-source information fusion |
CN117643461A (en) * | 2024-01-30 | 2024-03-05 | 吉林大学 | Heart rate intelligent monitoring system and method based on artificial intelligence |
WO2024066962A1 (en) * | 2022-09-28 | 2024-04-04 | 华为技术有限公司 | Respiratory health detection method and wearable electronic device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114670877A (en) * | 2022-05-09 | 2022-06-28 | 中国第一汽车股份有限公司 | Vehicle control method, device, electronic device and storage medium |
CN115034290B (en) * | 2022-05-17 | 2023-02-03 | 医声医事(北京)科技有限公司 | Dynamic fusion system, method, equipment and medium for multi-source data |
CN116725484B (en) * | 2022-09-09 | 2024-04-16 | 荣耀终端有限公司 | Physiological detection method based on wearable device and wearable device |
CN116077050B (en) * | 2023-01-30 | 2024-07-19 | 青岛海尔科技有限公司 | Respiratory diagnosis method, respiratory diagnosis device, storage medium and electronic device |
CN117898692B (en) * | 2024-03-20 | 2024-05-14 | 河北网新数字技术股份有限公司 | Data processing method for heart rate monitoring system |
Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102056026A (en) * | 2009-11-06 | 2011-05-11 | 中国移动通信集团设计院有限公司 | Audio/video synchronization detection method and system, and voice detection method and system |
CN105898502A (en) * | 2016-04-11 | 2016-08-24 | 深圳Tcl新技术有限公司 | Audio and video playing synchronization method and synchronization device |
CN105930631A (en) * | 2015-02-27 | 2016-09-07 | 三星电子株式会社 | Method for measuring biological signal and wearable electronic device for the same |
CN106293033A (en) * | 2015-06-09 | 2017-01-04 | 联想(北京)有限公司 | Information processing method and electronic equipment |
CN106529596A (en) * | 2016-11-11 | 2017-03-22 | 国网上海市电力公司 | Indoor and outdoor scene recognition method based on wearable device |
US20170220854A1 (en) * | 2016-01-29 | 2017-08-03 | Conduent Business Services, Llc | Temporal fusion of multimodal data from multiple data acquisition systems to automatically recognize and classify an action |
US20170224285A1 (en) * | 2016-09-12 | 2017-08-10 | Genomi-K S.A.P.I. De C.V. | Method to obtain and validate physiological data |
US20170249280A1 (en) * | 2015-07-01 | 2017-08-31 | Anhui Huami Information Technology Co.,Ltd. | Data Statistics For Wearable Device |
CN108472489A (en) * | 2016-01-08 | 2018-08-31 | 心脏起搏器股份公司 | Multiple sources of physiological data are made to synchronize |
CN108701495A (en) * | 2016-02-19 | 2018-10-23 | 三星电子株式会社 | Method for integrating and providing the data collected from multiple equipment and the electronic equipment for realizing this method |
CN108833085A (en) * | 2018-04-04 | 2018-11-16 | 深圳大学 | A kind of wearable smart machine matching method and system based on heartbeat signal |
CN109933294A (en) * | 2019-03-26 | 2019-06-25 | 努比亚技术有限公司 | Data processing method, device, wearable device and storage medium |
CN110047587A (en) * | 2018-09-29 | 2019-07-23 | 苏州爱医斯坦智能科技有限公司 | A kind of medical data acquisition method, apparatus, equipment and storage medium |
CN110298409A (en) * | 2019-07-03 | 2019-10-01 | 广东电网有限责任公司 | Multi-source data fusion method towards electric power wearable device |
CN110623652A (en) * | 2019-09-17 | 2019-12-31 | 华为技术有限公司 | Data display method and electronic equipment |
US10575131B1 (en) * | 2019-05-30 | 2020-02-25 | Snap Inc. | Wearable device location accuracy systems |
WO2020039226A1 (en) * | 2018-08-18 | 2020-02-27 | Smartcardia Sa | Method for synchronization of a multitude of wearable devices |
CN110879806A (en) * | 2019-11-25 | 2020-03-13 | 北京优奥创思科技发展有限公司 | Data fusion method, device, equipment and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104582564B (en) * | 2012-09-03 | 2017-03-15 | 株式会社日立制作所 | Biological signal measuring device and biosignal measurement set |
CN103607269B (en) * | 2013-12-02 | 2016-05-25 | 常州博睿康科技有限公司 | A kind of brain electric installation and accurate wireless event synchronizing method |
CN107874756A (en) * | 2017-11-21 | 2018-04-06 | 博睿康科技(常州)股份有限公司 | The precise synchronization method of eeg collection system and video acquisition system |
CN110636321A (en) * | 2019-09-30 | 2019-12-31 | 北京达佳互联信息技术有限公司 | Data processing method, device, system, mobile terminal and storage medium |
-
2020
- 2020-04-24 CN CN202010330890.8A patent/CN113555132B/en active Active
-
2021
- 2021-03-31 WO PCT/CN2021/084390 patent/WO2021213165A1/en active Application Filing
Patent Citations (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102056026A (en) * | 2009-11-06 | 2011-05-11 | 中国移动通信集团设计院有限公司 | Audio/video synchronization detection method and system, and voice detection method and system |
CN105930631A (en) * | 2015-02-27 | 2016-09-07 | 三星电子株式会社 | Method for measuring biological signal and wearable electronic device for the same |
CN106293033A (en) * | 2015-06-09 | 2017-01-04 | 联想(北京)有限公司 | Information processing method and electronic equipment |
US20170249280A1 (en) * | 2015-07-01 | 2017-08-31 | Anhui Huami Information Technology Co.,Ltd. | Data Statistics For Wearable Device |
CN108472489A (en) * | 2016-01-08 | 2018-08-31 | 心脏起搏器股份公司 | Multiple sources of physiological data are made to synchronize |
US20170220854A1 (en) * | 2016-01-29 | 2017-08-03 | Conduent Business Services, Llc | Temporal fusion of multimodal data from multiple data acquisition systems to automatically recognize and classify an action |
CN108701495A (en) * | 2016-02-19 | 2018-10-23 | 三星电子株式会社 | Method for integrating and providing the data collected from multiple equipment and the electronic equipment for realizing this method |
CN105898502A (en) * | 2016-04-11 | 2016-08-24 | 深圳Tcl新技术有限公司 | Audio and video playing synchronization method and synchronization device |
US20170224285A1 (en) * | 2016-09-12 | 2017-08-10 | Genomi-K S.A.P.I. De C.V. | Method to obtain and validate physiological data |
CN106529596A (en) * | 2016-11-11 | 2017-03-22 | 国网上海市电力公司 | Indoor and outdoor scene recognition method based on wearable device |
CN108833085A (en) * | 2018-04-04 | 2018-11-16 | 深圳大学 | A kind of wearable smart machine matching method and system based on heartbeat signal |
WO2020039226A1 (en) * | 2018-08-18 | 2020-02-27 | Smartcardia Sa | Method for synchronization of a multitude of wearable devices |
CN110047587A (en) * | 2018-09-29 | 2019-07-23 | 苏州爱医斯坦智能科技有限公司 | A kind of medical data acquisition method, apparatus, equipment and storage medium |
CN109933294A (en) * | 2019-03-26 | 2019-06-25 | 努比亚技术有限公司 | Data processing method, device, wearable device and storage medium |
US10575131B1 (en) * | 2019-05-30 | 2020-02-25 | Snap Inc. | Wearable device location accuracy systems |
CN110298409A (en) * | 2019-07-03 | 2019-10-01 | 广东电网有限责任公司 | Multi-source data fusion method towards electric power wearable device |
CN110623652A (en) * | 2019-09-17 | 2019-12-31 | 华为技术有限公司 | Data display method and electronic equipment |
CN110879806A (en) * | 2019-11-25 | 2020-03-13 | 北京优奥创思科技发展有限公司 | Data fusion method, device, equipment and storage medium |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114145759A (en) * | 2021-11-11 | 2022-03-08 | 歌尔股份有限公司 | Myoelectric signal compensation method and device, myoelectric detection device and storage medium |
CN114145759B (en) * | 2021-11-11 | 2024-05-14 | 歌尔股份有限公司 | Myoelectric signal compensation method and device, myoelectric detection equipment and storage medium |
CN114246572A (en) * | 2021-11-30 | 2022-03-29 | 歌尔科技有限公司 | Earphone set |
CN114246572B (en) * | 2021-11-30 | 2024-07-02 | 歌尔科技有限公司 | Earphone |
CN115049015A (en) * | 2022-07-14 | 2022-09-13 | 北京中科心研科技有限公司 | Method, device and equipment for aligning time sequence data after sliding window and storage medium |
CN115049015B (en) * | 2022-07-14 | 2023-04-18 | 北京中科心研科技有限公司 | Method, device and equipment for aligning time sequence data after sliding window and storage medium |
WO2024066962A1 (en) * | 2022-09-28 | 2024-04-04 | 华为技术有限公司 | Respiratory health detection method and wearable electronic device |
CN115736915A (en) * | 2022-11-21 | 2023-03-07 | 大连理工大学 | Patient physical ability assessment method based on multi-source information fusion |
CN115736915B (en) * | 2022-11-21 | 2024-04-26 | 大连理工大学 | Patient physical ability assessment method based on multi-source information fusion |
CN117643461A (en) * | 2024-01-30 | 2024-03-05 | 吉林大学 | Heart rate intelligent monitoring system and method based on artificial intelligence |
CN117643461B (en) * | 2024-01-30 | 2024-04-02 | 吉林大学 | Heart rate intelligent monitoring system and method based on artificial intelligence |
Also Published As
Publication number | Publication date |
---|---|
CN113555132B (en) | 2024-09-17 |
WO2021213165A1 (en) | 2021-10-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113555132B (en) | Multi-source data processing method, electronic device, and computer-readable storage medium | |
CN113395382B (en) | Method for data interaction between devices and related devices | |
CN111030990B (en) | Method for establishing communication connection, client and server | |
CN113552937B (en) | Display control method and wearable device | |
CN113892920B (en) | Wearing detection method and device of wearable equipment and electronic equipment | |
CN112651510B (en) | Model updating method, working node and model updating system | |
CN111552451A (en) | Display control method and device, computer readable medium and terminal equipment | |
CN113467735A (en) | Image adjusting method, electronic device and storage medium | |
CN114257920B (en) | Audio playing method and system and electronic equipment | |
CN114064571A (en) | Method, device and terminal for determining file storage position | |
CN113509145B (en) | Sleep risk monitoring method, electronic device and storage medium | |
CN114095602B (en) | Index display method, electronic device and computer readable storage medium | |
CN113129916A (en) | Audio acquisition method, system and related device | |
CN115665632B (en) | Audio circuit, related device and control method | |
CN115412678B (en) | Exposure processing method and device and electronic equipment | |
CN113518189A (en) | Shooting method, shooting system, electronic equipment and storage medium | |
WO2023030067A1 (en) | Remote control method, remote control device and controlled device | |
CN114466238B (en) | Frame demultiplexing method, electronic device and storage medium | |
WO2021204036A1 (en) | Sleep risk monitoring method, electronic device and storage medium | |
CN111026285B (en) | Method for adjusting pressure threshold and electronic equipment | |
CN111460942A (en) | Proximity detection method and device, computer readable medium and terminal equipment | |
CN116087930B (en) | Audio ranging method, device, storage medium, and program product | |
CN113749611B (en) | Data measurement method and related device | |
CN113359120B (en) | Method and device for measuring user activity distance and electronic device | |
CN117243640A (en) | Method and device for predicting gestation period and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |