CN113384296A - Method for predicting behavior occurrence time, electronic device, and storage medium - Google Patents
Method for predicting behavior occurrence time, electronic device, and storage medium Download PDFInfo
- Publication number
- CN113384296A CN113384296A CN202010167820.5A CN202010167820A CN113384296A CN 113384296 A CN113384296 A CN 113384296A CN 202010167820 A CN202010167820 A CN 202010167820A CN 113384296 A CN113384296 A CN 113384296A
- Authority
- CN
- China
- Prior art keywords
- target object
- ultrasonic
- echo
- target
- occurrence time
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/08—Detecting organic movements or changes, e.g. tumours, cysts, swellings
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/44—Constructional features of the ultrasonic, sonic or infrasonic diagnostic device
Landscapes
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
Abstract
The embodiment of the application discloses a method for predicting behavior occurrence time, electronic equipment and a storage medium. The method comprises the following steps: the ultrasonic device transmits an ultrasonic signal to a first target object; the processing unit calculates the deformation amplitude of the first target object generated by the extrusion of a second target object based on the ultrasonic reflection signal received by the ultrasonic device; and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude. Therefore, the deformation amplitude generated after the first target object is extruded by the second target object is measured through the ultrasonic ranging technology to predict the target behavior occurrence time corresponding to the second target object, and the target behavior occurrence time can be predicted more accurately.
Description
Technical Field
The application belongs to the technical field of electronics, and particularly relates to a method for predicting behavior occurrence time, electronic equipment and a storage medium.
Background
In the occasions of outdoor, business and the like, if the toilet using time is known in advance, the embarrassing scene can be effectively avoided. For some old people or special groups, the method has important significance in predicting defecation time.
Disclosure of Invention
In view of the above problems, the present application proposes a method of predicting occurrence time of a behavior, an electronic device, and a storage medium to achieve an improvement of the above problems.
In a first aspect, an embodiment of the present application provides a method for predicting occurrence time of a behavior, which is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, the ultrasonic device is connected to the processing unit, and the method includes: the ultrasonic device transmits an ultrasonic signal to a first target object; the processing unit calculates the deformation amplitude of the first target object generated by the extrusion of a second target object based on the ultrasonic reflection signal received by the ultrasonic device; and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
In a second aspect, an embodiment of the present application provides an electronic device, including an ultrasonic device and a processing unit: the ultrasonic device is used for transmitting an ultrasonic signal to the first target object and receiving an ultrasonic reflection signal; the processing unit is used for calculating the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device; and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium, in which a program code is stored, where the program code executes the above-mentioned method.
The embodiment of the application provides a method for predicting behavior occurrence time, electronic equipment and a storage medium. Firstly, an ultrasonic device transmits an ultrasonic signal to a first target object, a processing unit calculates the deformation amplitude of the first target object generated by the extrusion of a second target object based on an ultrasonic reflection signal received by the ultrasonic device, and predicts the occurrence time of a target behavior corresponding to the second target object based on the deformation amplitude. Therefore, the deformation amplitude generated after the first target object is extruded by the second target object is measured through the ultrasonic ranging technology to predict the target behavior occurrence time corresponding to the second target object, and the target behavior occurrence time can be predicted more accurately.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a flowchart illustrating a method for predicting behavior occurrence time according to an embodiment of the present application;
FIG. 2 is a flow chart illustrating a method for predicting behavior occurrence times according to another embodiment of the present application;
FIG. 3 is a schematic diagram of an arrangement of a transducer array according to another embodiment of the present application;
FIG. 4 is a schematic diagram of an ultrasonic time delay focused emission scheme according to another embodiment of the present application;
FIG. 5 is a schematic diagram illustrating a time delay parameter calculation of an ultrasound time delay focusing transmission according to another embodiment of the present application;
FIG. 6 is a schematic view of a displacement measurement model of a bladder after being compressed according to another embodiment of the present application;
FIG. 7 is a flow chart illustrating a method for predicting behavior occurrence times according to yet another embodiment of the present application;
FIG. 8 is a schematic diagram illustrating ultrasonic echo signal beamforming according to yet another embodiment of the present application;
FIG. 9 is a schematic view showing the displacement of a bladder after rectal compression in accordance with yet another embodiment of the present application;
FIG. 10 is a flow chart illustrating a method for predicting behavior occurrence times according to yet another embodiment of the present application;
FIG. 11 is a flow chart illustrating a method for predicting behavior occurrence times according to yet another embodiment of the present application;
fig. 12 is a block diagram illustrating an electronic device for predicting occurrence time of a behavior according to an embodiment of the present application;
fig. 13 is a block diagram illustrating an ultrasonic device according to yet another embodiment of the present application;
fig. 14 is a block diagram illustrating a system for predicting occurrence time of a behavior according to an embodiment of the present disclosure;
fig. 15 is a block diagram illustrating a processing unit of an electronic device for executing an image processing method according to an embodiment of the present application in real time.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In the occasions of outdoor, business and the like, if the toilet using time is known in advance, the embarrassing scene can be effectively avoided; and for some old people or special groups, the toilet time can be predicted, and the nursing can be more effectively carried out. In a related aspect, a method of predicting a defecation time includes: 1) by swallowing the test object and then detecting the rectal condition in an external non-invasive manner, the defecation feeling is judged, and the defecation time is predicted; 2) the thickness of the rectum is measured through an ultrasonic ranging technology, and then defecation time is predicted.
The inventor finds that defecation time is pre-judged by measuring the thickness of the rectum in the research process of a method for predicting the occurrence time of target behaviors, and ultrasonic waves can generate echo interference with large amplitude for many times in the transmission process due to the fact that the physiological position of the rectum is far away from the abdomen, so that the accuracy of distance measurement is influenced; meanwhile, excrement in the rectum is complex, the medium is uneven, ultrasonic waves can be transmitted for many times in the rectum transmission process, the signal to noise ratio of echo signals can be worsened, the complexity of signal processing is increased, and the stability of rectum thickness measurement is influenced. In addition, the signal-to-noise ratio of the ultrasonic echo signal in the current method for predicting the occurrence time of the target behavior needs to be improved, and the reliability of the measurement result is influenced.
Therefore, the inventor proposes that in the present application, an ultrasonic signal is transmitted to a first target object through an ultrasonic device, a processing unit calculates a deformation amplitude of the first target object generated by being squeezed by a second target object based on an ultrasonic reflection signal received by the ultrasonic device, predicts an occurrence time of a target behavior corresponding to the second target object based on the deformation amplitude, and predicts the occurrence time of the target behavior corresponding to the second target object by measuring the deformation amplitude of the first target object generated by being squeezed by the second target object through an ultrasonic ranging technique, so that a method, an electronic device, and a storage medium capable of more accurately predicting the occurrence time of the target behavior are realized.
Embodiments of the present application will be described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a method for predicting occurrence time of a behavior according to an embodiment of the present application is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, and the ultrasonic device is connected to the processing unit, and the method includes:
step S110: triggering the ultrasonic device to transmit an ultrasonic signal to the first target object.
By way of example, the ultrasound device may be a transducer array including a plurality of transducers through which ultrasound signals may be transmitted to the first target.
Step S120: and calculating the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device.
As a mode, after the ultrasonic signal reaches the first target object, the first target object may reflect the ultrasonic signal, and after the transducer receives the ultrasonic reflected signal, the processing unit may calculate, according to the ultrasonic reflected signal received by the transducer, a deformation amplitude of the first target object generated by the extrusion of the second target object.
Optionally, the first object may be a bladder, the second object may be a rectum, a spine is present on one side of the rectum, and the other side of the rectum is in close proximity to the bladder, so that when a certain amount of excrement is accumulated in the rectum, the rectum bulges towards the bladder, and one side of the bladder is also squeezed and displaced forwards. Therefore, the displacement of the extruded bladder can be measured to predict the amount of excrement accumulated in the rectum, and the defecation time can be predicted.
Illustratively, the transducer transmits ultrasonic signals to the bladder, the ultrasonic signals generate ultrasonic reflection signals after reaching two sides of the bladder, the processing unit can calculate time difference by receiving the receiving time of the ultrasonic reflection signals for multiple times, and then the deformation amplitude generated by squeezing the bladder by the rectum is calculated according to the propagation speed of the ultrasonic signals in the human body.
Step S130: and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
Illustratively, the defecation occurrence time can be predicted by calculating the deformation amplitude generated after the bladder is extruded by the rectum.
According to the method for predicting the behavior occurrence time, firstly, an ultrasonic device transmits an ultrasonic signal to a first target object, a processing unit calculates the deformation amplitude of the first target object generated by extrusion of a second target object based on an ultrasonic reflection signal received by the ultrasonic device, and the occurrence time of the target behavior corresponding to the second target object is predicted based on the deformation amplitude. Therefore, the deformation amplitude generated after the first target object is extruded by the second target object is measured through the ultrasonic ranging technology to predict the target behavior occurrence time corresponding to the second target object, and the target behavior occurrence time can be predicted more accurately.
Referring to fig. 2, a method for predicting occurrence time of a behavior according to an embodiment of the present application is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, the ultrasonic device includes a plurality of transducers, a transmission control unit, and a reception processing unit, and the plurality of transducers are respectively connected to the transmission control unit and the reception processing unit, and the method includes:
step S210: and triggering the transmitting control unit to control the plurality of transducers to transmit the ultrasonic signals to the first target object in a time-delay transmitting mode, so that the plurality of ultrasonic signals transmitted by the plurality of transducers reach the first target object simultaneously.
The arrangement of the transducers may be linear, matrix or quincunx, as shown in fig. 3 (one dot in fig. 3 represents one transducer). In effect, the three arrangements and the derived modes can achieve the effect of improving the signal to noise ratio, wherein the plum blossom-shaped arrangement is better than the other two modes due to better spatial property of sound field focusing, and meanwhile, the plum blossom-shaped array formed by six transducers can achieve the good effect of improving the signal to noise ratio in consideration of cost.
Furthermore, the emission control unit controls the plurality of transducers to emit the ultrasonic signals to the first target object in a delayed emission mode, and the ultrasonic signals are emitted in a delayed manner, so that the plurality of ultrasonic signals emitted by the plurality of transducers can reach the first target object simultaneously. The schematic diagram of the delayed focusing emission is shown in fig. 4, and the emission control unit controls each transducer to emit ultrasonic waves in a delayed manner, so that a plurality of ultrasonic signals emitted by the transducers can simultaneously reach a first target object, and the ultrasonic signals reach the first target object as energy and have a focusing effect, so that the signal-to-noise ratio of the system can be well improved.
The delay time parameter is calculated by the distance between the transducer and the first target object and the relative distance between the transducers, and according to the propagation velocity C of the ultrasonic wave. For example, as shown in fig. 5, there are three transducers, the coordinates of the three transducers in the constructed coordinate system X are Z1(0, y1), Z2(0, -y1) and Z3(0,0), respectively, assuming that the (X,0) position of the first object on the X coordinate system, and the Z3 transducer with the coordinate of (0,0) is taken as a reference, the coordinate position of the first object and the coordinate positions of the three transducers are known, the positions of the transducers Z1 and Z2 from the object can be calculated as d1 and d2 respectively by a distance formula between two points, and the propagation velocity C of the ultrasonic wave is known; the time delay parameters for transducer Z1 and transducer Z2 are calculated as (d1-x)/C, (d2-x)/C, since the coordinates of transducer Z3 are used as reference points.
Step S220: calculating an echo time difference based on the received first echo signal and second echo signal.
As one mode, the ultrasonic reflection signal includes a first echo signal and a second echo signal, where the first echo signal is a signal reflected by a side of the first target close to the electronic device, and the second echo signal is a signal reflected by a side of the first target away from the electronic device.
Optionally, the processing unit calculates an echo time difference according to the first echo signal and the second echo signal, and further calculates a deformation amplitude of the first target object generated by the extrusion of the second target object according to the echo time difference and the propagation speed of the ultrasonic wave.
For example, the first target object may be a bladder, the second target object may be a rectum, as shown in fig. 6, fig. 6 is a model for measuring deformation amplitude generated after the bladder is squeezed by the rectum based on an ultrasonic ranging technique, an ultrasonic signal is transmitted to a position a and a position B on two sides of the bladder, an echo is generated due to a difference of acoustic impedances on two sides of an interface, and a receiving processing unit may receive echo R twiceA、RBAnd then the echo time difference delta t can be obtained by calculating the time of the two received echoes.
Step S230: and calculating the deformation amplitude of the first target object generated by the extrusion of a second target object based on the echo time difference.
As one mode, the processing unit may calculate the deformation amplitude of the first target object and the target object extruded by the first target object based on the echo time difference obtained in the above mode and the propagation speed of the ultrasonic wave in the human body.
For example, as shown in fig. 6, knowing the propagation velocity of the ultrasonic wave (the propagation velocity of the ultrasonic wave in the human body is generally 1540m/s) and calculating the echo time difference Δ t, the deformation amplitude d can be obtained1(ii) a When the excrement in the rectum is continuously accumulated, the rectum can start to press the bladder, one side of the bladder is pressed, and the ultrasonic wave signals are transmitted to the two sides of the bladder, the time of ultrasonic echoes at the two sides is shortened, and the ultrasonic wave can be transmitted according to the transmission speed of the ultrasonic waveObtaining the deformation amplitude d2(ii) a Obtaining the deformation amplitude d1And the deformation amplitude d2Then, the deformation amplitude delta d ═ d of the bladder after rectal compression can be obtained1-d2。
Step S240: and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
As a mode, the deformation amplitude of the first target object generated by the extrusion of the second target object is obtained through the above-mentioned calculation, and the occurrence time of the target behavior corresponding to the second target object can be predicted through the linear relationship between the first target object and the second target object.
Illustratively, since the numerical value of the deformation amplitude Δ d generated by the bladder extruded by the rectum has a linear relationship with the accumulated amount M of the excrement in the rectum, the linear relationship y between the deformation amplitude Δ d generated after the bladder is extruded and the accumulated amount M of the excrement in the rectum, which is kx + b, can be obtained through a linear fitting method; the time to defecation can be predicted in this way by setting the threshold value of the accumulation amount of excrement by the derived linear relationship. For example, a relation table between the accumulated amount of excrement M and the defecation time is established, the relation table is established in a calibration mode (for example, when the accumulated amount of excrement reaches M, the predicted defecation time is 10 minutes later), the defecation time is judged in a table look-up mode, meanwhile, the threshold value M can be set, and when the defecation time is about to arrive, the prediction result is sent out through the wireless unit. The defecation time is predicted through threshold grading judgment, and a better user experience can be provided.
According to the method for predicting the behavior occurrence time, the transmission control unit controls the multiple transducers to transmit ultrasonic signals to the first target object in a delayed transmission mode so that the multiple ultrasonic signals transmitted by the multiple transducers can simultaneously reach the first target object, the processing unit calculates echo time differences based on the received first echo signals and the received second echo signals, the processing unit calculates deformation amplitudes of the first target object generated by extrusion of the second target object based on the echo time differences, and the occurrence time of the target behavior corresponding to the second target object is predicted based on the deformation amplitudes. Therefore, the deformation amplitude generated after the first target object is extruded by the second target object is measured through the ultrasonic ranging technology to predict the target behavior occurrence time corresponding to the second target object, and the target behavior occurrence time can be predicted more accurately.
Referring to fig. 7, a method for predicting occurrence time of a behavior according to an embodiment of the present application is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, the ultrasonic device includes a transmission control unit, a reception processing unit, and a plurality of transducers respectively connected to the transmission control unit and the reception processing unit, and the method includes:
step S310: and triggering the transmitting control unit to control the plurality of transducers to transmit the ultrasonic signals to the first target object in a time-delay transmitting mode, so that the plurality of ultrasonic signals transmitted by the plurality of transducers reach the first target object simultaneously.
Step S320: and acquiring first reflection signals reflected by the first target object and received by the plurality of transducers respectively to obtain a plurality of first reflection signals, and acquiring second reflection signals reflected by the first target object and received by the plurality of transducers respectively to obtain a plurality of second reflection signals.
Step S330: and performing beam synthesis on the plurality of first reflection signals to obtain first echo signals, and performing beam synthesis on the plurality of second reflection signals to obtain second echo signals.
It should be noted that, in the process of transmitting and receiving ultrasonic signals, the signal-to-noise ratio of the ultrasonic signals can be improved by controlling the method of ultrasonic transmitting and receiving beam forming.
As one mode, the step of performing beam synthesis on the plurality of first reflection signals to obtain first echo signals, and performing beam synthesis on the plurality of second reflection signals to obtain second echo signals includes: the processing unit performs beam forming on the plurality of first reflection signals in a weighting manner to obtain first echo signals, wherein echo path lengths of the plurality of first reflection signals are different, and in the weighting manner, weighting coefficients of the first reflection signals which are longer than the corresponding echo paths are smaller; the processing unit performs beam forming on the plurality of second reflection signals through a weighting mode to obtain second echo signals, wherein echo path lengths of the plurality of second reflection signals are different, and in the weighting mode, weighting coefficients of the second reflection signals which are longer corresponding to the echo paths are smaller.
Furthermore, the ultrasonic reflection signals are received by a plurality of transducers, and the processing unit processes the echo signals in a beam forming mode; the specific operation method comprises the following steps: it is assumed that echo signals received by a plurality of transducers can be represented by a matrix as R1、R2、R3....Ri]The echo signals received by each transducer are weighted to form a beam, and the beam-formed echo signals can be represented as R ═ ω1R1+ω2R2+ω3R3+...+ωiRiWherein the weighting coefficients effectively form a window function that enhances the echo signal of the main path and reduces the echo signal of the bypass path, e.g., the weighting coefficient of the echo signal of the transducer near the first target object is large and the weighting coefficient of the echo signal of the transducer far from the first target object is small; the schematic diagram of the ultrasonic echo signal beam forming is shown in fig. 8, so that the signal-to-noise ratio of the echo signal can be well improved, and a reliable original signal is provided for the subsequent echo signal processing process.
Step S340: and acquiring envelope information of the first echo signal, and obtaining a characteristic point corresponding to the first echo signal based on the envelope information of the first echo signal.
As one mode, envelope information extraction is performed on the first echo signal based on the first echo signal obtained in the above-mentioned mode, and the envelope information may be obtained by hilbert transform. Further, an envelope signal is obtained through envelope information, and a feature point of the envelope signal is extracted, generally, a maximum peak value P of the envelope signal is extractedMAX。
Step S350: and acquiring envelope information of the second echo signal, and obtaining a characteristic point corresponding to the second echo signal based on the envelope information of the first echo signal.
As one mode, envelope information is extracted from the second echo signal obtained in the above-described mode, and the envelope information may be obtained by hilbert transform. Further, an envelope signal is obtained through envelope information, and a feature point of the envelope signal is extracted, generally, a maximum peak value P of the envelope signal is extractedMAX。
Step S360: and taking the difference value of the time corresponding to the characteristic point corresponding to the first echo signal and the time corresponding to the characteristic point corresponding to the second echo signal as the echo time difference.
As one mode, the processing unit records the time corresponding to the feature points corresponding to the first echo signal and the second echo signal, and performs a difference operation on the time corresponding to the peak point of the two echo signals to obtain the echo time difference. Optionally, the time of the echo signal can be obtained through the number of sampling points and the sampling rate of the ADC.
Step S370: and calculating the deformation amplitude of the first target object generated by the extrusion of a second target object based on the echo time difference.
As one mode, the deformation amplitude of the first target object generated by the extrusion of the second target object can be calculated according to the echo time difference obtained in the above mode and the propagation speed of the ultrasonic signal in the human body.
Step S380: and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
In one approach, the first target may be the bladder and the second target may be the rectum.
Alternatively, as shown in fig. 9, when the bladder is not being rectally compressed, a plurality of transducers receive echo signals to obtain the distance of both sides of the bladder when not being compressed, as shown in fig. 9 (a); when the bladder is extruded by the rectum, the plurality of transducers re-receive the echo signals to obtain the distance between two sides of the extruded bladder, as shown in fig. 9 (b); the displacement d generated after the bladder is extruded can be obtained by differentiating the distance between the two sides when the bladder is not extruded and the distance between the two sides after the bladder is extruded, and the accumulated amount of excrement can be further deduced due to the linear relation between the displacement d and the accumulated excrement amount of the rectum, and finally the defecation time can be predicted.
According to the method for predicting the behavior occurrence time, the emission control unit controls the multiple transducers to emit ultrasonic signals to the first target object in a time delay emission mode, the processing unit obtains the reflection signals reflected by the first target object and received by the multiple transducers, carries out beam forming on the multiple reflection signals to obtain echo signals, obtains envelope information of the echo signals, extracts feature points corresponding to the envelope information of the echo signals, uses the difference value of the time corresponding to the feature points corresponding to the first echo signals and the time corresponding to the feature points corresponding to the second echo signals as the echo time difference, the processing unit calculates the deformation amplitude of the first target object generated by the extrusion of the second target object based on the echo time difference, and predicts the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude. The stability and reliability of the distance measurement are improved by the ultrasonic sensor array and the method for controlling the formation of the transmitting and receiving beams of the ultrasonic waves.
Referring to fig. 10, a method for predicting occurrence time of a behavior according to an embodiment of the present application is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, and the ultrasonic device is connected to the processing unit, and the method includes:
step S410: triggering the ultrasonic device to transmit an ultrasonic signal to the first target object.
Step S420: and calculating the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device.
Step S430: a plurality of configured amplitude thresholds is obtained.
As a mode, a plurality of amplitude threshold values are set according to the deformation amplitude of the first target object generated by the extrusion of the second target object, and different deformation amplitudes correspond to different amplitude threshold values.
Illustratively, a plurality of amplitude thresholds and corresponding rectal fecal matter metrics are set according to the amplitude of deformation of the bladder caused by rectal compression, as shown in the following table:
amplitude of deformation by extrusion | Cumulative amount of rectal discharge |
d1 | M1 |
d2 | M2 |
d3 | M4 |
d4 | M5 |
It can be understood that when the deformation amplitude of the bladder due to rectal extrusion reaches d1, the accumulation amount of excrement in the rectum is judged to be M1.
Step S440: and predicting the occurrence time of the target behavior corresponding to the second target object according to the amplitude threshold value reached by the deformation amplitude.
As one mode, the occurrence time of the target behavior corresponding to the second target object is predicted based on the amplitude threshold value at which the deformation amplitude is determined.
Furthermore, a plurality of amplitude threshold values are set according to the deformation amplitude of the bladder caused by rectal extrusion, and meanwhile, the occurrence time of defecation behaviors is predicted according to different setting of the amplitude threshold values, as shown in the following table:
amplitude of deformation by extrusion | Cumulative amount of rectal discharge | Defecation time prediction |
d1 | M1 | Defecation time takes place after 20 minutes |
d2 | M2 | Defecation time takes place after 15 minutes |
d3 | M4 | Defecation time takes place after 10 minutes |
d4 | M5 | Defecation time occurred after 5 minutes |
Optionally, the occurrence behavior of the target behavior corresponding to the second target object may be determined according to an amplitude threshold reached by a deformation amplitude of the first target object extruded by the second target object. Illustratively, if the amplitude of deformation of the bladder due to rectal compression reaches d1, it is predicted that defecation occurs after 20 minutes.
Furthermore, the defecation time is predicted by setting a threshold value of the excrement accumulation amount, a relation table of the excrement accumulation amount M and the defecation time is established, the establishment of the relation table is carried out in a calibration mode (for example, when the excrement accumulation amount reaches M, the predicted defecation time is after 10 minutes), the defecation time is judged in a table look-up mode, meanwhile, the threshold value M can be set, and when the defecation time is about to reach, the defecation time is sent out through a wireless unit. In this way, the user can be informed of the relevant preparation for defecation, while the terminal can access the public data interface, recommend a nearby public restroom location and plan the optimal route to be displayed in the terminal. By means of linkage with the terminal device, the application of the defecation time prediction result can be well achieved, such as emission of a defecation alarm, recommendation of accessory toilet positions and matching of an optimal route and the like.
The method for predicting the behavior occurrence time is applied to electronic equipment, the electronic equipment comprises an ultrasonic device and a processing unit, the ultrasonic device transmits ultrasonic signals to a first target object, the processing unit calculates deformation amplitude of the first target object generated by extrusion of a second target object based on ultrasonic reflection signals received by the ultrasonic device, a plurality of configured amplitude threshold values are obtained, and the occurrence time of a target behavior corresponding to the second target object is predicted according to the amplitude threshold values reached by the deformation amplitude. The pre-judgment result is sent in a threshold value grading mode, and a good prompt effect can be achieved.
Referring to fig. 11, a method for predicting occurrence time of a behavior according to an embodiment of the present application is applied to an electronic device, where the electronic device includes an ultrasonic device and a processing unit, and the ultrasonic device is connected to the processing unit, and the method includes:
step S510: triggering the ultrasonic device to transmit an ultrasonic signal to the first target object.
Step S520: and calculating the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device.
Step S530: a plurality of configured amplitude thresholds is obtained.
Step S540: if the current amplitude threshold value is a first threshold value in the multiple amplitude threshold values, and the deformation amplitude exceeds the first threshold value, predicting the occurrence time of the target behavior corresponding to the second target object to be a first occurrence time, and triggering prompt information to prompt whether the occurrence time of the target behavior is known or not.
As a mode, the processing unit obtains a deformation amplitude of the current first target object generated by the extrusion of the second target object, and if the deformation amplitude of the first target object generated by the extrusion of the second target object exceeds a set first amplitude threshold, the occurrence time of the target behavior corresponding to the second target object is determined to be the occurrence time of the target behavior corresponding to the set first amplitude threshold. And meanwhile, sending prediction information to the equipment terminal to prompt the occurrence time of the target behavior of the user.
Step S550: if no feedback information is received that characterizes the occurrence time of the known target behavior, modifying the current amplitude threshold to be a second threshold of the plurality of amplitude thresholds, wherein the second threshold is greater than the first threshold.
As a way, if no feedback information is received that characterizes the time of occurrence that the user has known the target behavior, the processing unit will modify the current amplitude threshold to be the second threshold of the set plurality of amplitude thresholds.
Alternatively, when feedback information indicating that the user knows the occurrence time of the target behavior is received, the user waits for a predetermined time and then performs the next operation of predicting the occurrence time of the target behavior.
Step S560: and after waiting for the first specified time, acquiring the deformation amplitude after waiting for the first specified time again, and if the deformation amplitude acquired again exceeds the second threshold, predicting the occurrence time of the target behavior corresponding to the second target object as second occurrence time.
As a mode, after waiting for a certain time, the processing unit obtains again a deformation amplitude generated by the second object being extruded by the current first object after waiting for the specified time, determines whether the current deformation amplitude exceeds a second threshold value of the multiple threshold value amplitudes, and if the current deformation amplitude exceeds the second threshold value, determines that the occurrence time of the target behavior is the occurrence time of the target behavior corresponding to the second threshold value.
Step S570: and if the deformation amplitude exceeds the maximum amplitude threshold value in the plurality of amplitude threshold values, setting the current threshold value as the first threshold value after the occurrence time of the target behavior corresponding to the second target object is predicted and sent for a second designated time.
As a mode, if the obtained deformation amplitude of the current first target object generated by being squeezed by the second target object exceeds the maximum amplitude threshold value of the plurality of set amplitude threshold values, after the processing unit continuously sends the predicted occurrence time of the target behavior corresponding to the second target object for the specified time to the user, the current threshold value is set as the first threshold value of the plurality of amplitude threshold values, and the prediction process is restarted.
The method for predicting the behavior occurrence time is applied to electronic equipment, the electronic equipment comprises an ultrasonic device and a processing unit, the ultrasonic device transmits ultrasonic signals to a first target object, the processing unit calculates deformation amplitude of the first target object generated by extrusion of a second target object based on ultrasonic reflection signals received by the ultrasonic device, a plurality of configured amplitude threshold values are obtained, and the occurrence time of a target behavior corresponding to the second target object is predicted according to the amplitude threshold values reached by the deformation amplitude. The pre-judgment result is sent in a threshold value grading mode, and a good prompt effect can be achieved.
Referring to fig. 12, an electronic device 600 for predicting a behavior occurrence time according to an embodiment of the present application includes:
the processing unit 620 is configured to trigger the ultrasonic device 610 to transmit an ultrasonic signal to the first target object and receive an ultrasonic reflection signal.
The processing unit 620 is configured to calculate a deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device; and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
The processing unit 620 is further configured to calculate an echo time difference based on the received first echo signal and the second echo signal; and calculating the deformation amplitude of the first target object generated by the extrusion of a second target object based on the echo time difference.
The processing unit 620 is further configured to obtain first reflection signals that are reflected by the first target object and received by the multiple transducers respectively, so as to obtain multiple first reflection signals, and obtain second reflection signals that are reflected by the first target object and received by the multiple transducers respectively, so as to obtain multiple second reflection signals; performing beam forming on the plurality of first reflection signals to obtain first echo signals, and performing beam forming on the plurality of second reflection signals to obtain second echo signals; acquiring envelope information of the first echo signal, and obtaining a characteristic point corresponding to the first echo signal based on the envelope information of the first echo signal; acquiring envelope information of the second echo signal, and obtaining a characteristic point corresponding to the second echo signal based on the envelope information of the second echo signal; and taking the difference value of the time corresponding to the characteristic point corresponding to the first echo signal and the time corresponding to the characteristic point corresponding to the second echo signal as the echo time difference.
The processing unit 620 is further configured to perform beam forming on the multiple first reflection signals through a weighting manner to obtain first echo signals, where echo path lengths of the multiple first reflection signals are different, and in the weighting manner, weighting coefficients of the first reflection signals that are longer corresponding to the echo paths are smaller; and performing beam synthesis on the plurality of second reflection signals through a weighting mode to obtain second echo signals, wherein the lengths of echo paths of the plurality of second reflection signals are different, and in the weighting mode, the weighting coefficient of the second reflection signal is smaller when the corresponding echo path is longer.
The processing unit 620 is further configured to obtain a plurality of configured amplitude thresholds; and predicting the occurrence time of the target behavior corresponding to the second target object according to the amplitude threshold value reached by the deformation amplitude.
As shown in fig. 13, the ultrasonic device 610 includes a transmission control unit 612, a reception processing unit 614, and a plurality of transducers 616 connected to the transmission control unit and the reception processing unit, respectively.
The emission control unit 612 is configured to control the multiple transducers to emit the ultrasonic signals to the first target object in a time-delay emission manner, so that the multiple ultrasonic signals emitted by the multiple transducers reach the first target object simultaneously. The ultrasonic transmission control unit 612 mainly performs pulse generation and pulse timing control, and the ultrasonic reception includes signal amplification, digital-to-analog conversion, and the like.
The plurality of transducers 616 are used for converting the electric signals into ultrasonic waves to be emitted, and converting the returned ultrasonic waves into processable electric signals; the implementation mode is generally that the piezoelectric ceramic package is formed by using piezoelectric ceramics with piezoelectric effect as base materials.
Referring to fig. 14, an electronic device 700 for predicting a behavior occurrence time according to an embodiment of the present application includes:
and a power management unit 710 for power input and voltage conversion, wherein the voltage conversion can be implemented by a voltage conversion chip such as LDO, DCDC, and the like.
And the wireless communication unit 720 is used for uploading data to a cloud end, and the general implementation mode is through wireless protocols such as low-power-consumption Bluetooth, ZigBee, wifi and the like.
The micro-control processor 730 is mainly used for connecting the wireless communication unit 720, controlling the transmission of ultrasonic waves and processing received signals at the same time, and the implementation mode can be realized by a micro-processing controller based on an ARM-M series core, a field programmable logic device (FPGA) and the like.
And the emission control unit 740 is used for the transducer to emit ultrasonic waves, and the implementation mode can be established through discrete electronic elements and can also be realized by finding a special integrated IC for ultrasonic transceiving control.
And the receiving control unit 750 is used for the transducer to receive the ultrasonic wave, and the implementation mode can be established through a discrete electronic element and can also be realized by finding a special integrated IC for ultrasonic receiving and transmitting control.
It should be noted that the device embodiment and the method embodiment in the present application correspond to each other, and specific principles in the device embodiment may refer to the contents in the method embodiment, which is not described herein again.
The processing unit of an electronic device provided by the present application will be described with reference to fig. 15.
An embodiment of the present invention provides a processing unit of an electronic device for predicting occurrence time of a behavior, where the processing unit of the electronic device for predicting occurrence time of a behavior includes a processor and a memory, where the memory stores at least one instruction, at least one program, a code set, or an instruction set, and the at least one instruction, the at least one program, the code set, or the instruction set is loaded and executed by the processor to implement the method for predicting occurrence time of a behavior provided in the above method embodiment.
The memory may be used to store software programs and modules, and the processor may execute various functional applications and data processing by operating the software programs and modules stored in the memory. The memory can mainly comprise a program storage area and a data storage area, wherein the program storage area can store an operating system, application programs needed by functions and the like; the storage data area may store data created according to use of the apparatus, and the like. Further, the memory may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device. Accordingly, the memory may also include a memory controller to provide the processor access to the memory.
Fig. 15 is a block diagram of a hardware structure of a processing unit of an electronic device for predicting a behavior occurrence time according to an embodiment of the present invention. As shown in fig. 15, the processing unit 1100 may have a relatively large difference due to different configurations or performances, and may include one or more processors (CPUs) 1110 (the processors 1110 may include but are not limited to processing devices such as a microprocessor MCU or a programmable logic device FPGA), a memory 1130 for storing data, and one or more storage media 1120 (e.g., one or more mass storage devices) for storing applications 1123 or data 1122. The memory 1130 and the storage medium 1120 may be, among other things, transient storage or persistent storage. The program stored in the storage medium 1120 may include one or more modules, each of which may include a series of instructions operating on an electronic device. Still further, the processor 1110 may be arranged in communication with the storage medium 1120, executing a series of instruction operations in the storage medium 1120 on the processing unit 1100. Processing unit 1100 may also include one or more power supplies 1160, one or more wired or wireless network interfaces 1150, one or more input-output interfaces 1140, and/or one or more operating systems 1121, such as windows server, MacOSXTM, unix, linux, FreeBSDTM, and so forth.
The input output interface 1140 may be used to receive or transmit data via a network. Specific examples of the network described above may include a wireless network provided by a communications provider of the processing unit 1100. In one example, i/o interface 1140 includes a network adapter (NIC) that may be coupled to other network devices via a base station to communicate with the internet. In one example, the input/output interface 1140 can be a Radio Frequency (RF) module, which is used for communicating with the internet in a wireless manner.
It will be understood by those skilled in the art that the structure shown in fig. 15 is only an illustration, and does not limit the structure of the processing unit of the electronic device for predicting the occurrence time of the behavior. For example, processing unit 1100 may also include more or fewer components than shown in FIG. 15, or have a different configuration than shown in FIG. 15.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the motion recognition method, and can achieve the same technical effect, and in order to avoid repetition, the detailed description is omitted here. The computer-readable storage medium may be a Read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
According to the method, the electronic device, the system and the storage medium for predicting the behavior occurrence time, firstly, the ultrasonic device transmits an ultrasonic signal to the first target object, the processing unit calculates the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device, and predicts the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude. Therefore, the deformation amplitude generated after the first target object is extruded by the second target object is measured through the ultrasonic ranging technology to predict the target behavior occurrence time corresponding to the second target object, and the target behavior occurrence time can be predicted more accurately.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not necessarily depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (14)
1. A method for predicting behavior occurrence time is characterized by being applied to electronic equipment, wherein the electronic equipment comprises an ultrasonic device and a processing unit, and the ultrasonic device is connected with the processing unit;
triggering the ultrasonic device to transmit an ultrasonic signal to a first target object;
calculating the deformation amplitude of the first target object generated by the extrusion of a second target object based on the ultrasonic reflection signal received by the ultrasonic device;
and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
2. The method of claim 1, wherein the ultrasonic device comprises a transmit control unit, a receive processing unit, and a plurality of transducers respectively connected to the transmit control unit and the receive processing unit, the ultrasonic device transmitting ultrasonic signals to a first target object, comprising:
and triggering the transmitting control unit to control the plurality of transducers to transmit the ultrasonic signals to the first target object in a time-delay transmitting mode, so that the plurality of ultrasonic signals transmitted by the plurality of transducers reach the first target object simultaneously.
3. The method according to claim 2, wherein the ultrasonic reflection signal comprises a first echo signal and a second echo signal, the first echo signal is a signal reflected by a side of the first object close to the electronic device, the second echo signal is a signal reflected by a side of the first object far from the electronic device, and the processing unit calculates a deformation amplitude of the first object caused by the second object when the first object is pressed based on the ultrasonic reflection signal received by the ultrasonic device, and the method comprises:
calculating an echo time difference based on the received first echo signal and the second echo signal;
and calculating the deformation amplitude of the first target object generated by the extrusion of a second target object based on the echo time difference.
4. The method of claim 3, wherein the processing unit calculates an echo time difference based on the received first echo signal and second echo signal, comprising:
acquiring first reflection signals reflected by the first target object and received by the multiple transducers respectively to obtain multiple first reflection signals, and acquiring second reflection signals reflected by the first target object and received by the multiple transducers respectively to obtain multiple second reflection signals;
performing beam forming on the plurality of first reflection signals to obtain first echo signals, and performing beam forming on the plurality of second reflection signals to obtain second echo signals;
acquiring envelope information of the first echo signal, and obtaining a characteristic point corresponding to the first echo signal based on the envelope information of the first echo signal;
acquiring envelope information of the second echo signal, and obtaining a characteristic point corresponding to the second echo signal based on the envelope information of the second echo signal;
and taking the difference value of the time corresponding to the characteristic point corresponding to the first echo signal and the time corresponding to the characteristic point corresponding to the second echo signal as the echo time difference.
5. The method of claim 4, wherein the beam-synthesizing the plurality of first reflected signals to obtain a first echo signal comprises:
performing beam synthesis on the plurality of first reflection signals in a weighting manner to obtain first echo signals, wherein echo path lengths of the plurality of first reflection signals are different, and weighting coefficients of the first reflection signals which correspond to the longer echo paths are smaller in the weighting manner;
the performing beam synthesis on the plurality of second reflection signals to obtain a second echo signal includes:
and performing beam synthesis on the plurality of second reflection signals through a weighting mode to obtain second echo signals, wherein the lengths of echo paths of the plurality of second reflection signals are different, and in the weighting mode, the weighting coefficient of the second reflection signal is smaller when the corresponding echo path is longer.
6. The method of claim 1, wherein predicting the occurrence time of the target behavior corresponding to the second target object based on the magnitude of deformation comprises:
acquiring a plurality of configured amplitude thresholds;
and predicting the occurrence time of the target behavior corresponding to the second target object according to the amplitude threshold value reached by the deformation amplitude.
7. The method of claim 6, wherein predicting the occurrence time of the target behavior corresponding to the second target object according to the amplitude threshold reached by the deformation amplitude comprises:
if the current amplitude threshold value is a first threshold value in the multiple amplitude threshold values and the deformation amplitude exceeds the first threshold value, predicting the occurrence time of the target behavior corresponding to the second target object to be a first occurrence time, and triggering prompt information to prompt whether the occurrence time of the target behavior is known or not;
if no feedback information representing the occurrence time of known target behaviors is received, modifying the current amplitude threshold value to be a second threshold value of the plurality of amplitude threshold values, wherein the second threshold value is larger than the first threshold value;
and after waiting for the first specified time, acquiring the deformation amplitude after waiting for the first specified time again, and if the deformation amplitude acquired again exceeds the second threshold, predicting the occurrence time of the target behavior corresponding to the second target object as second occurrence time.
8. The method of claim 7, further comprising:
and if the deformation amplitude exceeds the maximum amplitude threshold value in the plurality of amplitude threshold values, setting the current threshold value as the first threshold value after the occurrence time of the target behavior corresponding to the second target object is predicted and sent for a second designated time.
9. The method of claim 8, wherein the prompting whether to know the occurrence time of the target behavior comprises:
and if feedback information representing the occurrence time of the known target behavior is received, waiting for the third specified time, and then performing the next prediction operation of the occurrence time of the target behavior.
10. The method of any one of claims 1-9, wherein the first target is the bladder and the second target is the rectum.
11. An electronic device, comprising an ultrasonic device and a processing unit:
the processing unit is used for triggering the ultrasonic device to transmit an ultrasonic signal to a first target object and receiving an ultrasonic reflection signal;
the processing unit is used for calculating the deformation amplitude of the first target object generated by the extrusion of the second target object based on the ultrasonic reflection signal received by the ultrasonic device; and predicting the occurrence time of the target behavior corresponding to the second target object based on the deformation amplitude.
12. The electronic device of claim 11, the ultrasonic device comprising a transmit control unit, a receive processing unit, and a plurality of transducers coupled to the transmit control unit and the receive processing unit, respectively;
the processing unit is used for triggering the transmitting control unit to control the plurality of transducers to transmit the ultrasonic signals to the first target object in a time delay transmitting mode, so that the plurality of ultrasonic signals transmitted by the plurality of transducers simultaneously reach the first target object.
13. The electronic device of claim 11, the processing unit further configured to perform the method of any of claims 3-10.
14. A computer-readable storage medium, having program code stored therein, wherein the program code when executed by a processor performs the method of any of claims 1-10.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010167820.5A CN113384296B (en) | 2020-03-11 | 2020-03-11 | Method for predicting behavior occurrence time, electronic device and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010167820.5A CN113384296B (en) | 2020-03-11 | 2020-03-11 | Method for predicting behavior occurrence time, electronic device and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113384296A true CN113384296A (en) | 2021-09-14 |
CN113384296B CN113384296B (en) | 2023-05-23 |
Family
ID=77615422
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010167820.5A Active CN113384296B (en) | 2020-03-11 | 2020-03-11 | Method for predicting behavior occurrence time, electronic device and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113384296B (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113759339A (en) * | 2021-11-10 | 2021-12-07 | 北京一径科技有限公司 | Echo signal processing method, device, equipment and storage medium |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0271214A2 (en) * | 1986-11-13 | 1988-06-15 | National Aeronautics And Space Administration | Apparatus for measuring the volume of urine in the human bladder |
US20050215896A1 (en) * | 2002-08-09 | 2005-09-29 | Mcmorrow Gerald | Instantaneous ultrasonic echo measurement of bladder volume with a limited number of ultrasound beams |
CN1788685A (en) * | 2004-12-15 | 2006-06-21 | 深圳迈瑞生物医疗电子股份有限公司 | Receiving method and its device based on double beam and synthetic aperture |
US20070123778A1 (en) * | 2003-10-13 | 2007-05-31 | Volurine Israel Ltd. | Bladder measurement |
US20110098565A1 (en) * | 2007-08-27 | 2011-04-28 | Hiroshi Masuzawa | Ultrasound imaging device |
CN107106145A (en) * | 2014-10-03 | 2017-08-29 | 三W日本株式会社 | Defecation prediction meanss and defecation Forecasting Methodology |
US20170258386A1 (en) * | 2014-11-27 | 2017-09-14 | Umc Utrecht Holding B.V. | Wearable ultrasound device for signalling changes in a human or animal body |
CN107530052A (en) * | 2015-05-01 | 2018-01-02 | 三W日本株式会社 | Just estimating device and just method of estimating rate are measured |
CN109069115A (en) * | 2017-06-06 | 2018-12-21 | 深圳迈瑞生物医疗电子股份有限公司 | A kind of method, apparatus and system being imaged in ultrasonic scanning |
CN208492152U (en) * | 2017-12-29 | 2019-02-15 | 成都优途科技有限公司 | A kind of device improving B ultrasound imaging transverse resolving power |
-
2020
- 2020-03-11 CN CN202010167820.5A patent/CN113384296B/en active Active
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP0271214A2 (en) * | 1986-11-13 | 1988-06-15 | National Aeronautics And Space Administration | Apparatus for measuring the volume of urine in the human bladder |
US20050215896A1 (en) * | 2002-08-09 | 2005-09-29 | Mcmorrow Gerald | Instantaneous ultrasonic echo measurement of bladder volume with a limited number of ultrasound beams |
US20070123778A1 (en) * | 2003-10-13 | 2007-05-31 | Volurine Israel Ltd. | Bladder measurement |
CN1788685A (en) * | 2004-12-15 | 2006-06-21 | 深圳迈瑞生物医疗电子股份有限公司 | Receiving method and its device based on double beam and synthetic aperture |
US20110098565A1 (en) * | 2007-08-27 | 2011-04-28 | Hiroshi Masuzawa | Ultrasound imaging device |
CN107106145A (en) * | 2014-10-03 | 2017-08-29 | 三W日本株式会社 | Defecation prediction meanss and defecation Forecasting Methodology |
US20170258386A1 (en) * | 2014-11-27 | 2017-09-14 | Umc Utrecht Holding B.V. | Wearable ultrasound device for signalling changes in a human or animal body |
CN107530052A (en) * | 2015-05-01 | 2018-01-02 | 三W日本株式会社 | Just estimating device and just method of estimating rate are measured |
CN109069115A (en) * | 2017-06-06 | 2018-12-21 | 深圳迈瑞生物医疗电子股份有限公司 | A kind of method, apparatus and system being imaged in ultrasonic scanning |
CN208492152U (en) * | 2017-12-29 | 2019-02-15 | 成都优途科技有限公司 | A kind of device improving B ultrasound imaging transverse resolving power |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113759339A (en) * | 2021-11-10 | 2021-12-07 | 北京一径科技有限公司 | Echo signal processing method, device, equipment and storage medium |
Also Published As
Publication number | Publication date |
---|---|
CN113384296B (en) | 2023-05-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7346029B2 (en) | Pairing a wireless ultrasound probe with a mobile ultrasound system | |
JP4806813B2 (en) | System and method for measuring the height of storage contents | |
JP5575554B2 (en) | Ultrasonic diagnostic equipment | |
Abdelgawad et al. | Structural health monitoring: Internet of things application | |
WO2008060422A2 (en) | Transducer array imaging system | |
CN106580366A (en) | Wireless probe, ultrasonic imaging apparatus, and method for controlling the same | |
JP5812317B2 (en) | Ultrasonic measurement system | |
Horsley et al. | Piezoelectric micromachined ultrasonic transducers in consumer electronics: The next little thing? | |
US20100305450A1 (en) | Ultrasound diagnosis apparatus | |
CN113384296A (en) | Method for predicting behavior occurrence time, electronic device, and storage medium | |
US20100312119A1 (en) | Ultrasonic probe and ultrasonic imaging apparatus | |
JP5442215B2 (en) | Ultrasonic distance measurement system | |
CN105572672A (en) | Ultrasonic pulse-echo ranging device | |
CN1712988B (en) | Adaptive ultrasound imaging system | |
CN109901173A (en) | Ultrasonic ranging method, device and electronic equipment based on duty cycle adjustment | |
KR101654670B1 (en) | Ultrasound system for performing impedence matching | |
US11331083B2 (en) | Ultrasound diagnosis apparatus and method of operating the same | |
JP4974781B2 (en) | Ultrasonic diagnostic equipment | |
JP2003325508A (en) | Ultrasonic diagnostic apparatus | |
CN113574911B (en) | Earphone and wearing state detection method thereof | |
JP2024510429A (en) | Processing circuits, systems and methods for reducing power consumption of ultrasound imaging probes based on interlaced data acquisition and reconstruction algorithms | |
CN102793566B (en) | System and method for generating acoustic radiation force | |
CN218647148U (en) | Air coupling ultrasonic imaging device and robot | |
TW201939062A (en) | Electronic device, ultrasound ranging device and method | |
US20230270408A1 (en) | Ultrasound diagnostic apparatus and control method of ultrasound diagnostic apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |