[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114334082B - Electronic device - Google Patents

Electronic device Download PDF

Info

Publication number
CN114334082B
CN114334082B CN202111649638.4A CN202111649638A CN114334082B CN 114334082 B CN114334082 B CN 114334082B CN 202111649638 A CN202111649638 A CN 202111649638A CN 114334082 B CN114334082 B CN 114334082B
Authority
CN
China
Prior art keywords
training
user
muscle
action
muscle strength
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111649638.4A
Other languages
Chinese (zh)
Other versions
CN114334082A (en
Inventor
张正
杨志宝
杜国生
沈忠美
张常安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Allin Technology Co ltd
Original Assignee
Beijing Ouying Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ouying Information Technology Co ltd filed Critical Beijing Ouying Information Technology Co ltd
Priority to CN202111649638.4A priority Critical patent/CN114334082B/en
Publication of CN114334082A publication Critical patent/CN114334082A/en
Application granted granted Critical
Publication of CN114334082B publication Critical patent/CN114334082B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

Embodiments of the present disclosure provide an electronic device. The electronic device includes: a processor; and a memory coupled with the processor, the memory having instructions stored therein that, when executed by the processor, cause the device to perform operations comprising obtaining motion data of a first user when performing a muscle strength detection action, the muscle strength detection action comprising the first user moving an articulation part to move an action part associated with the articulation part when in a particular pose, and the motion data comprising position data of the action part and position data of the articulation part; and determining a muscle force level of a muscle associated with the joint location based on the motion data. In this way, the muscle force level can be detected with electronic equipment without special equipment and professionals.

Description

Electronic device
Technical Field
Embodiments of the present disclosure relate generally to the field of smart medicine and, more particularly, to an electronic device for detecting muscle strength of a human body part.
Background
Nowadays, in order to further and deeply promote the internet and medical health, an integrated shared service is implemented to improve the level of convenient intelligent humanized service, and the integrated shared service is paid much attention. Specifically, for example, an inline-offline unified fusion is to be achieved. That is to say, while continuously improving offline medical service actions, medical institutions fully utilize information technologies such as the internet and big data to expand service space and content, actively provide online convenient and efficient services for patients, and realize home rehabilitation of patients gradually through follow-up visit management and remote guidance.
Disclosure of Invention
Embodiments of the present disclosure provide a solution to detect a user's muscle strength level and monitor the user's training during a training session.
In a first aspect of the present disclosure, an electronic device is provided, comprising a processor; and a memory coupled with the processor. The memory has instructions stored therein that, when executed by the processor, cause the device to perform operations. The operation comprises the following steps: acquiring motion data of a first user during muscle strength detection action, wherein the muscle strength detection action comprises that the first user moves a joint part to drive an action part associated with the joint part to move when being in a specific pose, and the motion data comprises position data of the action part and position data of the joint part; and determining a muscle force level of a muscle associated with the joint location based on the motion data.
In a second aspect of the present disclosure, an electronic device is provided. The electronic device includes a processor; and a memory coupled to the processor. The memory has instructions stored therein that, when executed by the processor, cause the device to perform operations. The operations include receiving, from a further device, a motion video of a first user, the motion video captured while the first user is performing a muscle force detection action that includes the first user moving a joint part to move an action part associated with the joint part while in a particular pose, the motion data including position data of the action part and position data of the joint part, and a muscle force level associated with the joint part and determined based on the motion data. The operations further include presenting the motion video, the motion data, and the muscle strength level to a second user.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the disclosure, nor is it intended to be used to limit the scope of the disclosure.
Drawings
The foregoing and other objects, features and advantages of the disclosure will be apparent from the following more particular descriptions of exemplary embodiments of the disclosure as illustrated in the accompanying drawings wherein like reference numbers generally represent like parts throughout the exemplary embodiments of the disclosure. In the drawings:
FIG. 1 shows a schematic block diagram of an example environment in which embodiments of the present disclosure may be implemented;
fig. 2 illustrates a signaling diagram of an example interaction of an electronic device during a stage of diagnosing a muscle force level, in accordance with some embodiments of the present disclosure;
FIG. 3 shows a signaling diagram of an example interaction of the example electronic device of FIG. 2 during a user recovery phase;
FIG. 4 illustrates a block diagram of an example electronic device for detecting a muscle force level of a user, in accordance with some embodiments of the present disclosure;
FIG. 5 shows a block diagram of an example electronic device for detecting a muscle force level of a user, in accordance with other embodiments of the present disclosure; and
fig. 6 illustrates a simplified block diagram of an example device suitable for implementing some embodiments of the present disclosure.
Detailed Description
The principles of the present disclosure will be described below with reference to a number of example embodiments shown in the drawings. While the preferred embodiments of the present disclosure have been illustrated in the accompanying drawings, it is to be understood that these embodiments are described merely for the purpose of enabling those skilled in the art to better understand and to practice the present disclosure, and are not intended to limit the scope of the present disclosure in any way.
In describing embodiments of the present disclosure, the terms "include" and "comprise," and similar language, are to be construed as open-ended, i.e., "including but not limited to. The term "based on" should be understood as "based at least in part on". The term "one embodiment" or "the embodiment" should be understood as "at least one embodiment". The terms "first," "second," and the like may refer to different or the same object. Other explicit and implicit definitions are also possible below.
As discussed above, there is also a need to promote "integrated" sharing services in orthopedic medicine. For this purpose, the human body may be detected by a remote device and then the received human body data may be evaluated by a professional, such as a doctor. Currently, such products are known: 3D virtual scene interaction and evaluation training system-Anokan-VR. The Anokan-VR system uses a time-difference range 3D motion capture instrument to create a 3D image of a patient. The motion capture device emits infrared light and receives "echoes" reflected from the patient, and the details of any body movement can be recorded by the software. The system continuously stimulates the patient's response during treatment, stimulating the patient's motion. However, such solutions are expensive and the equipment is too complex to operate. Although the measured muscle force can reach a high level, it is not practical in terms of actual medical treatment and rehabilitation. Therefore, the existing equipment cannot meet the requirements of rehabilitation evaluation, orthopedic follow-up and remote medical evaluation in a medical environment.
In view of the above, the present disclosure is directed to a solution for detecting a muscle strength level of a patient using an electronic device, such as a smartphone. Through the scheme, suggestions and guidance of related specialties can be provided under the condition that the specialties are not present. Especially, under a certain medical scene, the automatic recording of the motion state of a patient or a person who is inconvenient to move is realized, and the classification of the motion of the patient or the person who is inconvenient to move in the medical sense is given. The muscle strength and activity degree evaluation system based on the human body key point acquisition technology finishes human body key point acquisition through the camera, and achieves accurate grading on the muscle strength and the activity degree through regulating the elastic band (regulating the resistance) and regulating the examination actions including the body position limb angle, so that the system is easy to operate, reduces the practical learning cost of a patient, saves the traffic cost, the economic cost, the time cost and the family accompany cost of the patient for the patient to go to a hospital for a return visit and a return visit, greatly saves the cost of each aspect of the patient, saves the medical resources, reduces the return visit difficulty and improves the medical efficiency. Finally, the home rehabilitation quality of the patient can be improved, and the accuracy of the remote diagnosis physical examination can be improved.
Fig. 1 illustrates a schematic diagram of an example environment 100 in which various embodiments of the present disclosure can be implemented. As shown in FIG. 1, environment 100 includes a first user 110, a first computing device 130, a second computing device 140, and a second user 150. The first computing device 130 and the second computing device 140 may be, for example, devices with computing capabilities, examples of which include, but are not limited to: cloud-side servers, smart phones, laptops, tablets, desktops, edge computing devices, or the like. The first computing device 130 and the second computing device 140 may communicate to exchange data. In the example of fig. 1, the first computing device 130 comprises, for example, a smartphone, and the second computing device 140 comprises, for example, a computer. The first user 110 is for example a patient and the second user 150 is for example a doctor.
In the example of fig. 1, the first computing device 130 may receive body or motion data of the first user 110 detected by a particular means. In some embodiments, the motion data may be obtained by capturing a motion video of the first user 110 with a camera device and then processing the motion video. In some embodiments, the data of the first user 110 may also be detected by having the first user 110 wear a wearable device. The wearable device may be, for example, a sensor worn at the site to be detected. Upon detection with the wearable device, the first computing device 130 may also simultaneously capture motion video of the first user 110 with the camera device.
As shown in fig. 1, the first computing device 130 also includes a screen 131, and it can be seen that a gray mask is displayed on the screen 131. The middle part of the mask is provided with a human-shaped space. In a position corresponding to the human-shaped space, a human image 120 of the first user 110 is displayed. The position data of the human body part detected by the first computing device 130 is displayed on the screen. In the example shown in fig. 1, human key points of 8 human parts are exemplarily shown. These 8 body key points represent two shoulders, two wrists and two waists, and two ankles, respectively. When the first computing device 130 receives or detects the motion data of the first user 110, the motion data may be processed and/or transmitted to the second computing device 140. When the second computing device 140 receives the motion data, the motion data may be directly presented to the second user 150, or the motion data may be processed and then the processing result may be presented to the second user 150. When the second user 150 finishes reviewing the presented action data or processing results, his confirmation or diagnostic information may be input to the second computing device 140. The confirmation or diagnostic information is then presented to the first user 110 via the second computing device 140 and the first computing device 130 in turn.
An interactive flow for detecting a user's muscle strength level and recovering a training phase according to some embodiments of the present disclosure will be described below with reference to fig. 2 and 3. Fig. 2 illustrates a signaling diagram of an example interaction 200 of an electronic device during a stage of diagnosing a muscle force level, in accordance with some embodiments of the present disclosure. The example interaction 200 may be performed, for example, by a first computing device 201 and a second computing device 202. The first computing device 201 may correspond, for example, to the first computing device 130 in fig. 1 discussed above, and the second computing device 202 may correspond, for example, to the second computing device 140.
In the example shown in fig. 2, at 205, the first computing device 130 acquires motion data of the first user 110 (e.g., a patient) while performing a muscle strength detection action. The muscle force detecting action is, for example, that the first user 110 moves the joint part to move the action part associated with the joint part while being in a specific posture, and the movement data includes position data of the action part and position data of the joint part. In some embodiments, the first user 110 indicates that the muscle strength level of the shoulders is to be detected, then the first user 110 may first stand to position himself in a standing position, and then move the shoulders to move their associated arms and wrists. For example, the action by the first user 110 may be an action presented in a canonical action video. In some embodiments, the first computing device 130 may select a canonical action video corresponding to the portion to be detected by the first user 110 from a canonical action video library according to the information of the first user 110 for the first user 110 to learn.
In some embodiments, the first computing device 130 may acquire motion video of the first user during the muscle force detection action and process the motion video using a human keypoint capture model (e.g., mediaPipe developed by Google, inc.) to determine position data of joint parts and position data of joints. In this way, the motion data of the human body can be accurately determined by the video. In some examples, the second computing device 140 may also issue a reminder for a follow-up or muscle strength check, or to schedule a next follow-up or muscle strength check. The first computing device 130 presents the appointment or reminder to the first user 110 upon receiving it from the second computing device 140.
In some examples, the first computing device 130 may present reminders to the first user 110 and receive feedback from the first user 110. When positive feedback is received, the first computing device 130 may automatically pop up the canonical video library, display the mask, open the camera, and begin recording the video.
At 210, the first computing device 130 determines a muscle force level of a muscle associated with the joint site based on the motion data.
In this way, the muscle strength level of the user can be detected without requiring special equipment and professionals, thereby saving cost and improving convenience.
Generally, the muscle strength level can be determined by the patient, in the case of performing a predetermined normative movement, according to the magnitude of the movement, i.e. the angle of the movement part to the body. In some embodiments, the muscle force level may be determined by the first computing device 130 as follows. First, the first computing device 130 determines a frame of reference associated with a particular pose. For example, in the standing position, the reference frame is a torso extending in a vertical direction. The first computing device 130 then determines the relative angle of the action site with respect to the reference frame. Finally, based on the relative angle, a muscle force level of a muscle associated with the joint location is determined. For example, the muscle force level may be determined from a medically predetermined correlation of the relative angle to the muscle force level. For example, according to the obtained relative angle and a preset relative angle range, a corresponding judgment on classification is given, such as: when the relative angle < =20 °, then the muscle strength of the first user is diagnosed to be level 2 or below level 2, and a voice prompt may be given: "your results of muscle strength diagnosis are grade 2 or below grade 2". For example, when the relative angle >20 °, the muscle strength of the first user is diagnosed to be above level 2, and a voice prompt "your movement classification result is above level 2" may be given.
In some embodiments, the acquisition of motion data is stopped when the relative angle remains constant for a predetermined period of time. For example, when the relative angle is kept constant for a predetermined period of time (e.g., 10 seconds), it can be concluded that the user has reached the extreme position where movement is possible, and thus the data required for classification has been acquired and the acquisition of data can be ended, and video recording can be ended.
In some embodiments, the pose of the human body may include: standing, sitting and side lying; the action site may include: shoulder, elbow, knee, hip and neck; the standard actions may include: inward expansion, outward contraction, flexion, extension, inward rotation, outward rotation and lateral flexion.
At 215, the first computing device 201 sends the motion video, the motion data, and the determined muscle strength level to the second computing device 202 for presentation to the second user 150.
At 220, the second computing device 202 receives the motion video, the motion data, and the muscle strength level of the first user 110 from the first computing device 201 that were acquired by the first computing device 201 in the previous steps.
At 225, the second computing device 202 presents the motion video, the motion data, and the muscle strength level to the second user 150 so that the second user 150 (e.g., a physician) can make a diagnosis based on the information.
At 230, the second computing device 202 receives a first diagnostic input for the muscle force level by the second user.
At 235, the second computing device 202 generates a muscle strength training program for the first user based on the first diagnostic input, the muscle strength training program including a plurality of sets of training content, the training content including a training date, a training duration, and a predetermined training action. In some embodiments, the training content further includes an exemplary training action for the first user 110 to learn. In some embodiments, the second user may monitor the first user's action video, the accuracy of the diagnosis, and the pertinence of the muscle strength training program, and may modify the results and protocols at any time, with confirmation of error, before entry into the second computing device.
At 240, the second computing device 202 transmits a muscle strength training program to the additional device.
At 245, the first computing device 201 receives a muscle strength training program associated with the muscle strength level from the second computing device 202.
At this point, the diagnosis of the first user 110 is completed. Utilize many human key point recognition technology in order to accomplish long-range muscle power aassessment, belong to remote medical treatment, long-range function evaluation field, in the muscle power inspection process, realized doctor's remote follow-up. Both the assessment and follow-up videos are analyzed to form data and stored. In addition, these data can also be correlated with patient cases, ultimately giving automated assessment results and related training recommendations. The doctor can participate, supervise and correct the results in the whole process; the patient can receive the evaluation report and the training suggestion transmitted by the mobile phone terminal at home. The patient can complete the return visit conveniently at home, the return visit cost of the patient is saved, the follow-up visit time of a doctor is saved, and more high-quality medical resources are released to meet the medical needs of more people.
The process of the first user 110 at the stage of training according to the muscle strength training program will be described below with reference to fig. 3. Fig. 3 shows a signaling diagram of an example interaction 300 of the example electronic device of fig. 2 during a user recovery phase.
At 305, the first computing device 201 obtains a training video of the first user 110 while engaged in training content. In some embodiments, the muscle strength training program received by the first computing device 201 may include a predetermined training time for performing the muscle strength training. The first computing device 201 may automatically initiate video recording when a predetermined training time is reached.
At 310, the first computing device 201 determines a difference in a training action in the training video of the first user from a predetermined training action. At 315, the first computing device 201 determines that the training action is not qualified in response to determining that the difference is greater than the predetermined threshold. At 320, the first computing device 201 determines a number of times that the training action failed during the training duration. At 325, the first computing device 201 determines training motion data based on the training video in response to determining the number of times is greater than the predetermined number of times. Thus, the first computing device 201 determines that the first user 110 is unable to complete the training program, and needs to re-evaluate it and re-determine the training program.
Thereafter, at 330, the first computing device 201 determines a training muscle strength level of the user based on the training motion data. At 335, the first computing device 201 sends the training video, the training motion data, and the current training muscle force level to the second computing device 202 for presentation to the second user 150.
At 340, the second computing device 202 receives the training video of the first user 110, the training motion data associated with the training video, and the training muscle strength level from the first computing device 201. At 345, the second computing device 202 presents the training video, the training motion data, and the training muscle force level to the second user. At 350, the second computing device 202 receives a second diagnostic input of the second user for the training muscle strength level. At 355, the second computing device 202 generates an updated muscle strength training program based on the second diagnostic input. The updated muscle strength training program comprises at least one of: an updated training date, an updated training duration, and an updated predetermined training action. At 360, the second computing device 202 sends the updated muscle strength training program to the first computing device 201.
At 365, the first computing device 201 receives the updated muscle strength training program from the additional device. In this way, the muscle strength training program is updated for the first user 110, correcting errors in time, and avoiding user injuries.
The program is suitable for non-traditional diagnosis and treatment modes such as postoperative follow-up, remote consultation and remote rehabilitation scenes in the medical field. The method relates to a technology combining orthopedic postoperative muscle strength examination, a portrait recognition technology, human body key point capturing and the like, mainly solves the partial diagnosis needs of patients in some remote areas, realizes remote diagnosis of doctors, increases the risk of repeated diagnosis and follow-up visit during epidemic situations, and aims to meet the numerous evaluation needs, accurate human body key points are captured at any time through standard muscle strength examination actions, reasonable algorithms are adopted, and muscle strength data are accurately and objectively quantized.
Fig. 4 illustrates a block diagram of an example electronic device 400 for detecting a muscle force level of a user, in accordance with some embodiments of the present disclosure. The electronic device 400 corresponds to, for example, the first computing device 130.
In the example shown in fig. 4, device 400 includes a processor 410 and a memory 420 coupled to processor 410. Stored in memory 420 are instructions 422 and instructions 424. The instructions 422, when executed by the processor 410, cause the electronic device 400 to perform: the method comprises the steps of acquiring motion data of a first user during muscle strength detection action, wherein the muscle strength detection action comprises the step that the first user moves a joint part to drive an action part associated with the joint part to move when being in a specific pose, and the motion data comprises position data of the action part and position data of the joint part. When executed by the processor 410, the instructions 424 cause the electronic device 400 to perform: based on the motion data, a muscle force level of a muscle associated with the joint location is determined.
Fig. 5 illustrates a block diagram of an example electronic device 500 for detecting a muscle force level of a user, in accordance with other embodiments of the present disclosure. The electronic device 500 corresponds to the second computing device 140, for example.
In the example shown in fig. 5, the device 500 includes a processor 510 and a memory 520 coupled to the processor 510. Instructions 522 and 524 are stored in memory 520. The instructions 522, when executed by the processor 510, cause the electronic device 500 to perform: receiving, from a further device, a motion video of the first user, the motion video captured while the first user is performing a muscle force detection action that includes the first user moving a joint part to move a motion part associated with the joint part while in a particular pose, motion data that includes position data of the motion part and position data of the joint part, and a muscle force level associated with the joint part and determined based on the motion data. When executed by the processor 510, the instructions 524 cause the electronic device 500 to perform: presenting the motion video, the motion data, and the muscle strength level to a second user.
Fig. 6 illustrates a schematic block diagram of an example device 600 that can be used to implement embodiments of the present disclosure. For example, a computing device according to embodiments of the disclosure (e.g., computing device 130 in FIG. 1 may be implemented by device 600. As shown, device 600 includes a Central Processing Unit (CPU) 601 that may perform various appropriate actions and processes according to computer program instructions stored in a Read Only Memory (ROM) 602 or loaded from a storage unit 608 into a Random Access Memory (RAM) 603. In RAM 603, various programs and data required for operation of device 600 may also be stored.
A number of components in the device 600 are connected to the I/O interface 605, including: an input unit 606 such as a keyboard, a mouse, or the like; an output unit 607 such as various types of displays, speakers, and the like; a storage unit 608, such as a magnetic disk, optical disk, or the like; and a communication unit 609 such as a network card, modem, wireless communication transceiver, etc. The communication unit 609 allows the device 600 to exchange information/data with other devices via a computer network such as the internet and/or various telecommunication networks.
Various processes and processes described above, such as process 200, may be performed by processing unit 601. For example, in some embodiments, process 200 may be implemented as a computer software program tangibly embodied in a machine-readable medium, such as storage unit 608. In some embodiments, part or all of a computer program may be loaded onto and/or installed onto device 600 via ROM 602 and/or communications unit 609. When the computer program is loaded into RAM 603 and executed by CPU601, one or more acts of process 500 and/or process 1200 described above may be performed.
The present disclosure may be methods, apparatus, systems, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for performing various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as a punch card or an in-groove protruding structure with instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device over a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
Computer program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processing unit of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processing unit of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the specific embodiments. The terminology used herein was chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (9)

1. An electronic device, comprising:
a processor; and
a memory coupled with the processor, the memory having instructions stored therein that, when executed by the processor, cause the device to perform operations comprising:
acquiring motion data of a first user when autonomously performing a muscle strength detection action, wherein the muscle strength detection action comprises that the first user moves a joint part to drive a motion part associated with the joint part to move when being in a specific pose, and the motion data comprises position data of the motion part and position data of the joint part, and the acquiring the motion data of the first user when performing the muscle strength detection action comprises the following steps:
acquiring a motion video of the first user during the autonomous performance of the muscle strength detection action;
processing the motion video using a human keypoint capture model to determine position data of the joint portion and position data of the action portion;
determining an included angle of a reference frame associated with the action part and a specific pose based on the position data of the joint part and the position data of the action part; and
determining a muscle force level of a muscle associated with the joint location based on the included angle and a predetermined control relationship.
2. The electronic device of claim 1, determining a muscle force level of a muscle associated with the joint site comprising:
determining a frame of reference associated with the particular pose;
determining a relative angle of the action site with respect to the reference frame; and
based on the relative angle, a muscle force level of a muscle associated with the joint site is determined.
3. The electronic device of claim 1, the operations further comprising:
transmitting the motion video, the motion data, and the determined muscle strength level to a further device for presentation to a second user; and
receiving, from the further device, a muscle strength training program associated with the muscle strength level, the muscle strength training program comprising a plurality of sets of training content, the training content comprising a training date, a training duration, and a predetermined training action.
4. The electronic device of claim 3, the operations further comprising:
acquiring a training video of the first user during the training content;
determining a difference between a training action in the training video of the first user and the predetermined training action; and
in response to determining that the difference is greater than a predetermined threshold, determining that the training action is not qualified.
5. The electronic device of claim 4, the operations further comprising:
determining the number of times that the training action is not qualified during the training duration;
in response to determining that the number of times is greater than a predetermined number of times, determining training motion data based on the training video; and
determining a training muscle force level of the user based on the training motion data.
6. The electronic device of claim 5, the operations further comprising:
transmitting the training video, the training motion data, and the training muscle strength level to the further device for presentation to the second user; and
receiving an updated muscle strength training program from the further device, the updated muscle strength training program comprising at least one of: an updated training date, an updated training duration, and an updated predetermined training action.
7. An electronic device, comprising:
a processor; and
a memory coupled with the processor, the memory having instructions stored therein that, when executed by the processor, cause the device to perform operations comprising:
receiving motion video of a first user, motion data, and a muscle force level from a further device, wherein the motion video is captured while the first user is voluntarily performing a muscle force detection action that includes the first user moving a joint part to move an action part associated with the joint part while in a particular pose, the motion data includes position data of the action part and position data of the joint part, the muscle force level is associated with the joint part and is determined based on the motion data, wherein the motion data is determined by processing the motion video using a human keypoint capture model, and the muscle force level is determined based on an angle of a reference frame associated with the action part and the particular pose and a predetermined contrast relationship; and
presenting the motion video, the motion data, and the muscle strength level to a second user.
8. The electronic device of claim 7, the operations further comprising:
receiving a first diagnostic input by the second user for the muscle force level;
generating a muscle strength training program for the first user based on the first diagnostic input, the muscle strength training program comprising a plurality of sets of training content, the training content comprising a training date, a training duration, and a predetermined training action; and
transmitting the muscle strength training program to the further device.
9. The electronic device of claim 7, the operations further comprising:
receiving, from the further device, a training video of the first user, training motion data associated with the training video, and a training muscle force level;
presenting the training video, the training motion data, and the training muscle force level to the second user;
receiving a second diagnostic input of the second user for the training muscle force level; generating, based on the second diagnostic input, an updated muscle strength training program comprising at least one of: updated training dates, updated training durations and updated predetermined training actions, an
Transmitting the updated muscle strength training program to the further device.
CN202111649638.4A 2021-12-30 2021-12-30 Electronic device Active CN114334082B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111649638.4A CN114334082B (en) 2021-12-30 2021-12-30 Electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111649638.4A CN114334082B (en) 2021-12-30 2021-12-30 Electronic device

Publications (2)

Publication Number Publication Date
CN114334082A CN114334082A (en) 2022-04-12
CN114334082B true CN114334082B (en) 2023-04-14

Family

ID=81018007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111649638.4A Active CN114334082B (en) 2021-12-30 2021-12-30 Electronic device

Country Status (1)

Country Link
CN (1) CN114334082B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109599165A (en) * 2019-01-30 2019-04-09 浙江强脑科技有限公司 Rehabilitation exercise training method, system and readable storage medium storing program for executing

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109331453A (en) * 2018-08-07 2019-02-15 燕山大学 The virtual rehabilitation system and training method interacted based on EMG feedback with Kinect
CN109199417A (en) * 2018-09-06 2019-01-15 中山大学 A kind of augmented reality method and system for athletic rehabilitation treatment
CN109550222A (en) * 2019-01-09 2019-04-02 浙江强脑科技有限公司 Electric body building training method, system and readable storage medium storing program for executing
CN110364235A (en) * 2019-06-13 2019-10-22 缤刻普达(北京)科技有限责任公司 User movement plan customizes method, apparatus, computer equipment and storage medium
US11321375B2 (en) * 2020-06-22 2022-05-03 Paypal, Inc. Text object management system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109599165A (en) * 2019-01-30 2019-04-09 浙江强脑科技有限公司 Rehabilitation exercise training method, system and readable storage medium storing program for executing

Also Published As

Publication number Publication date
CN114334082A (en) 2022-04-12

Similar Documents

Publication Publication Date Title
US11069144B2 (en) Systems and methods for augmented reality body movement guidance and measurement
WO2020249855A1 (en) An image processing arrangement for physiotherapy
US9600934B2 (en) Augmented-reality range-of-motion therapy system and method of operation thereof
US20150320343A1 (en) Motion information processing apparatus and method
US20210304001A1 (en) Multi-head neural network model to simultaneously predict multiple physiological signals from facial RGB video
Pavón-Pulido et al. IoT architecture for smart control of an exoskeleton robot in rehabilitation by using a natural user interface based on gestures
US20200375467A1 (en) Telemedicine application of video analysis and motion augmentation
CN113990440A (en) Human skeleton rehabilitation training method and device, electronic equipment and storage medium
US20240194358A1 (en) Systems and methods for automated pricing, conduction, and transcription of telemedicine encounters
Cotton Kinematic tracking of rehabilitation patients with markerless pose estimation fused with wearable inertial sensors
Yi et al. [Retracted] Home Interactive Elderly Care Two‐Way Video Healthcare System Design
CN111045575A (en) Diagnosis and treatment interaction method and diagnosis and treatment terminal equipment
US11636777B2 (en) System and method for improving exercise performance using a mobile device
KR20140082449A (en) Health and rehabilitation apparatus based on natural interaction
Wang et al. A webcam-based machine learning approach for three-dimensional range of motion evaluation
Lopez et al. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™‐Based Software
CN114334082B (en) Electronic device
AU2019216976A1 (en) Virtual and augmented reality telecommunication platforms
Shaji et al. A real-time IoMT enabled remote cardiac rehabilitation framework
CN113033526A (en) Computer-implemented method, electronic device and computer program product
Narváez et al. Kushkalla: a web-based platform to improve functional movement rehabilitation
CN113869090A (en) Fall risk assessment method and device
Fook et al. Innovative platform for tele-physiotherapy
Cesarini et al. Simplifying tele-rehabilitation devices for their practical use in non-clinical environments
WO2021014149A1 (en) Methods and systems for musculoskeletal rehabilitation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20240829

Address after: Unit 5B5, 5th Floor, Building 7, Guanghua Road, Chaoyang District, Beijing 100020

Patentee after: BEIJING ALLIN TECHNOLOGY CO.,LTD.

Country or region after: China

Address before: 100020 room 702, 7 / F, building 9, Guanghua Road, Chaoyang District, Beijing

Patentee before: Beijing ouying Information Technology Co.,Ltd.

Country or region before: China

TR01 Transfer of patent right