[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN118213092B - Remote medical supervision system for chronic wound diseases - Google Patents

Remote medical supervision system for chronic wound diseases Download PDF

Info

Publication number
CN118213092B
CN118213092B CN202410306469.1A CN202410306469A CN118213092B CN 118213092 B CN118213092 B CN 118213092B CN 202410306469 A CN202410306469 A CN 202410306469A CN 118213092 B CN118213092 B CN 118213092B
Authority
CN
China
Prior art keywords
wound
patient
module
chronic wound
filling frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410306469.1A
Other languages
Chinese (zh)
Other versions
CN118213092A (en
Inventor
黄海玲
欧伟强
曹晗宸
刘宏伟
张雅丽
刘晖
曹凤
卜敏
李升红
廖选
周燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
First Affiliated Hospital of Jinan University
Original Assignee
First Affiliated Hospital of Jinan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by First Affiliated Hospital of Jinan University filed Critical First Affiliated Hospital of Jinan University
Priority to CN202410306469.1A priority Critical patent/CN118213092B/en
Publication of CN118213092A publication Critical patent/CN118213092A/en
Application granted granted Critical
Publication of CN118213092B publication Critical patent/CN118213092B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H10/00ICT specially adapted for the handling or processing of patient-related medical or healthcare data
    • G16H10/60ICT specially adapted for the handling or processing of patient-related medical or healthcare data for patient-specific data, e.g. for electronic patient records
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A90/00Technologies having an indirect contribution to adaptation to climate change
    • Y02A90/10Information and communication technologies [ICT] supporting adaptation to climate change, e.g. for weather forecasting or climate simulation

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Molecular Biology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computational Linguistics (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Pathology (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention discloses a remote medical supervision system for chronic wound diseases, which comprises a patient user terminal, a medical user terminal and a cloud server, wherein the patient user terminal and the medical user terminal are respectively in communication connection with the cloud server; the patient user terminal comprises a wound surface image acquisition module and a basic information acquisition module, wherein the wound surface image acquisition module is used for acquiring preliminary chronic wound surface image information of a patient and preprocessing the preliminary chronic wound surface image information of the patient; the basic information acquisition module is used for acquiring basic information of a patient; the cloud server comprises a sample acquisition module, a wound analysis model construction module, a model training module, a wound analysis module, a storage module and an electronic case module; the medical user terminal comprises a medical diagnosis module, a patient supervision module and a doctor-patient interaction module; the invention can effectively monitor the wound surface of the patient with chronic wound surface diseases remotely, reduce the times that the patient goes to and from the hospital, and improve the life quality of the patient.

Description

Remote medical supervision system for chronic wound diseases
Technical Field
The invention belongs to the technical field of remote medical supervision, and particularly relates to a remote medical supervision system for chronic wound diseases.
Background
At present, medical staff tries to follow-up and monitor chronic wound disease patients by using a remote medical means, so that most chronic wound disease patients with difficult actions can furthest reduce the times of going to hospitals in daily nursing, and the treatment course cost of the patients is reduced.
In the prior art, the remote medical system for patients with chronic wound diseases lacks effective management, and medical staff is difficult to effectively judge the chronic wound of the patients due to different understanding conditions of the patients to the conditions of the patients, so that the recurrence period of the wound is possibly shortened, and poor results are produced on wound healing of the patients, so that the remote medical supervision system capable of reducing the working intensity of medical staff, improving the effective rate of remote wound diagnosis and improving the home care level of a chronic window is urgently needed at present.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a remote medical supervision system for chronic wound diseases, which can enable medical staff to effectively supervise the wound of a patient suffering from the chronic wound diseases remotely, reduce the times of the patient going to and from a hospital, further reduce the treatment course cost of the patient and improve the life quality of the patient.
The invention can be achieved by adopting the following technical scheme: the remote medical supervision system for chronic wound diseases comprises a patient user terminal, a medical user terminal and a cloud server, wherein the patient user terminal and the medical user terminal are respectively in communication connection with the cloud server;
The patient user terminal comprises a wound surface image acquisition module and a basic information acquisition module, wherein the wound surface image acquisition module is used for acquiring preliminary chronic wound surface image information of a patient and preprocessing the preliminary chronic wound surface image information of the patient; the basic information acquisition module is used for acquiring basic information of a patient;
The cloud server comprises a sample acquisition module, a wound analysis model construction module, a model training module, a wound analysis module, a storage module, an alarm module and an electronic case module; the sample acquisition module is used for acquiring chronic wound surface image information to be identified and constructing a corresponding chronic wound surface image data set; the wound analysis model construction module is used for constructing a wound analysis network model; the model training module is used for training the wound analysis network model by adopting the chronic wound image data set to obtain a trained wound analysis network model; the storage module is used for storing all interaction data of the cloud server; the wound analysis module is used for analyzing the preprocessed chronic image information through the trained wound analysis network model to obtain wound analysis information; the alarm module is used for sending out corresponding alarm signals when the wound analysis information obtained by the wound analysis module is abnormal; the electronic case module is used for storing an electronic medical record of a patient, wherein the electronic medical record comprises the basic information and the historical wound surface analysis information of the patient;
the medical user terminal comprises a medical diagnosis module, a patient supervision module, an alarm receiving module and a doctor-patient interaction module; the medical diagnosis module is used for making treatment guidance and personalized intervention measures for the patient according to the wound analysis information and the electronic medical record, and recording the treatment guidance and personalized intervention measures to the electronic medical record corresponding to the patient; the patient monitoring module is used for regularly monitoring chronic wound image information, wound analysis information and basic disease conditions of a patient, and evaluating the health conditions of the patient according to the chronic wound image information, the wound analysis information and the basic disease conditions; the alarm receiving module is used for receiving alarm information sent by the alarm module, and a doctor recommends the next face diagnosis time of a patient according to the alarm information; the doctor-patient interaction module is used for communication between doctors and patients aiming at illness states.
Preferably, the wound surface image acquisition module comprises a wound filling frame unit, a verification model unit and a shooting unit, wherein the wound filling frame unit is used for generating a contour filling frame of a corresponding human body part, and the contour filling frame is arranged in a shooting interface of the shooting unit; when the device works, a patient is attached to the outline of the human body part around the wound by the outline filling frame and then shoots, and the preliminary chronic wound image information is obtained.
Preferably, the working steps of the verification model unit are as follows:
S1, generating an initial chronic wound image after a patient shoots, and dividing the initial chronic wound image into wound images based on skin detection through a pre-trained verification model to obtain a pixel area SA1 of a human body part in the initial chronic wound image;
S2, acquiring a pixel area SA2 of a preset filling frame through a wound filling frame unit;
s3, overlapping the pixel area SA1 and the pixel area SA2 to obtain an overlapping intersection area SA3;
S4, judging whether the patient needs to correct posture and shoot the angle again by setting the accuracy K1 and the accuracy K2 as judging conditions.
Preferably, the contour filling frame of the human body part comprises a buttock contour filling frame, a left lower limb top view filling frame, a left lower limb left view filling frame, a left lower limb bottom view filling frame, a left lower limb right view filling frame, a left lower limb foot plate filling frame, a left foot surface filling frame, a right lower limb top view filling frame, a right lower limb left view filling frame, a right lower limb bottom view filling frame, a right lower limb right view filling frame, a right lower limb foot plate filling frame, a right foot surface filling frame, a prone position head filling frame, a left lateral position head filling frame, a right lateral position head filling frame, a supine position chest and abdomen filling frame, a prone position back filling frame and a special position filling frame
Preferably, the wound analysis information comprises a chronic wound pixel ratio and chronic wound image information, wherein the chronic wound pixel ratio is the ratio of the number of pixels of the chronic wound to the number of pixels of the human body part output by the wound analysis network model; the chronic wound pixel duty cycle and the chronic wound image information are used to assist a physician in analyzing the current chronic wound area and severity of a patient.
Preferably, the construction process of the wound surface analysis network model is as follows:
(1) Firstly, a VGG-16 deep convolutional neural network model is adopted as an initial model of the wound surface analysis network model, wherein the initial wound surface analysis network model consists of a first convolutional block, a second convolutional block, a third convolutional block, a fourth convolutional block, a fifth convolutional block, a first full-connection layer, a second full-connection layer and a third full-connection layer which are sequentially connected;
(2) Then, the first full connection layer, the second full connection layer and the third full connection layer in the initial wound surface analysis network model are adjusted to be convolution layers, then new convolution layers are a fourteenth convolution layer, a fifteenth convolution layer and a sixteenth convolution layer, the number of convolution kernels of the fourteenth convolution layer is set to be 512, the convolution kernel scale is 1 multiplied by 1, the number of convolution kernels of the fifteenth convolution layer is 1024, the convolution kernel scale is 1 multiplied by 1, then the number of convolution kernels of the sixteenth convolution layer is 1024, the number of output channels is adjusted to be 3, and the convolution kernel scale is set to be 1 multiplied by 1;
(3) Improving the network adjusted in the step (2) by adopting a bilinear interpolation method, so that the output image size of the improved network is restored to be the same as the input image size;
(4) And (3) taking the network model adjusted in the step (3) as the wound analysis network model.
Preferably, the training process of the wound surface analysis network model is as follows:
(a) Acquiring chronic wound surface image information to be identified in clinic, constructing a chronic wound surface image data set according to the chronic wound surface image information to be identified, and performing data enhancement processing on image data in the chronic wound surface image data set, wherein the data enhancement processing comprises translation, rotation, mirror symmetry and shearing of the image data; dividing a training set and a verification set from the processed chronic wound image dataset;
(b) Labeling the background, the human body part and the chronic wound surface in the image data in the training set by using CVAT, labelIMG or LabelStudio image labeling software; thereby completing semantic annotation and image segmentation of the training set and obtaining a processed training set;
(c) Inputting the processed image data in the training set into the wound analysis network model, and circularly adjusting the weight value of each layer of convolution kernel in the model by adopting a gradient descent algorithm, thereby improving the semantic classification accuracy of each pixel;
(d) And after the weight value of the model tends to be stable, the training result of the K-fold cross validation model is used, and after the model passes the validation, the trained wound analysis network model is obtained.
Preferably, the basic information includes patient name, height, weight, age, patient number, basic disease, duration and number of follow-up visits, current medication, surgical history, risk level, and home address.
Preferably, the basic information acquisition module guides the patient to fill in own basic information in a questionnaire form, and after the patient is filled in, the basic information acquisition module sends the basic information to the electronic case module, and the electronic case module records the basic information to an electronic medical record corresponding to the patient.
Preferably, the medical user terminal further comprises a message pushing module, wherein the message pushing module is used for sending related kepu articles, counseling consultation, latest treatment technology, home care notes and doctor seeing reminding information of the chronic wound surface to the patient user terminal.
Compared with the prior art, the invention has the following beneficial effects:
(1) The remote medical supervision system for chronic wound diseases guides patients to shoot chronic wound images which are more beneficial to doctor diagnosis through the wound image acquisition module, analyzes the chronic wound images through the wound analysis module to obtain corresponding chronic wound pixel ratios, and medical staff remotely monitors electronic medical records of the patients, chronic wound images and corresponding chronic wound pixel ratio changes through medical user terminals to provide scientific disease management for the patients with the chronic wound diseases, so that the conditions of the patients with the chronic wound diseases are effectively supervised, the effective nursing of the chronic wound is promoted, the times of the patients going to hospitals are reduced, the family and economic burden of the patients is lightened, the life quality of the patients is improved, and in addition, for medical institutions, the invention can improve the turnover rate of sickbeds and reduce the disease management mode of medical resource consumption.
(2) The remote medical supervision system for chronic wound diseases effectively records personal information, basic diseases, diagnosis information, imaging examination and laboratory sheet examination of patients through the electronic case module to form individual mobile electronic medical records of the patients, wherein the electronic medical records can be continuously updated through uploading follow-up records of the patients, so that the patients can conveniently and rapidly and accurately know the conditions of the patients by using the electronic files when returning to a hospital for review or visiting a hospital outside, and a clinical diagnosis and diagnosis scheme can be made.
(3) The remote medical supervision system for chronic wound diseases effectively analyzes the wound of a patient through the trained chronic wound analysis network model, and finally analyzes the number of the respective pixels of the chronic wound and the human body part through the chronic wound analysis network model to obtain the pixel ratio of the chronic wound in a chronic photograph of the patient, thereby supervising the chronic wound condition change of the patient by using the index.
Drawings
Fig. 1 is a schematic structural diagram of a telemedicine supervision system for chronic wound diseases according to an embodiment of the present invention
Fig. 2 is a schematic structural diagram of a cloud server for chronic wound diseases according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a medical user terminal for chronic wound diseases according to an embodiment of the present invention;
FIG. 4 is a schematic diagram of a patient user terminal for chronic wound disease according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a training process of a wound analysis network model according to an embodiment of the present invention;
fig. 6 is a schematic diagram of a model structure of a wound analysis network model according to an embodiment of the present invention.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present invention, and it is apparent that the described embodiments are some embodiments of the present invention, but not all embodiments, and all other embodiments obtained by those skilled in the art without making any inventive effort based on the embodiments of the present invention are within the scope of protection of the present invention.
The technical scheme of the invention is further described below by the specific embodiments with reference to the accompanying drawings.
Examples:
As shown in fig. 1-4, a remote medical supervision system for chronic wound diseases comprises a patient user terminal, a medical user terminal and a cloud server, wherein the patient user terminal and the medical user terminal are respectively in communication connection with the cloud server;
The patient user terminal comprises a wound surface image acquisition module and a basic information acquisition module, wherein the wound surface image acquisition module is used for acquiring preliminary chronic wound surface image information of a patient and preprocessing the preliminary chronic wound surface image information of the patient; the basic information acquisition module is used for acquiring basic information of a patient; specifically, the basic information includes the patient's name, height, weight, age, patient number, basic disease, duration and number of follow-up visits, current medication, surgical history, risk level, and home address.
Specifically, the basic information acquisition module guides a patient to fill in own basic information in a questionnaire form, and after the patient is filled in, the basic information acquisition module sends the basic information to the electronic case module, and the electronic case module records the basic information to an electronic medical record corresponding to the patient.
The wound surface image acquisition module as shown in fig. 4 comprises a wound filling frame unit, a verification model unit and a shooting unit, wherein the wound filling frame unit is used for generating a contour filling frame of a corresponding human body part, and the contour filling frame is arranged in a shooting interface of the shooting unit; when the device works, a patient is attached to the outline of the human body part around the wound by the outline filling frame and then shoots, and the preliminary chronic wound image information is obtained.
Specifically, the profile filling frame at human body position includes buttock profile filling frame, left low limbs top view filling frame, left low limbs left side view filling frame, left low limbs bottom view filling frame, left low limbs right side view filling frame, left low limbs sole filling frame, left foot face filling frame, right low limbs top view filling frame, right low limbs left side view filling frame, right low limbs bottom view filling frame, right low limbs right side view filling frame, right low limbs sole filling frame, right foot face filling frame, prone position head filling frame, left side prone position head filling frame, right side prone position head filling frame, supine position head filling frame, prone position chest belly filling frame, prone position back filling frame and special position filling frame, this setting is convenient for the patient to the sick position selection corresponding filling frame of self chronic surface of wound, promotes the accuracy of chronic surface of wound image of collection, has improved the supervision effect of remote medical system for chronic surface of wound to chronic patient.
Specifically, the working steps of the verification model unit are as follows:
S1, generating an initial chronic wound image after a patient shoots, and dividing the initial chronic wound image into wound images based on skin detection through a pre-trained verification model to obtain a pixel area SA1 of a human body part in the initial chronic wound image;
S2, acquiring a pixel area SA2 of a preset filling frame through a wound filling frame unit;
s3, overlapping the pixel area SA1 and the pixel area SA2 to obtain an overlapping intersection area SA3;
s4, judging whether the patient needs to correct the posture and shoot the angle again by setting the accuracy K1 and the accuracy K2 as judging conditions;
specifically, the accuracy K1 and the accuracy K2 are expressed as follows:
K1=SA3/SA1
K2=SA3/SA2
The judging conditions are as follows:
when K1 is more than or equal to A1 and K2 is more than or equal to A2;
wherein a1=a2, and A1 is greater than 0.9.
As shown in fig. 2, the cloud server includes a sample acquisition module, a wound analysis model construction module, a model training module, a wound analysis module, a storage module, an alarm module and an electronic case module; the sample acquisition module is used for acquiring chronic wound surface image information to be identified and constructing a corresponding chronic wound surface image data set; the wound analysis model construction module is used for constructing a wound analysis network model; the model training module is used for training the wound analysis network model by adopting the chronic wound image data set to obtain a trained wound analysis network model; the storage module is used for storing all interaction data of the cloud server; the wound analysis module is used for analyzing the preprocessed chronic image information through the trained wound analysis network model to obtain wound analysis information; the alarm module is used for sending out corresponding alarm signals when the wound analysis information obtained by the wound analysis module is abnormal; the electronic case module is used for storing an electronic medical record of a patient, wherein the electronic medical record comprises the basic information and the historical wound surface analysis information of the patient;
Specifically, the construction process of the wound analysis network model is as follows:
(1) Firstly, a VGG-16 deep convolutional neural network model is adopted as an initial model of the wound surface analysis network model, wherein the initial wound surface analysis network model consists of a first convolutional block, a second convolutional block, a third convolutional block, a fourth convolutional block, a fifth convolutional block, a first full-connection layer, a second full-connection layer and a third full-connection layer which are sequentially connected;
Specifically, the pooling layers of the first convolution block, the second convolution block, the third convolution block, the fourth convolution block and the fifth convolution block all adopt the largest pooling layer, the number of convolution kernels of the third convolution block, the fourth convolution block and the fifth convolution block is set to be 3, the number of channels 512 of the full-connection convolution kernels of the fourth convolution block, and the number of channels 512 of the full-connection convolution kernels of the fifth convolution block.
(2) Then, the first full connection layer, the second full connection layer and the third full connection layer in the initial wound surface analysis network model are adjusted to be convolution layers, then new convolution layers are a fourteenth convolution layer, a fifteenth convolution layer and a sixteenth convolution layer, the number of convolution kernels of the fourteenth convolution layer is set to be 512, the convolution kernel scale is 1 multiplied by 1, the number of convolution kernels of the fifteenth convolution layer is 1024, the convolution kernel scale is 1 multiplied by 1, then the number of convolution kernels of the sixteenth convolution layer is 1024, the number of output channels is adjusted to be 3, and the convolution kernel scale is set to be 1 multiplied by 1; the setting model outputs 3 predictive pictures which can correspond to three labels of 'background', 'limb' and 'wound surface'.
(3) Improving the network adjusted in the step (2) by adopting a bilinear interpolation method, so that the output image size of the improved network is restored to be the same as the input image size;
specifically, the specific process of step (3) is as follows:
Firstly, adding a seventeenth convolution layer and an eighteenth convolution layer on the network adjusted in the step (2), wherein the number of convolution kernels of the seventeenth convolution layer is 512, the convolution kernel scale is 1 multiplied by 1, and the number of output channels is 3; the number of convolution kernels of the eighteenth convolution layer is 256, the convolution kernel scale is 1 multiplied by 1, and the number of output channels is 3; then, one path of feature images output by a third convolution block of the network after the 2 layers of convolution layers are added is output to the eighteenth convolution layer for full-connection convolution operation, and the eighteenth convolution layer outputs to generate 3 feature images R1; adding one path of output to the seventeenth convolution layer of the network to perform full-connection convolution operation, outputting the seventeenth convolution layer to generate 3 corresponding feature images R2, recording the 3 feature images output by the sixteenth convolution layer of the network as R3, amplifying the R3 by 2 times by adopting a bilinear interpolation method, adding and fusing the amplified feature images R3 and R2 to generate a prediction image R4, amplifying the prediction image R4 by 2 times by adopting a bilinear interpolation method, adding and fusing the amplified prediction image R4 and the feature images R1 to generate a feature image R5, amplifying the prediction image R5 by 8 times by adopting a bilinear interpolation method to obtain a prediction image R6, and ensuring that the image size of the prediction image R6 is the same as the size of an input image; calculating probability values of each pixel point in the predictive graph R6 on three labels of 'background', 'limb' and 'wound surface' by using a Softmax function to serve as an output predictive graph result;
(4) And (3) taking the network model after the improvement in the step (3) as the wound analysis network model.
As shown in fig. 5, the training process of the wound surface analysis network model is as follows:
(a) Acquiring chronic wound surface image information to be identified in clinic, constructing a chronic wound surface image data set according to the chronic wound surface image information to be identified, and performing data enhancement processing on image data in the chronic wound surface image data set, wherein the data enhancement processing comprises translation, rotation, mirror symmetry and shearing of the image data; dividing a training set and a verification set from the processed chronic wound image dataset;
(b) Labeling the background, the human body part and the chronic wound surface in the image data in the training set by using CVAT, labelIMG or LabelStudio image labeling software; thereby completing semantic annotation and image segmentation of the training set and obtaining a processed training set;
Specifically, the clinical chronic wound surface image of the training set of the embodiment is marked by a professional medical staff to the approximate position of the wound surface area by using a handwriting pen, then the image is divided into different areas based on the initial mark by using a watershed algorithm, and finally standardized marks are formed.
(C) Inputting the processed image data in the training set into the wound analysis network model, and circularly adjusting the weight value of each layer of convolution kernel in the model by adopting a gradient descent algorithm, thereby improving the semantic classification accuracy of each pixel;
(d) And after the weight value of the model tends to be stable, the training result of the K-fold cross validation model is used, and after the model passes the validation, the trained wound analysis network model is obtained.
Specifically, the steps of the K-fold cross-validation model are as follows:
Step 101, randomly dividing the data of the chronic wound image data set into N parts, and marking the subsets as S 1,S2,……,SN respectively;
102, taking a subset except S i in S 1 to S N as a training set, and training a wound surface analysis network model;
Step 103, using S i as a verification set, and dividing the image of the verification set by using the model trained in the step 102 to obtain a division result R i; where i is a logical count tag
Step 104, repeating step 102 and step 103 to output i split results, and when i=k, proceeding to step 105; wherein k is an experimental random set value, and k is more than or equal to 1 and less than or equal to N;
and 105, merging the segmentation result Ri obtained in the step 104, and comparing with a true mark to finish evaluation.
In order to prevent the overfitting phenomenon when the number of convolution kernel weights in the whole model is too large, in the embodiment, two methods of data enhancement and BatchNormallization are adopted to improve the model, and the setting ensures that the wound analysis model still has good classification performance in training under the condition that the training set data is limited; as shown in step (a), data enhancement refers to the fact that limited clinical chronic wound image data can be processed mathematically to obtain more transformed images, so that the data volume of a training set is increased, and the phenomenon that models which appear due to overfitting are not good in data except a verification set and the training set is reduced;
Specifically, batchNormal is also a mature tool for accelerating model training convergence in the industry, in this embodiment, before the activation function of the convolution layer is inserted, the convolution output data of a plurality of samples in the training process are associated together, so that special data enhancement generated in the convolution training process is realized, the accuracy of the finally obtained trained wound analysis model is higher by the arrangement, and the reliability of the remote medical supervision system for chronic wound diseases is improved.
As shown in fig. 6, the working process of the trained wound surface analysis network model is as follows: in this embodiment, taking 224×224×3 as an example of the size of the input image, after the input image enters the trained wound surface analysis network model, the input image passes through a first convolution block, a second convolution block and a third convolution block, the third convolution block generates a feature map with a size of 28×28×256, the feature map output by the third convolution block is respectively input to the fourth convolution block and an eighteenth convolution layer, the eighteenth convolution layer performs full-connection convolution operation on the feature map output by the third convolution block to generate a feature map R1 with a size of 28×28×3, the fourth convolution block continues to convolve the feature map output by the third convolution block to generate a feature map with a size of 14×14×512, and then continues to input the feature map output by the fourth convolution block to the fifth convolution block and the seventeenth convolution layer respectively, the seventeenth convolution layer carries out full-connection convolution operation on the feature image output by the fourth convolution block to generate a feature image R2 with the size of 14 multiplied by 3, the fifth convolution block continues to carry out convolution on the feature image output by the fourth convolution block to generate a feature image with the size of 7 multiplied by 512, the feature image output by the fifth convolution block passes through a fourteenth convolution layer, a fifteenth convolution layer and a sixteenth convolution layer to obtain a feature image R3 with the size of 7 multiplied by 3, then a bilinear interpolation method is adopted to amplify R3 by 2 times, the size of the feature image R3 is amplified to 14 multiplied by 3, the amplified feature image R3 and the feature image R2 are added and fused to generate a prediction image R4, the amplified feature image R4 is amplified by 2 times by adopting a bilinear interpolation method, the size of the amplified prediction image R4 is 28 multiplied by 3, the amplified prediction image R4 and the feature image R1 are added and fused to generate a feature image R5, amplifying the predictive picture R5 by 8 times by adopting a bilinear interpolation method to obtain a predictive picture R6, wherein the image size of the predictive picture R6 is the same as the input image size; and calculating probability values of each pixel point in the predictive graph R6 on three labels of the background, the limb and the wound surface by using a Softmax function to serve as output predictive graph results, and marking the pixels by the label with the highest probability to complete semantic segmentation of the image.
Specifically, the wound analysis information comprises a chronic wound pixel ratio and chronic wound image information, wherein the chronic wound pixel ratio is the ratio of the number of pixels of the chronic wound to the number of pixels of the human body part output by the wound analysis network model; the chronic wound pixel duty cycle and the chronic wound image information are used to assist a physician in analyzing the current chronic wound area and severity of a patient.
As shown in fig. 3, the medical user terminal includes a medical diagnosis module, a patient monitoring module, an alarm receiving module and a doctor-patient interaction module; the medical diagnosis module is used for making treatment guidance and personalized intervention measures for the patient according to the wound analysis information and the electronic medical record, and recording the treatment guidance and personalized intervention measures to the electronic medical record corresponding to the patient; the patient monitoring module is used for regularly monitoring chronic wound image information, wound analysis information and basic disease conditions of a patient, and evaluating the health conditions of the patient according to the chronic wound image information, the wound analysis information and the basic disease conditions; the alarm receiving module is used for receiving alarm information sent by the alarm module, and a doctor recommends the next face diagnosis time of a patient according to the alarm information; the doctor-patient interaction module is used for communication between doctors and patients aiming at illness states.
Specifically, the medical user terminal further comprises a message pushing module, wherein the message pushing module is used for sending related kepu articles, counseling consultation, latest treatment technology, family care notes and treatment reminding information of the chronic wound surface to the patient user terminal.
Specifically, the patient monitoring module periodically monitors chronic wound disease patients in the system through screening indexes, and evaluates the health condition and disease progress of the patients through the screening indexes; the screening index is the time sequence change of the pixel ratio of the chronic wound, when the pixel ratio of the chronic wound increases faster in a short time, doctors or nurses prompt patients to review as soon as possible through the patient supervision module and give corresponding medical guidance comments, remark conditions are added in corresponding electronic medical records of the patients, the setting is convenient for the patients to discover the disease condition change of the patients in time, and thus the chronic wound is scientifically nursed and treated, the possibility of recurrence of the chronic wound is reduced, and the feasibility of the remote medical supervision system for chronic wound diseases is improved.
In the description of the present invention, it should be noted that, unless explicitly stated and agreed otherwise, the terms "disposed," "mounted," and "connected" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present invention can be understood by those of ordinary skill in the art according to the specific circumstances.
The above examples are preferred embodiments of the present invention, but the embodiments of the present invention are not limited to the above examples, and any other changes, modifications, substitutions, combinations, and simplifications that do not depart from the spirit and principle of the present invention should be made in the equivalent manner, and the embodiments are included in the protection scope of the present invention.

Claims (9)

1. The remote medical supervision system for the chronic wound diseases is characterized by comprising a patient user terminal, a medical user terminal and a cloud server, wherein the patient user terminal and the medical user terminal are respectively in communication connection with the cloud server;
The patient user terminal comprises a wound surface image acquisition module and a basic information acquisition module, wherein the wound surface image acquisition module is used for acquiring preliminary chronic wound surface image information of a patient and preprocessing the preliminary chronic wound surface image information of the patient; the basic information acquisition module is used for acquiring basic information of a patient;
The cloud server comprises a sample acquisition module, a wound analysis model construction module, a model training module, a wound analysis module, a storage module, an alarm module and an electronic case module; the sample acquisition module is used for acquiring chronic wound surface image information to be identified and constructing a corresponding chronic wound surface image data set; the wound analysis model construction module is used for constructing a wound analysis network model; the construction process of the wound analysis network model is as follows:
(1) Firstly, a VGG-16 deep convolutional neural network model is adopted as an initial model of the wound surface analysis network model, wherein the initial wound surface analysis network model consists of a first convolutional block, a second convolutional block, a third convolutional block, a fourth convolutional block, a fifth convolutional block, a first full-connection layer, a second full-connection layer and a third full-connection layer which are sequentially connected;
(2) Then, the first full connection layer, the second full connection layer and the third full connection layer in the initial wound surface analysis network model are adjusted to be convolution layers, then new convolution layers are a fourteenth convolution layer, a fifteenth convolution layer and a sixteenth convolution layer, the number of convolution kernels of the fourteenth convolution layer is set to be 512, the convolution kernel scale is 1 multiplied by 1, the number of convolution kernels of the fifteenth convolution layer is 1024, the convolution kernel scale is 1 multiplied by 1, then the number of convolution kernels of the sixteenth convolution layer is 1024, the number of output channels is adjusted to be 3, and the convolution kernel scale is set to be 1 multiplied by 1;
(3) Improving the network adjusted in the step (2) by adopting a bilinear interpolation method, so that the output image size of the improved network is restored to be the same as the input image size;
(4) Taking the network model adjusted in the step (3) as the wound analysis network model;
The model training module is used for training the wound analysis network model by adopting the chronic wound image data set to obtain a trained wound analysis network model; the storage module is used for storing all interaction data of the cloud server; the wound analysis module is used for analyzing the preprocessed chronic image information through the trained wound analysis network model to obtain wound analysis information; the alarm module is used for sending out corresponding alarm signals when the wound analysis information obtained by the wound analysis module is abnormal; the electronic case module is used for storing an electronic medical record of a patient, wherein the electronic medical record comprises the basic information and the historical wound surface analysis information of the patient;
the medical user terminal comprises a medical diagnosis module, a patient supervision module, an alarm receiving module and a doctor-patient interaction module; the medical diagnosis module is used for making treatment guidance and personalized intervention measures for the patient according to the wound analysis information and the electronic medical record, and recording the treatment guidance and personalized intervention measures to the electronic medical record corresponding to the patient; the patient monitoring module is used for regularly monitoring chronic wound image information, wound analysis information and basic disease conditions of a patient, and evaluating the health conditions of the patient according to the chronic wound image information, the wound analysis information and the basic disease conditions; the alarm receiving module is used for receiving alarm information sent by the alarm module, and a doctor recommends the next face diagnosis time of a patient according to the alarm information; the doctor-patient interaction module is used for communication between doctors and patients aiming at illness states.
2. The telemedicine supervision system for chronic wound diseases according to claim 1, wherein the wound image acquisition module comprises a wound filling frame unit, a verification model unit and a shooting unit, wherein the wound filling frame unit is used for generating a contour filling frame of a corresponding human body part, and the contour filling frame is arranged in a shooting interface of the shooting unit; when the device works, a patient is attached to the outline of the human body part around the wound by the outline filling frame and then shoots, and the preliminary chronic wound image information is obtained.
3. Telemedicine monitoring system for chronic wound diseases according to claim 2, characterized in that the working steps of the verification model unit are as follows:
S1, generating an initial chronic wound image after a patient shoots, and dividing the initial chronic wound image into wound images based on skin detection through a pre-trained verification model to obtain a pixel area SA1 of a human body part in the initial chronic wound image;
S2, acquiring a pixel area SA2 of a preset filling frame through a wound filling frame unit;
s3, overlapping the pixel area SA1 and the pixel area SA2 to obtain an overlapping intersection area SA3;
S4, judging whether the patient needs to correct posture and shoot the angle again by setting the accuracy K1 and the accuracy K2 as judging conditions.
4. The telemedicine supervision system for chronic wound diseases of claim 2, wherein the contour filling frames of the human body part include a buttock contour filling frame, a left lower limb top view filling frame, a left lower limb left view filling frame, a left lower limb bottom view filling frame, a left lower limb right view filling frame, a left lower limb foot plate filling frame, a left foot surface filling frame, a right lower limb top view filling frame, a right lower limb left view filling frame, a right lower limb bottom view filling frame, a right lower limb right view filling frame, a right lower limb foot plate filling frame, a right foot surface filling frame, a prone position head filling frame, a left lateral position head filling frame, a right lateral position head filling frame, a supine position chest and abdomen filling frame, a prone position back filling frame, and a special position filling frame.
5. The telemedicine supervisory system for chronic wound diseases according to claim 1, wherein the wound analysis information includes a chronic wound pixel ratio and chronic wound image information, the chronic wound pixel ratio being a ratio of the number of pixels of the chronic wound output by the wound analysis network model to the number of pixels of the human body part; the chronic wound pixel duty cycle and the chronic wound image information are used to assist a physician in analyzing the current chronic wound area and severity of a patient.
6. The telemedicine monitoring system for chronic wound disease of claim 1, wherein the training process of the wound analysis network model is as follows:
(a) Acquiring chronic wound surface image information to be identified in clinic, constructing a chronic wound surface image data set according to the chronic wound surface image information to be identified, and performing data enhancement processing on image data in the chronic wound surface image data set, wherein the data enhancement processing comprises translation, rotation, mirror symmetry and shearing of the image data; dividing a training set and a verification set from the processed chronic wound image dataset;
(b) Labeling the background, the human body part and the chronic wound surface in the image data in the training set by using CVAT, labelIMG or LabelStudio image labeling software; thereby completing semantic annotation and image segmentation of the training set and obtaining a processed training set;
(c) Inputting the processed image data in the training set into the wound analysis network model, and circularly adjusting the weight value of each layer of convolution kernel in the model by adopting a gradient descent algorithm, thereby improving the semantic classification accuracy of each pixel;
(d) And after the weight value of the model tends to be stable, the training result of the K-fold cross validation model is used, and after the model passes the validation, the trained wound analysis network model is obtained.
7. The telemedicine monitoring system for chronic wound disease of claim 1, wherein the base information includes patient name, height, weight, age, patient number, base disease, follow-up duration and number, current medication, surgical history, risk level, and home address.
8. The telemedicine supervision system for chronic wound diseases according to claim 1, wherein the basic information acquisition module guides the patient to fill in own basic information in a questionnaire form, and after the patient is filled in, the basic information acquisition module sends the basic information to the electronic case module, and the electronic case module records the basic information to an electronic medical record corresponding to the patient.
9. The telemedicine monitoring system for chronic wound diseases of claim 1, wherein the medical user terminal further comprises a message pushing module for sending related kepu articles, counseling, latest treatment techniques, home care notes and visit reminder information of the chronic wound to the patient user terminal.
CN202410306469.1A 2024-03-18 2024-03-18 Remote medical supervision system for chronic wound diseases Active CN118213092B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410306469.1A CN118213092B (en) 2024-03-18 2024-03-18 Remote medical supervision system for chronic wound diseases

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410306469.1A CN118213092B (en) 2024-03-18 2024-03-18 Remote medical supervision system for chronic wound diseases

Publications (2)

Publication Number Publication Date
CN118213092A CN118213092A (en) 2024-06-18
CN118213092B true CN118213092B (en) 2024-09-27

Family

ID=91451441

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410306469.1A Active CN118213092B (en) 2024-03-18 2024-03-18 Remote medical supervision system for chronic wound diseases

Country Status (1)

Country Link
CN (1) CN118213092B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118520508B (en) * 2024-07-23 2024-10-18 长春中医药大学 Retrospective correction-based coronary artery microvascular lesion prediction method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113241196A (en) * 2021-05-17 2021-08-10 中国科学院自动化研究所 Remote medical treatment and grading monitoring system based on cloud-terminal cooperation
CN117393156A (en) * 2023-12-12 2024-01-12 珠海灏睿科技有限公司 Multi-dimensional remote auscultation and diagnosis intelligent system based on cloud computing

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6829378B2 (en) * 2001-05-04 2004-12-07 Biomec, Inc. Remote medical image analysis
TWI614624B (en) * 2017-04-24 2018-02-11 太豪生醫股份有限公司 System and method for cloud medical image analyzing
CN113707301A (en) * 2021-08-30 2021-11-26 康键信息技术(深圳)有限公司 Remote inquiry method, device, equipment and medium based on artificial intelligence
CN116110615B (en) * 2023-04-12 2023-06-30 四川智康科技有限责任公司 Digital medical system for intervention in chronic pain

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113241196A (en) * 2021-05-17 2021-08-10 中国科学院自动化研究所 Remote medical treatment and grading monitoring system based on cloud-terminal cooperation
CN117393156A (en) * 2023-12-12 2024-01-12 珠海灏睿科技有限公司 Multi-dimensional remote auscultation and diagnosis intelligent system based on cloud computing

Also Published As

Publication number Publication date
CN118213092A (en) 2024-06-18

Similar Documents

Publication Publication Date Title
Chatrati et al. Smart home health monitoring system for predicting type 2 diabetes and hypertension
US11967074B2 (en) Method and system for computer-aided triage
RU2757048C1 (en) Method and system for assessing the health of the human body based on the large-volume sleep data
US20230014705A1 (en) Method and system for computer-aided triage
US20170249434A1 (en) Multi-format, multi-domain and multi-algorithm metalearner system and method for monitoring human health, and deriving health status and trajectory
CN109119130A (en) A kind of big data based on cloud computing is health management system arranged and method
CN118213092B (en) Remote medical supervision system for chronic wound diseases
CN117831745B (en) Remote nursing management method and system based on data analysis
CN117198548A (en) Intelligent ward rehabilitation diagnosis method, system, equipment and readable storage medium
WO2023148145A1 (en) System for forecasting a mental state of a subject and method
Krishnapriya Enhancing elderly care through telehealth monitoring system utilizing xDNN model: A review
CN118692681B (en) Heart monitoring information analysis method and system
KR102380027B1 (en) Patient health examination terminal using bio big-data visualization method
EP3460808A1 (en) Determining patient status based on measurable medical characteristics
Ahmad et al. IoT-Enabled Smart E-Healthcare System with Predictive Prescription Algorithm for Automatic Patient Monitoring and Treatment
Rammo et al. Comatose patient monitoring system based on (IoT)
Patel et al. Multi-Modal Data Fusion Based Cardiac Disease Prediction using Late Fusion and 2D CNN Architectures
KR102382659B1 (en) Method and system for training artificial intelligence model for estimation of glycolytic hemoglobin levels
Kalra et al. Artificial Intelligence in Healthcare-A Survey
WO2024149613A1 (en) System and method for patient monitoring based on condition and activity level
CN118352055A (en) Remote nursing method and system for child internal medicine patient
Badhoutiya et al. Leveraging Recurrent Neural Networks for Diabetes Progression Monitoring
Bali et al. Artificial intelligence in Healthcare
Hegde et al. IoT-Based Heart Disease Monitoring Using Neural Networks for Detecting Arrhythmia and Assessing Hypertrophic Cardiomyopathy (HCM) Risk
Basavaraddi et al. Breaking Barriers in Cardiovascular Diagnosis: Cloud-Based Decision Trees Algorithm for Precision Medicine in Heart Disease

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant