[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN115205775A - Event prompting method, device, equipment and storage medium - Google Patents

Event prompting method, device, equipment and storage medium Download PDF

Info

Publication number
CN115205775A
CN115205775A CN202210636287.1A CN202210636287A CN115205775A CN 115205775 A CN115205775 A CN 115205775A CN 202210636287 A CN202210636287 A CN 202210636287A CN 115205775 A CN115205775 A CN 115205775A
Authority
CN
China
Prior art keywords
pet
image
adverse reaction
vaccine
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210636287.1A
Other languages
Chinese (zh)
Inventor
彭永鹤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
New Ruipeng Pet Healthcare Group Co Ltd
Original Assignee
New Ruipeng Pet Healthcare Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by New Ruipeng Pet Healthcare Group Co Ltd filed Critical New Ruipeng Pet Healthcare Group Co Ltd
Priority to CN202210636287.1A priority Critical patent/CN115205775A/en
Publication of CN115205775A publication Critical patent/CN115205775A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/40Scenes; Scene-specific elements in video content
    • G06V20/44Event detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The embodiment of the application provides an event prompting method, an event prompting device, event prompting equipment and a storage medium, wherein the method comprises the following steps: if the pet image after the vaccination of the pet is received, acquiring the associated pet image and pet information of the pet; acquiring state characteristics of the pet according to the pet image of the pet and the associated pet image, and acquiring preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine; if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event; and sending prompt information of the vaccine adverse reaction event to a target corresponding to the pet. By adopting the embodiment of the application, the accuracy rate of judging the adverse reaction and the effectiveness of state monitoring can be improved.

Description

Event prompting method, device, equipment and storage medium
Technical Field
The present application relates to the field of internet technologies, and in particular, to an event prompting method, apparatus, device, and storage medium.
Background
With the development of economy, more and more people nurture pets. Pets need to be inoculated with different vaccines at different stages, reactions of different degrees may occur after vaccination, and different pets may have different reactions, and serious adverse reactions threaten life. The owner of the pet usually judges according to the state of the pet or informs a professional, and the professional judges whether the adverse reaction is generated. However, there is a subjective factor in human judgment, and different pets may react differently, so that there is a misjudgment situation.
Disclosure of Invention
The embodiment of the application provides an event prompting method, device, equipment and storage medium, after a pet is vaccinated, whether the pet has a vaccine adverse reaction event or not is determined through the current pet image and the state characteristics obtained by the associated pet image, and the accuracy rate of judging the adverse reaction and the effectiveness of state monitoring can be improved.
An aspect of the present embodiment provides an event notification method, including: if the pet image after the vaccination of the pet is received, acquiring the associated pet image and pet information of the pet; acquiring the state characteristics of the pet according to the pet image and the associated pet image of the pet, and acquiring the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the vaccination; if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event; and sending prompt information of the vaccine adverse reaction event to a target corresponding to the pet. Therefore, after the pet is vaccinated, whether the pet has the vaccine adverse reaction event or not is determined through the current pet image and the state characteristics obtained by the associated pet image, and the accuracy rate of judging the adverse reaction and the effectiveness of state monitoring can be improved.
In one possible example, obtaining the status characteristic of the pet according to the pet image of the pet and the associated pet image includes: acquiring a correlation value between a pet image of a pet and a correlated pet image; acquiring first state data of the pet according to the pet image of the pet, and acquiring second state data of the pet according to the associated pet image of the pet; and carrying out fusion processing on the first state data and the second state data according to the correlation values to obtain the state characteristics of the pet. Therefore, the first state data obtained by the pet image and the second state data obtained by the associated pet image are fused according to the associated value, so that the accuracy of obtaining the state characteristics of the pet can be improved, and the accuracy of judging whether the adverse reaction occurs is facilitated to be improved.
In one possible example, obtaining a correlation value between a pet image of a pet and a correlated pet image includes: respectively acquiring scene data and/or physiological parameters of a pet image of a pet and a related pet image; and acquiring a correlation value between the pet image of the pet and the correlated pet image according to the scene data and/or the physiological parameters. Therefore, the accuracy of obtaining the correlation value can be improved, and the accuracy of judging whether the adverse reaction occurs is improved.
In one possible example, after obtaining the status feature of the pet according to the pet image of the pet and the associated pet image, the method further comprises: and updating the state characteristics of the pet according to the physiological parameters. Therefore, the accuracy of the state characteristics can be further improved, and the accuracy of judging whether the adverse reaction occurs is favorably improved.
In one possible example, before sending the prompt message of the vaccine adverse reaction event to the pet corresponding subject, the method further comprises the following steps: obtaining the risk level of the vaccine adverse reaction event according to the state characteristics of the pet; acquiring the attention of the vaccine adverse reaction event according to the risk level; and generating prompt information of the vaccine adverse reaction event according to the notice. Therefore, the prompt information can be generated through the danger level, and the accuracy and the effectiveness of the prompt notice are improved.
In one possible example, after obtaining the risk level of the vaccine adverse reaction event according to the status characteristics of the pet, the method further comprises: and if the danger level is greater than the preset level, sending a calling request to the hospital corresponding to the pet, wherein the calling request is used for indicating the hospital to provide medical resources. Therefore, the emergency treatment speed of the hospital can be improved, and the effectiveness of state monitoring is improved.
In one possible example, before sending the call request to the hospital corresponding to the pet, the method further includes: obtaining the position, insurance information and historical diagnosis record of the pet; determining the priority of each hospital in a plurality of hospitals corresponding to the position of the pet according to the insurance information and the historical visit record of the pet; and determining the hospital corresponding to the pet according to the priority of the hospital. Therefore, the hospital corresponding to the call request is determined according to the priority of the hospital, and the efficiency and the effect of emergency treatment can be improved.
An aspect of the present application provides an event notification apparatus, including:
the communication unit is used for receiving a pet image of a pet;
the processing unit is used for acquiring the state characteristics of the pet according to the pet image of the pet and the associated pet image and acquiring the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine; if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event;
the communication unit is also used for sending prompt information of the vaccine adverse reaction event to a subject corresponding to the pet.
An aspect of an embodiment of the present application provides a computer device including a memory and a processor connected to the memory. Wherein the memory is configured to store a computer program, and the processor is configured to invoke the computer program to cause the computer device to perform the method provided by the above-mentioned aspect in the embodiments of the present application.
An aspect of an embodiment of the present application provides a computer-readable storage medium. The computer readable storage medium has stored therein a computer program, which is adapted to be loaded and executed by a processor, so as to cause a computer device having the processor to execute the method provided by the above-mentioned aspect in the embodiments of the present application.
According to one aspect of the present application, a computer program product is provided that includes computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the method provided by the above-mentioned aspect.
By adopting the embodiment of the application, if the pet image after the vaccination of the pet is received, the associated pet image and the pet information of the pet are obtained. And then obtaining the state characteristics of the pet according to the pet image of the pet and the related pet image, and obtaining the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine. And if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has the vaccine adverse reaction event. That is, after the pet is vaccinated, whether the pet has a vaccination adverse reaction event is determined by the current pet image and the acquired state characteristics of the associated pet image, so that the accuracy of determining the adverse reaction can be improved. And then after the occurrence of the vaccine adverse reaction event is determined, prompt information of the vaccine adverse reaction event can be sent to the object corresponding to the pet, so that the effectiveness of state monitoring is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present application;
fig. 2 is a scene schematic diagram of an event prompt provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of an event notification method according to an embodiment of the present application;
fig. 4 is a schematic view of a scene for uploading pet images according to an embodiment of the present application;
fig. 5 is a schematic flowchart of a state feature identification method according to an embodiment of the present application;
fig. 6 is a schematic view of another scenario of event notification according to an embodiment of the present application;
fig. 7 is a schematic structural diagram of an event notification apparatus according to an embodiment of the present application;
fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
The pet type is not limited in the present application, and may include common cats, dogs, birds, turtles, etc., and may also include unusual chickens, hog snakes, lizards, geckos, lions, etc. In the embodiment of the present application, the pet information may include identification information such as a name and a nasal print of the pet, and may further include pet attributes such as a variety, an age, a sex, a skin color, and a hair color of the pet. The pet information may also include historical visit records, historical vaccination records, application information, and the like.
The historical visit record is used for recording the information of each visit of the pet, and can include the visit information corresponding to each visit time. The medical information may include the type of medical treatment, the current medical history, the past medical history, the physical examination data, the diagnosis information, and the like. Or may also include a visit hospital, a return visit message, etc., which are not limited herein. The historical vaccination record is used for recording the vaccination condition of the vaccinated pet, such as whether vaccination is carried out or not, whether adverse reaction is generated or not and the like. The insurance information is used to describe insurance information of purchased insurance of the pet, such as insurance name, insurance number, insurance company, insurance object, protected object, benefit object, insurance scope, insurance clause, payment amount, claim settlement and the like, and is not limited herein.
Further, the pet information may further include preference characteristics of the pet, such as diet characteristics, work and rest characteristics, exercise preferences, health status, and the like, which are not limited herein.
The object corresponding to the pet is not limited in the present application, and may be the owner or custodian of the pet. The number of objects corresponding to the pet may be 1, 2 or more. In the embodiment of the present application, the object information may include an object attribute of the object. The object attributes may include identification information (e.g., name, identity, account identification, etc.) of the object, basic information such as age, gender, occupation, address, etc., and may also include social data of the object, such as social relationships in a network or real life. Object properties may alternatively include tags of the object, e.g., hobbies, behavior habits, etc.
The object information may be further divided into instant data and historical data according to the occurrence time, and the instant data may include data related to the current use of the user terminal by the object, such as keywords currently searched, contents browsed, and the like. The history data may include shopping records, browsing records, etc. of prior uses of the object with the user terminal or other user terminals. The preference feature of the object may be obtained through the historical data, so that whether to send the corresponding push message to the object may be determined based on whether there is a match between the data feature of the instant scene and the preference feature of the object.
The pet information of the pet and/or the object information of the object may be uploaded to a server for storage, or stored in a Blockchain (Blockchain). The block chain in the embodiment of the application is a novel application mode of computer technologies such as distributed data storage, point-to-point transmission, a consensus mechanism, an encryption algorithm and the like. The blockchain is essentially a decentralized database, which is a string of data blocks associated by using cryptography, each data block contains information of a batch of network transactions, and the information is used for verifying the validity (anti-counterfeiting) of the information and generating the next block. The blockchain may include a blockchain underlying platform, a platform product service layer, an application service layer, and the like. Therefore, data are stored in a distributed mode through the block chains, data security is guaranteed, and meanwhile data sharing of information among different platforms can be achieved.
The scene for storing the pet information and/or the object information is not limited, and the scene can be the scene that the pet sees a doctor, gets a vaccine, beautifies, checks up the body, transacts hospitalization business, transacts insurance business, transacts identity information and the like in a hospital. It can be understood that after the pet information and the object information are stored, the information of the pet can be pushed to the object based on the pet information and the object information, and the accuracy rate of pushing the pet information is improved.
It should be noted that, in the specific implementation manner of the present application, data of objects such as users, enterprises, institutions, etc. may be involved, when the above embodiments of the present application are applied to specific products or technologies, permission or consent of the objects such as users, enterprises, institutions, etc. needs to be obtained, and collection, use and processing of relevant data need to comply with relevant laws and regulations and standards of relevant countries and regions.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Referring to fig. 1, fig. 1 is a schematic structural diagram of a network architecture according to an embodiment of the present disclosure. As shown in fig. 1, the network architecture may include a server 10d and a user terminal cluster, which may include one or more user terminals, where the number of user terminals is not limited. As shown in fig. 1, the user terminal cluster may specifically include a user terminal 10a, a user terminal 10b, a user terminal 10c, and the like.
The server 10d may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, or a cloud server providing basic cloud computing services such as a cloud service, a cloud database, cloud computing, a cloud function, cloud storage, a network service, cloud communication, a middleware service, a domain name service, a security service, a Content Delivery Network (CDN), a big data and artificial intelligence platform, and the like.
The user terminal 10a, the user terminal 10b, the user terminal 10c, and the like may each include: the electronic device comprises an electronic device with a video/image playing function, such as a smart phone, a tablet computer, a notebook computer, a palm computer, a Mobile Internet Device (MID), a wearable device (such as a smart watch, a smart bracelet and the like), an intelligent voice interaction device, an intelligent household appliance (such as a smart television and the like), a vehicle-mounted device and the like.
As shown in fig. 1, the user terminal 10a, the user terminal 10b, the user terminal 10c, etc. may be respectively connected to the server 10d via a network, so that each user terminal may interact data with the server 10d via the network. For example, the first object transmits object information of the first object to the server 10d through the user terminal 10a, and the second object relays a push message and the like by the server 10d through the user terminal 10 b. For another example, the server 10d transmits the vaccination guidance information to the target object via the user terminal 10 a.
Referring to fig. 2, fig. 2 is a schematic view illustrating an event prompt scenario provided in an embodiment of the present application. As shown in fig. 2, the object corresponding to the pet may upload a pet image 201 through the user terminal 10 a. The server 10d may then obtain the associated pet image and the pet information of the pet and determine whether the pet has a vaccine adverse reaction event according to the pet image 201, the associated pet image and the pet information. If yes, sending prompt information of the vaccine adverse reaction event to the user terminal 10 a.
The associated pet image may include any historical image of the pet, or may include a historical image of the pet image 201 acquired after vaccination and before the pet image 201, or may include a historical image of the pet acquired after last vaccination with the same type of vaccine, and the like, which is not limited herein.
The pet corresponding to the vaccine adverse reaction event generates adverse reaction, and the type of the adverse reaction generated by the pet is not limited by the application, and can include mental depression, dyspnea, serious vomiting, eye edema and poor openness, red rash (or urticaria) on the skin, inappetence, lethargy, red swelling and swelling of the injection part and the like.
It should be noted that the above scenario diagram of event prompt is only used for example. In fact, the corresponding object of the pet can also upload the physiological parameters of the pet. The physiological parameters may include measured data such as body temperature, heart rate, water intake, urine output, respiration, pulse, blood pressure, and the like. The physiological parameters may be data obtained according to the surface phenomena of the pet, such as mood, food amount, sleeping time, etc., but not limited thereto.
Referring to fig. 3, fig. 3 is a schematic flowchart illustrating an event notification method according to an embodiment of the present disclosure. The method may be executed by an event notification apparatus or a computer device, where the computer device may be a server (e.g., the server 10d in the embodiment corresponding to fig. 1), or a user terminal (e.g., any one of the user terminals in the user terminal cluster shown in fig. 1), or a computer program (including program code), etc. As shown in fig. 3, the method includes the following steps S301 to S304, wherein:
s301: and if the pet image after the vaccination of the pet is received, acquiring the associated pet image and the pet information of the pet.
The pet image, the associated pet image and the pet information may refer to the foregoing or the following, and are not described herein again. The pet to which the present application relates is a vaccinated pet, and the receiving period of the pet image after vaccination is not limited, and may be any one of the receiving periods of one day, 8 hours, 12 hours, and the like, or may be a predetermined fixed period of time, for example, every half hour, every 1 hour, and the like. For another example, every 5 minutes for the first 30 minutes, and every half hour after 30 minutes. It will be appreciated that the first 30 minutes following vaccination are prone to dangerous adverse reactions and that shorter time intervals are used to obtain images of pets. The pet images may then be acquired at slightly longer time intervals, thereby increasing the effectiveness of the monitoring based on the dynamic time intervals at which the pet images are acquired.
In one possible example, the time information of uploading the pet image is determined according to the associated pet image and/or the pet information of the pet; and sending push information corresponding to the time information to an object corresponding to the pet.
The time information may include a time or a time interval for uploading the pet image, and the like. The push information may indicate a specific time or time range. It is to be appreciated that the associated pet image may be understood as a previously uploaded pet image that may be indicative of a status of the pet. The pet information may be used to determine the response of different types of pets to vaccination. Therefore, in this example, the time information for uploading the pet image may be determined according to the associated pet image and/or the pet information of the pet, and the push information corresponding to the time information may be sent to the object corresponding to the pet, so that the object may upload the pet image based on the time or time interval in the push information.
Exemplarily, please refer to fig. 4, and fig. 4 is a schematic view of a scene for uploading pet images according to an embodiment of the present application. As shown in fig. 4, the server 10d may send push information corresponding to the time information of uploading the pet image to the user terminal 10a, so that "please upload the pet image within 5 minutes" is displayed in the user terminal 10 a. The object 101 may upload the pet image 201 in the upload page 200 of the user terminal 10a and click the upload component 202 for uploading, so that the server 10d receives the pet image 201 after pet vaccination. If the object 101 clicks the cancel component 203, the uploading of the pet image 201 is canceled, and other pet images may be uploaded or no more pet images may be uploaded.
The pet image can be shot and uploaded by a corresponding object of the pet, or can be shot and uploaded by a monitoring camera, and the like. Or the image of the pet can be extracted from the video collected by the monitoring camera, and then the image of the pet is screened, so that incomplete images are filtered out, clear images are reserved, and the accuracy of the pet image analysis state is improved.
The associated pet image and the pet information can be searched based on the nasal print features extracted from the pet image. The method for identifying the nasal print features is not limited, and the pet image can be input into the nasal print identification platform firstly, so that the nasal print identification platform outputs the nasal print features. The nose print recognition platform can pre-process the pet image, such as image denoising, image correction, image enhancement, feature incompletion, and the like. And then sequentially carrying out frame detection on the images obtained by preprocessing to obtain a nasal print image, and then carrying out feature extraction on the nasal print image to obtain the nasal print features of the nasal print image. The frame detection can be realized by adopting a plurality of frame detection models, the feature extraction can be realized by adopting a plurality of feature extraction models, and each frame detection model can have a corresponding feature extraction model. Therefore, the accuracy of identifying the nose line features can be improved by identifying the nose line features through the plurality of models. Before the preprocessing, the pet image may also be subjected to authenticity verification, that is, living body authentication, to confirm whether the pet in the pet image is a real pet.
S302: and acquiring the state characteristics of the pet according to the pet image and the associated pet image of the pet, and acquiring the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the vaccination.
S303: and if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has the vaccine adverse reaction event.
In the present example, adverse reactions in vaccine adverse reaction events may include mental depression, dyspnea, severe vomiting, eye edema that does not open, red rash (or hives) on the skin, loss of appetite, lethargy, red swelling and distension at the injection site, and the like. The status and preset status features may include the following categories of features: mental state characteristics, gastrointestinal state characteristics, skin state characteristics, and the like, without limitation. It can be understood that if the pet has adverse reaction, the state characteristics of the adverse reaction can appear. The state characteristics of the pet are matched with the preset state characteristics of the vaccine adverse reaction event, and if the matching is successful, the situation that the vaccine adverse reaction event occurs to the pet is determined.
The vaccination information for vaccination may include the vaccine manufacturer, the type of vaccine (e.g., weak virus vaccine, inactivated vaccine, etc.), the number of vaccinations, the subject to be vaccinated, the recommended vaccination time, adverse reactions, etc. It can be understood that different pets may have different reactions to the same vaccine, and therefore, in the embodiment of the application, the preset state characteristics of the adverse reaction event of the vaccine are obtained according to the pet information and the vaccine information of the vaccination, so that the accuracy rate of determining whether the adverse reaction occurs can be improved. Specifically, the state characteristics corresponding to one or more of the type, age, physiological parameters before vaccination and the like of the pet can be searched from the state characteristics of the vaccination information on the vaccine adverse reaction events as the preset state characteristics of the vaccine adverse reaction events.
The method for acquiring the state characteristics of the pet is not limited, and the state characteristics of the pet image and the state characteristics of the associated pet image can be acquired respectively and then data fusion is carried out. Or the state recognition model can be trained in advance, and the pet image and the associated pet image are input into the state recognition model to obtain the state characteristics of the pet. The state recognition model may perform unsupervised learning based on a Comparative Predictive Coding (CPC) technique.
The state recognition model may include a feature recognition network and a feature learning network. The feature recognition network may include a deep residual network (ResNet), the degradation problem of the deep network is solved through residual learning, and a deeper network may be trained. The feature learning network may include Recurrent Neural Networks (RNNs) for predicting state information at a next time from previous state information. The long term memory network (LSTM) can be specifically included, and the long term dependence problem can be solved.
Exemplarily, please refer to fig. 5, and fig. 5 is a schematic flowchart of a state feature identification method according to an embodiment of the present application. As shown in fig. 5, the pet image and the related pet image are respectively input to the feature recognition network, and the first state feature corresponding to the pet image and the second state feature corresponding to the related pet image are obtained. And then inputting the first state characteristic and the second state characteristic into the characteristic learning network to obtain the state characteristic of the pet and the state characteristic at the next moment.
Further, the pet image and the associated pet image may be divided into a plurality of images of different parts, for example, a face image and a limb image, and the face image may be further divided into an eye image, a nose image, a mouth image, and the like. The images of different parts can be respectively input into the feature recognition network to obtain the state features of different parts in a single image. And then the state features of different parts in the single image are input into the feature learning network, and the state features of the pet image and the associated pet image are subjected to feature learning, so that the accuracy of feature recognition can be improved, and the state features at the next moment can be predicted. If the number of the associated pet images is multiple, the second state features of the associated pet images may be sequentially acquired in time order. Alternatively, the number of associated pet images is a fixed number, e.g., 5, etc. Therefore, the state characteristics of the pet can be determined by adopting a certain number of associated pet images and pet images, and the state characteristics at the next moment can be predicted, so that the accuracy and efficiency of judging the adverse reaction can be improved.
In one possible example, the step of obtaining the status feature of the pet based on the pet image of the pet and the associated pet image may include the steps of: acquiring a correlation value between a pet image of a pet and a correlation pet image; acquiring first state data of the pet according to the pet image of the pet, and acquiring second state data of the pet according to the associated pet image of the pet; and carrying out fusion processing on the first state data and the second state data according to the correlation values to obtain the state characteristics of the pet.
Wherein the correlation value between the pet image and the correlated pet image is used to describe a degree of correlation between the pet image and the correlated pet image. The determination may be made by a similarity value between the pet image and the associated pet image. Or in a possible example, the method of obtaining the correlation value between the pet image of the pet and the correlated pet image may include the steps of: acquiring scene data and/or physiological parameters between a pet image of a pet and an associated pet image; and acquiring a correlation value between the pet image of the pet and the correlated pet image according to the scene data and/or the physiological parameters.
The scene data may include, among other things, the time at which the pet image and associated pet image were taken, the angle at which the pet was taken, the camera device, etc. The physiological parameters can be referred to the above or the following, and are not described herein again. The correlation value may be obtained from the difference of the scene data and/or the parameter of the physiological parameter. The difference of the scene data may include a time interval between the capturing of the pet image and the associated pet image, and the time interval may be obtained by a difference of times between the capturing of the pet image and the associated pet image, or may be obtained by a product of a preset time interval of the pet image and a number of images between the pet image and the associated pet image.
It will be appreciated that the closer the scene data or physiological parameter, the greater the referential of the associated pet image. In this example, the correlation value between the pet image and the associated pet image is obtained according to the scene data and/or the physiological parameters of the pet image and the associated pet image, so that the accuracy of obtaining the correlation value can be improved, and the accuracy of judging whether the adverse reaction occurs can be improved.
In the embodiment of the present application, the first status data and the second status data are respectively status data corresponding to the pet image and the associated pet image, and may include the aforementioned status characteristics (such as the first status characteristic and the second status characteristic shown in fig. 5) or data corresponding to the status characteristics. The dimensions corresponding to the status features of the first status data and the second status data may be the same, e.g. mental dimensions, skin dimensions, etc. The following pet image is exemplified, and in one possible example, the method for obtaining the first status data of the pet according to the pet image may include the following steps: acquiring action data, expression data and color data of the pet according to the pet image; and acquiring mental state data of the pet according to the action data and the expression data, and acquiring skin state data of the pet according to the color data.
The motion data can be acquired through the limb motion of the pet in the pet image. The expression data can be acquired through a facial image in the pet image, and specifically, the expression data can be acquired according to the nasal print features of the nasal print image in the pet image, for example, the line features of the nose and the micro-expression corresponding to the line features generated by the micro-motion. The color data may be determined by the color of the hair or the color of the face in the pet image. The mental state data can be obtained through action amplitude corresponding to the action data and emotion vectors corresponding to the expression data, and the skin state data can be obtained through areas with abnormal colors in the color data. It can be understood that the images or image data related to the state data are extracted from the pet images, so that the accuracy of acquiring the state data can be improved, and the accuracy of judging whether adverse reaction occurs can be improved. Further, health status data is determined from the mental status data and the skin status data. That is, determining the health status data according to the mapping relationship between the mental status and the health status and the mapping relationship between the skin status and the health status can improve the accuracy of determining the adverse reaction.
The method for performing fusion processing on the first state data and the second state data by using the association values is not limited in the present application, and the weights of the first state data and the second state data may be determined by using the association values, for example, the larger the association value is, the closer the weight between the first state data and the second state data is. The smaller the correlation value, the more the weight of the first state data is greater than the weight of the second state data. And weighting the state characteristics corresponding to the first state data and the second state data respectively to obtain the state characteristics of the pet. Or the product between the correlation value and the state characteristic corresponding to the second state data can be obtained, and the state characteristic corresponding to the first state data is added to obtain the state characteristic of the pet.
It can be understood that the first state data obtained by the pet image and the second state data obtained by the associated pet image are fused according to the associated value, so that the accuracy of obtaining the state characteristics of the pet can be improved, and the accuracy of judging whether the adverse reaction occurs can be improved.
In one possible example, after obtaining the status feature of the pet according to the pet image of the pet and the associated pet image, the method further includes: and updating the state characteristics of the pet according to the physiological parameters.
The physiological parameters can reflect the state of the pet, so that the state characteristics of the pet can be updated. The method for updating the state characteristics of the pet is not limited, and the reference state characteristics corresponding to the physiological parameters can be obtained according to the mapping relation between the physiological parameters and the various state characteristics. The status feature is updated according to the reference status feature, for example, sub-features (e.g., heart rate, blood pressure, etc.) not included in the status feature are increased, and sub-features (e.g., emotional feature, skin feature, etc.) included in the status feature are weighted. Therefore, the accuracy of the state characteristics can be further improved, and the accuracy of judging whether the adverse reaction occurs is favorably improved.
S304: and sending prompt information of the vaccine adverse reaction event to a target corresponding to the pet.
In the embodiment of the application, the prompt message of the vaccine adverse reaction event can be sent to the object corresponding to the pet in the modes of short message, mail, application push and the like. The content of the prompt information is not limited in the application, and the prompt information can include alarm information, an emergency processing mode, caution items and the like of adverse reaction events of the vaccine. Wherein the warning message is used to inform the subject pet that an adverse vaccine reaction event has occurred. The emergency treatment modality is used to inform the subject of a treatment modality for dealing with the vaccine adverse reaction event, such as taking to a pet hospital, reducing going out, and the like. The notice is used to inform the subject about the need for attention, such as avoiding bathing, avoiding contact with the patient, etc., enhancing care, avoiding external stress stimulation, etc.
In a possible example, before step S304, the following steps may be further included: acquiring the risk level of the vaccine adverse reaction event according to the state characteristics of the pet; acquiring the attention of the vaccine adverse reaction event according to the risk level; and generating prompt information of the vaccine adverse reaction event according to the notice.
Wherein the risk rating is used to describe the extent to which a vaccine adverse reaction event affects the life of the pet. The determination can be performed by a state grade corresponding to the state characteristics, for example, an unstable grade corresponding to the mental state, a gastrointestinal grade corresponding to the vomit frequency, a pruritus grade corresponding to the area where the skin appears pruritus, and the like.
It is understood that in this example, the risk level of a vaccine adverse reaction event is obtained based on the status characteristics of the pet. And then, obtaining the caution items of the vaccine adverse reaction events according to the risk grade, thereby generating the prompt information of the vaccine adverse reaction events according to the caution items. Therefore, the prompt information can be generated through the danger level, and the accuracy and the effectiveness of the prompt notice are improved.
In one possible example, after acquiring the risk level of the vaccine adverse reaction event according to the status characteristics of the pet, the method further comprises the following steps: and if the danger level is greater than the preset level, sending a calling request to a hospital corresponding to the pet.
The call request is used to instruct the hospital to provide medical resources, such as ambulances, allocation of hospitality doctors and nurses, ordering of medical devices, and the like, and is not limited herein. The hospital corresponding to the pet is not limited, and can be a hospital near the pet, a hospital capable of executing insurance reimbursement, a frequently-going hospital, a hospital with a high evaluation value, a hospital without queuing, and the like.
In one possible example, before sending the call request to the hospital corresponding to the pet, the method further comprises: acquiring the position, insurance information and historical diagnosis record of the pet; determining the priority of each hospital in a plurality of hospitals corresponding to the position of the pet according to the application information and the historical visit record of the pet; and determining the hospital corresponding to the pet according to the priority of the hospital.
The position of the pet is the current position of the pet, and may be a specific position or an area range. The plurality of hospitals corresponding to the pet's location may be hospitals within an area, or hospitals that are not more than a certain distance (e.g., 30 minutes, etc.), and the like. The information on the application and the history of the visit can be referred to the above, and is not limited herein. The priority is used to describe the ordering of the hospital to perform the call requests, i.e., the higher the priority, the greater the likelihood of going to the hospital. The calculation of the priority can be obtained by weighting the distance between the pet and the hospital, whether the hospital supports insurance reimbursement, and whether the hospital is a hospital for treatment.
It can be understood that, in this example, the priority of the hospital corresponding to the pet's location is determined according to the application information and the historical visit record of the pet, so that the hospital corresponding to the call request is determined according to the priority of the hospital, and the efficiency and effect of emergency treatment can be improved.
The preset grade is not limited in the application, and can be a fixed grade, or can be determined according to the type of the vaccination, or can be determined according to the distance between the corresponding object of the pet and the pet, and the like. It can be understood that different vaccinations have different adverse reactions, and the accuracy of hospital calling can be improved by determining the preset grade according to the type of the vaccinations.
Exemplarily, please refer to fig. 6, and fig. 6 is a schematic view of another scenario of event prompt according to an embodiment of the present application. As shown in fig. 6, the object corresponding to the pet may upload the pet image 201 through the user terminal 10 a. The server 10d may then obtain status characteristics of the pet based on the pet image 201 and the associated pet image. And if the state characteristics of the pet are successfully matched with the preset state characteristics of the vaccine adverse reaction event, determining that the pet has the vaccine adverse reaction event. The server 10d may also obtain the risk level of the vaccine adverse reaction event according to the status characteristics of the pet. If the risk level is greater than the preset level, the prompt message of the vaccine adverse reaction event generated by the risk level may be sent to the user terminal 10a, and a call request may be sent to the hospital 202.
It should be noted that fig. 6 is a method performed at a location outside of a hospital for pets. If the pet is currently located in the hospital, the hospital corresponding to the pet is the hospital where the pet is currently located, and the call request is used for instructing a specific department of the hospital or a specific doctor and nurse to perform an adverse reaction treatment operation for the pet.
In the method shown in fig. 3, if the pet image after the vaccination of the pet is received, the associated pet image and the pet information of the pet are obtained. And then obtaining the state characteristics of the pet according to the pet image of the pet and the related pet image, and obtaining the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine. And if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has the vaccine adverse reaction event. That is to say, after the pet is vaccinated, whether the pet has a vaccine adverse reaction event is determined by the current pet image and the acquired state characteristics of the associated pet image, so that the accuracy of determining the adverse reaction can be improved. And then after determining that the vaccine adverse reaction event occurs, sending prompt information of the vaccine adverse reaction event to a target corresponding to the pet, so that the effectiveness of state monitoring is improved.
The method of the embodiments of the present application is set forth above in detail and the apparatus of the embodiments of the present application is provided below.
Please refer to fig. 7, fig. 7 is a schematic structural diagram of an event notification device according to an embodiment of the present disclosure. As shown in fig. 7, the event notification apparatus includes a processing unit 701 and a communication unit 702, where:
the communication unit 702 is used for receiving a pet image of a pet;
the processing unit 701 is configured to obtain a status characteristic of the pet according to the pet image of the pet and the associated pet image, and obtain a preset status characteristic of the vaccine adverse reaction event according to the pet information and the vaccine information of the vaccination; if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event;
the communication unit 702 is further configured to send a prompt message of the vaccine adverse reaction event to a subject corresponding to the pet.
In one possible example, the processing unit 701 is specifically configured to obtain a correlation value between the pet image of the pet and the correlated pet image; acquiring first state data of the pet according to the pet image of the pet, and acquiring second state data of the pet according to the associated pet image of the pet; and carrying out fusion processing on the first state data and the second state data according to the correlation value to obtain the state characteristics of the pet.
In one possible example, the processing unit 701 is specifically configured to obtain scene data and/or physiological parameters of a pet image of the pet and an associated pet image, respectively; and acquiring a correlation value between the pet image of the pet and the correlated pet image according to the scene data and/or the physiological parameters.
In a possible example, the processing unit 701 is further configured to update the status characteristic of the pet according to the physiological parameter.
In one possible example, the processing unit 701 is further configured to obtain a risk level of the vaccine adverse reaction event according to the status characteristics of the pet; acquiring the attention of the vaccine adverse reaction event according to the risk level; and generating prompt information of the vaccine adverse reaction event according to the notice.
In a possible example, the communication unit 702 is further configured to send a call request to a hospital corresponding to the pet, where the call request is used to instruct the hospital to provide medical resources, if the risk level is greater than a preset level.
In one possible example, the processing unit 701 is further configured to obtain a location, insurance application information, and historical visit records of the pet; determining the priority of each hospital in a plurality of hospitals corresponding to the pet position according to the application information and the historical visit record of the pet; and determining the hospital corresponding to the pet according to the priority of the hospital.
Referring to fig. 8, fig. 8 is a schematic structural diagram of a computer device according to an embodiment of the present disclosure. The computer device 800 includes a processor 801, a communication interface 802, and a memory 803. The processor 801, the communication interface 802, and the memory 803 may be connected to each other by a bus 805, or may be connected by other means. The relevant functions performed by the processing unit 701 shown in fig. 7 may be implemented by one or more processors 801. The related functions implemented by the communication unit 702 shown in fig. 7 can be implemented by the communication interface 802.
The processor 801 includes one or more processors, for example, one or more Central Processing Units (CPUs), and in the case that the processor 801 is one Central Processing Unit (CPU), the CPU may be a single-core CPU or a multi-core CPU. In the embodiment of the present application, the processor 801 is used to control the computer apparatus 800 to implement the embodiment shown in fig. 3.
The communication interface 802 is used for realizing communication with other devices, for example, if the computer device 800 is a user terminal, the communication interface 802 can realize communication between the user terminal and devices such as a server; if the computer device 800 is a server, the communication interface 802 can enable communication between the server and a device such as a user terminal.
The memory 803 includes, but is not limited to, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM), or a portable read-only memory (CD-ROM), and the memory 803 is used for storing relevant instructions and data.
In the present embodiment, the memory 803 stores a computer program 804, the computer program comprising program instructions, the processor 801 being configured to invoke the program instructions, the program comprising instructions for performing the steps of:
if a pet image after the vaccination of the pet is received, acquiring a related pet image and pet information of the pet;
acquiring state characteristics of the pet according to the pet image and the associated pet image of the pet, and acquiring preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine;
if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event;
and sending prompt information of the vaccine adverse reaction event to a subject corresponding to the pet.
In one possible example, in the aspect of obtaining the status characteristic of the pet according to the pet image of the pet and the associated pet image, the instructions in the program are specifically configured to perform the following operations:
acquiring a correlation value between the pet image of the pet and the correlated pet image;
acquiring first state data of the pet according to the pet image of the pet, and acquiring second state data of the pet according to the associated pet image of the pet;
and carrying out fusion processing on the first state data and the second state data according to the associated values to obtain the state characteristics of the pet.
In one possible example, in the obtaining of the correlation value between the pet image of the pet and the correlated pet image, the instructions in the program are specifically configured to:
respectively acquiring scene data and/or physiological parameters of the pet image of the pet and the related pet image;
and acquiring a correlation value between the pet image of the pet and the correlation pet image according to the scene data and/or the physiological parameters.
In one possible example, after obtaining the status characteristic of the pet based on the pet image of the pet and the associated pet image, the instructions in the program are further configured to:
and updating the state characteristics of the pet according to the physiological parameters.
In one possible example, prior to said sending the reminder of the vaccine adverse reaction event to the subject corresponding to the pet, the instructions in the program are further for:
obtaining the risk level of the vaccine adverse reaction event according to the state characteristics of the pet;
acquiring the attention of the vaccine adverse reaction event according to the risk level;
and generating prompt information of the vaccine adverse reaction event according to the notice.
In one possible example, after said obtaining the risk level of said vaccine adverse reaction event based on the status characteristic of said pet, the instructions in the program are further for:
and if the danger level is greater than a preset level, sending a calling request to a hospital corresponding to the pet, wherein the calling request is used for indicating the hospital to provide medical resources.
In one possible example, before said sending a call request to a hospital corresponding to said pet, the instructions in the program are further configured to:
acquiring the position, the insurance information and the historical diagnosis record of the pet;
determining the priority of each hospital in a plurality of hospitals corresponding to the pet position according to the insurance application information and the historical visit record of the pet;
and determining the hospital corresponding to the pet according to the priority of the hospital.
An embodiment of the present application further provides a computer-readable storage medium, where the computer program executed by the aforementioned event notification apparatus is stored in the computer-readable storage medium, and the computer program includes program instructions, and when the processor executes the program instructions, the description of the event notification method in the embodiment corresponding to fig. 3 can be performed, so that details are not repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in embodiments of the computer-readable storage medium referred to in the present application, reference is made to the description of embodiments of the method of the present application. As an example, the program instructions may be deployed to be executed on one computing device or on multiple computing devices at one site or distributed across multiple sites and interconnected by a communication network, which may constitute a block chain system.
Further, it should be noted that: embodiments of the present application also provide a computer program product or computer program, which may include computer instructions, which may be stored in a computer-readable storage medium. The processor of the computer device reads the computer instruction from the computer-readable storage medium, and the processor can execute the computer instruction, so that the computer device executes the description of the event notification method in the embodiment corresponding to fig. 3, which is described above, and therefore, the description thereof will not be repeated here. In addition, the beneficial effects of the same method are not described in detail. For technical details not disclosed in the embodiments of the computer program product or the computer program referred to in the present application, reference is made to the description of the embodiments of the method of the present application.
It should be noted that, for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts or combinations, but those skilled in the art should understand that the present application is not limited by the order of acts described, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
The steps in the method of the embodiment of the application can be sequentially adjusted, combined and deleted according to actual needs.
The modules in the device can be combined, divided and deleted according to actual needs.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can include the processes of the embodiments of the methods described above when the computer program is executed. The storage medium may be a magnetic disk, an optical disk, a ROM, a RAM, an EPROM, a CD-ROM, or the like.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (10)

1. An event prompting method is characterized by comprising the following steps:
if a pet image after the vaccination of the pet is received, acquiring a related pet image and pet information of the pet;
acquiring state characteristics of the pet according to the pet image and the associated pet image of the pet, and acquiring preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the inoculated vaccine;
if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event;
and sending prompt information of the vaccine adverse reaction event to a subject corresponding to the pet.
2. The method of claim 1, wherein said obtaining a status characteristic of the pet based on the pet image of the pet and the associated pet image comprises:
acquiring a correlation value between the pet image of the pet and the correlation pet image;
acquiring first state data of the pet according to the pet image of the pet, and acquiring second state data of the pet according to the associated pet image of the pet;
and carrying out fusion processing on the first state data and the second state data according to the associated values to obtain the state characteristics of the pet.
3. The method of claim 2, wherein said obtaining a correlation value between the pet image of the pet and the correlated pet image comprises:
respectively acquiring scene data and/or physiological parameters of the pet image of the pet and the related pet image;
and acquiring a correlation value between the pet image of the pet and the correlation pet image according to the scene data and/or the physiological parameters.
4. The method of claim 3, further comprising, after said obtaining the status characteristic of the pet based on the pet image of the pet and the associated pet image:
and updating the state characteristics of the pet according to the physiological parameters.
5. The method of any one of claims 1 to 4, further comprising, prior to said sending a reminder of said vaccinal adverse reaction event to said subject to whom said pet corresponds:
obtaining the risk level of the vaccine adverse reaction event according to the state characteristics of the pet;
acquiring the attention of the vaccine adverse reaction event according to the risk level;
and generating prompt information of the vaccine adverse reaction event according to the notice.
6. The method of claim 5, further comprising, after said obtaining a risk level for said vaccinal adverse reaction event based on said status characteristic of said pet:
and if the danger level is greater than a preset level, sending a calling request to a hospital corresponding to the pet, wherein the calling request is used for indicating the hospital to provide medical resources.
7. The method of claim 6, further comprising, before sending the call request to the hospital corresponding to the pet:
acquiring the position, the insurance information and the historical diagnosis record of the pet;
determining the priority of each hospital in a plurality of hospitals corresponding to the pet position according to the insurance application information and the historical visit record of the pet;
and determining the hospital corresponding to the pet according to the priority of the hospital.
8. An event notification apparatus, comprising:
a communication unit for receiving a pet image of a pet;
the processing unit is used for acquiring the state characteristics of the pet according to the pet image and the associated pet image of the pet, and acquiring the preset state characteristics of the vaccine adverse reaction event according to the pet information and the vaccine information of the vaccination; if the state characteristics of the pet are successfully matched with the preset state characteristics, determining that the pet has a vaccine adverse reaction event;
the communication unit is also used for sending prompt information of the vaccine adverse reaction event to a subject corresponding to the pet.
9. A computer device comprising a memory and a processor; the memory is coupled to the processor, the memory for storing a computer program, the processor for invoking the computer program to cause the computer device to perform the method of any of claims 1-7.
10. A computer-readable storage medium, in which a computer program is stored which is adapted to be loaded and executed by a processor to cause a computer device having said processor to perform the method of any of claims 1 to 7.
CN202210636287.1A 2022-06-07 2022-06-07 Event prompting method, device, equipment and storage medium Pending CN115205775A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210636287.1A CN115205775A (en) 2022-06-07 2022-06-07 Event prompting method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210636287.1A CN115205775A (en) 2022-06-07 2022-06-07 Event prompting method, device, equipment and storage medium

Publications (1)

Publication Number Publication Date
CN115205775A true CN115205775A (en) 2022-10-18

Family

ID=83577194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210636287.1A Pending CN115205775A (en) 2022-06-07 2022-06-07 Event prompting method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN115205775A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546839A (en) * 2022-10-20 2022-12-30 星宠王国(北京)科技有限公司 Epidemic prevention system and method based on dog face image recognition technology

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115546839A (en) * 2022-10-20 2022-12-30 星宠王国(北京)科技有限公司 Epidemic prevention system and method based on dog face image recognition technology
CN115546839B (en) * 2022-10-20 2023-05-16 星宠王国(北京)科技有限公司 Epidemic prevention system and method based on dog face image recognition technology

Similar Documents

Publication Publication Date Title
US11854676B2 (en) Providing live first aid response guidance using a machine learning based cognitive aid planner
EP3660855A1 (en) Multi-channel data aggregation system and method for communicating animal breed, medical, and profile information among remote user networks
CN109523067B (en) Cost prediction method and device based on prediction model, server and storage medium
CN111899878B (en) Old person health detection system, method, computer device and readable storage medium
KR102254419B1 (en) METHOD AND APPARATUS FOR CURATING COMPANION ANIMAL HEALTH CARE, and system using the same
CN110462654A (en) Record accessing and management
CN110139598A (en) Monitoring and tracking system, method, article and equipment
JP6912840B1 (en) Information processing method, diagnostic support device and computer program
US10937160B1 (en) Dental images processed with artificial intelligence
CN109492595B (en) Behavior prediction method and system suitable for fixed group
CN110875093A (en) Treatment scheme processing method, device, equipment and storage medium
CN113724859A (en) Disease prompting device, method and device based on artificial intelligence and storage medium
US20190115111A1 (en) System and method for sharing medical test results
KR20200063637A (en) Pet healthcare network service system based on heart rate monitoring database
CN111227789A (en) Human health monitoring method and device
JP2017188089A (en) Behavior prediction
JP2008269188A (en) Support system and method for insurance for pet
CN115205775A (en) Event prompting method, device, equipment and storage medium
JP2023500511A (en) Combining Model Outputs with Combined Model Outputs
KR102277853B1 (en) Control method for system of preventing pet loss
US12008749B2 (en) Dental images correlated to the human genome with artificial intelligence
CN115271132A (en) Vaccine reservation method, device, equipment and storage medium
CN115311100A (en) Event processing method, device, equipment and storage medium
CN109493108B (en) Medical activity information processing method, device, computer equipment and medium
US12094323B2 (en) Mobile collection of sensitive information including tracking system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination