CN116703227A - Guest room management method and system based on intelligent service - Google Patents
Guest room management method and system based on intelligent service Download PDFInfo
- Publication number
- CN116703227A CN116703227A CN202310702222.7A CN202310702222A CN116703227A CN 116703227 A CN116703227 A CN 116703227A CN 202310702222 A CN202310702222 A CN 202310702222A CN 116703227 A CN116703227 A CN 116703227A
- Authority
- CN
- China
- Prior art keywords
- gesture
- monitoring
- module
- guest room
- image sequence
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000007726 management method Methods 0.000 title claims abstract description 27
- 238000012544 monitoring process Methods 0.000 claims abstract description 222
- 230000004913 activation Effects 0.000 claims abstract description 40
- 230000005611 electricity Effects 0.000 claims abstract description 21
- 238000000034 method Methods 0.000 claims abstract description 20
- 238000012545 processing Methods 0.000 claims abstract description 11
- 230000006399 behavior Effects 0.000 claims description 59
- 238000004088 simulation Methods 0.000 claims description 34
- 238000005070 sampling Methods 0.000 claims description 10
- 230000029058 respiratory gaseous exchange Effects 0.000 claims description 8
- 238000005065 mining Methods 0.000 claims description 6
- 238000007417 hierarchical cluster analysis Methods 0.000 claims description 5
- 238000012549 training Methods 0.000 claims description 4
- 238000007781 pre-processing Methods 0.000 claims description 2
- 210000002837 heart atrium Anatomy 0.000 claims 1
- 230000000694 effects Effects 0.000 abstract description 6
- 230000008569 process Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 5
- 238000000605 extraction Methods 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 230000004048 modification Effects 0.000 description 4
- 230000000241 respiratory effect Effects 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 210000001503 joint Anatomy 0.000 description 3
- 238000004140 cleaning Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 241000257159 Musca domestica Species 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000007621 cluster analysis Methods 0.000 description 1
- 238000012217 deletion Methods 0.000 description 1
- 230000037430 deletion Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 210000002310 elbow joint Anatomy 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000009467 reduction Effects 0.000 description 1
- 230000036387 respiratory rate Effects 0.000 description 1
- 230000007958 sleep Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B21/00—Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
- G08B21/18—Status alarms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06395—Quality analysis or management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/12—Hotels or restaurants
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02P—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
- Y02P90/00—Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
- Y02P90/02—Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Resources & Organizations (AREA)
- Theoretical Computer Science (AREA)
- Tourism & Hospitality (AREA)
- Economics (AREA)
- Strategic Management (AREA)
- Multimedia (AREA)
- Educational Administration (AREA)
- Entrepreneurship & Innovation (AREA)
- General Health & Medical Sciences (AREA)
- Marketing (AREA)
- Health & Medical Sciences (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Game Theory and Decision Science (AREA)
- Primary Health Care (AREA)
- Human Computer Interaction (AREA)
- Emergency Management (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Databases & Information Systems (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Alarm Systems (AREA)
Abstract
The invention discloses a guest room management method and system based on intelligent service, and relates to the technical field of data processing, wherein the method comprises the following steps: acquiring guest room basic information, wherein the guest room basic information comprises power taking state information and registration state information; when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, monitoring a guest room through a gesture monitoring module, and acquiring a gesture monitoring sequence; inputting a gesture feature analysis module to obtain a gesture feature set; inputting the behavior classification module to obtain a behavior classification result; when the system belongs to the early warning behavior of the aircraft room, the power taking state information is converted into a power taking closing state, and the early warning information of the aircraft room is generated and sent to the passenger control monitoring platform. The invention solves the technical problems that the hotel guest room has the femto phenomenon and has larger supervision loopholes on the femto in the prior art, and achieves the technical effects of improving the efficiency of guest room management and effectively managing the femto phenomenon.
Description
Technical Field
The invention relates to the technical field of data processing, in particular to a guest room management method and system based on intelligent service.
Background
The service quality of the hotel can be greatly improved through efficient management, and guest room management is the core of hotel management, so that not only is the user intuitively perceived service, but also the guest room management has important influence on the operation benefit of the hotel. At present, a house-flying phenomenon caused by various factors exists in a hotel, such as a house-flying behavior is formed by taking a guest in the daytime, after the guest returns to the house, not checking out in the foreground and arranging the guest in the house at night to take the house by modifying the guest information. However, the efficiency of the house fly phenomenon manually supervised is relatively low and is not easily found, resulting in loss of hotel benefits. In the prior art, the hotel guest room has the phenomenon of a femto room, and the technical problem of large supervision loopholes of the femto room is solved.
Disclosure of Invention
The application provides a guest room management method and system based on intelligent service, which are used for solving the technical problems that in the prior art, a hotel guest room has a guest room flying phenomenon and the supervision loophole of the guest room is large.
In view of the above problems, the present application provides a guest room management method and system based on intelligent service.
In a first aspect of the present application, there is provided a guest room management method based on an intelligent service, the method comprising:
Acquiring guest room basic information, wherein the guest room basic information comprises power taking state information and registration state information;
when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, monitoring a guest room through a gesture monitoring module to obtain a gesture monitoring sequence;
inputting the gesture monitoring sequence into a gesture feature analysis module to obtain a gesture feature set;
inputting the gesture feature set into a behavior classification module to obtain a behavior classification result;
when the behavior classification result belongs to the house-flying early warning behavior, converting the electricity-taking state information into an electricity-taking closing state, generating house-flying early warning information and sending the house-flying early warning information to the passenger control monitoring platform.
In a second aspect of the present application, there is provided a guest room management system based on an intelligent service, the system comprising:
the system comprises a basic information acquisition module, a control module and a control module, wherein the basic information acquisition module is used for acquiring guest room basic information, and the guest room basic information comprises electricity taking state information and registration state information;
the monitoring sequence obtaining module is used for monitoring the guest room through the gesture monitoring module when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, and obtaining a gesture monitoring sequence;
The gesture feature acquisition module is used for inputting the gesture monitoring sequence into the gesture feature analysis module to acquire a gesture feature set;
the behavior classification result obtaining module is used for inputting the gesture feature set into the behavior classification module to obtain a behavior classification result;
and the early warning information sending module is used for converting the power taking state information into a power taking closing state when the behavior classification result belongs to the house early warning behavior, generating house early warning information and sending the house early warning information to the passenger control monitoring platform.
One or more technical schemes provided by the application have at least the following technical effects or advantages:
the application acquires guest room basic information, wherein the guest room basic information comprises electricity taking state information and registration state information, then monitors guest rooms through a gesture monitoring module when the registration state information belongs to an unregistered state and the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, acquires a gesture monitoring sequence, inputs the gesture monitoring sequence into a gesture feature analysis module to acquire a gesture feature set, inputs the gesture feature set into a behavior classification module to acquire a behavior classification result, converts the electricity taking state information into an electricity taking off state when the behavior classification result belongs to a flight room early warning behavior, and generates flight room early warning information to be sent to a guest control monitoring platform. The technical effects of improving the early warning of the behaviors of the flying room and improving the intelligent level and quality of guest room management are achieved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic flow chart of a guest room management method based on intelligent service according to an embodiment of the present application;
fig. 2 is a schematic flow chart of acquiring an attitude monitoring sequence in a guest room management method based on an intelligent service according to an embodiment of the present application;
fig. 3 is a schematic flow chart of acquiring a gesture feature set in a guest room management method based on an intelligent service according to an embodiment of the present application;
fig. 4 is a schematic diagram of a guest room management system based on intelligent service according to an embodiment of the present application.
Reference numerals illustrate: the system comprises a basic information acquisition module 11, a monitoring sequence acquisition module 12, a gesture feature acquisition module 13, a behavior classification result acquisition module 14 and an early warning information transmission module 15.
Detailed Description
The application provides a guest room management method and system based on intelligent service, which are used for solving the technical problems that in the prior art, a hotel guest room has a guest room flying phenomenon and the supervision loophole of the guest room is large.
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the accompanying drawings in the embodiments of the present application. It will be apparent that the described embodiments are only some, but not all, embodiments of the application. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
It should be noted that the terms "comprises" and "comprising," along with any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or server that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed or inherent to such process, method, article, or apparatus, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus.
Example 1
As shown in fig. 1, the present application provides a guest room management method based on intelligent service, which includes:
step S100: acquiring guest room basic information, wherein the guest room basic information comprises power taking state information and registration state information;
step S200: when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, monitoring a guest room through a gesture monitoring module to obtain a gesture monitoring sequence;
Further, as shown in fig. 2, when the registration state information belongs to a non-registration state, the power taking state information belongs to a power taking activation state, and a preset activation duration is satisfied, the passenger room is monitored by the gesture monitoring module, and a gesture monitoring sequence is obtained, and step S200 in the embodiment of the present application further includes:
step S210: the gesture monitoring module comprises a human body recognition module and a gesture sensing module;
step S220: the human body recognition module is used for covering and transmitting millimeter-level electromagnetic waves to a guest room, and collecting first electromagnetic wave monitoring characteristics, wherein the first electromagnetic wave monitoring characteristics comprise target distance characteristics, target breathing characteristics and target heartbeat characteristics of a first monitoring object;
step S230: when the target distance characteristic meets a sensitive distance threshold and the target breathing characteristic and/or the target heartbeat characteristic exist, adding the first monitoring object into a gesture monitoring object;
step S240: and monitoring the gesture monitoring object according to the gesture sensing module to obtain the gesture monitoring sequence.
In one possible embodiment, the guest room basic information is obtained by retrieving guest room data in a hotel guest room management system. The guest room basic information is information describing the real-time state of the guest room and comprises power taking state information and registration state information. The power taking state information is information describing whether a circuit in the guest room is communicated with a hotel circuit or not and the state of supplying power to the guest room. Preferably, the guest room typically uses a power card or the like to connect the electrical circuit in the guest room to the hotel electrical circuit to supply power to guests in the guest room. And when the power taking state information is in the power-off state, indicating that no power is supplied in the guest room, and when the power taking state information is in the power taking activation state, indicating that the power is supplied in the guest room. The registration state information is information describing a state of whether a guest room has a guest to check in. When the registration state is displayed as registered, indicating that a guest in the guest room is checked in; when the registration state is displayed as a non-registration state, it indicates that no tenant in the guest room is registered for check-in, and the guest room should be in an empty state. By acquiring basic information of the guest room, a basic analysis basis is provided for subsequent analysis of whether the pre-warning behavior of the guest room exists in the guest room.
In one embodiment, the abnormal power taking guest room is determined as a monitoring object by analyzing the power taking state of the guest room according to the guest room basic information and combining the registration state information, and then whether the guest room has the phenomenon of a flying guest room is determined by analyzing the posture of a human body in the guest room. The method realizes the goal of multidimensional and efficient analysis and determination of the early warning behavior of the flyer house.
In one possible embodiment, when the registration state information of the guest room belongs to a non-registration state and the power taking state information belongs to a power taking activation state, the activation duration from the power taking activation time point in the guest room to the current time point is extracted from the hotel guest room management system. And then comparing the activation time with a preset activation time, and when the activation time meets the preset activation time, indicating that the guest room is not checked in at the moment, wherein the time length of the guest room in the power-on state exceeds the preset activation time, and the possibility of the phenomenon of the flying house exists. At this time, monitoring a preset time window of the guest room through a gesture monitoring module, and obtaining the gesture monitoring sequence according to the monitoring time sequence. The preset activation time is the longest time period that the preset guest room can be in the power-taking activation state in the non-registration state, and is set by the staff at will, and the preset activation time is not limited herein. Exemplary, according to the size of the guest room area, e.g. 20m 2 And 20m of the hotel room cleaning steps are calculated 2 The length of time required for the complete cleaning of the guest room is 20 minutes, and the 20 minutes is set as the preset activation time.
In one possible embodiment, the gesture monitoring module is a functional module for identifying and monitoring the gesture of a person in a guest room, and the gesture monitoring module comprises a human body identification module and a gesture sensing module. The human body identification module is a functional module for carrying out characteristic identification monitoring on personnel in a guest room, and is optional and is sensed by the PCR radar sensor. The working frequency band of the PCR radar sensor is 61GHZ, and the working frequency band of the PCR radar sensor can be used for monitoring a person who moves, is static or sleeps well by actively transmitting millimeter-level electromagnetic waves and can also be used for monitoring respiration and heartbeat signals of the person. The gesture sensor module is used for monitoring the gesture of a human body in a guest room, and optionally, the gesture sensor is also composed of a PCR radar sensor.
In the embodiment of the application, the human body identification module is utilized to cover and emit millimeter-level electromagnetic waves to the guest room, and the first electromagnetic wave monitoring characteristic is obtained according to Doppler information of the reflected waves obtained after the electromagnetic waves reach the body limb surface. The first electromagnetic wave monitoring feature is a feature which is different from other monitoring objects after the guest room is subjected to coverage monitoring, and comprises a target distance feature, a target breathing feature and a target heartbeat feature of the first monitoring object. The first monitoring object is any human body identified in the guest room by utilizing the human body identification module. The target distance feature is used for describing the distance between the first monitoring object and the human body identification module. The target respiratory characteristics are used to describe respiratory conditions, including respiratory rate, monitored by the first monitored subject. The target heartbeat characteristic is used for describing the heartbeat condition monitored by the first monitoring object and comprises the characteristics of the number of heartbeats, the heartbeat amplitude and the like in unit time.
In the embodiment of the application, the sensitive distance threshold is the farthest distance between the first monitoring object and the human body recognition module when the recognition result determined according to the size of the area and the specification structure of the guest room belongs to the guest room. When the target distance characteristic meets the sensitive distance threshold, it indicates that the first monitored object is within the guest room, and not an object passing outside the guest room. Furthermore, when the target respiratory feature exists, or the target respiratory feature and the target heartbeat feature exist simultaneously, or the target heartbeat feature exists, the first monitoring object is indicated to be a human body, and the first monitoring object is added into the gesture monitoring object. Therefore, the technical effect of improving the accuracy of the monitored object is achieved.
In one possible embodiment, the gesture monitoring object is a target object in a guest room that needs gesture analysis monitoring. And after the gesture monitoring object is obtained, monitoring the gesture monitoring object in a preset time window by utilizing the gesture sensing module to obtain the gesture monitoring sequence. The preset time window is a preset time length for monitoring.
Further, according to the gesture sensing module, the gesture monitoring object is monitored to obtain the gesture monitoring sequence, and step S200 of the embodiment of the present application further includes:
Step S250: transmitting millimeter-level electromagnetic waves to the gesture monitoring object through the gesture sensing module, and collecting second electromagnetic wave monitoring features at a first moment, wherein the second electromagnetic wave monitoring features comprise target distance positioning features and target angle positioning features;
step S260: performing contour positioning on the gesture monitoring object according to the target distance positioning feature and the target angle positioning feature to obtain target contour positioning information;
step S270: matching a human skeleton model according to the target contour positioning information, and identifying gesture monitoring key points for the gesture monitoring object, wherein the gesture monitoring key points are human skeleton joints;
step S280: and obtaining the positioning feature sequence of the gesture monitoring key point, and adding the positioning feature sequence into the gesture monitoring sequence.
In the embodiment of the application, the gesture sensing module is utilized to emit millimeter-level electromagnetic waves to the gesture monitoring object, and the reflected waves of different limb parts of the human body are passed through to obtain second electromagnetic wave monitoring characteristics at a first moment. The first moment is any moment when the attitude sensing module emits millimeter-level electromagnetic waves. The second electromagnetic wave monitoring feature is used for describing information obtained by reflected waves of different limb parts obtained by carrying out gesture monitoring on the gesture monitoring object, and comprises a target distance positioning feature and a target angle positioning feature. The target distance positioning feature is used for describing the distance between different limb parts and the gesture sensing module. The target angle positioning feature is used for describing angles formed by different limb parts and the gesture sensing module.
In one embodiment, the profile of the gesture monitoring object is positioned according to the target distance positioning feature and the target angle positioning feature, optionally, a spatial coordinate system with the gesture sensing module as an origin is constructed, a straight line angle sent from the origin is determined according to an angle in the target angle positioning feature, and then a straight line length is determined according to a distance in the target distance positioning feature, so that the position of the profile point is determined, and the profile of the gesture monitoring object is positioned according to information described by the target angle positioning feature and the target distance positioning feature, so that the target profile positioning information is obtained. The target contour positioning information is information for carrying out position positioning description on the external contour of the gesture monitoring object.
Specifically, the external contour presented in the target contour positioning information is matched with a human skeleton model, and the gesture monitoring key point identification is carried out on the gesture monitoring object according to the human skeleton joint position in the human skeleton model. The gesture monitoring key points are key nodes for general description of the behavior gesture of the gesture monitoring object, namely human skeleton joints. And determining a positioning feature sequence according to the sequence of time and the sequence of time by using the monitoring data of the gesture monitoring key points at different moments in a preset time window as a gesture monitoring sequence.
Step S300: inputting the gesture monitoring sequence into a gesture feature analysis module to obtain a gesture feature set;
further, as shown in fig. 3, the gesture monitoring sequence is input to a gesture feature analysis module to obtain a gesture feature set, and step S300 of the embodiment of the present application further includes:
step S310: connecting the positioning feature sequences of the same frames of the first monitoring object of the gesture monitoring sequence based on the human skeleton model to generate a gesture monitoring image sequence;
step S320: based on a first step length, extracting the gesture monitoring image sequence and performing s times downsampling processing to generate a gesture monitoring first image sequence;
step S330: extracting the gesture monitoring image sequence based on a second step length to generate a gesture monitoring second image sequence, wherein the second step length is larger than the first step length;
step S340: inputting the gesture monitoring first image sequence and the gesture monitoring second image sequence into the gesture feature analysis module to obtain the gesture feature set.
Further, inputting the first image sequence for gesture monitoring and the second image sequence for gesture monitoring into the gesture feature analysis module to obtain the gesture feature set, where before step S300 in the embodiment of the present application further includes:
Step S350: performing gesture deployment according to gesture description features based on the human skeleton model to generate a gesture simulation image sequence;
step S360: based on a third step length, extracting the gesture simulation image sequence and performing s times downsampling processing to generate a gesture simulation first image sequence;
step S370: extracting the gesture simulation image sequence based on a fourth step length to generate a gesture simulation second image sequence, wherein the fourth step length is larger than the third step length;
step S380: and acquiring a slow channel and a fast channel based on a SLOWFAST model, inputting the gesture simulation first image sequence into the fast channel, inputting the gesture simulation second image sequence into the slow channel, and training the gesture description characteristic as output identification information to generate the gesture characteristic analysis module.
Further, the gesture feature analysis module further includes a preprocessing channel, and step S380 of the embodiment of the present application further includes:
step S381: performing hierarchical clustering analysis on the positioning information of the key point adjacent frames of the gesture image sequence according to the preset positioning deviation degree to obtain a plurality of clustering frame number step sizes of a plurality of clustering results;
Step S382: taking the minimum value of the step sizes of the plurality of clustering frame numbers as a first step size;
step S383: taking the maximum value of the step sizes of the plurality of clustering frame numbers as a second step size;
step S384: and performing slow channel input data sampling on the gesture image sequence according to the first step length, and performing fast channel input data sampling on the gesture image sequence according to the second step length.
In one possible embodiment, the gesture feature analysis module is a functional module for intelligently analyzing gesture feature information described in a gesture monitoring sequence, including a slow channel and a fast channel. The slow channel is constructed based on a SLOWFAST model and used for analyzing the gesture with small change in the gesture monitoring first image sequence, and the fast channel is constructed based on the SLOWFAST model and used for analyzing the gesture with large change in the gesture monitoring second image sequence. Inputting the gesture detection first image sequence and the gesture detection second image sequence into the gesture feature analysis module to perform gesture feature analysis of the module, so as to obtain the gesture feature set.
In the embodiment of the application, the positioning feature sequences of the same frames of the first monitoring object of the gesture monitoring sequence are connected based on the human skeleton model, that is to say, the positioning feature sequences in the same frames belonging to the first monitoring object in the gesture monitoring sequence are identified and connected, so as to obtain the gesture monitoring image sequence. The gesture monitoring image sequence is an image sequence for monitoring the behavior gesture of the first monitoring object.
Specifically, the gesture monitoring image sequence is extracted according to the first step length, the first step length is set to obtain more images, so that the first step length is smaller, and the first step length is the minimum value of a plurality of clustering frame numbers in a plurality of clustering results after the clustering analysis of the adjacent frames of the key points of the gesture image sequence, namely, the number of interval frames between the adjacent frames of the key points is smaller, so that the number of images obtained after the extraction is larger. For example, the first step size is set to 2 frames of the interval frame number. The result of the extraction is subjected to a downsampling process by S times, and an image is sampled by S times, wherein the size of the image is 16 x 24, and the image is obtained by downsampling by S times, so that a resolution image with the size of (16/S) (24/S) is obtained, and S is a common divisor of 16 and 24, such as 4 or 8. And performing dimension reduction processing on the extraction result, reducing analysis load, and obtaining the gesture monitoring first image sequence according to the processed image. The gesture monitoring first image sequence is a sequence for describing key points of a first monitored object with relatively small variation amplitude.
In one possible embodiment, the pose monitor image Xu Lei is extracted by the second step, and the second step is set to obtain the key point image with larger variation range, that is, the pose monitor second image sequence. The distance between two neighboring frames of the keypoint is larger and therefore the second step size is larger than the first step size. The second step size is set to 10 frames of interval extraction frame number. And obtaining the gesture monitoring second image sequence according to the extracted result. And inputting the first image sequence of gesture monitoring into the fast channel, and inputting the second image sequence of gesture monitoring into the slow channel for gesture feature analysis, so as to obtain a gesture feature set.
In the embodiment of the application, gesture deployment is performed according to gesture description features based on the human skeleton model, and a gesture simulation image sequence is generated. Namely, the human body skeleton model is used for simulating the pose which can be swung out, and then video image acquisition is carried out on the pose change process, so that the pose simulation image sequence is obtained. And performing hierarchical clustering analysis on the positioning information of the adjacent frames of the key points of the gesture simulation image sequence according to the preset positioning deviation degree by analyzing the gesture simulation image sequence, obtaining a plurality of simulation clustering frame number step sizes of a plurality of simulation clustering results, setting the minimum value of the simulation clustering frame number step sizes as a third step size, and setting the maximum value of the simulation clustering frame number step sizes as a fourth step size. And inputting the first image sequence of the gesture simulation into the fast channel, inputting the second image sequence of the gesture simulation into the slow channel, and performing supervision training by taking the gesture description characteristic as output identification information until an output result reaches convergence to obtain the gesture characteristic analysis module.
Specifically, the preset positioning deviation degree is a positioning deviation value which can be determined to be large in magnitude when the gesture monitoring key point of the preset gesture monitoring object occurs, and is set by a worker by himself, and the preset positioning deviation degree is not limited herein. And acquiring the distances between adjacent frames, of which the positioning deviation values reach the preset positioning deviation degree, of different gesture monitoring key points in the gesture image sequence, and performing cluster analysis on the acquired results to obtain a plurality of cluster frame number step sizes of the plurality of cluster results. The clustering results correspond to the gesture monitoring key points, and each clustering structure is provided with a plurality of clustering frame number step sizes. Illustratively, when the deviation value of the positioning of the key point of the gesture monitoring of the elbow joint reaches the preset deviation degree, the distances between adjacent frames are acquired to be 3,6, 13, 16 and 21 respectively. And selecting one of the 6 interval distances at will, for example, 6 as a first dividing node, carrying out two-classification on the 6 interval distances, obtaining a first clustering frame number step length of 3 in a first dividing result, taking 6, 14, 16 and 21 as a second clustering frame number step length set in the first dividing result, carrying out mean processing on the second clustering frame number step length, and rounding to obtain a second clustering frame number step length of 14. The minimum value 3 of the two cluster frame steps is set to a first step and the maximum value 14 is set to a second step. And performing slow channel input data sampling on the gesture image sequence according to the first step length to obtain a gesture monitoring first image sequence, and performing fast channel input data sampling on the gesture image sequence according to the second step length to obtain a gesture monitoring second image sequence.
Step S400: inputting the gesture feature set into a behavior classification module to obtain a behavior classification result;
further, the gesture feature set is input into a behavior classification module to obtain a behavior classification result, and step S400 of the embodiment of the present application further includes:
step S410: carrying out strict frequent pattern mining on the gesture feature set to obtain an associated task type and an associated execution tool;
step S420: the guest room basic information further comprises tool state information during power-on activation, the associated task types of which the associated execution tools do not meet the tool state information are deleted, and the rest associated task types are set as the behavior classification results.
Step S500: when the behavior classification result belongs to the house-flying early warning behavior, converting the electricity-taking state information into an electricity-taking closing state, generating house-flying early warning information and sending the house-flying early warning information to the passenger control monitoring platform.
Specifically, the strict frequent pattern mining is performed according to the gesture feature set, namely, the gesture features in the gesture feature set are used as indexes, related tasks are mined in big data, the strict frequent pattern is executed in the mining process, and the tasks which do not accord with the gesture feature set are not acquired through data, so that the related task types and the related execution tools are obtained. The associated task type is a description of the task category associated with the gesture feature. The associated execution tool is a tool needed for executing tasks related to the gesture features. For example, when the gesture feature is that the hand is always holding, the corresponding associated task type is a glass wipe, and the associated implement is a wipe.
In one possible embodiment, the acquiring is performed by acquiring tool state information during power taking activation according to guest room basic information, that is, tool states available in guest rooms during power taking activation, such as: the electric hair drier is electrified and the refrigerator is electrified. When the obtained association execution tool can not meet the tool state information, namely the obtained association execution tool is found out in a guest room, namely the association task type of the association execution tool is not needed to be used, deleting the association task type of the association execution tool which does not meet the tool state information, and setting the rest association task types as the behavior classification result. The behavior classification module is used for executing operations such as strict frequent pattern mining, associated task type deletion, behavior classification result setting and the like.
Specifically, when the behavior classification result belongs to the house-flying early-warning behavior, the power-taking state information is converted into the power-taking off state, the power-off operation is carried out, the house-flying early-warning behavior is prevented from always damaging hotel benefits, and then the house-flying early-warning information is obtained and sent to the passenger control monitoring platform. The passenger control monitoring platform is used for intelligently monitoring the pre-warning behavior of the room.
In summary, the embodiment of the application has at least the following technical effects:
according to the application, the power taking state of the guest room is obtained according to guest room basic information, whether the guest room is to be monitored is determined by combining with registration information, then when the registration state information belongs to a non-registration state, the power taking state information belongs to a power taking activation state, and the preset activation time is met, the situation that the guest room possibly has the pre-warning behavior of the guest room is indicated, the guest room is monitored by utilizing a gesture monitoring module, the monitoring result is input into an intelligent gesture feature analysis module for gesture analysis, so that the obtained feature is input into a behavior classification module for behavior classification according to the analysis, then when the behavior classification result belongs to the pre-warning behavior of the guest room, the power taking state information is converted into the power taking off state, and the pre-warning behavior of the guest room is generated and sent to a guest control monitoring platform. The intelligent and reliable monitoring of the room-in-room phenomenon is achieved, and the technical effect of improving the room management quality is achieved.
Example two
Based on the same inventive concept as the intelligent service-based guest room management method in the foregoing embodiments, as shown in fig. 4, the present application provides a guest room management system based on intelligent service, and the system and method embodiments in the embodiments of the present application are based on the same inventive concept. Wherein the system comprises:
A basic information acquisition module 11, where the basic information acquisition module 11 is configured to acquire guest room basic information, where the guest room basic information includes power taking state information and registration state information;
the monitoring sequence obtaining module 12 is configured to monitor a guest room by using the gesture monitoring module when the registration state information belongs to a non-registration state, the power-taking state information belongs to a power-taking activation state, and a preset activation duration is satisfied, so as to obtain a gesture monitoring sequence;
the gesture feature obtaining module 13 is used for inputting the gesture monitoring sequence into the gesture feature analysis module to obtain a gesture feature set;
the behavior classification result obtaining module 14 is configured to input the gesture feature set into the behavior classification module to obtain a behavior classification result;
the early warning information sending module 15 is used for converting the power-taking state information into a power-taking closing state when the behavior classification result belongs to the house early warning behavior, generating house early warning information and sending the house early warning information to the passenger control monitoring platform.
Further, the system further comprises:
The monitoring module setting unit is used for the gesture monitoring module to comprise a human body recognition module and a gesture sensing module;
the monitoring feature acquisition unit is used for covering and transmitting millimeter-level electromagnetic waves to a guest room through the human body identification module and acquiring first electromagnetic wave monitoring features, wherein the first electromagnetic wave monitoring features comprise target distance features, target breathing features and target heartbeat features of a first monitoring object;
the monitoring object adding unit is used for adding the first monitoring object into the gesture monitoring object when the target distance characteristic meets a sensitive distance threshold and the target breathing characteristic and/or the target heartbeat characteristic exist;
the gesture monitoring sequence obtaining unit is used for monitoring the gesture monitoring object according to the gesture sensing module to obtain the gesture monitoring sequence.
Further, the system further comprises:
the second monitoring feature acquisition unit is used for transmitting millimeter-level electromagnetic waves to the gesture monitoring object through the gesture sensing module and acquiring second electromagnetic wave monitoring features at the first moment, wherein the second electromagnetic wave monitoring features comprise target distance positioning features and target angle positioning features;
The contour positioning information obtaining unit is used for carrying out contour positioning on the gesture monitoring object according to the target distance positioning feature and the target angle positioning feature to obtain target contour positioning information;
the monitoring key point identification unit is used for matching a human skeleton model according to the target contour positioning information and identifying gesture monitoring key points for the gesture monitoring objects, wherein the gesture monitoring key points are human skeleton joints;
and the gesture monitoring sequence adding unit is used for acquiring the positioning feature sequence of the gesture monitoring key point and adding the positioning feature sequence into the gesture monitoring sequence.
Further, the system further comprises:
the monitoring image sequence generating unit is used for connecting the positioning feature sequences of the same frames of the first monitoring object of the gesture monitoring sequence based on the human skeleton model to generate a gesture monitoring image sequence;
the first image sequence generating unit is used for extracting the gesture monitoring image sequence based on a first step length and performing s times downsampling processing to generate a gesture monitoring first image sequence;
The second image sequence generating unit is used for extracting the gesture monitoring image sequence based on a second step length to generate a gesture monitoring second image sequence, wherein the second step length is longer than the first step length;
the gesture feature set obtaining unit is used for inputting the gesture monitoring first image sequence and the gesture monitoring second image sequence into the gesture feature analysis module to obtain the gesture feature set.
Further, the system further comprises:
the gesture simulation image sequence generating unit is used for carrying out gesture deployment according to gesture description characteristics based on the human skeleton model to generate a gesture simulation image sequence;
the gesture simulation first image sequence generation unit is used for extracting the gesture simulation image sequence based on a third step length and performing s times downsampling processing to generate a gesture simulation first image sequence;
the gesture simulation second image sequence generating unit is used for extracting the gesture simulation image sequence based on a fourth step length to generate a gesture simulation second image sequence, wherein the fourth step length is larger than the third step length;
The gesture feature analysis module generating unit is used for acquiring a slow channel and a fast channel based on a SLOWFAST model, inputting the gesture simulation first image sequence into the fast channel, inputting the gesture simulation second image sequence into the slow channel, and training by taking the gesture description features as output identification information to generate the gesture feature analysis module.
Further, the system further comprises:
the hierarchical clustering analysis unit is used for performing hierarchical clustering analysis on the positioning information of the key point adjacent frames of the gesture image sequence according to the preset positioning deviation degree, and obtaining a plurality of clustering frame number step sizes of a plurality of clustering results;
a first step setting unit, configured to use a minimum value of the step sizes of the plurality of cluster frame numbers as a first step;
a second sampling compensation setting unit, configured to take a maximum value of the plurality of cluster frame step sizes as a second step size;
and the data sampling input unit is used for performing slow channel input data sampling on the gesture image sequence according to the first step length and performing fast channel input data sampling on the gesture image sequence according to the second step length.
Further, the system further comprises:
the associated execution tool obtaining unit is used for carrying out strict frequent pattern mining on the attitude feature set to obtain an associated task type and an associated execution tool;
and the behavior classification result setting unit is used for the guest room basic information and also comprises tool state information in the power-on activation period, deleting the associated task types of which the associated execution tools do not meet the tool state information, and setting the rest associated task types as the behavior classification result.
It should be noted that the sequence of the embodiments of the present application is only for description, and does not represent the advantages and disadvantages of the embodiments. And the foregoing description has been directed to specific embodiments of this specification. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims can be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
The foregoing description of the preferred embodiments of the application is not intended to limit the application to the precise form disclosed, and any such modifications, equivalents, and alternatives falling within the spirit and scope of the application are intended to be included within the scope of the application.
The specification and figures are merely exemplary illustrations of the present application and are considered to cover any and all modifications, variations, combinations, or equivalents that fall within the scope of the application. It will be apparent to those skilled in the art that various modifications and variations can be made to the present application without departing from the scope of the application. Thus, the present application is intended to include such modifications and alterations insofar as they come within the scope of the application or the equivalents thereof.
Claims (8)
1. A guest room management method based on intelligent service, comprising:
acquiring guest room basic information, wherein the guest room basic information comprises power taking state information and registration state information;
when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, monitoring a guest room through a gesture monitoring module to obtain a gesture monitoring sequence;
Inputting the gesture monitoring sequence into a gesture feature analysis module to obtain a gesture feature set;
inputting the gesture feature set into a behavior classification module to obtain a behavior classification result;
when the behavior classification result belongs to the house-flying early warning behavior, converting the electricity-taking state information into an electricity-taking closing state, generating house-flying early warning information and sending the house-flying early warning information to the passenger control monitoring platform.
2. The method of claim 1, wherein when the registration status information belongs to a no-registration status and the power-on status information belongs to a power-on activation status and a preset activation duration is satisfied, monitoring the guest room by a gesture monitoring module to obtain a gesture monitoring sequence, comprising:
the gesture monitoring module comprises a human body recognition module and a gesture sensing module;
the human body recognition module is used for covering and transmitting millimeter-level electromagnetic waves to a guest room, and collecting first electromagnetic wave monitoring characteristics, wherein the first electromagnetic wave monitoring characteristics comprise target distance characteristics, target breathing characteristics and target heartbeat characteristics of a first monitoring object;
when the target distance characteristic meets a sensitive distance threshold and the target breathing characteristic and/or the target heartbeat characteristic exist, adding the first monitoring object into a gesture monitoring object;
And monitoring the gesture monitoring object according to the gesture sensing module to obtain the gesture monitoring sequence.
3. The method of claim 2, wherein monitoring the gesture monitoring object according to the gesture sensing module, obtaining the gesture monitoring sequence, comprises:
transmitting millimeter-level electromagnetic waves to the gesture monitoring object through the gesture sensing module, and collecting second electromagnetic wave monitoring features at a first moment, wherein the second electromagnetic wave monitoring features comprise target distance positioning features and target angle positioning features;
performing contour positioning on the gesture monitoring object according to the target distance positioning feature and the target angle positioning feature to obtain target contour positioning information;
matching a human skeleton model according to the target contour positioning information, and identifying gesture monitoring key points for the gesture monitoring object, wherein the gesture monitoring key points are human skeleton joints;
and obtaining the positioning feature sequence of the gesture monitoring key point, and adding the positioning feature sequence into the gesture monitoring sequence.
4. The method of claim 3, wherein inputting the gesture monitoring sequence into a gesture feature analysis module, obtaining a set of gesture features, comprises:
Connecting the positioning feature sequences of the same frames of the first monitoring object of the gesture monitoring sequence based on the human skeleton model to generate a gesture monitoring image sequence;
based on a first step length, extracting the gesture monitoring image sequence and performing s times downsampling processing to generate a gesture monitoring first image sequence;
extracting the gesture monitoring image sequence based on a second step length to generate a gesture monitoring second image sequence, wherein the second step length is larger than the first step length;
inputting the gesture monitoring first image sequence and the gesture monitoring second image sequence into the gesture feature analysis module to obtain the gesture feature set.
5. The method of claim 4, wherein inputting the pose monitor first image sequence and the pose monitor second image sequence into the pose feature analysis module, obtaining the set of pose features, previously comprises:
performing gesture deployment according to gesture description features based on the human skeleton model to generate a gesture simulation image sequence;
based on a third step length, extracting the gesture simulation image sequence and performing s times downsampling processing to generate a gesture simulation first image sequence;
Extracting the gesture simulation image sequence based on a fourth step length to generate a gesture simulation second image sequence, wherein the fourth step length is larger than the third step length;
and acquiring a slow channel and a fast channel based on a SLOWFAST model, inputting the gesture simulation first image sequence into the fast channel, inputting the gesture simulation second image sequence into the slow channel, and training the gesture description characteristic as output identification information to generate the gesture characteristic analysis module.
6. The method of claim 4 or 5, wherein the gesture feature analysis module further comprises a preprocessing channel comprising:
performing hierarchical clustering analysis on the positioning information of the key point adjacent frames of the gesture image sequence according to the preset positioning deviation degree to obtain a plurality of clustering frame number step sizes of a plurality of clustering results;
taking the minimum value of the step sizes of the plurality of clustering frame numbers as a first step size;
taking the maximum value of the step sizes of the plurality of clustering frame numbers as a second step size;
and performing slow channel input data sampling on the gesture image sequence according to the first step length, and performing fast channel input data sampling on the gesture image sequence according to the second step length.
7. The method of claim 1, wherein inputting the set of gesture features into a behavior classification module to obtain a behavior classification result comprises:
carrying out strict frequent pattern mining on the gesture feature set to obtain an associated task type and an associated execution tool;
the guest room basic information further comprises tool state information during power-on activation, the associated task types of which the associated execution tools do not meet the tool state information are deleted, and the rest associated task types are set as the behavior classification results.
8. A guest room management system based on intelligent services, the system comprising:
the system comprises a basic information acquisition module, a control module and a control module, wherein the basic information acquisition module is used for acquiring guest room basic information, and the guest room basic information comprises electricity taking state information and registration state information;
the monitoring sequence obtaining module is used for monitoring the guest room through the gesture monitoring module when the registration state information belongs to a non-registration state, the electricity taking state information belongs to an electricity taking activation state and the preset activation time is met, and obtaining a gesture monitoring sequence;
The gesture feature acquisition module is used for inputting the gesture monitoring sequence into the gesture feature analysis module to acquire a gesture feature set;
the behavior classification result obtaining module is used for inputting the gesture feature set into the behavior classification module to obtain a behavior classification result;
the early warning information sending module is used for classifying the behaviors
When the result belongs to the early warning behavior of the atrium, the electricity taking state information is converted into an electricity taking closing state,
and generating the pre-warning information of the flyer house and sending the pre-warning information to the passenger control monitoring platform.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310702222.7A CN116703227B (en) | 2023-06-14 | 2023-06-14 | Guest room management method and system based on intelligent service |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310702222.7A CN116703227B (en) | 2023-06-14 | 2023-06-14 | Guest room management method and system based on intelligent service |
Publications (2)
Publication Number | Publication Date |
---|---|
CN116703227A true CN116703227A (en) | 2023-09-05 |
CN116703227B CN116703227B (en) | 2024-05-03 |
Family
ID=87835406
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310702222.7A Active CN116703227B (en) | 2023-06-14 | 2023-06-14 | Guest room management method and system based on intelligent service |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116703227B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117255451A (en) * | 2023-10-24 | 2023-12-19 | 快住智能科技(苏州)有限公司 | Intelligent living guest control method and system for hotel guest room management |
CN117421769A (en) * | 2023-10-24 | 2024-01-19 | 快住智能科技(苏州)有限公司 | Hotel data management method and system based on blockchain |
Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102147596A (en) * | 2010-02-04 | 2011-08-10 | 成都光码智能科技有限公司 | Automatic hotel room status monitoring system and method |
US20110210915A1 (en) * | 2009-05-01 | 2011-09-01 | Microsoft Corporation | Human Body Pose Estimation |
CN109523756A (en) * | 2018-12-19 | 2019-03-26 | 南京遍宇联动科技有限公司 | Room abnormality monitoring method and system |
CN111510665A (en) * | 2019-01-30 | 2020-08-07 | 杭州海康威视数字技术股份有限公司 | Monitoring system, monitoring method and device combining millimeter wave radar and camera |
CN111539358A (en) * | 2020-04-28 | 2020-08-14 | 上海眼控科技股份有限公司 | Working state determination method and device, computer equipment and storage medium |
CN112200165A (en) * | 2020-12-04 | 2021-01-08 | 北京软通智慧城市科技有限公司 | Model training method, human body posture estimation method, device, equipment and medium |
WO2021051579A1 (en) * | 2019-09-17 | 2021-03-25 | 平安科技(深圳)有限公司 | Body pose recognition method, system, and apparatus, and storage medium |
CN112906520A (en) * | 2021-02-04 | 2021-06-04 | 中国科学院软件研究所 | Gesture coding-based action recognition method and device |
CN112990137A (en) * | 2021-04-29 | 2021-06-18 | 长沙鹏阳信息技术有限公司 | Classroom student sitting posture analysis method based on template matching |
CN113658412A (en) * | 2021-08-05 | 2021-11-16 | 南京信息职业技术学院 | Millimeter wave radar-based household old people behavior monitoring method and system |
CN114067358A (en) * | 2021-11-02 | 2022-02-18 | 南京熊猫电子股份有限公司 | Human body posture recognition method and system based on key point detection technology |
CN114677758A (en) * | 2022-03-23 | 2022-06-28 | 华南理工大学 | Gait recognition method based on millimeter wave radar point cloud |
-
2023
- 2023-06-14 CN CN202310702222.7A patent/CN116703227B/en active Active
Patent Citations (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110210915A1 (en) * | 2009-05-01 | 2011-09-01 | Microsoft Corporation | Human Body Pose Estimation |
CN102147596A (en) * | 2010-02-04 | 2011-08-10 | 成都光码智能科技有限公司 | Automatic hotel room status monitoring system and method |
CN109523756A (en) * | 2018-12-19 | 2019-03-26 | 南京遍宇联动科技有限公司 | Room abnormality monitoring method and system |
CN111510665A (en) * | 2019-01-30 | 2020-08-07 | 杭州海康威视数字技术股份有限公司 | Monitoring system, monitoring method and device combining millimeter wave radar and camera |
WO2021051579A1 (en) * | 2019-09-17 | 2021-03-25 | 平安科技(深圳)有限公司 | Body pose recognition method, system, and apparatus, and storage medium |
CN111539358A (en) * | 2020-04-28 | 2020-08-14 | 上海眼控科技股份有限公司 | Working state determination method and device, computer equipment and storage medium |
CN112200165A (en) * | 2020-12-04 | 2021-01-08 | 北京软通智慧城市科技有限公司 | Model training method, human body posture estimation method, device, equipment and medium |
CN112906520A (en) * | 2021-02-04 | 2021-06-04 | 中国科学院软件研究所 | Gesture coding-based action recognition method and device |
CN112990137A (en) * | 2021-04-29 | 2021-06-18 | 长沙鹏阳信息技术有限公司 | Classroom student sitting posture analysis method based on template matching |
CN113658412A (en) * | 2021-08-05 | 2021-11-16 | 南京信息职业技术学院 | Millimeter wave radar-based household old people behavior monitoring method and system |
CN114067358A (en) * | 2021-11-02 | 2022-02-18 | 南京熊猫电子股份有限公司 | Human body posture recognition method and system based on key point detection technology |
CN114677758A (en) * | 2022-03-23 | 2022-06-28 | 华南理工大学 | Gait recognition method based on millimeter wave radar point cloud |
Non-Patent Citations (1)
Title |
---|
米华东: ""基于多模态数据的动作识别方法研究"", 《中国优秀硕士学位论文全文数据库-信息科技辑》, 15 February 2023 (2023-02-15), pages 138 - 1344 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117255451A (en) * | 2023-10-24 | 2023-12-19 | 快住智能科技(苏州)有限公司 | Intelligent living guest control method and system for hotel guest room management |
CN117421769A (en) * | 2023-10-24 | 2024-01-19 | 快住智能科技(苏州)有限公司 | Hotel data management method and system based on blockchain |
CN117255451B (en) * | 2023-10-24 | 2024-05-03 | 快住智能科技(苏州)有限公司 | Intelligent living guest control method and system for hotel guest room management |
Also Published As
Publication number | Publication date |
---|---|
CN116703227B (en) | 2024-05-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN116703227B (en) | Guest room management method and system based on intelligent service | |
US10058076B2 (en) | Method of monitoring infectious disease, system using the same, and recording medium for performing the same | |
Leroy et al. | A computer vision method for on-line behavioral quantification of individually caged poultry | |
JP6616791B2 (en) | Information processing apparatus, information processing method, and computer program | |
CN110059668B (en) | Behavior prediction processing method and device and electronic equipment | |
CN113164094B (en) | System and method for micropulse radar detection of physiological information | |
CN108760302A (en) | A kind of on-line monitoring and fault diagnosis system of wind power generating set bearing | |
CN109598229A (en) | Monitoring system and its method based on action recognition | |
CN105404849B (en) | Using associative memory sorted pictures to obtain a measure of pose | |
US20210192270A1 (en) | Person indentification systems and methods | |
Noe et al. | Automatic detection and tracking of mounting behavior in cattle using a deep learning-based instance segmentation model | |
CN111199211A (en) | Intelligent monitoring equipment with infrared awakening function, monitoring method and storage medium | |
CN105138995A (en) | Time-invariant and view-invariant human action identification method based on skeleton information | |
CN112949417A (en) | Tumble behavior identification method, equipment and system | |
Seo et al. | A yolo-based separation of touching-pigs for smart pig farm applications | |
CN110717461A (en) | Fatigue state identification method, device and equipment | |
JP2021144631A (en) | Animal behavior estimation system, animal behavior estimation support device, animal behavior estimation method, and program | |
CN112998697B (en) | Tumble injury degree prediction method and system based on skeleton data and terminal | |
CN111860117A (en) | Human behavior recognition method based on deep learning | |
CN116269355B (en) | Safety monitoring system based on figure gesture recognition | |
CN113569671A (en) | Abnormal behavior alarm method and device | |
US20210089960A1 (en) | Training a machine learning model using a batch based active learning approach | |
CN114972727A (en) | System and method for multi-modal neural symbol scene understanding | |
Maryam et al. | A novel human posture estimation using single depth image from Kinect v2 sensor | |
CN115063752B (en) | Video tracking early warning method and system based on UWB positioning |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |