CN118278822A - Working data acquisition method and system based on image data analysis and electronic equipment - Google Patents
Working data acquisition method and system based on image data analysis and electronic equipment Download PDFInfo
- Publication number
- CN118278822A CN118278822A CN202410681246.3A CN202410681246A CN118278822A CN 118278822 A CN118278822 A CN 118278822A CN 202410681246 A CN202410681246 A CN 202410681246A CN 118278822 A CN118278822 A CN 118278822A
- Authority
- CN
- China
- Prior art keywords
- employee
- image data
- image
- staff
- feature matrix
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000007405 data analysis Methods 0.000 title claims abstract description 16
- 238000012549 training Methods 0.000 claims abstract description 15
- 230000004044 response Effects 0.000 claims abstract description 6
- 239000011159 matrix material Substances 0.000 claims description 246
- 238000004458 analytical method Methods 0.000 claims description 53
- 238000012545 processing Methods 0.000 claims description 13
- 238000013480 data collection Methods 0.000 claims description 12
- 238000004590 computer program Methods 0.000 claims description 11
- 238000005070 sampling Methods 0.000 claims description 3
- 230000006399 behavior Effects 0.000 description 24
- 230000008569 process Effects 0.000 description 24
- 239000000284 extract Substances 0.000 description 8
- 238000004891 communication Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 6
- 230000003542 behavioural effect Effects 0.000 description 5
- 230000000694 effects Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 238000012512 characterization method Methods 0.000 description 4
- 230000002123 temporal effect Effects 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000004913 activation Effects 0.000 description 2
- 230000003203 everyday effect Effects 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 230000000644 propagated effect Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 238000013475 authorization Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013145 classification model Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 230000000284 resting effect Effects 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000001502 supplementing effect Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06393—Score-carding, benchmarking or key performance indicator [KPI] analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/06—Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
- G06Q10/063—Operations research, analysis or management
- G06Q10/0639—Performance analysis of employees; Performance analysis of enterprise or organisation operations
- G06Q10/06398—Performance of employee with respect to a job function
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
- G06Q10/10—Office automation; Time management
- G06Q10/105—Human resources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
Landscapes
- Business, Economics & Management (AREA)
- Human Resources & Organizations (AREA)
- Engineering & Computer Science (AREA)
- Strategic Management (AREA)
- Entrepreneurship & Innovation (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Physics & Mathematics (AREA)
- Development Economics (AREA)
- Quality & Reliability (AREA)
- Operations Research (AREA)
- Marketing (AREA)
- Tourism & Hospitality (AREA)
- General Business, Economics & Management (AREA)
- Game Theory and Decision Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
The disclosure relates to a working data acquisition method, a system and electronic equipment based on image data analysis, belonging to the field of human resource management, wherein the method comprises the following steps: firstly, acquiring image data of a work area; inputting the work area image data into a staff image splitting model obtained by training in advance to obtain independent staff image data of each staff appearing in a work area in a preset time period; inputting the image data of the independent staff associated with the first staff into a working data acquisition model which is trained in advance to obtain working data of the first staff; comparing the working data of the first staff based on a preset assessment index to obtain an assessment score of the first staff; and finally, sending prompt information to the first staff in response to the assessment score of the first staff being lower than a preset score threshold. The working condition of staff can be accurately and effectively checked, the accuracy of staff working data acquisition and the reliability of staff working checking are effectively ensured, and meanwhile, the expenditure of human resources is reduced.
Description
Technical Field
The disclosure relates to the field of human resource management, in particular to a working data acquisition method and system based on image data analysis and electronic equipment.
Background
In enterprise operation, performance assessment of staff is always a core link in an enterprise management system. However, in the conventional assessment method, for example, in the conventional assessment method, a manager needs to be set up to monitor the working condition of the staff under hand and record the working data, however, the assessment method not only increases the cost of manpower resources, but also has various problems, such as fairness of the subjective factor interference assessment result, fuzzy assessment standard, and the like.
In recent years, with the rapid development of new generation information technology, image data processing plays an increasingly important role in the field of human resource management, however, in the scene of staff work data acquisition and assessment, the reliability of data acquisition and analysis based on image data analysis is difficult to be ensured due to the large number of staff in a work area.
Disclosure of Invention
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
In a first aspect, the present disclosure provides a method for collecting working data based on image data analysis, the method comprising:
Acquiring work area image data, wherein the work area image data comprises continuous image frames acquired by a work area in a preset time period;
Inputting the work area image data into a staff image splitting model which is obtained through training in advance to obtain independent staff image data of each staff which appears in the work area in the preset time period;
Inputting the image data of the independent staff related to a first staff into a working data acquisition model which is obtained by training in advance to obtain working data of the first staff, wherein the first staff is any staff of the staff;
comparing the working data of the first staff based on a preset assessment index to obtain an assessment score of the first staff;
And sending prompt information to the first staff in response to the assessment score of the first staff being lower than a preset score threshold.
In a second aspect, the present disclosure provides a working data acquisition system based on image data analysis, comprising a server for performing the method described in the first aspect.
In a third aspect, the present disclosure provides an electronic device comprising:
a storage device having a computer program stored thereon;
Processing means for executing said computer program in said storage means to carry out the steps of the method of the first aspect.
According to the technical scheme, the collected work area image data is firstly obtained, the work area image data is input into the staff image splitting model which is obtained through training in advance to split the work area image data to obtain independent staff image data corresponding to each staff, the independent staff image data is further input into the work data collecting model to obtain work data corresponding to each staff, the work data is compared with the preset assessment indexes to obtain assessment scores of the staff, whether prompt information is sent to the staff is finally determined according to the assessment scores, staff images in a work area can be accurately split, the work data are collected based on the staff images of the staff, the meeting condition of the staff on the assessment indexes is judged, the staff working condition can be accurately and effectively assessed, the accuracy of staff working data collection and the reliability of staff working assessment are effectively guaranteed, and meanwhile the expenditure of manpower resources is reduced.
Additional features and advantages of the present disclosure will be set forth in the detailed description which follows.
Drawings
The above and other features, advantages, and aspects of embodiments of the present disclosure will become more apparent by reference to the following detailed description when taken in conjunction with the accompanying drawings. The same or similar reference numbers will be used throughout the drawings to refer to the same or like elements. It should be understood that the figures are schematic and that elements and components are not necessarily drawn to scale. In the drawings:
fig. 1 is a flowchart of a working data collection method based on image data analysis according to an embodiment of the present disclosure.
Fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the disclosure.
Detailed Description
Embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain embodiments of the present disclosure have been shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the embodiments set forth herein, but are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and embodiments of the present disclosure are for illustration purposes only and are not intended to limit the scope of the present disclosure.
It should be understood that the various steps recited in the method embodiments of the present disclosure may be performed in a different order and/or performed in parallel. Furthermore, method embodiments may include additional steps and/or omit performing the illustrated steps. The scope of the present disclosure is not limited in this respect.
The term "including" and variations thereof as used herein are intended to be open-ended, i.e., including, but not limited to. The term "based on" is based at least in part on. The term "one embodiment" means "at least one embodiment"; the term "another embodiment" means "at least one additional embodiment"; the term "some embodiments" means "at least some embodiments. Related definitions of other terms will be given in the description below.
It should be noted that the terms "first," "second," and the like in this disclosure are merely used to distinguish between different devices, modules, or units and are not used to define an order or interdependence of functions performed by the devices, modules, or units.
It should be noted that references to "one", "a plurality" and "a plurality" in this disclosure are intended to be illustrative rather than limiting, and those of ordinary skill in the art will appreciate that "one or more" is intended to be understood as "one or more" unless the context clearly indicates otherwise.
The names of messages or information interacted between the various devices in the embodiments of the present disclosure are for illustrative purposes only and are not intended to limit the scope of such messages or information.
It will be appreciated that prior to using the technical solutions disclosed in the embodiments of the present disclosure, the user should be informed and authorized of the type, usage range, usage scenario, etc. of the personal information related to the present disclosure in an appropriate manner according to the relevant legal regulations.
For example, in response to receiving an active request from a user, a prompt is sent to the user to explicitly prompt the user that the operation it is requesting to perform will require personal information to be obtained and used with the user. Thus, the user can autonomously select whether to provide personal information to software or hardware such as an electronic device, an application program, a server or a storage medium for executing the operation of the technical scheme of the present disclosure according to the prompt information.
As an alternative but non-limiting implementation, in response to receiving an active request from a user, the manner in which the prompt information is sent to the user may be, for example, a popup, in which the prompt information may be presented in a text manner. In addition, a selection control for the user to select to provide personal information to the electronic device in a 'consent' or 'disagreement' manner can be carried in the popup window.
It will be appreciated that the above-described notification and user authorization process is merely illustrative and not limiting of the implementations of the present disclosure, and that other ways of satisfying relevant legal regulations may be applied to the implementations of the present disclosure.
Meanwhile, it can be understood that the data (including but not limited to the data itself, the acquisition or the use of the data) related to the technical scheme should conform to the requirements of the corresponding laws and regulations and related regulations.
Fig. 1 is a schematic flow chart of a working data collection method based on image data analysis according to an embodiment of the disclosure, where an execution subject of the method may be a server. Alternatively, the server may be connected to one or more other electronic devices to obtain data and/or signals sent by the other electronic devices, or to receive data and/or signals sent by the other electronic devices. As shown in fig. 1, the method comprises the steps of:
Step S101: and acquiring the image data of the work area.
The work area image data comprises continuous image frames acquired by a work area in a preset time period, and the work area can be a work station gathering area of a factory. The work area image data may include a plurality of employee image data, which may be any employee image data combined from employee image data of a plurality of employees, each of the plurality of employee image data containing an associated plurality of individual employee image data.
On the other hand, the preset time period may be a preconfigured image acquisition interval, for example, the preset time period may be set to 9 a.m. to 5 a.m. of each day when the working data of the staff is acquired and checked every day, or the preset time period may be set to the first day of each month to the last day of each month or other time periods when the working data of the staff is acquired and checked every day, which is not limited in the embodiment of the present disclosure.
In addition, an image pickup device may be disposed in the working area, and the image pickup device may collect an image of the working area in real time within a preset period of time and transmit the collected image data to the server, so that the server analyzes the collected image data.
Step S102: inputting the work area image data into a staff image splitting model which is trained in advance, and obtaining independent staff image data of each staff which appears in the work area within a preset time period.
The employee image splitting model may be trained by a server, or may be sent to the server after being trained by other devices, which is not limited in the embodiments of the present disclosure.
The staff image splitting model can be used for determining a plurality of independent staff image data associated with the work area image data, and the independent staff image data is mapped with staff identifications one by one. The employee identification comprises an identification corresponding to each employee appearing in the work area, namely, independent employee image data are associated with each employee one by one.
In some embodiments, the employee image splitting model may be trained based on an original employee image splitting model, which may be identical to the model structure of the employee image splitting model, with only model parameter indicators being different. The original employee image splitting model is updated by model parameters, namely, the model is trained, so that a trained employee image splitting model can be obtained, and the employee image splitting model can accurately split independent employee image data in the work area image data.
The specific training process of the employee image splitting model will be described in detail in the following embodiments, which will not be described herein.
Step S103: and inputting the image data of the independent staff associated with the first staff into a working data acquisition model which is trained in advance to obtain the working data of the first staff.
Wherein the first employee is any one of the employees. It can be understood that the image data of each of the plurality of employees can be input into the work data collection model to obtain the work data corresponding to each employee.
In some embodiments, the employee's work data may be used to indicate the employee's work in a preset time period, e.g., may be used to indicate the time the employee was in work during the preset time period, the rest time, the off duty time, the on duty time, the number of times the target activity is completed per minute in the work state, etc., which embodiments of the disclosure are not limited.
For example, the working data of a staff member may be used to indicate that the staff member is in the working state for 5 hours, off duty for 1 hour, rest time for 2 hours, during which the assembly of the electronic device is completed 30 times, 3 times of the assembly of the electronic device described above is completed per hour within the first 2 hours in the working state, and 8 times of the assembly of the electronic device is completed per hour within the last 3 hours in the working state.
In some embodiments, the process of collecting the working data by the working data collection model may be implemented based on behavior recognition, for example, the behavior of the employee in the image of the employee may be recognized, and whether the employee is in a working state or a non-working state in the corresponding image or whether a project label behavior is completed is determined.
In some embodiments, the working data acquisition model comprises a behavior recognition unit, and step S103 may comprise the following sub-steps: sampling the image data of the independent staff corresponding to the first staff to obtain a first image sequence; inputting each employee image frame in the first image sequence into a behavior recognition unit to obtain a behavior data tag sequence of the first employee, wherein the behavior data tags in the behavior data tag sequence are associated with the employee image frames in the first image sequence one by one; and generating working data according to the behavior data labels associated with each worker image frame.
Optionally, if the number of the behavior data tags that characterizes the employee in the first state in the K continuous behavior data tags is greater than K-L, it may be determined that the employee is in the first state in a time corresponding to the K behavior data tags, where K is greater than 0 and L is greater than or equal to 0. For example, if there are 10 consecutive behavior data tags in the sequence of behavior data tags, each of which characterizes the employee as being in operation, it may be determined that the employee is in operation at the corresponding time.
The frequency of sampling the image data of the independent staff can be, for example, once every 3 seconds, so that all staff images can be effectively reduced without being identified.
The working data collection model or the behavior recognition unit may adopt an image classification model or a behavior recognition model in a related technology, and the training mode of the working data collection model or the behavior recognition unit is not described in detail in the embodiment of the disclosure. In addition, step S103 may be to process the image data of the independent employee by calling a working data collection model deployed on another server or an electronic device to obtain working data.
Step S104: and comparing the working data of the first staff based on the preset assessment index to obtain the assessment score of the first staff.
The value range of the assessment score may be, for example, 0-100, and the higher the assessment score, the better the meeting condition of the employee for the preset assessment index, for example, the higher the corresponding assessment score for the assessment index of the working duration, the longer the working duration of the employee, and the higher the corresponding assessment score for the assessment index of the off-duty duration, the shorter the off-duty duration of the employee.
The preset assessment indexes can include assessment indexes of dimensions such as target working time length, target resting time length, target off-duty time length, on-duty time length, target times for completing target behaviors per minute in a working state and the like, and can be set according to actual assessment requirements in specific implementation, and the embodiment of the disclosure is not limited to the above.
In some possible implementations, different assessment indicators may correspond to different weights. The method can compare the information of each dimension in the working data with different assessment indexes, and obtain the final assessment score after weighted calculation according to the comparison result.
In some embodiments, step S104 may specifically include: comparing the quantity distribution information of the behavior data labels in the behavior data label sequence with target quantity distribution corresponding to a preset assessment index to obtain an assessment score of the first employee.
The quantity distribution information may be used to indicate the number or the duty ratio of the behavioral data tags used to indicate different information in the behavioral data tag sequence, for example, the quantity distribution information may be used to indicate that the number of behavioral data tags used to indicate that the employee is in a working state in the behavioral data tag sequence is 20, the number of behavioral data tags used to indicate that the employee is in an off-duty state is 5, and so on.
For example, the number of behavior data labels used for representing that the employee is in the working state in the target number distribution corresponding to the preset assessment index may be N, the number of behavior data labels used for representing that the employee is in the working state in the data label sequence may be N, the assessment score of the assessment dimension for the working state may be 60, the number of behavior data labels used for representing that the employee is in the working state in the data label sequence is 0, the corresponding assessment score may be 0, and when the number of behavior data labels used for representing that the employee is in the working state in the data label sequence is greater than or equal to X, the corresponding assessment score may be 100, wherein the value of X may be set based on the actual situation, and the embodiment of the disclosure does not limit the invention.
Step S105: and sending prompt information to the first staff in response to the assessment score of the first staff being lower than a preset score threshold.
The prompt message may be sent to the first employee through a short message or a mailbox, or may be sent to the employee through an application pushing message, which is not limited in the embodiment of the disclosure.
Alternatively, the preset score threshold may be a preset absolute value, for example, 60 scores, or may be set according to the assessment scores of all employees, for example, the social score threshold may be set to be the highest assessment score among the employees with the assessment scores of 30% later, that is, prompt information may be sent to the employees with the assessment scores of 30% later, respectively.
In the embodiment of the disclosure, firstly, acquired work area image data is acquired, the work area image data is input into a staff image splitting model which is trained in advance to split the work area image data to obtain independent staff image data corresponding to each staff, then the independent staff image data is input into a work data acquisition model to obtain work data corresponding to each staff, the work data is compared with a preset assessment index to obtain assessment scores of the staff, finally, whether prompt information is sent to the staff is determined according to the assessment scores, staff images in a work area can be split accurately, the work data acquisition is carried out based on the staff images of the staff, further, the meeting condition of the staff on the assessment index is judged, the staff working condition can be assessed accurately and effectively, and meanwhile, the accuracy of staff working data acquisition and the reliability of staff working assessment are guaranteed effectively, and the expenditure of manpower resources is reduced.
In some embodiments, training of the working data acquisition model may include the steps of:
step 1: and acquiring sample work area image data.
The sample work area image data is used for carrying out model parameter iteration on an original employee image splitting model, wherein the sample work area image data can comprise a plurality of sample employee image data, the sample employee image data can be any employee image data formed by combining employee image data of a plurality of employees, each sample employee image data comprises a plurality of associated sample independent employee image data, and the sample independent employee image data is mapped with employee identifications one by one. For example, the designated employee image data in the plurality of sample employee image data is obtained based on the time sequence arrangement of the associated plurality of designated independent employee image data, that is, when the designated employee image data is accurately split, the plurality of designated independent employee image data is the splitting result associated with the designated employee image data, and the designated employee image data may be any one sample employee image data in the plurality of sample employee image data.
For example, the specified employee image data may be a video including a plurality of employees, and the plurality of specified independent employee image data may be employee image data associated with each employee, i.e., employee identification, in the video. Employee identification is used to identify employees, such as employee objects, that are present in the work area.
Step 2: the method comprises the steps of sequentially setting a plurality of sample employee image data into designated employee image data, determining candidate employee quantity parameters associated with the designated employee image data by using an original employee quantity analysis unit in an original employee image splitting model, and determining a plurality of first employee images associated with the designated employee image data according to the candidate employee quantity parameters by using an original image splitting unit in the original employee image splitting model.
In order to enable the employee image splitting model to split only according to the input employee image data and not to have splitting limitation in the employee identification dimension, the server needs to enable the employee image splitting model to have the capacity of analyzing the number of employees, namely the capacity of determining the employee identification number parameters in the model training process. Accordingly, in the embodiment of the present application, the original employee image splitting model may include an original employee number analysis unit and an original image splitting unit, where the original employee number analysis unit is configured to analyze the employee identification number, that is, the employee identification number parameter, included in the employee image data, and the original image splitting unit is configured to split the employee image data according to the employee identification number obtained by the analysis to obtain a splitting result. Therefore, the model can have the capability of autonomous quantity analysis according to the addition of the original employee quantity analysis unit, and the employee identification quantity is not required to be input into the model during splitting.
In the model training process, taking the designated employee image data as an example, the server can input the designated employee image data into an original employee image splitting model, a candidate employee number parameter associated with the designated employee image data can be determined by using an original employee number analysis unit in the original employee image splitting model, and a plurality of first employee images associated with the designated employee image data can be determined according to the candidate employee number parameter by using an original image splitting unit in the original employee image splitting model. The candidate staff number parameter is used for indicating staff identification number parameter included in the designated staff image data analyzed by the original staff number analysis unit, a plurality of first staff images are splitting results output by the original staff image splitting model, and the number of the first staff images is the staff identification number corresponding to the candidate staff number parameter.
Step 3: a number of actual employees associated with the specified employee image data is determined based on the plurality of specified individual employee image data.
The method comprises the steps that a plurality of appointed independent employee image data are split results related to the appointed employee image data, and the appointed independent employee image data are mapped with employee identifications one by one, so that the number of actual employee identifications related to the appointed employee image data can be determined by utilizing the appointed independent employee image data, and further the number of actual employees can be determined, wherein the number of actual employees is used for indicating the number of actual employee identifications included in the appointed employee image data. For example, the server may directly determine the number of image data of a specified individual employee as the number of actual employees or the like.
Step 4: updating parameter index values associated with the original employee image splitting model based on the quantity deviation between the candidate employee quantity parameter and the actual employee quantity and the quantity deviation between the plurality of specified independent employee image data and the plurality of first employee images to obtain an employee image splitting model. The staff image splitting model can be used for determining a plurality of independent staff image data associated with the work area image data, wherein the independent staff image data is mapped with staff identifications, and the staff identifications comprise identifications of each staff association which occur in the work area.
Because the candidate staff number parameter can identify the staff identification number included in the designated staff image data analyzed by the original staff number analysis unit, and the real staff number is used for indicating the actual staff identification number included in the designated staff image data, the accuracy of the original staff number analysis unit in analyzing the staff identification number can be shown according to the number deviation between the candidate staff number parameter and the real staff number.
Meanwhile, the plurality of first employee images are splitting results of association of the designated employee image data output by the original employee image splitting model, and the plurality of designated independent employee image data are reliable splitting results of association of the designated employee image data, so that the accuracy of the original employee image splitting model in splitting as a whole can be reflected by utilizing the quantity deviation between the plurality of first employee images and the plurality of designated independent employee image data. Therefore, the parameter updating is carried out on the original employee image splitting model by combining the quantity deviation of the two dimensions, in the process of gradually reducing the quantity deviation of the two dimensions, on one hand, the original employee quantity analyzing unit can learn how to accurately analyze the employee identification quantity in the employee image data, on the other hand, the model can be split into a more accurate splitting result wholly, further, an employee image splitting model capable of accurately analyzing the employees included in the employee image data and accurately splitting according to the accurately analyzed employee quantity is obtained, the original employee quantity analyzing unit can obtain an employee quantity analyzing unit in the employee image splitting model through the parameter updating, and the original image splitting unit can obtain an image splitting unit in the employee image splitting model through the parameter updating.
The employee image splitting model can be used for determining a plurality of independent employee image data associated with the work area image data, the work area image data can be any one of the employee image data needing image splitting, and the independent employee image data obtained by the employee image splitting model is mapped with employee identifications one by one.
According to the technical scheme, in order to enable the model to have the capability of automatically analyzing the number of staff in the staff image data, an original staff image splitting model in the application can comprise an original staff number analysis unit and an original image splitting unit, wherein the original staff number analysis unit can be used for determining candidate staff number parameters related to sample staff image data, and the candidate staff number parameters are used for indicating staff identification number parameters related to the sample staff image data analyzed by the original staff number analysis unit; by utilizing the original image splitting unit, the sample employee image data can be split according to the employee identification number corresponding to the candidate employee number parameter, so as to obtain a plurality of first employee images. By utilizing the plurality of sample independent employee image data associated with the sample employee image data, the number of employee identifications associated with the sample employee image data and the plurality of independent employee image data obtained by splitting can be embodied when the sample employee image data is accurately split. Therefore, the number of real staff related to the sample staff image data can be determined by utilizing the plurality of sample independent staff image data, the accuracy of the original staff number analysis unit for staff identification number analysis can be shown by utilizing the number deviation between the number of real staff and the sample staff identification number, the accuracy of the original image splitting unit in staff image data splitting can be shown by utilizing the number deviation between the plurality of sample independent staff image data and the plurality of first staff images, and therefore, the original staff image splitting model is updated by utilizing the number deviation combining the two dimensions, so that on one hand, the original staff number analysis unit can learn how to accurately analyze the staff identification number included in the staff image data, and on the other hand, the original staff image splitting model can learn how to accurately split the staff image data according to staff number parameters, and staff image data capable of forming an accurate one-to-one mapping relation with the staff identification is obtained.
The form of the staff number parameter can comprise a plurality of types. In one possible implementation, the employee number parameter may be information constituted by a count identification. When the original employee number analysis unit is used for determining candidate employee number parameters associated with the designated employee image data, the designated employee image data can be split in the dimension of an image feature matrix, a plurality of second employee image feature matrixes associated with the designated employee image data are determined, namely the splitting result of the original employee number analysis unit for splitting the designated employee image data in the dimension of the image feature matrix, wherein the image feature matrixes refer to the features of the employee image data, and the information characteristics of the employee image data can be identified.
It will be appreciated that the staff image data characteristics of different staff identifications are often different, for example, the staff images of different staff typically differ in both time domain features and frequency domain features, so that the image feature matrix can characterize staff to some extent. Thus, in the present application, the original employee number analysis unit may analyze the plurality of second employee image feature matrices, respectively, to determine whether each of the second employee image feature matrices is capable of characterizing an employee. The server may determine, by using the original employee number analysis unit, candidate numerical parameters associated with each of the plurality of second employee image feature matrices based on the plurality of second employee image feature matrices, where the candidate numerical parameters are one of the count identifiers, and the numerical parameters are used to indicate a confidence level that the associated employee image feature matrix is associated with a unique employee identifier. That is, by analyzing each second employee image feature matrix, the original employee quantity analysis unit may determine whether each second employee image feature matrix is capable of characterizing a unique employee identification, i.e., employee, to determine the numerical parameter.
The server may determine candidate numerical parameters associated with each of the plurality of second employee image feature matrices as candidate employee quantity parameters indicative of the number of employee image feature matrices associated with the unique employee identification. Because the candidate numerical parameters can indicate the confidence that the second employee image feature matrix is associated with the unique employee identification, the numerical parameters respectively associated with the plurality of second employee image feature matrices can be combined to determine the feature quantity of the associated unique employee identification in the plurality of second employee image feature matrices, so that the sample employee identification quantity can be identified.
Similarly, in determining the actual number of employees associated with the specified employee image data based on the plurality of specified independent employee image data, the server may perform the steps of:
step 31: second independent employee image data associated with each of the plurality of second employee image feature matrices is determined.
It can be understood that the image feature matrix is identified from the employee image data, so that the employee image data can be reversely restored according to the image feature matrix, that is, the image feature matrix associated with the second independent employee image data is the second employee image feature matrix, so that the second independent employee image data can be restored according to the second employee image feature matrix.
Step 32: and sequentially setting the plurality of second independent employee image data as appointed second independent employee image data, and determining the maximum index parameter in the distance index respectively associated between the appointed independent employee image data and the plurality of appointed independent employee image data as the real numerical index associated with the appointed independent employee image data.
The numerical parameter is used for indicating the confidence level of the associated image feature matrix associated unique employee identification, and the designated independent employee image data is employee image data associated with the unique employee identification, so that the confidence level of the second independent employee image data as the employee image data associated with the unique employee identification can be represented by utilizing the distance index between the second independent employee image data and the designated independent employee image data, and the confidence level of the second independent employee image feature matrix associated with the second independent employee image data associated with the unique employee identification can be represented.
It will be appreciated that the distance indicator may be used to indicate the similarity of the two, and that a smaller distance indicator may indicate a higher similarity of the two, and vice versa. The distance parameter may be a euclidean distance, a manhattan distance, or the like, which is not limited in the embodiments of the present disclosure.
According to this, the server may set the plurality of second independent employee image data as the designated second independent employee image data in turn, and calculate the distance indicators between the plurality of designated second independent employee image data and the designated second independent employee image data, where the maximum indicator parameter in the distance indicators may indicate that the similarity between the designated second independent employee image data and the designated independent employee image data is the highest, that is, the designated second independent employee image data is most likely to be associated with the employee identifier associated with the designated independent employee image data. Thus, the server can determine the maximum index parameter in the distance index respectively associated between the designated second independent employee image data and the plurality of designated independent employee image data as the real numerical index associated with the designated second independent employee image data, so as to measure whether the designated second independent employee image data is associated with the unique employee identification.
Step 33: and determining the real numerical indexes associated with the image feature matrixes of the plurality of second staff as the number of real staff.
Because the respective associated real numerical indexes of the second employee image feature matrix can reflect the confidence level of whether the second employee image feature matrix is actually associated with the unique employee identification, the server can determine the respective associated real numerical indexes of the plurality of second employee image feature matrices as the real employee number.
Server in some embodiments, step 4 may comprise the sub-steps of:
Step 41: and sequentially setting the plurality of second employee image feature matrixes as appointed second employee image feature matrixes, and updating the parameter index values associated with the original employee number analysis unit according to the number deviation between the candidate numerical parameters associated with the appointed second employee image feature matrixes and the real numerical indexes.
The candidate numerical parameters associated with the image feature matrix of the appointed second staff can show the confidence coefficient of the associated unique staff mark analyzed by the image splitting model of the original staff, and the real staff number is the confidence coefficient of the associated unique staff mark actually associated with the image feature matrix of the appointed second staff, so that the accuracy of the original staff number analysis unit in analyzing whether the image feature matrix of the second staff is associated with the unique staff mark can be shown by utilizing the number deviation, and the original staff number analysis unit is updated according to the number deviation, so that the module unit can learn how to accurately analyze whether the image feature matrix of the staff is associated with the unique staff mark, and further learn how to determine the accurate staff number parameters.
Step 42: based on the number deviation between the plurality of specified individual employee image data and the plurality of first employee images, a parameter index value associated with the original employee image resolution model is updated.
Because the splitting result is obtained by cooperation of a plurality of module units in the original employee image splitting model, the server can update the original employee image splitting model integrally according to the quantity deviation on the splitting result so as to obtain an accurate and effective employee image splitting model.
The number of staff marks is the key information in the image splitting process, and the original staff number analysis unit can split the image feature matrix of the designated staff image data to obtain a plurality of second staff image feature matrices, and the confidence level of the unique staff mark associated with the second staff image feature matrix can be identified by the candidate numerical parameters associated with the second staff image feature matrix, so that the number of the second staff image feature matrix associated with the unique staff mark in the second staff image feature matrix can be determined according to the candidate numerical parameters associated with the second staff image feature matrix, and the number is the number of the candidate staff analyzed by the original staff number analysis unit.
According to this, when the original image splitting unit is utilized to determine the plurality of first employee images associated with the designated employee image data according to the candidate employee number parameter, the server may determine, as the candidate employee number, the number of second employee image feature matrices having a confidence coefficient higher than the first index threshold corresponding to the associated candidate numerical parameter, from the plurality of second employee image feature matrices. After the number of staff marks is available, the original image splitting unit can use the number of candidate staff as a splitting basis, and a first staff image of the number of candidate staff associated with the image data of the appointed staff is determined according to the number of candidate staff.
From the foregoing, it can be seen that the candidate numerical parameter associated with the image feature matrix of the second employee is one of the key information in the image splitting process, and the candidate numerical parameter is determined according to the image feature matrix of the second employee, so that the analysis accuracy of the original employee number analysis unit on the image feature matrix of the second employee is also one of the important factors affecting the image splitting accuracy.
Accordingly, in one possible implementation, to further improve the accuracy of the image splitting, the server may improve the accuracy of the analysis of the image feature matrix of the second employee by the original employee number analysis unit.
In some embodiments, step 41 may further comprise the sub-steps of:
Step 441: and updating the parameter index value associated by the original employee number analysis unit according to the number deviation between the second independent employee image data associated with the designated second employee image feature matrix and the designated independent employee image data and the number deviation between the candidate numerical parameter associated with the designated second employee image feature matrix and the real numerical index.
The method comprises the steps that the appointed independent employee image data is appointed independent employee image data with the largest distance index between second independent employee image data associated with an appointed second employee image feature matrix in a plurality of appointed independent employee image data, namely, employee identification large confidence represented by the appointed second employee image feature matrix is employee identification associated with the appointed independent employee image data, if an original employee number analysis unit extracts the appointed second employee image feature matrix more accurately, the appointed second employee image feature matrix is closer to the image feature matrix of the employee identification, and the second independent employee image data associated with the appointed second employee image feature matrix is closer to the appointed independent employee image data.
According to the method, on one hand, the server can enable the original staff number analysis unit to learn how to analyze and extract the image feature matrix of the staff image data in the model training process according to the number deviation between the second independent staff image data associated with the appointed second staff image feature matrix and the appointed independent staff image data, so that the extracted image feature matrix can effectively represent each staff mark; on the other hand, according to the quantity deviation between the candidate numerical parameters associated with the image feature matrix of the appointed second staff and the real numerical indexes, the original staff quantity analysis unit can learn how to perform accurate numerical parameter analysis according to the extracted image feature matrix, so that the accuracy of the determined staff quantity parameters can be improved in the two dimensions, and the image splitting accuracy is further improved.
The image feature matrices of different employee identifications are different, and the image feature matrices can characterize the employee identifications to a certain extent, so that in one possible implementation manner, the server can guide the employee image data splitting process of the original image splitting unit according to the characterization effect of the second employee image feature matrix, which is obtained by analyzing the original image splitting unit, on the employee identifications.
In determining the candidate employee quantity parameter associated with the designated employee image data, the server may also determine a plurality of second employee image feature matrices associated with the designated employee image data, the second employee image feature matrices consistent with the second employee image feature matrices above. The server may determine, based on the plurality of second employee image feature matrices, a candidate employee number parameter associated with the designated employee image data, where the candidate employee number parameter is used to indicate a plurality of reference feature matrices associated with unique employee identifications in the plurality of second employee image feature matrices, that is, the reference feature matrices determine, for the original employee number analysis unit, the second employee image feature matrix associated with the unique employee identifications. For example, the server may analyze the candidate numerical parameters, and determine the second employee image feature matrix with higher confidence level identified by the candidate numerical parameters as a reference feature matrix, where the specified reference feature matrix may be used to indicate employee personal feature information of an employee identifier associated with the specified reference feature matrix, where the employee personal feature information refers to a feature of the employee identifier when the employee identifier sends out employee image data, and the specified reference feature matrix is an image feature matrix associated with the employee identifier, and the specified reference feature matrix may be any reference feature matrix.
When the original image splitting unit is utilized to determine a plurality of first employee images associated with the designated employee image data according to the candidate employee quantity parameters, the server can sequentially set the plurality of reference feature matrices as the designated reference feature matrix, taking the designated reference feature matrix as an example, the server can determine the first employee image associated with the designated reference feature matrix based on the designated reference feature matrix and the designated employee image data, namely, the server can split the images from the designated employee image data according to the designated reference feature matrix, and split the employee image data, which is relatively attached to the designated reference feature matrix, of the associated image feature matrix as the first employee image associated with the designated reference feature matrix, and the first employee image associated with the designated reference feature matrix can be determined as the employee image data associated with the employee identifier associated with the designated reference feature matrix because the designated reference feature matrix can identify the employee personal feature information of the associated employee identifier.
Therefore, when the image is split, on one hand, the number of staff marks is used as constraint, so that the split staff image data can be attached to the number of staff marks included in the work area image data; on the other hand, the reference feature matrix is used for guiding, so that the image splitting can obtain employee image data accurately related to employee identification in the work area image data, and the accuracy of image splitting is improved as a whole.
In particular, in determining the first artificial image associated with the specified reference feature matrix based on the specified reference feature matrix and the specified employee image data, in one possible implementation, the server may determine from a distance index between the image feature matrices.
First, the server may determine a designated employee image feature matrix associated with the designated employee image data, the designated employee image feature matrix being an image feature matrix associated with the designated employee image data as a whole, so that the designated employee image feature matrix is mixed with image feature matrices associated with the employee image data of the plurality of employee identifications. The server may identify a first employee image feature matrix associated with the designated reference feature matrix from among the designated employee image feature matrices based on the distance indicators between the designated reference feature matrix and the respective image feature matrix portions of the designated employee image feature matrix, the distance indicators between the first employee image feature matrix and the designated reference feature matrix being greater than a second indicator threshold for determining whether there is a higher distance indicator between the image feature matrices, i.e., the first employee image feature matrix and the designated reference feature matrix, such that the first employee image feature matrix confidence level associates the same employee identifier with the designated reference feature matrix, such that the server may determine a first employee image associated with the designated reference feature matrix based on the first employee image feature matrix associated with the designated reference feature matrix, the first employee image may determine employee image data associated with the employee identifier represented by the designated reference feature matrix.
It can be appreciated that, because the reference feature matrix is mapped to the employee identifications one by one, the employee identifications associated with different reference feature matrices are different, so in one possible implementation manner, in order to further improve the accuracy of image splitting, the server may extract similar image feature matrices according to the designated reference feature matrix, and may remove the portion of the extracted image feature matrix, which has confidence associated with other employee identifications, according to other reference feature matrices.
The server may extract first employee image feature matrices associated with the plurality of reference feature matrices, respectively, by using the above manner, where each first employee image feature matrix may to some extent represent employee image data associated with the employee identification represented by the associated reference feature matrix in the designated employee image data. Therefore, when determining the first employee image associated with the specified reference feature matrix based on the first employee image feature matrix associated with the specified reference feature matrix, the server may first use the first employee image feature matrices associated with the reference feature matrices other than the specified reference feature matrix of the plurality of reference feature matrices as the plurality of reference employee image feature matrices. If similar feature parts exist between the first employee image feature matrix associated with the designated reference feature matrix and the contrast employee image feature matrix, the fact that the feature parts have larger confidence is indicated to be associated with other employee identifications, and employee image data associated with other employee identifications is more likely to be determined when the employee image data are determined according to the first employee image feature matrix.
According to the method, the server can identify the first characteristic matrix associated with the appointed reference characteristic matrix from the first employee image characteristic matrices associated with the appointed reference characteristic matrix based on the plurality of comparison employee image characteristic matrices, the distance indexes between the first characteristic matrix and the plurality of comparison employee image characteristic matrices are lower than a third index threshold, and the third index threshold is used for judging whether the image characteristic matrices have smaller distance indexes or not. In other words, according to the third index threshold, the server can reject the image feature matrix part with the higher distance index between the image feature matrix part and the contrast employee image feature matrix in the first employee image feature matrix associated with the appointed reference feature matrix, so that the image feature matrix part with higher confidence of associating other employee identifications can be rejected, and the association between the first feature matrix and the employee identifications represented by the appointed reference feature matrix is more intimate.
Furthermore, the server can determine the first employee image associated with the specified reference feature matrix based on the first feature matrix associated with the specified reference feature matrix, so that the association relationship between the first employee image and the employee identifications represented by the specified reference feature matrix is more accurate, interference of employee image data of other employee identifications on the first employee image is reduced, and a more accurate image splitting result is obtained.
Specifically, in one possible implementation manner, in order to further improve the accuracy of image splitting, after extracting the first artificial image feature matrix associated with the specified reference feature matrix by using the above manner, the server may further use the original image splitting unit to identify, based on the specified reference feature matrix, the second feature matrix associated with the specified reference feature matrix from the first artificial image feature matrix associated with the specified reference feature matrix, and similar to the manner of extracting the first artificial image feature matrix, the server may extract according to the distance indexes between each feature part in the first artificial image feature matrix and the specified reference feature matrix, so as to obtain the second feature matrix that is closer to the specified reference feature matrix. The distance index between the second characteristic matrix and the appointed reference characteristic matrix is higher than a fourth index threshold, and the fourth index threshold is higher than the second index threshold, namely the process is to further purify the characteristic matrix of the first manual image according to the appointed reference characteristic matrix.
When a first employee image associated with a specified reference feature matrix is determined based on the first feature matrix associated with the specified reference feature matrix, the server can combine feature extraction according to the specified reference feature matrix with feature extraction according to other reference feature matrices, and combine the first feature matrix associated with the specified reference feature matrix and the second feature matrix associated with the specified reference feature matrix to generate an image feature matrix associated with the specified reference feature matrix, so that the image feature matrix has higher fitting degree on the staff personal feature information identified by the specified reference feature matrix on the one hand, and has lower confidence degree on the image feature matrix including staff identifications identified by other reference feature matrices on the other hand, therefore, the server can determine the first employee image associated with the specified reference feature matrix based on the image feature matrix associated with the specified reference feature matrix, further improve the association close degree between the first employee image and staff identifications identified by the specified reference feature matrix, and further improve the accuracy of image splitting.
From the foregoing, it can be seen that, in the image splitting process, the overall image feature matrix associated with the designated employee image data is also one of the key information for performing image splitting. Therefore, the server not only can optimize the two parts of staff identification number analysis and image splitting, but also can more accurately process the part of the image feature matrix extraction of the image data of the appointed staff so as to obtain a more accurate image feature matrix capable of reflecting the characteristics of the image data of the appointed staff.
In one possible implementation, the original employee image splitting model may further include an original feature engineering unit to identify a specified employee image feature matrix associated with the specified employee image data, the specified employee image feature matrix to characterize information features associated with the specified employee image data, such as features in a time domain or a frequency domain, and the like.
In some embodiments, step2 may comprise the sub-steps of:
Step 21: determining candidate staff quantity parameters associated with the specified staff image data based on the specified staff image feature matrix by using an original staff quantity analysis unit in the original staff image splitting model, and determining a plurality of first staff images associated with the specified staff image data according to the candidate staff quantity parameters and the specified staff image feature matrix by using an original image splitting unit in the original staff image splitting model.
In the original employee number analysis unit, the image feature matrix of the designated employee can be analyzed to extract the plurality of second employee image feature matrices, so that the feature number of the associated unique employee identifier is analyzed to determine candidate employee number parameters; in the original image splitting unit, the plurality of first employee image feature matrices may be identified from the designated employee image feature matrices according to the number of candidate employees parameter, so as to determine a plurality of first employee images.
In some embodiments, the original employee number analysis unit may include two convolution layers, a filling and reshaping layer, a bidirectional long-short-term memory unit and an activation layer, and the server may input the image feature matrix of the designated employee to the first convolution layer in the original employee number analysis unit to obtain a plurality of feature matrices, and after supplementing and reconstructing the feature matrix according to a preset mask matrix dimension, a mask matrix may be obtained through the second convolution layer, where the mask matrix is a feature matrix formed by a plurality of second employee image feature matrices. If the length of the mask matrix may be a predetermined length, for example, L, the number of employee identifications determined by the original employee number analysis unit at most is L, and L may be 256, for example. The dimensions of the mask matrix are variable, and can be controlled by varying the dimensions of the mask matrix during the replenishment and reconstruction phases.
And the two-way long-short-term memory unit and the activation layer can be used for analyzing and obtaining candidate numerical parameters associated with each second employee image feature matrix and indicating the confidence level of the unique employee identification associated with the second employee image feature matrix. And determining 01 codes associated with the second employee image feature matrix according to the candidate numerical parameters, wherein 1 is used for indicating that the associated second employee image feature matrix is associated with a unique employee identifier, 0 is used for indicating that the associated second employee image feature matrix is not associated with the unique employee identifier, so that the number of candidate employees can be represented by the number of 1 in the 01 codes, for example, the server can determine the 01 codes of the second employee image feature matrix with the confidence of being higher than an index threshold, which are identified by the candidate numerical parameters, as 1, and otherwise, determining as 0. The module may directly input the number of candidate employees and the associated second employee image feature matrix encoded as 1 into the original image splitting unit for image splitting.
In some embodiments, the original image splitting unit is formed based on a plurality of parallel data processing units, the number of the original image splitting unit is the number of candidate employees analyzed by the original employee number analyzing unit, namely, the data processing units are mapped with employee identifications one by one, and one data processing unit is used for splitting employee image data associated with one employee identification. The input to each data processing unit is a designated employee image feature matrix associated with the designated employee image data, and a reference feature matrix associated with the unique employee identification in the mask matrix.
In some embodiments, in extracting the specified employee image feature matrix associated with the specified employee image data, the server may extract the temporal features associated with the specified employee image data from the temporal information associated with the specified employee image data and extract the frequency domain features associated with the specified employee image data from the frequency domain information associated with the specified employee image data. The server can determine the designated employee image feature matrix based on the time domain features and the frequency domain features, so that the designated employee image feature matrix has a good characterization effect on both the time domain features and the frequency domain features of the designated employee image data, and is beneficial to more accurately analyzing and processing the designated employee image feature matrix.
It will be appreciated that the feature engineering is a process of identifying a part of representative information from information, such as image data, and the more feature engineering is performed, the more detailed the feature identification of the employee image data is to some extent, and the more details of the employee image data can be noted. For example, when an image feature matrix is identified by using an image convolution method, a larger convolution kernel is generally utilized in the related art, and a feature engineering is performed once to obtain the image feature matrix; in the application, a plurality of small convolution kernels can be utilized, so that a plurality of feature projects can be utilized to extract an image feature matrix associated with the employee image data.
Therefore, in the application, the server can extract the image feature matrix of the appointed staff associated with the image data of the appointed staff by utilizing multiple feature projects in combination with the information of multiple dimensions. For example, in identifying temporal features associated with the specified employee image data, the server may perform feature engineering M times to determine temporal features associated with the specified employee image data, M being a positive integer greater than 1.
The server may also, in some embodiments, determine a designated employee image feature matrix associated with the designated employee image data in a manner in which a plurality of features are directly combined.
In some embodiments, in combination with the above embodiments, the application process of the employee image splitting model trained based on the training manner may include the following steps:
step 31: and acquiring the image data of the work area.
The work area image data is formed based on the independent employee image data associated with the plurality of employee identifications, namely, the work area image data can be used for receiving the independent employee image data associated with the plurality of employee identifications.
Step 32: based on the work area image data, determining a staff number parameter associated with the work area image data by using a staff number analysis unit in a staff image splitting model, and determining a staff identification number independent staff image data associated with the work area image data by using an image splitting module in the staff image splitting model based on the staff number parameter.
The staff identification number of independent staff image data is an image splitting result associated with the work area image data, wherein the staff identification number of independent staff image data is mapped with staff identification one by one, and staff number parameters are used for indicating the number of staff identifications associated with the work area image data. The staff number analysis unit is a module which is determined by the original staff number analysis unit through the parameter updating mode, and the image splitting module is a module which is determined by the original image splitting unit through the parameter updating mode.
The image feature matrix can play a role in representing staff marks in the image splitting process. Similar to the model determination process, in the model application process, when determining the employee quantity parameters associated with the work area image data, the server can determine employee identification quantity employee image feature matrices associated with the work area image data, wherein the employee identification quantity employee image feature matrices are mapped with the employee identifications one by one, and the employee image feature matrices are used for representing the image feature matrices of the associated employee identifications. The capacity of the model is that in the process of determining the model, parameter updating is carried out on the original employee number analysis unit according to multidimensional digital deviation, so that the module can learn how to determine an accurate second employee image feature matrix and accurately analyze the numerical parameters associated with the second employee image feature matrix, and therefore the employee image feature matrix associated with the unique employee identification can be accurately determined.
When the employee identification number of the independent employee image data associated with the work area image data is determined based on the employee number parameter, the image splitting module can determine the employee identification number of the independent employee image data associated with the work area image data based on the employee number parameter and the employee identification number of the employee image feature matrix according to the number limitation of the employee number parameter on the employee image data and the characterization effect of the employee image feature matrix on the employee identification. By using the employee image feature matrix, the server can identify employee image data respectively attached to each employee image feature matrix from candidate employee image data, and further obtain employee image data associated with employee identifications associated with each employee image feature matrix.
It will be appreciated that the number of employee identifications included in the image data of different work areas may vary, and that in the present application, the image splitting process for each employee identification is the same, i.e. the model parameters required for image splitting for each employee identification are the same. Accordingly, in one possible implementation, the server may split employee image data associated with each of the plurality of employee identifications in a parallel manner.
When the employee identification number independent employee image data related to the work area image data is determined based on the employee number parameters and the employee identification number employee image feature matrixes, the server can form employee identification number splitting submodules based on the parameter index values related to the image splitting modules and the employee number parameters, the parameter index values related to the employee identification number splitting submodules are the same and are the parameter index values determined according to the parameter index values of the image splitting modules, the employee identification number splitting submodules are mapped with the employee identification number employee image feature matrixes one by one, namely each splitting submodule is used for splitting the employee image data related to the employee identification represented by the related employee image feature matrixes from the work area image data. Therefore, in the image splitting process, the employee image data associated with a plurality of employee identifications can be synchronously split, and the next employee identification employee image data splitting is executed after the employee image data associated with one employee identification is split is not required to be waited, so that the image splitting efficiency is improved.
The server can sequentially set the number of splitting submodules of the employee identifications as the designated submodules, taking the designated submodules as an example, and utilizing the designated submodules, independent employee image data associated with the designated employee identifications can be obtained by splitting the designated employee image data according to the characterization effect of the designated employee image feature matrix on the designated employee identifications, wherein the independent employee image data is employee image data which is relatively attached to the designated employee image feature matrix, and the designated employee identifications are employee identifications characterized by the designated employee image feature matrix.
Based on the embodiment, the model training mode of the embodiment of the disclosure can be utilized to generate the employee image splitting model capable of automatically analyzing the number of employee identifications and accurately splitting images according to the analyzed number of employee identifications, and when the model is applied, an image splitting result can be obtained only by inputting the image data of the work area to be split, and employee identification on the image data of the work area before the image splitting accuracy is ensured, and meanwhile, the convenience and the data analysis efficiency of image splitting are improved.
In another aspect, an embodiment of the disclosure further provides a working data collection system based on image data analysis, where the system includes at least a server, and the server may be configured to perform one or more optional implementations of the working data collection method based on image data analysis in the foregoing embodiment.
Optionally, the system may further comprise other electronic devices than a server, such as an image data acquisition device, which may be used to acquire and send work area image data to the server, a communication device, etc., which may be used to send prompt information to one or more employees.
Referring now to fig. 2, a schematic diagram of a configuration of an electronic device 200 suitable for use in implementing embodiments of the present disclosure is shown, the electronic device 200 may be provided, for example, as a server as referred to in the embodiments described above. The electronic device shown in fig. 2 is merely an example and should not be construed to limit the functionality and scope of use of the disclosed embodiments.
As shown in fig. 2, the electronic device 200 may include a processing means (e.g., a central processing unit, a graphics processor, etc.) 201, which may perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 202 or a program loaded from a storage means 208 into a Random Access Memory (RAM) 203. In the RAM 203, various programs and data necessary for the operation of the electronic apparatus 200 are also stored. The processing device 201, ROM 202, and RAM 203 are connected to each other through a bus 204. An input/output (I/O) interface 205 is also connected to bus 204.
In general, the following devices may be connected to the I/O interface 205: input devices 206 including, for example, a touch screen, touchpad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; an output device 207 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 208 including, for example, magnetic tape, hard disk, etc.; and a communication device 209. The communication means 209 may allow the electronic device 200 to communicate with other devices wirelessly or by wire to exchange data. While fig. 2 shows an electronic device 200 having various means, it is to be understood that not all of the illustrated means are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
In particular, according to embodiments of the present disclosure, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a non-transitory computer readable medium, the computer program comprising program code for performing the method shown in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 209, or from the storage means 208, or from the ROM 202. The above-described functions defined in the methods of the embodiments of the present disclosure are performed when the computer program is executed by the processing device 201.
It should be noted that the computer readable medium described in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, fiber optic cables, RF (radio frequency), and the like, or any suitable combination of the foregoing.
In some embodiments, the clients, servers may communicate using any currently known or future developed network protocol, such as HTTP (HyperText Transfer Protocol ), and may be interconnected with any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network ("LAN"), a wide area network ("WAN"), the internet (e.g., the internet), and peer-to-peer networks (e.g., ad hoc peer-to-peer networks), as well as any currently known or future developed networks.
The computer readable medium may be contained in the electronic device; or may exist alone without being incorporated into the electronic device.
The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to perform one or more alternative implementations of the working data acquisition method based on image data analysis in the above embodiments.
Or the computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to implement one or more alternative implementations of the working data acquisition method based on image data analysis in the above embodiments.
Computer program code for carrying out operations of the present disclosure may be written in one or more programming languages, including, but not limited to, an object oriented programming language such as Java, smalltalk, C ++ and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computer (for example, through the Internet using an Internet service provider).
The flowcharts and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The modules described in the embodiments of the present disclosure may be implemented in software or hardware. The name of a module does not in some cases define the module itself.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a Complex Programmable Logic Device (CPLD), and the like.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
The foregoing description is only of the preferred embodiments of the present disclosure and description of the principles of the technology being employed. It will be appreciated by persons skilled in the art that the scope of the disclosure referred to in this disclosure is not limited to the specific combinations of features described above, but also covers other embodiments which may be formed by any combination of features described above or equivalents thereof without departing from the spirit of the disclosure. Such as those described above, are mutually substituted with the technical features having similar functions disclosed in the present disclosure (but not limited thereto).
Moreover, although operations are depicted in a particular order, this should not be understood as requiring that such operations be performed in the particular order shown or in sequential order. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate embodiments can also be implemented in combination in a single embodiment. Conversely, various features that are described in the context of a single embodiment can also be implemented in multiple embodiments separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims. The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Claims (10)
1. A method of collecting working data based on image data analysis, the method comprising:
Acquiring work area image data, wherein the work area image data comprises continuous image frames acquired by a work area in a preset time period;
Inputting the work area image data into a staff image splitting model which is obtained through training in advance to obtain independent staff image data of each staff which appears in the work area in the preset time period;
Inputting the image data of the independent staff related to a first staff into a working data acquisition model which is obtained by training in advance to obtain working data of the first staff, wherein the first staff is any staff of the staff;
comparing the working data of the first staff based on a preset assessment index to obtain an assessment score of the first staff;
And sending prompt information to the first staff in response to the assessment score of the first staff being lower than a preset score threshold.
2. A method as defined in claim 1, wherein the training of the employee image splitting model comprises:
Acquiring sample work area image data, wherein the sample work area image data comprises a plurality of sample employee image data, each sample employee image data comprises a plurality of associated sample independent employee image data, the assigned employee image data is arranged according to the time sequence of the associated plurality of assigned independent employee image data, and the sample independent employee image data is mapped with employee identifications one by one;
Sequentially setting the plurality of sample employee image data as the designated employee image data, determining candidate employee number parameters associated with the designated employee image data by using an original employee number analysis unit in an original employee image splitting model, and determining a plurality of first employee images associated with the designated employee image data according to the candidate employee number parameters by using an original image splitting unit in the original employee image splitting model;
determining a number of actual employees associated with the specified employee image data based on the plurality of specified independent employee image data;
Updating parameter index values associated with the original employee image splitting model based on the quantity deviation between the candidate employee quantity parameter and the real employee quantity and the quantity deviation between the plurality of appointed independent employee image data and the plurality of first employee images to obtain an employee image splitting model, wherein the employee image splitting model is used for determining a plurality of independent employee image data associated with the work area image data, the independent employee image data is mapped with employee identifications one by one, and the employee identifications comprise identifications corresponding to each employee appearing in the work area.
3. A method as defined in claim 2, wherein said determining candidate employee count parameters associated with the specified employee image data comprises:
Determining a plurality of second employee image feature matrices associated with the designated employee image data;
Based on the plurality of second employee image feature matrices, determining candidate numerical parameters respectively associated with the plurality of second employee image feature matrices, wherein the numerical parameters are used for indicating the confidence level of the associated unique employee identification of the associated employee image feature matrix;
Determining candidate numerical parameters associated with the plurality of second employee image feature matrices as the candidate employee number parameters, wherein the employee number parameters are used for indicating the number of the employee image feature matrices associated with the unique employee identification;
the determining, based on the plurality of specified independent employee image data, a number of actual employees associated with the specified employee image data includes:
determining second independent employee image data associated with each of the plurality of second employee image feature matrices;
Sequentially setting a plurality of second independent employee image data as appointed second independent employee image data, and determining the maximum index parameter in the distance index respectively associated between the appointed second independent employee image data and the plurality of appointed independent employee image data as the real numerical index associated with the appointed second independent employee image data;
determining the real numerical indexes respectively associated with the plurality of second employee image feature matrixes as the real employee number;
The updating of the parameter index value associated with the original employee image resolution model based on the number deviation between the candidate employee number parameter and the actual employee number, and based on the number deviation between the plurality of specified individual employee image data and the plurality of first employee images, comprises:
Sequentially setting the plurality of second employee image feature matrixes as designated second employee image feature matrixes, and updating parameter index values associated with the original employee number analysis unit according to the number deviation between candidate numerical parameters and real numerical indexes associated with the designated second employee image feature matrixes;
Updating a parameter index value associated with the original employee image resolution model based on a quantity deviation between the plurality of specified independent employee image data and the plurality of first employee images.
4. A method as defined in claim 3, wherein said determining a plurality of first employee images associated with said designated employee image data in accordance with said candidate employee quantity parameter comprises:
determining the number of the second employee image feature matrixes with the confidence coefficient higher than a first index threshold corresponding to the associated candidate numerical parameters as the number of the candidate employees;
Determining a first employee image of the number of candidate employees associated with the designated employee image data according to the number of candidate employees;
the updating the parameter index value associated by the original employee number analysis unit according to the number deviation between the candidate numerical parameter associated by the designated second employee image feature matrix and the real numerical index comprises the following steps:
Updating the parameter index value associated by the original staff number analysis unit according to the number deviation between the second independent staff image data associated with the specified second staff image feature matrix and the specified independent staff image data and the number deviation between the candidate numerical parameter associated with the specified second staff image feature matrix and the real numerical index, wherein the specified independent staff image data is the specified independent staff image data with the largest distance index between the second independent staff image data associated with the specified second staff image feature matrix in the specified independent staff image data.
5. A method as defined in claim 2, wherein said determining candidate employee count parameters associated with the specified employee image data comprises:
Determining a plurality of second employee image feature matrices associated with the designated employee image data;
Determining candidate staff quantity parameters associated with the designated staff image data based on the plurality of second staff image feature matrices, wherein the candidate staff quantity parameters are used for indicating a plurality of reference feature matrices associated with unique staff identifications in the plurality of second staff image feature matrices, and the designated reference feature matrices are used for indicating staff personal feature information of the staff identifications associated with the designated reference feature matrices;
The determining the plurality of first employee images associated with the designated employee image data according to the candidate employee quantity parameter includes:
Sequentially setting the multiple reference feature matrixes as the appointed reference feature matrix, and determining a first employee image associated with the appointed reference feature matrix based on the appointed reference feature matrix and the appointed employee image data, wherein the first employee image associated with the appointed reference feature matrix is employee image data associated with the employee identification associated with the appointed reference feature matrix;
the determining a first employee image associated with the specified reference feature matrix based on the specified reference feature matrix and the specified employee image data includes:
determining a designated employee image feature matrix associated with the designated employee image data;
Identifying a first employee image feature matrix associated with the specified reference feature matrix from the specified employee image feature matrices based on the specified reference feature matrix, wherein a distance index between the first employee image feature matrix and the specified reference feature matrix is higher than a second index threshold;
Determining a first artificial image associated with the specified reference feature matrix based on the first artificial image feature matrix associated with the specified reference feature matrix;
the determining the first artificial image associated with the specified reference feature matrix based on the first artificial image feature matrix associated with the specified reference feature matrix comprises the following steps:
Taking the first employee image feature matrixes respectively associated with the reference feature matrixes except the appointed reference feature matrix in the plurality of reference feature matrixes as a plurality of comparison employee image feature matrixes, and identifying the first feature matrixes associated with the appointed reference feature matrix from the first employee image feature matrixes associated with the appointed reference feature matrixes based on the plurality of comparison employee image feature matrixes, wherein the distance indexes between the first feature matrixes and the plurality of comparison employee image feature matrixes are lower than a third index threshold;
determining a first artificial image associated with the specified reference feature matrix based on the first feature matrix associated with the specified reference feature matrix;
the original image splitting unit is further configured to:
Identifying a second feature matrix associated with the specified reference feature matrix from first manual image feature matrices associated with the specified reference feature matrix based on the specified reference feature matrix, wherein a distance index between the second feature matrix and the specified reference feature matrix is higher than a fourth index threshold, and the fourth index threshold is higher than the second index threshold;
the determining the first artificial image associated with the specified reference feature matrix based on the first feature matrix associated with the specified reference feature matrix comprises:
Combining the first feature matrix associated with the appointed reference feature matrix and the second feature matrix associated with the appointed reference feature matrix to generate an image feature matrix associated with the appointed reference feature matrix;
and determining a first artificial image associated with the specified reference feature matrix based on the image feature matrix associated with the specified reference feature matrix.
6. A method as defined in claim 2, wherein the original employee image splitting model further includes an original feature engineering unit to identify a designated employee image feature matrix associated with the designated employee image data, the determining, with an original employee quantity analysis unit in the original employee image splitting model, a candidate employee quantity parameter associated with the designated employee image data, and determining, with an original image splitting unit in the original employee image splitting model, a plurality of first employee images associated with the designated employee image data in accordance with the candidate employee quantity parameter, comprising:
Determining candidate staff quantity parameters associated with the specified staff image data based on the specified staff image feature matrix by using an original staff quantity analysis unit in an original staff image splitting model, and determining a plurality of first staff images associated with the specified staff image data according to the candidate staff quantity parameters and the specified staff image feature matrix by using an original image splitting unit in the original staff image splitting model.
7. A method as defined in claim 1, wherein the work data collection model includes a behavior recognition unit, the inputting the individual employee image data associated with a first employee into a pre-trained work data collection model to obtain work data for the first employee, comprising:
Sampling the image data of the independent staff corresponding to the first staff to obtain a first image sequence;
Inputting each employee image frame in the first image sequence into the behavior recognition unit to obtain a behavior data tag sequence of the first employee, wherein the behavior data tags in the behavior data tag sequence are associated with the employee image frames in the first image sequence one by one;
and generating the working data according to the behavior data labels associated with each employee image frame.
8. The method of claim 7, wherein comparing the work data of the first employee based on the preset assessment index to obtain an assessment score for the first employee, comprises:
Comparing the quantity distribution information of the behavior data labels in the behavior data label sequence with the target quantity distribution corresponding to the preset assessment index to obtain the assessment score of the first employee.
9. A working data acquisition system based on image data analysis, characterized by comprising a server for performing the method of any one of claims 1-8.
10. An electronic device, comprising:
a storage device having a computer program stored thereon;
processing means for executing said computer program in said storage means to carry out the steps of the method according to any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410681246.3A CN118278822B (en) | 2024-05-29 | 2024-05-29 | Working data acquisition method and system based on image data analysis and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202410681246.3A CN118278822B (en) | 2024-05-29 | 2024-05-29 | Working data acquisition method and system based on image data analysis and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN118278822A true CN118278822A (en) | 2024-07-02 |
CN118278822B CN118278822B (en) | 2024-08-09 |
Family
ID=91642435
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202410681246.3A Active CN118278822B (en) | 2024-05-29 | 2024-05-29 | Working data acquisition method and system based on image data analysis and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN118278822B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210264374A1 (en) * | 2020-02-25 | 2021-08-26 | Tachyons Inc. | Time monitoring system |
CN113378674A (en) * | 2021-05-31 | 2021-09-10 | 上海东普信息科技有限公司 | Person arrival post detection method and device, electronic equipment and storage medium |
WO2022227562A1 (en) * | 2021-04-27 | 2022-11-03 | 北京市商汤科技开发有限公司 | Identity recognition method and apparatus, and electronic device, storage medium and computer program product |
CN116092199A (en) * | 2023-04-11 | 2023-05-09 | 山东易视智能科技有限公司 | Employee working state identification method and identification system |
CN116385918A (en) * | 2022-12-19 | 2023-07-04 | 沈阳创新设计研究院有限公司 | Personnel working state evaluation method and system based on intelligent recognition |
US20240062127A1 (en) * | 2022-08-19 | 2024-02-22 | Toshiba Tec Kabushiki Kaisha | Image forming apparatus and control method |
-
2024
- 2024-05-29 CN CN202410681246.3A patent/CN118278822B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210264374A1 (en) * | 2020-02-25 | 2021-08-26 | Tachyons Inc. | Time monitoring system |
WO2022227562A1 (en) * | 2021-04-27 | 2022-11-03 | 北京市商汤科技开发有限公司 | Identity recognition method and apparatus, and electronic device, storage medium and computer program product |
CN113378674A (en) * | 2021-05-31 | 2021-09-10 | 上海东普信息科技有限公司 | Person arrival post detection method and device, electronic equipment and storage medium |
US20240062127A1 (en) * | 2022-08-19 | 2024-02-22 | Toshiba Tec Kabushiki Kaisha | Image forming apparatus and control method |
CN116385918A (en) * | 2022-12-19 | 2023-07-04 | 沈阳创新设计研究院有限公司 | Personnel working state evaluation method and system based on intelligent recognition |
CN116092199A (en) * | 2023-04-11 | 2023-05-09 | 山东易视智能科技有限公司 | Employee working state identification method and identification system |
Also Published As
Publication number | Publication date |
---|---|
CN118278822B (en) | 2024-08-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108520220B (en) | Model generation method and device | |
CN108427939B (en) | Model generation method and device | |
CN109818839B (en) | Personalized behavior prediction method, device and system applied to smart home | |
CN109104620A (en) | A kind of short video recommendation method, device and readable medium | |
CN107330731B (en) | Method and device for identifying click abnormity of advertisement space | |
CN111526119B (en) | Abnormal flow detection method and device, electronic equipment and computer readable medium | |
CN109684047A (en) | Event-handling method, device, equipment and computer storage medium | |
CN109447246B (en) | Method and apparatus for generating a model | |
CN109902446B (en) | Method and apparatus for generating information prediction model | |
CN112231663A (en) | Data acquisition method, device, equipment and storage medium combining RPA and AI | |
CN112001322B (en) | Method, device and storage medium for determining label personnel aggregation | |
CN116932919B (en) | Information pushing method, device, electronic equipment and computer readable medium | |
CN113918526A (en) | Log processing method and device, computer equipment and storage medium | |
CN113382087B (en) | Push time prediction method, push time prediction device, push time prediction equipment, storage medium and program product | |
CN113449773A (en) | Model updating method and device, storage medium and electronic equipment | |
CN110490132B (en) | Data processing method and device | |
CN109088793B (en) | Method and apparatus for detecting network failure | |
CN118278822B (en) | Working data acquisition method and system based on image data analysis and electronic equipment | |
CN115690544B (en) | Multi-task learning method and device, electronic equipment and medium | |
CN117135032A (en) | Abnormality identification method, device and equipment and computer storage medium | |
CN113689959B (en) | Epidemic situation prevention and control decision method, device, equipment and medium based on artificial intelligence | |
CN116542509A (en) | Campus logistics task management method and device | |
CN114462494A (en) | Intelligent lock fault prediction method, system, electronic device and readable medium | |
CN114399355B (en) | Information pushing method and device based on user conversion rate and electronic equipment | |
CN112163932A (en) | Malicious seat occupying order identification method and device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |