[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022092032A1 - Dispositif d'estimation d'activité, procédé d'estimation d'activité, programme et système d'estimation d'activité - Google Patents

Dispositif d'estimation d'activité, procédé d'estimation d'activité, programme et système d'estimation d'activité Download PDF

Info

Publication number
WO2022092032A1
WO2022092032A1 PCT/JP2021/039325 JP2021039325W WO2022092032A1 WO 2022092032 A1 WO2022092032 A1 WO 2022092032A1 JP 2021039325 W JP2021039325 W JP 2021039325W WO 2022092032 A1 WO2022092032 A1 WO 2022092032A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
person
unit
relationship
behavior
Prior art date
Application number
PCT/JP2021/039325
Other languages
English (en)
Japanese (ja)
Inventor
遼 田中
Original Assignee
株式会社Vaak
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社Vaak filed Critical 株式会社Vaak
Publication of WO2022092032A1 publication Critical patent/WO2022092032A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion

Definitions

  • the present invention relates to a behavior estimation device, a behavior estimation method, a program, and a behavior estimation system for estimating a person's behavior.
  • the conventional behavior estimation device estimates the behavior based only on the movement of the subject, there is a problem that it is difficult to discriminate the behavior with similar movements.
  • the present invention has been made in view of these points, and an object thereof is to improve the accuracy of estimating human behavior.
  • the behavior estimation device includes an image acquisition unit that acquires captured image data showing an captured image, a human detection unit that detects the movement of a person included in the captured image, and the captured image.
  • the subject detection unit that detects the type of the included subject
  • the relationship specifying unit that specifies the relative relationship between the person and the subject due to the movement of the person, the type of the subject, and the relative relationship. It has an estimation unit that estimates the behavior content of the person based on the combination.
  • the estimation unit identifies the reference action content associated with the reference subject corresponding to the type of the subject and the reference relative relationship corresponding to the relative relationship in the action content database.
  • the behavior content of the person detected by the person detection unit may be estimated.
  • the estimation unit has the reference subject and the reference relative relationship.
  • the reference behavior content associated with the combination with may be estimated as the behavior content of the person.
  • the relationship specifying unit specifies the distance between the person and the subject as the relative relationship
  • the estimation unit uses the reference distance associated with the type of the subject detected by the subject detecting unit and the relationship specifying unit.
  • the action content may be estimated based on the result of comparing the distance between the person and the subject specified by.
  • the relationship specifying unit specifies the relationship between the direction of the person's line of sight and the position of the subject as the relative relationship
  • the estimation unit is the line of sight associated with the type of the subject detected by the subject detecting unit.
  • the action content is based on the result of comparing the reference line-of-sight relationship, which is the relationship between the direction of the subject and the position of the subject, and the relationship between the direction of the line of sight of the person specified by the relationship specifying unit and the position of the subject. You may estimate.
  • the relationship specifying unit specifies the relationship between the orientation of the person's body and the position of the subject as the relative relationship
  • the estimation unit is the body associated with the type of the subject detected by the subject detecting unit.
  • the action content based on the result of comparing the reference body orientation relationship, which is the relationship between the orientation of the subject and the position of the subject, and the relationship between the direction of the person's line of sight specified by the relationship specifying unit and the position of the subject. May be estimated.
  • the image acquisition unit acquires a plurality of the captured image data created in time series, the relationship specifying unit identifies a change in the relative relationship, and the estimation unit determines the type of the subject and the relative.
  • the behavioral content of the person may be estimated based on the combination with the change of the relationship.
  • the estimation unit further determines the action content based on the result of determining whether or not the subject was included in any of the plurality of captured image data before the person detection unit detects the person. You may estimate.
  • the person detection unit may further detect the attribute of the person, and the estimation unit may estimate the behavior content based on the attribute of the person detected by the person detection unit.
  • the subject detection unit detects the first subject and the second subject included in the captured image, and the relationship specifying unit is the first relative, which is the relationship between the person and the first subject due to the movement of the person.
  • the estimation unit identifies the behavior of the person based on the combination of the type of the subject and at least two relative relationships of the first relative relationship, the second relative relationship, or the third relative relationship.
  • the content may be estimated.
  • the setting reception unit that accepts the setting of the attribute of the subject included in the captured image may be further provided, and the estimation unit may estimate the action content based on the attribute of the subject.
  • the relationship specifying unit further specifies the positional relationship between the subject detected by the subject detecting unit and the subject for which the setting receiving unit has received the attribute setting, and the estimation unit further identifies the positional relationship specified by the relationship specifying unit.
  • the action content may be estimated based on the relationship and the attribute of the subject for which the setting receiving unit has received the setting.
  • the output unit may output the notification information of the type corresponding to the combination of the location specified by the location specifying unit and the action content.
  • the estimation unit is the change in the color of the person specified by the color specifying unit or the change in the subject.
  • the behavior may be estimated based further on at least one of the color changes.
  • the image acquisition unit acquires a plurality of the captured image data
  • the relationship specifying unit identifies a plurality of the relative relationships
  • the estimation unit combines each of the plurality of the relative relationships with the type of the subject.
  • the behavior content of the person may be estimated based on the above.
  • the relationship specifying unit identifies a first relative relationship and a second relative relationship different from the first relative relationship
  • the estimation unit identifies the first relative relationship and a plurality of the first relative relationships specified at a plurality of times.
  • the behavior content of the person may be estimated based on the combination of the change in the second relative relationship specified based on the relative relationship and the type of the subject.
  • the behavior estimation method includes a step of acquiring captured image data showing a captured image, a step of detecting a movement of a person included in the captured image, and the captured image, which are executed by a computer. Based on a step of detecting the type of the subject included in the above, a step of specifying the relative relationship between the person and the subject due to the movement of the person, and a combination of the type of the subject and the relative relationship. It has a step of estimating the behavior content of the person.
  • the program according to the third aspect of the present invention includes a step of acquiring captured image data showing a captured image, a step of detecting a movement of a person included in the captured image, and a subject included in the captured image in a computer. Based on the combination of the step of detecting the type of the subject, the step of specifying the relative relationship between the person and the subject due to the movement of the person, and the type of the subject and the relative relationship, the person. To execute the step of estimating the action content of.
  • the behavior estimation system displays a behavior estimation device that estimates a person's behavior content based on an image captured by the image pickup device, and the behavior content of the person estimated by the behavior estimation device.
  • the behavior estimation device includes an image acquisition unit that acquires captured image data indicating the captured image, a person detection unit that detects the movement of a person included in the captured image, and the captured image.
  • a subject detection unit that detects the type of subject included in the above, a relationship specifying unit that specifies a relative relationship between the person and the subject due to the movement of the person, a type of the subject, and the relative relationship.
  • the terminal has an estimation unit that estimates the behavior content of the person based on the combination of the above, and a transmission unit that transmits the behavior content information indicating the behavior content estimated by the estimation unit to the information terminal.
  • the terminal has an information receiving unit that receives the action content information, and a display unit that displays the action content based on the action content information received by the information receiving unit.
  • the present invention has the effect of improving the accuracy of estimating human behavior.
  • FIG. 1 is a diagram for explaining an outline of the behavior estimation system S according to the present embodiment.
  • FIG. 2 is a diagram showing a configuration of the behavior estimation system S.
  • the behavior estimation system S includes an image pickup device 1, an information terminal 2, and a behavior estimation device 10. As shown in FIG. 2, the image pickup device 1, the information terminal 2, and the behavior estimation device 10 are connected via a network N.
  • the network N is, for example, an intranet or the Internet.
  • the network N may include a wireless network.
  • FIG. 1A is a view of the space R from above
  • FIG. 1B is a view of the space R from the side.
  • the space R is a space in which an object is placed, such as a store sales floor, a store warehouse, a factory warehouse, or a facility storage space.
  • a plurality of shelves T (T1 to T4) are installed in the space R.
  • a plurality of objects P (P1 to P3) are respectively placed on the plurality of shelves T (T1 to T3).
  • a person M is staying in the space R.
  • the person M has a bag B in which the object P can be put.
  • the behavior estimation system S includes a behavior estimation device 10 that estimates the behavior content of the person M based on the captured images generated by one or more image pickup devices 1 installed in the space R.
  • the image pickup device 1 is, for example, a surveillance camera, and captures an area for estimating a person's behavior.
  • the image pickup device 1 transmits the captured image data generated by taking an image to the behavior estimation device 10.
  • the image pickup apparatus 1 may generate moving image data as captured image data.
  • the moving image data is, for example, H. 264 or H.
  • the data is encoded by a compression standard such as 265.
  • the behavior estimation device 10 estimates the behavior content of the person M based on the captured image generated by the image pickup device 1, and transmits the behavior content information indicating the estimated behavior content to the information terminal 2.
  • the behavior estimation device 10 is an information processing device that estimates the behavior content of the person M based on the relative relationship between the person M and the subject included in the captured image, and is, for example, a computer.
  • the subject is, for example, a plurality of shelves T, a plurality of objects P, a bag B, or another person.
  • the behavior estimation device 10 may be a single information processing device, or may be composed of a plurality of information processing devices such as a cloud-type server.
  • the behavior estimation device 10 identifies the relative relationship between the person M and the subject, and estimates the behavior content of the person M based on the combination of the specified relative relationship and the type of the subject.
  • the relative relationship is, for example, the distance between the person M and the object P, the relationship between the direction of the line of sight of the person M and the position of the object P, or the relationship between the orientation of the body of the person M and the position of the object P.
  • the type of subject is represented by information that can identify the subject based on at least one of shape or color, such as canned beverages, knives, and product baskets. In the present embodiment, the type of the subject may be represented by a product name for the sake of simplicity.
  • the subject is an object or a person.
  • the behavior estimation device 10 determines that "the person M in contact with the object P2 is in contact with the object P2" based on the distance between the person M's hand and the subject. The hand is in close proximity to bag B. " Subsequently, when the type of the detected object P2 is "canned beverage” and the type of the detected bag B is "handbag", the behavior estimation system S has the specified relative relationship and the canned beverage which is the type of the subject. And by specifying that "person M put the canned beverage in the handbag” based on the combination with the handbag, it is presumed that the action content of person M is baggage.
  • the behavior estimation device 10 has a relative relationship.
  • "person M put the canned beverage in the product basket” based on the combination of the canned beverage and the product basket, which are the types of the subject, it is presumed that the action content of the person M is shopping.
  • the behavior estimation device 10 transmits the estimated result to the information terminal 2.
  • the information terminal 2 is, for example, a smartphone, a tablet, or a personal computer.
  • the information terminal 2 is a terminal used by, for example, a store clerk who monitors the sales floor of a store or an employee who manages a warehouse of a factory (hereinafter, referred to as "manager").
  • the information terminal 2 displays the behavior content of the person estimated by the behavior estimation device 10 on the display.
  • the information terminal 2 acquires the behavior content information including the behavior content of the person estimated by the behavior estimation device 10 and the captured image in which the person is captured, and is included in the captured image included in the behavior content information and the captured image. It may be displayed in association with the content of human behavior.
  • the behavior estimation device 10 estimates the behavior content of the human M based on the combination of the relative relationship between the human M and the subject and the type of the subject, thereby discriminating the behaviors having similar movements with high accuracy. be able to. Then, the action estimation device 10 notifies the information terminal 2 that a preset action (for example, shoplifting action) has been detected, so that the administrator using the information terminal 2 performs the preset action. It is possible to grasp that with high accuracy.
  • a preset action for example, shoplifting action
  • the administrator of the information terminal 2 can set various information used by the behavior estimation device 10 to estimate the behavior content of a person.
  • the information terminal 2 transmits setting information, which is information used for estimating the behavior content of a person, to the behavior estimation device 10 in response to the operation of the administrator.
  • the setting information includes, for example, at least one of subject attribute information or subject type information.
  • the subject attribute information is information in which the position or shape of the subject is associated with the subject attribute.
  • the subject attribute is information indicating the characteristics of the subject such as the owner of the subject and the function of the subject.
  • the attribute information of the shelf T1 shown in FIG. 1 is information in which the position or shape of the shelf T1 is associated with the "store" indicating the attribute of the shelf T1.
  • the subject type information is information in which the shape of the subject and the type of the subject are associated with each other.
  • the subject type information is information in which the "canned beverage" indicating the type of the subject is associated with the shape of the canned beverage.
  • FIG. 3 is a diagram for explaining the configuration of the information terminal 2.
  • the information terminal 2 has a communication unit 21, a storage unit 22, a control unit 23, a display unit 24, and an operation unit 25.
  • the control unit 23 has an information transmission unit 231 and an information reception unit 232.
  • the communication unit 21 includes a communication device for transmitting and receiving information via the network N.
  • the communication device is, for example, a LAN (Local Area Network) controller or a wireless LAN controller.
  • the storage unit 22 has a storage medium such as a ROM (ReadOnlyMemory), a RAM (RandomAccessMemory), and an SSD (SolidStateDrive).
  • the storage unit 22 stores a program executed by the control unit 23.
  • the control unit 23 is, for example, a CPU (Central Processing Unit).
  • the control unit 23 functions as an information transmission unit 231 and an information reception unit 232 by executing a program stored in the storage unit 22.
  • the information transmission unit 231 transmits the setting information of the behavior estimation device 10 created by the administrator who uses the information terminal 2 to the behavior estimation device 10 via the communication unit 21.
  • the information receiving unit 232 receives the behavior content information transmitted by the behavior estimation device 10 via the communication unit 21.
  • the display unit 24 has, for example, a liquid crystal display or a display device such as an organic EL (Electro Luminescence).
  • the display unit 24 displays the action content based on the action content information received by the information receiving unit 232.
  • the display unit 24 extracts, for example, the behavior content of the person included in the behavior content information received by the information receiving unit 232 and the captured image in which the person is captured.
  • the display unit 24 displays the extracted captured image on the display, and displays the behavior content of the person in the captured image in association with the captured image.
  • the operation unit 25 is a device that accepts the operation of the administrator who uses the information terminal 2, and is a keyboard, a mouse, or a touch panel.
  • the operation unit 25 notifies the information transmission unit 231 of information indicating the operation content of the administrator.
  • FIG. 4 is a diagram for explaining the configuration of the behavior estimation device 10.
  • the behavior estimation device 10 has a communication unit 11, a storage unit 12, and a control unit 13.
  • the control unit 13 includes an image acquisition unit 131, a person detection unit 132, a subject detection unit 133, a setting reception unit 134, a relationship identification unit 135, an estimation unit 136, a location identification unit 137, and an output unit 138.
  • the communication unit 11 includes a communication device for transmitting and receiving information via the network N.
  • the communication device is, for example, a LAN controller or a wireless LAN controller.
  • the storage unit 12 has a storage medium such as a ROM, RAM, and SSD.
  • the storage unit 12 stores a program executed by the control unit 13.
  • the storage unit 12 stores the setting information acquired from the information terminal 2.
  • the storage unit 12 stores various information used by the estimation unit 136 for estimating the behavior content of a person in a database.
  • the storage unit 12 stores, for example, an action content database in which a person's action content, a person's movement, a subject type, and a relative relationship between the person and the subject are associated with each other.
  • the subject is an object or a person.
  • the storage unit 12 stores an action content database in which one or more reference subjects other than the reference person, the reference relative relationship between the reference subject and the reference person, and the reference action content of the reference person are associated with each other. do.
  • the reference person, the reference subject, the reference relative relationship, and the reference behavior content are the person, the subject, the relative relationship between the subject and the person, and the behavior content of the person, respectively, which the estimation unit 136 uses as a reference for estimating the behavior content of the person. This is information indicating. The details of the reference person, the reference subject, the reference relative relationship, and the content of the reference action will be described later.
  • FIG. 5 is a diagram showing an example of an action content database.
  • a person's behavioral content is associated with a plurality of requirements for identifying the behavioral content.
  • the plurality of requirements are the attributes of the person who estimates the behavior, the movement of the person, the types of the plurality of subjects (“subject 1” and “subject 2” shown in FIG. 5), and the relative relationship between the person and the subject.
  • the attribute of a person indicates the position or role of the person in the space for estimating the behavior of the person, for example, a purchaser or an employee.
  • the estimation unit 136 specifies the behavior content of the subject included in the captured image by specifying the behavior content corresponding to the combination of the requirements included in the behavior content database.
  • control unit 13 is, for example, a CPU. By executing the program stored in the storage unit 12, the control unit 13 executes an image acquisition unit 131, a person detection unit 132, a subject detection unit 133, a setting reception unit 134, a relationship identification unit 135, an estimation unit 136, and a location. It functions as a specific unit 137 and an output unit 138.
  • the image acquisition unit 131 acquires captured image data indicating the captured image.
  • the image acquisition unit 131 acquires the captured image data imaged and generated by the image pickup device 1 via the communication unit 11.
  • the image acquisition unit 131 may acquire a plurality of captured image data created in time series.
  • the person detection unit 132 detects the movement of a person included in the captured image.
  • the movement of a person is, for example, a movement as described in the column of "movement of a person" shown in FIG.
  • the person detection unit 132 detects the position of the person included in the captured image, and further detects the movement of the person who has detected the position.
  • the human detection unit 132 detects the movement of a person included in the captured image, for example, by specifying changes in the positions of the joints of the person and the direction of the line of sight of the person included in the captured image.
  • the person detection unit 132 outputs the person detection information indicating the movement of the detected person to the estimation unit 136.
  • the person detection unit 132 may further detect the attributes of a person.
  • the person detection unit 132 detects a person's attributes by, for example, determining whether or not the employee is an employee based on the pattern or color of the work clothes or uniform worn by the employee.
  • the person detection unit 132 is an employee based on the result of comparing the employee characteristics such as the contour and shape of the employee's face stored in the storage unit 12 with the characteristics of the person included in the captured image.
  • a person's attribute may be detected by determining whether or not it is.
  • the subject detection unit 133 detects the type of subject included in the captured image.
  • the subject detection unit 133 detects the position of the subject included in the captured image, and further detects the type of the subject for which the position has been detected.
  • the subject detection unit 133 detects a subject included in the captured image based on the shape of the subject included in the setting information stored in the storage unit 12, for example, and is based on the type of the subject associated with the shape of the subject. Detects the type of subject included in the captured image.
  • the subject detection unit 133 outputs subject detection information indicating the position of the detected subject and the type of the subject to the estimation unit 136.
  • the subject detection unit 133 may detect a plurality of subjects included in the captured image.
  • the subject detection unit 133 may detect the first subject and the second subject included in the captured image.
  • the first subject and the second subject are, for example, articles such as the types described in the columns of "subject 1" and "subject 2" shown in FIG.
  • the estimation unit 136 estimates a person's behavior content based on the type of one or more of the detected plurality of subjects.
  • the setting reception unit 134 accepts the setting of the attribute of the subject included in the captured image.
  • the setting receiving unit 134 receives the setting information including the subject attribute information and the subject type information transmitted by the information terminal 2, and receives the setting of the subject attribute included in the captured image.
  • the setting reception unit 134 may store the setting information including the setting of the attribute of the received subject in the storage unit 12, or may output it to the subject detection unit 133.
  • the relationship specifying unit 135 identifies the relative relationship between the person and the subject that accompanies the movement of the person.
  • the relationship specifying unit 135 specifies the relative relationship based on the person detection information output by the person detection unit 132 and the subject detection information output by the subject detection unit 133.
  • the relationship specifying unit 135 specifies, for example, the relative relationship between the portion corresponding to the movement of the person detected by the person detecting unit 132 and the subject.
  • the relationship specifying unit 135 identifies the relative relationship between the human eye and the subject when the movement of the person detected by the person detecting unit 132 is "gaze", and the movement of the person detected by the person detecting unit 132. If is "take”, identify the relative relationship between the human hand and the subject.
  • the relationship specifying unit 135 notifies the estimation unit 136 of the specified relative relationship.
  • the relative relationship is used by the estimation unit 136 to identify the behavior content of a person.
  • the relationship specifying unit 135 specifies, for example, the distance between a person and a subject as a relative relationship.
  • the relationship specifying unit 135 specifies the distance between the person and the subject based on, for example, the difference between the position of the joint of the person included in the person detection information and the position of the subject included in the subject detection information.
  • the relationship specifying unit 135 may specify the relationship between the direction of the person's line of sight and the position of the subject as a relative relationship.
  • the relationship specifying unit 135 indicates the direction of the person's line of sight included in the person detection information and the subject included in the subject detection information.
  • the relationship with the position is specified as a relative relationship.
  • the relationship specifying unit 135 specifies, for example, the difference between the direction of the line of sight when a person directly looks at the subject and the direction of the line of sight included in the person detection information as a relative relationship.
  • the relationship specifying unit 135 may specify the relationship between the orientation of the human body and the position of the subject as a relative relationship. For example, when the movement of a person is "conversation", the relationship specifying unit 135 indicates the orientation of the body of the person specified based on the position of the joint of the person included in the person detection information and the position of the subject included in the subject detection information. The relationship with is specified as a relative relationship. The relationship specifying unit 135 specifies, for example, the difference between the orientation of the body when the person faces the front of the subject and the orientation of the body specified based on the positions of the joints of the person as a relative relationship.
  • the relationship specifying unit 135 may specify changes in the relative relationship based on the specified plurality of relative relationships.
  • the relationship specifying unit 135 identifies a plurality of relative relationships based on, for example, a plurality of movements of a person detected based on a plurality of captured images created in time series and a subject, and each of the specified plurality of relative relationships is specified. Identify changes in relative relationships based on.
  • the relationship specifying unit 135 specifies the relative relationship as "the distance between the person and the subject is 10 cm", and at time T + 1, the relationship specifying unit 135 specifies the relative relationship as "the distance between the person and the subject is 50 cm". To specify. In this case, based on the relative relationship specified at time T and time T + 1, the relationship specifying unit 135 specifies that "a person is moving in a direction away from the subject". By specifying the change in the relative relationship in this way, the relationship specifying unit 135 can specify the relative relationship with high accuracy.
  • the relationship specifying unit 135 may specify a plurality of relative relationships when the captured image includes a person who estimates the behavior and a plurality of subjects.
  • the relationship specifying unit 135 is, for example, a first relative relationship, which is a relationship between a person and a first subject due to the movement of a person, a second relative relationship, which is a relationship between a person and a second subject due to the movement of a person, and a person.
  • the third relative relationship which is the relationship between the first subject and the second subject due to the movement, may be specified.
  • the first relative relationship is, for example, the relative relationship between a person and the subject 1 shown in FIG. 5, the second relative relationship is the relative relationship between the person and the subject 2 shown in FIG. 5, and the third relative relationship is the subject. It is a relative relationship between 1 and subject 2.
  • the relationship specifying unit 135 determines the first relative relationship, which is the relationship between the purchaser and the product, the second relative relationship, which is the relationship between the purchaser and the bag, and the third relative relationship, which is the relationship between the product and the bag. Identify.
  • the relationship specifying unit 135 notifies the estimation unit 136 of such a plurality of relative relationships, the estimation unit 136 can improve the accuracy of specifying the human behavior content.
  • the relationship specifying unit 135 may specify the positional relationship between the subject for which the attribute is set and another subject among the plurality of subjects included in the captured image.
  • the relationship specifying unit 135 further specifies, for example, the positional relationship between the subject detected by the subject detecting unit 133 and the subject for which the setting receiving unit 134 has received the attribute setting.
  • the captured image of the sales floor of a store includes an image of a fixedly installed product shelf and an image of a product placed on the product shelf.
  • the relationship between the position of the product shelf and the position of the product is effective in presuming that a person is shoplifting. Therefore, the relationship specifying unit 135 specifies a change in the distance between the position of the product that is the subject detected by the subject detecting unit 133 and the position of the product shelf that is the subject that the setting receiving unit 134 has received the setting of the attribute.
  • the estimation unit 136 estimates the behavior of a person.
  • the estimation unit 136 estimates the human behavior content based on the combination of the type of the subject detected by the subject detection unit 133 and the relative relationship specified by the relationship identification unit 135. For example, in the action content database, the estimation unit 136 specifies the reference action content associated with the reference subject corresponding to the type of subject and the reference relative relationship corresponding to the relative relationship, so that the person detection unit 132 can perform the person detection unit 132. Estimate the behavior of the detected person.
  • the estimation unit 136 searches, for example, whether or not the combination of the subject type and the relative relationship is included in the action content database stored in the storage unit 12. When the behavior content database contains similar or matching combinations, the estimation unit 136 estimates the behavior content associated with the similar or matching combination as the human behavior content. When the behavior content database contains a combination of a reference subject and a reference relative relationship that is similar to or matches the combination of the subject type and the relative relationship, the estimation unit 136 associates the reference subject with the reference relative relationship. Specify the specified standard behavior content as the behavior content of a person.
  • the person detection unit 132 indicates that the person whose behavior is estimated is a "buyer” and that the movement of the person "puts the product”. Is detected.
  • the subject detection unit 133 detects the first subject "commodity" and the second subject "container”.
  • the relationship specifying unit 135 specifies the relative relationship based on the distance between the human hand and the first subject and the distance between the first subject and the second subject.
  • the relationship specifying unit 135 describes the relative relationship as "the distance between the human hand and the first subject and the distance between the first subject and the second subject are predetermined distances (for example, the distance D1 and the distance D2 shown in FIG. 5). Identify as "less than”.
  • the estimation unit 136 refers to the reference subject ⁇ "product” and “product basket” corresponding to the ⁇ first subject "product” and the second subject “container”> by referring to the action content database shown in FIG. > And the reference relative relationship corresponding to the relative relationship specified by the relationship specifying unit 135 ⁇ "The distance between the human hand and the subject 2 is less than the distance D1" and "The distance between the subject 1 and the subject 2 is less than the distance D2"> Specify the standard action content ⁇ "take” and "put”> associated with the combination of and. Based on the specified reference behavior content, the estimation unit 136 estimates that the behavior content of the person is ⁇ “shopping” or “shoplifting”> shown in the behavior content column shown in FIG.
  • the estimation unit 136 estimates the human behavior content as "shoplifting", and the type of the second subject detected by the subject detection unit 133 is In the case of a "product basket", the content of a person's behavior is presumed to be "shopping". As described above, the estimation unit 136 can estimate the human behavior with high accuracy by estimating the human behavior content using the type of the subject even when the human behavior is similar.
  • the estimation unit 136 may estimate the human behavior content based on the fact that the relationship identification unit 135 further identifies the change in the relative relationship as "the distance between the first subject and the second subject becomes shorter". good. By operating in this way, the estimation unit 136 can estimate the human behavior with higher accuracy.
  • the estimation unit 136 sets the reference distance associated with the type of the subject detected by the subject detecting unit 133 and the person specified by the relationship specifying unit 135.
  • the action content may be estimated based on the result of comparing the distance between the subject and the subject.
  • the reference distance is, for example, a distance corresponding to the characteristics of a person's behavior, such as “distance A1 or less” and “distance B1 or less” shown in FIG.
  • the reference distance may be different depending on the type of subject.
  • the relationship specifying unit 135 specifies the relative relationship between the person and the subject as "the distance between the person and the subject is the distance Y1". In that case, the estimation unit 136 compares the reference distance “less than the distance A1” shown in FIG. 5 with the distance Y1 specified by the relationship specifying unit 135. When the distance Y1 is less than the distance A1, the estimation unit 136 is one of the reference relative relationships in which the relative relationship specified by the relationship identification unit 135 is associated with "viewing the product" which is the behavior content of the reference person. Since it matches "the distance between the person and the subject 1 is less than the distance A1", it is determined that the person may be looking at the product.
  • the estimation unit 136 indicates the direction of the line of sight and the subject associated with the type of subject detected by the subject detection unit 133.
  • the action content may be estimated based on the result of comparing the reference line-of-sight relationship, which is the relationship with the position, and the relationship between the direction of the line of sight of the person specified by the relationship specifying unit 135 and the position of the subject.
  • the relationship between the direction of the line of sight and the position of the subject is expressed by the difference between the direction of the line of sight of the person and the direction of the line of sight when the person directly looks at the subject.
  • the reference line-of-sight relationship is a direction difference corresponding to the characteristics of the action content, for example, “direction difference less than A2” shown in FIG.
  • the relationship specifying unit 135 describes the relative relationship between the person and the subject as "the relationship between the direction of the person's line of sight and the position of the subject is a direction difference Y2.
  • the estimation unit 136 compares the “direction difference A2 or less”, which is the reference line-of-sight relationship shown in FIG. 5, with the direction difference Y2 specified by the relationship specifying unit 135.
  • the estimation unit 136 determines that the relative relationship specified by the relationship specifying unit 135 is a reference associated with the behavior content of the reference person, "viewing the product". Since it matches one of the relative relationships, "the direction difference between the person's line of sight and the subject 1 is less than the direction difference A2", it is determined that the person may be looking at the product.
  • the estimation unit 136 estimates that a person's behavior is "looking at a product" when both the distance between the person and the subject and the direction difference between the direction of the person's line of sight and the position of the subject satisfy the criteria. You may. By estimating the behavior content of a person based on a plurality of relative relationships, the estimation unit 136 further improves the estimation accuracy of the behavior content.
  • the estimation unit 136 indicates the orientation of the body and the subject associated with the type of the subject detected by the subject detection unit 133.
  • the action content may be estimated based on the result of comparing the reference body orientation relationship, which is the relationship with the position, and the relationship between the direction of the line of sight of the person specified by the relationship specifying unit 135 and the position of the subject.
  • the relationship between the orientation of the body and the position of the subject is expressed by the difference between the orientation of the body and the orientation of the body when the person faces the direction of the subject.
  • the reference body orientation relationship is a difference in orientation corresponding to the characteristics of the behavioral content, for example, the orientation “less than the angle difference B2” shown in FIG.
  • the relationship identification unit 135 describes the relative relationship between the person and the subject as "the orientation of the person's body and the position of the subject".
  • the relationship is specified as "angle difference X1".
  • the estimation unit 136 compares the “angle difference B2 or less”, which is the reference body orientation relationship shown in FIG. 5, with the angle difference X1 specified by the relationship specifying unit 135.
  • the estimation unit 136 associates the relative relationship specified by the relationship identification unit 135 with "talking to the employee" which is the behavior content of the reference person. Since it matches one of the reference relative relationships, "the angle difference between the orientation of the human body and the subject 1 is less than the angle difference B2", it is determined that the person may be talking to the employee.
  • the estimation unit 136 determines that a person's behavior is "speaking with an employee" when both the distance between the person and the subject and the angle difference between the orientation of the person's body and the position of the subject meet the criteria. You may estimate. By estimating the behavior content of a person based on a plurality of relative relationships, the estimation unit 136 further improves the estimation accuracy of the behavior content.
  • the estimation unit 136 estimates the human behavior content based on the combination of the type of the subject and the change in the relative relationship. May be good. For example, when a product picked up by a person staying in a store is put in a bag, the relationship identification unit 135 relatives the distance between the product as the first subject and the bag as the second subject at each of a plurality of times. Identify as a relationship. The relationship specifying unit 135 identifies the change in the relative relationship as "the distance between the first subject and the second subject becomes shorter".
  • the estimation unit 136 determines that it is an action of a person to put a product in a bag based on the combination of the change in the relative relationship specified by the relationship identification unit 135 and the product and the bag which are the types of the subject, and the behavior content of the person. Is estimated to be "shoplifting". In this way, by using the change in the relative relationship, the estimation unit 136 can improve the estimation accuracy of the human behavior content.
  • the estimation unit 136 estimates the behavior content of a person based on the combination of the type of the subject and the change in the relative relationship, the estimation unit 136 detects the subject before the person detection unit 132 detects the person in any of the plurality of captured image data.
  • the action content may be estimated based on the result of determining whether or not is included.
  • the estimation unit 136 detects, for example, an object reflected in the captured image shown by the captured image data created at the time before the store opens, and a person who picks up the object placed on the shelf by the person detection unit 132. It is determined whether or not the object reflected in the captured image indicated by the captured image data created at the same time matches. If it is determined that they match, the estimation unit 136 specifies that the type of the object is a product. The estimation unit 136 estimates that the behavior of a person is "shoplifting" when the person puts the object in a bag and the object is a product.
  • the estimation unit 136 By operating the estimation unit 136 in this way, for example, when a customer who visits the store misplaces his / her personal belongings on the shelf of the store and moves, and after a certain period of time, he / she returns to pick up the misplaced personal belongings, the estimation unit 136 operates. , It is possible to distinguish between personal items and products taken from the shelves by the purchaser based on the captured image before the store opens. As a result, the estimation unit 136 can suppress the mistaken estimation of the content of an action other than shoplifting as "shoplifting".
  • the estimation unit 136 may estimate the behavior content of the person based on the combination of each of the plurality of relative relationships and the type of the subject. ..
  • the relationship specifying unit 135 is an example of the first relative relationship, that is, the distance between the person and the shelf and the second relative. Identify the distance between a person and a product, which is an example of a relationship.
  • the estimation unit 136 includes the first relative relationship, the second relative relationship, and the product and bag which are the types of the subject. It is estimated that the behavior of a person is shoplifting based on the combination of.
  • the estimation unit 136 determines the change in the first relative relationship and the second relative relationship.
  • the content of human behavior may be estimated based on the combination of the subject and the type of the subject.
  • the relationship specifying unit 135 specifies a first relative relationship and a second relative relationship different from the first relative relationship.
  • the estimation unit 136 is a human behavior based on a combination of a first relative relationship, a change in the second relative relationship specified based on a plurality of second relative relationships specified at a plurality of times, and a type of subject. Estimate the content.
  • the estimation unit 136 is, for example, a type of subject such as "the distance between the person and the shelf is less than the reference distance" which is the first relative relationship and "the distance between the person and the product becomes shorter” which is the change of the second relative relationship. It is estimated that human behavior is shoplifting based on the combination of shelves and products.
  • the estimation unit 136 may estimate the behavior content of the person based on the combination of the plurality of relative relationships.
  • the estimation unit 136 may estimate the behavior content of a person based on, for example, a combination of the type of the subject and at least two relative relationships of a first relative relationship, a second relative relationship, or a third relative relationship.
  • the subject detection unit 133 detects the product and the bag as subjects.
  • the relationship specifying unit 135 specifies the relationship between the person and the product as the first relative relationship, the relationship between the person and the bag as the second relative relationship, and the relationship between the product and the bag as the third relative relationship. ..
  • the estimation unit 136 is based on the fact that the type of subject is a bag and a product, the distance between the human hand and the bag is less than D1, and the distance between the bag and the product is less than D2. Estimate the action content as "shoplifting".
  • estimation unit 136 estimates the human behavior content as "shoplifting" by using the second relative relationship and the third relative relationship is illustrated, but the estimation unit 136 is the first relative relationship ".
  • the content of a person's behavior may be estimated by further using "a person is holding a product”.
  • the estimation unit 136 may estimate the behavior content based on the attributes of the person detected by the person detection unit 132. For example, when a person puts a product in a bag at the sales floor of a store, if the attribute of the person is a purchaser, the estimation unit 136 estimates that the behavior of the person is "manipulation", and the attribute of the person is an employee. For example, the estimation unit 136 estimates the content of a person's behavior as "product arrangement". By operating the estimation unit 136 in this way, the estimation unit 136 is a case where the movements of people are similar, for example, "shoplifting" and "product arrangement", which are actions of picking up a product. However, the behavior content can be estimated correctly.
  • the estimation unit 136 may further estimate the action content based on the attributes of the subject received by the setting reception unit 134.
  • the subject attribute is information indicating the characteristics of the subject such as the owner of the subject and the function of the subject. For example, when the setting reception unit 134 accepts that the attribute of the object placed on the shelf installed in the warehouse of the store is "the property of the store", the estimation unit 136 moves to pick up the object. It is presumed that the behavior of the employee is "organization of goods or equipment”. On the other hand, when the setting reception unit 134 does not have the attribute of the object "property of the store", the estimation unit 136 estimates that the behavior content of the employee who picks up the object is, for example, "moving personal belongings”.
  • the estimation unit 136 specifies the relationship specifying unit 135.
  • the action content may be estimated based on the positional relationship and the attributes of the subject for which the setting receiving unit 134 has received the setting. For example, when the subject specified by the relationship specifying unit 135 is a product shelf and the subject for which the setting receiving unit 134 has received the setting is an object placed on the product shelf, the estimation unit 136 is specified by the relationship specifying unit 135. Based on the positional relationship, it is determined that a person other than the employee is moving to put the object placed on the product shelf into the bag. Subsequently, the estimation unit 136 estimates that the behavior content of a person other than the employee is shoplifting when the attribute of the object is "property of the store".
  • the estimation unit 136 is a case where a plurality of products and an object (for example, a personal property) different from the plurality of products are placed in the same place. However, it is possible to estimate the behavior of a person by distinguishing between a product and an object different from the product.
  • the location specifying unit 137 identifies the location shown in the captured image.
  • the location specifying unit 137 identifies a location in the captured image based on, for example, the subject in the captured image.
  • the location specifying unit 137 may specify a location shown in the captured image based on, for example, a GPS (Global Positioning System) signal.
  • the location specifying unit 137 notifies the output unit 138 of information indicating the specified location.
  • the output unit 138 functions as a transmission unit that transmits the action content information indicating the action content estimated by the estimation unit 136 to the information terminal 2.
  • the output unit 138 may output the notification information when the action content estimated by the estimation unit 136 and the location specified by the location identification unit 137 satisfy a predetermined condition.
  • the predetermined condition is that the user of the information terminal 2 should receive the notification information and take some action.
  • the user of the information terminal 2 is, for example, an employee or a security guard of the store when the image pickup device is installed in the store.
  • the output unit 138 determines that the predetermined condition is satisfied and outputs the notification information.
  • the estimated action content estimated by the estimation unit 136 is "product arrangement" and the place specified by the location identification unit 137 is "store"
  • the estimated action content is the normal behavior of the employee, so it is output.
  • Unit 138 determines that the predetermined condition is not satisfied and does not output the notification information. By operating in this way, the output unit 138 can output the information required by the user of the information terminal 2.
  • the output unit 138 may output the type of notification information corresponding to the combination of the location specified by the location identification unit 137 and the action content. For example, if the place specified by the location specifying unit 137 is the "store sales floor" and the behavior content of the person estimated by the estimation unit 136 is "shoplifting", the output unit 138 is shoplifted at the store sales floor. Output notification information indicating that.
  • FIG. 6 is a flowchart showing an example of an operation in which the behavior estimation device 10 estimates the behavior content of a person.
  • the behavior estimation device 10 identifies the change in the relative relationship based on the relative relationship at the time T1 and the relative relationship at the time T2, and estimates the human behavior content using the change in the specified relative relationship. The operation will be explained.
  • the image acquisition unit 131 acquires the first captured image data generated by the image pickup device 1 at time T1 (S11).
  • the location specifying unit 137 specifies a location in the captured image indicated by the first captured image data acquired by the image acquisition unit 131.
  • the person detection unit 132 detects the position of a person included in the captured image indicated by the first captured image data (S12).
  • the subject detection unit 133 detects the position of the subject and the type of the subject in the captured image indicated by the first captured image data (S13).
  • the relationship specifying unit 135 specifies the relative relationship between the position of the person, the position of the subject, and the type of the subject at the time T1 (S14).
  • the image acquisition unit 131 acquires the second captured image data generated by the image pickup device 1 at time T2 (S15).
  • the human detection unit 132 detects the position of a person included in the captured image indicated by the second captured image data (S16).
  • the subject detection unit 133 detects the position of the subject and the type of the subject in the captured image indicated by the second captured image data (S17).
  • the relationship specifying unit 135 specifies the relative relationship between the position of the person, the position of the subject, and the type of the subject at the time T2 (S18).
  • the relationship specifying unit 135 specifies a change in the relative relationship based on the relative relationship at the time T1 specified in S14 and the relative relationship at the time T2 specified in S18 (S19).
  • the estimation unit 136 estimates the human behavior content based on the combination of the change in the relative relationship specified by the relationship specifying unit 135 and the type of the subject detected by the subject detecting unit 133 (S20).
  • the behavior estimation device 10 repeats the processes from S11 to S20 when the operation for terminating the process of estimating the human behavior content is not performed (NO in S21). When the operation for terminating the process of estimating the human behavior content is performed (YES in S21), the behavior estimation device 10 ends the process.
  • FIG. 7 is a diagram showing a configuration of a behavior estimation device 10 that estimates a person's behavior content based on a change in the color of the person or the subject.
  • the behavior estimation device 10 shown in FIG. 7 is different from the behavior estimation device 10 shown in FIG. 4 in that it has a color specifying unit 139, and is the same in other respects.
  • the color specifying unit 139 specifies at least one of a change in the color of a person and a change in the color of a subject.
  • the color specifying unit 139 changes the color of the person or subject included in the captured image acquired by the image acquisition unit 131 based on the position of the person detected by the person detection unit 132 or the position of the subject detected by the subject detection unit 133. To identify.
  • the estimation unit 136 may estimate the action content based on at least one of the change in the color of the person specified by the color identification unit 139 and the change in the color of the subject. For example, when a person is using a knife, the subject detection unit 133 detects a knife, which is a type of subject. The relationship specifying unit 135 specifies the relative relationship between the person and the knife, which is the subject, as "the distance between the person and the subject is less than the reference distance". The color identification unit 139 identifies that the human hand and the knife have turned red based on the position of the person and the position of the knife. The estimation unit 136 estimates that the person is injured based on the fact that the person's hand and knife turn red.
  • the relationship identification unit 135 describes the change in the relative relationship between the knife, which is the subject, and the other person. By specifying that the distance changes greatly with each time, the estimation unit 136 estimates the human behavior as dangerous behavior. Then, the color identification unit 139 identifies that the color of the knife or another person has changed to red, and the estimation unit 136 estimates that the other person is injured.
  • the color specifying unit 139 identifies the change in the color of the person or the subject, and the estimation unit 136 identifies the behavior of the person based on the color change, so that the behavior estimation device 10 responds urgently to bloodshed or the like. It is possible to estimate the content of actions to be taken. Further, the behavior estimation device 10 can estimate the behavior content such as spilling paint by detecting that the color has changed from red, for example.
  • the behavior estimation device 10 may detect a dangerous behavior that has occurred outdoors, for example, based on an image of a surveillance camera installed outdoors.
  • a dangerous act is an act of destroying an object such as a bench installed in a park.
  • the relationship specifying unit 135 identifies the relative relationship between the person and the bench which is the subject.
  • the estimation unit 136 specifies the human behavior content as "karate practice”.
  • the threshold value is a distance indicating a range in which a person moves in practice of a martial art such as karate.
  • the estimation unit 136 estimates that the behavior content of the person "destroys the bench".
  • the behavior estimation device 10 may estimate the behavior of a person entering the restricted area based on the image of the surveillance camera. For example, when a fence is installed at the boundary of the restricted area, the subject detection unit 133 detects the fence that is the subject. When a person climbs over the fence and invades, the person detection unit 132 detects the movement of the person over the fence. The relationship specifying unit 135 specifies the relative relationship as "a person is overcoming the subject”. The estimation unit 136 estimates that the human behavior content is "overcoming the fence” based on the combination of the fence, which is the type of the subject, and the relative relationship.
  • the subject detection unit 133 may detect the area where entry is prohibited as the type of subject.
  • the person detection unit 132 detects that the person is moving.
  • the relationship specifying unit 135 identifies the relative relationship as "a person is moving inside the subject” based on the position information indicating the area where the subject is prohibited from entering and the position where the person has moved.
  • the estimation unit 136 estimates that the content of a person's behavior is "invading an area where entry is prohibited".
  • the behavior estimation device 10 can detect that a person enters the restricted area.
  • the manager of the no-entry area can notify the information terminal 2 that the behavior estimation device 10 has invaded the no-entry area, thereby knowing that the person has invaded and issuing a warning to the person. can.
  • the behavior estimation device 10 includes a human detection unit 132 that detects the movement of a person included in the captured image, a subject detection unit 133 that detects the type of the subject included in the captured image, and a human movement. It has a relationship specifying unit 135 that specifies a relative relationship that is a relationship between an accompanying person and a subject. Then, the estimation unit 136 estimates the human behavior content based on the type of the subject detected by the subject detection unit 133 and the relative relationship specified by the relationship specifying unit 135.
  • the behavior estimation device 10 estimates the content of a person's behavior using the type of subject and the relative relationship between the person and the subject in this way, so that even if the movements are similar, the behavior of the person is accurate.
  • the content can be estimated.
  • the information terminal 2 receives the behavior content information indicating the behavior content of the person estimated by the behavior estimation device 10, so that, for example, the employee of the store who is the user of the information terminal 2 can receive the behavior content information. It is possible to accurately know the abnormal behavior of a person. As a result, the user of the information terminal 2 can accurately respond to human behavior.
  • Image pickup device 2 Information terminal 10
  • Behavior estimation device 11 Communication unit 12
  • Storage unit 13 Control unit 21
  • Communication unit 22 Storage unit 23
  • Control unit 24 Display unit 25 Operation unit 131 Image acquisition unit 132 Person detection unit 133
  • Setting reception unit 135 Relationship identification unit 136
  • Estimating unit 137 Location identification unit 138
  • Output unit 139 Color specification unit 231
  • Information transmission unit 232 Information reception unit

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

Un dispositif d'estimation d'activité (10) comprend : une unité d'acquisition d'image (131) permettant d'acquérir des données d'image capturées représentant une image capturée ; une unité de détection de personne (132) qui détecte un mouvement d'une personne comprise dans l'image capturée ; une unité de détection de sujet (133) permettant de détecter le type d'un sujet compris dans l'image capturée ; une unité d'identification de relation (135) permettant d'identifier une relation relative, qui est une relation entre la personne et le sujet, concomitante au mouvement de la personne ; et une unité d'estimation (136) permettant d'estimer le contenu d'activité de la personne sur la base d'une combinaison du type de sujet et de la relation relative.
PCT/JP2021/039325 2020-10-30 2021-10-25 Dispositif d'estimation d'activité, procédé d'estimation d'activité, programme et système d'estimation d'activité WO2022092032A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-182477 2020-10-30
JP2020182477A JP6854959B1 (ja) 2020-10-30 2020-10-30 行動推定装置、行動推定方法、プログラム及び行動推定システム

Publications (1)

Publication Number Publication Date
WO2022092032A1 true WO2022092032A1 (fr) 2022-05-05

Family

ID=75267983

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/039325 WO2022092032A1 (fr) 2020-10-30 2021-10-25 Dispositif d'estimation d'activité, procédé d'estimation d'activité, programme et système d'estimation d'activité

Country Status (2)

Country Link
JP (2) JP6854959B1 (fr)
WO (1) WO2022092032A1 (fr)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023275968A1 (fr) * 2021-06-29 2023-01-05 日本電信電話株式会社 Dispositif de détermination d'anomalie, procédé de détermination d'anomalie et programme de détermination d'anomalie
WO2024004209A1 (fr) * 2022-07-01 2024-01-04 日本電信電話株式会社 Dispositif d'estimation, dispositif d'apprentissage, procédé et programme

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015033576A1 (fr) * 2013-09-06 2015-03-12 日本電気株式会社 Système de sécurité, procédé de sécurité et support non temporaire lisible par ordinateur
WO2019171573A1 (fr) * 2018-03-09 2019-09-12 日本電気株式会社 Système d'encaissement automatique, procédé de gestion de produits achetés et programme de gestion de produits achetés
JP2020053019A (ja) * 2018-07-16 2020-04-02 アクセル ロボティクス コーポレーションAccel Robotics Corp. 自律店舗追跡システム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016201105A (ja) * 2015-04-07 2016-12-01 三菱電機株式会社 情報処理装置及び情報処理方法
JP2017033401A (ja) * 2015-08-04 2017-02-09 株式会社 impactTV 顧客情報収集装置、顧客情報収集システムおよび顧客情報収集方法

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015033576A1 (fr) * 2013-09-06 2015-03-12 日本電気株式会社 Système de sécurité, procédé de sécurité et support non temporaire lisible par ordinateur
WO2019171573A1 (fr) * 2018-03-09 2019-09-12 日本電気株式会社 Système d'encaissement automatique, procédé de gestion de produits achetés et programme de gestion de produits achetés
JP2020053019A (ja) * 2018-07-16 2020-04-02 アクセル ロボティクス コーポレーションAccel Robotics Corp. 自律店舗追跡システム

Also Published As

Publication number Publication date
JP6854959B1 (ja) 2021-04-07
JP2022072825A (ja) 2022-05-17
JP2022073882A (ja) 2022-05-17

Similar Documents

Publication Publication Date Title
US10636267B2 (en) RFID tag tracking systems and methods in identifying suspicious activities
US20230041775A1 (en) Multiple-factor verification for vision-based systems
US10402634B2 (en) Information processing device, information processing method, and computer program product
JP6915542B2 (ja) 情報処理装置、通知システム、情報送信方法及びプログラム
JP6707940B2 (ja) 情報処理装置及びプログラム
WO2022092032A1 (fr) Dispositif d'estimation d'activité, procédé d'estimation d'activité, programme et système d'estimation d'activité
US20140225734A1 (en) Inhibiting alarming of an electronic article surviellance system
US11270562B2 (en) Video surveillance system and video surveillance method
US7295106B1 (en) Systems and methods for classifying objects within a monitored zone using multiple surveillance devices
JP2013238973A (ja) 購買情報管理システム、商品移動検出装置および購買情報管理方法
US20130135105A1 (en) Anti-theft rfid system and method thereof
US10229568B2 (en) Anti-theft RFID system and method thereof
KR20210064886A (ko) 사용자의 행동을 인식하는 서버, 방법 및 컴퓨터 프로그램
JP5654110B2 (ja) 監視方法およびカメラ
WO2019077559A1 (fr) Système de suivi de produits et d'utilisateurs dans un magasin
WO2019077561A1 (fr) Dispositif de détection de l'interaction d'utilisateurs avec des produits disposés sur un socle ou un présentoir d'un magasin
CN111583521A (zh) 一种安全防护方法及装置
CN117912174A (zh) 一种防盗预警方法、装置、设备及计算机可读存储介质
US10679086B2 (en) Imaging discernment of intersecting individuals
WO2023053243A1 (fr) Système de rapport, procédé et support d'enregistrement
JP2022011666A (ja) 画像処理装置、画像処理方法およびプログラム
US12131577B2 (en) Person-of-interest (POI) detection
CN115457697A (zh) 无人零售货柜防盗方法、装置、货柜、设备和存储介质
US20230281993A1 (en) Vision system for classifying persons based on visual appearance and dwell locations
JP2022011704A (ja) 画像処理装置、画像処理方法およびプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21886150

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21886150

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP