[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2022253750A1 - SYSTEMS FOR INCENTIVIZING SOCIAL DISTANCING USING CONNECTED LIGHTING IoT INFRASTRUCTURE - Google Patents

SYSTEMS FOR INCENTIVIZING SOCIAL DISTANCING USING CONNECTED LIGHTING IoT INFRASTRUCTURE Download PDF

Info

Publication number
WO2022253750A1
WO2022253750A1 PCT/EP2022/064600 EP2022064600W WO2022253750A1 WO 2022253750 A1 WO2022253750 A1 WO 2022253750A1 EP 2022064600 W EP2022064600 W EP 2022064600W WO 2022253750 A1 WO2022253750 A1 WO 2022253750A1
Authority
WO
WIPO (PCT)
Prior art keywords
data set
distancing
masking
area
person
Prior art date
Application number
PCT/EP2022/064600
Other languages
French (fr)
Inventor
Daksha Yadav
Manush Pragneshbhai PATEL
Abhishek MURTHY
Jaehan Koh
Original Assignee
Signify Holding B.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Signify Holding B.V. filed Critical Signify Holding B.V.
Publication of WO2022253750A1 publication Critical patent/WO2022253750A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063114Status monitoring or status determination for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0635Risk analysis of enterprise or organisation activities
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0207Discounts or incentives, e.g. coupons or rebates
    • G06Q30/0224Discounts or incentives, e.g. coupons or rebates based on user history
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce
    • G06Q30/02Marketing; Price estimation or determination; Fundraising
    • G06Q30/0281Customer communication at a business location, e.g. providing product or service information, consulting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/10Services
    • G06Q50/22Social work or social welfare, e.g. community support activities or counselling services
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/18Status alarms
    • G08B21/22Status alarms responsive to presence or absence of persons
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B47/00Circuit arrangements for operating light sources in general, i.e. where the type of light source is not relevant
    • H05B47/10Controlling the light source
    • H05B47/17Operational modes, e.g. switching from manual to automatic mode or prohibiting specific operations
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Definitions

  • the present disclosure is directed generally to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure.
  • IoT Internet of Things
  • the present disclosure is generally directed to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure.
  • Sites using connected lighting systems typically include a variety of IoT sensors as part of the connected lighting system infrastructure.
  • IoT sensors By using IoT sensors, the connected lighting system can anonymously track the distancing and mask wearing of occupants in an indoor or outdoor site.
  • the system may generate an anonymized unique identifier for the occupant and use the unique identifier to monitor the person’s distancing and masking within the site.
  • the system may monitor distancing of occupants by tracking the trajectory over time of the occupants within the site to determine proximity alerts and their associated intentionality. Based on the sensed level of intentionality of persons in the site, the system may generate distancing characteristics.
  • the system may also monitor masking of persons as they travel along their trajectory. If a person is unmasked, the system may detect a change in temperature around their mouth and generate a masking alert. Based on the duration of the masking alert, the system may generate masking characteristics.
  • the system may generate a social behavior score based on the distancing and masking characteristics, the pre-defmed safety parameters of the site, and a reinforcement learning algorithm.
  • the system may transmit the generated social behavior score to an incentive generation subsystem.
  • the incentive generation subsystem may generate a reward based on the social behavior score, and may provide the reward to the occupant.
  • a connected lighting system for tracking and incentivizing social behavior.
  • the connected lighting system includes a controller.
  • the controller is communicatively coupled to a plurality of sensors.
  • the plurality of sensors are arranged with luminaires in an area.
  • the controller is configured to capture, via the plurality of sensors, a data set.
  • the data set corresponds to the area.
  • the controller is further configured to determine, using machine learning, whether an object in the area is a person.
  • the controller is further configured to generate, based on the data set, a unique identifier for each object in the area determined to be a person.
  • generating the unique identifiers includes: (1) determining, based on the data set, a location and a time stamp for each object in the area determined to be a person; and (2) generating, based on the locations) and the time stamps, the unique identifiers for each object in the area determined to be a person.
  • the controller is further configured to determine, based on the data set, one or more distancing characteristics corresponding to each unique identifier.
  • determining the one or more distancing characteristics includes: (1) generating, based on the data set, a trajectory corresponding to each unique identifier; generating, based on the trajectories, one or more proximity alerts, wherein each proximity alert corresponds to at least two of the unique identifiers, generating, based on the trajectories, horizon data corresponding to each unique identifier, and generating, based on the proximity alerts and the horizon data, the one or more distancing characteristics.
  • each of the one or more distancing characteristics includes a distancing characteristic severity score based on the trajectories and the horizon data.
  • Each of the trajectories include a time data set, a location data set, and a velocity data set.
  • the horizon data is further generated based on the velocity data set.
  • the controller is further configured to determine, based on the data set, one or more masking characteristics corresponding to each unique identifier.
  • determining the one or more masking characteristics includes: (1) retrieving, from the plurality of sensors, a temperature data set, (2) identifying, based on a trajectory for each object determined to be a person, one or more mouth regions, wherein the trajectory includes a time data set, a location data set, and a velocity data set; (3) generating, based on the temperature data corresponding to the mouth regions, one or more masking alerts, wherein each masking alert corresponds to one of the unique identifiers, and wherein each of the masking alerts includes an alert duration; and (4) generating, based on the alert durations of the masking alerts, the one or more masking characteristics.
  • the temperature data set is retrieved from one or more multipixel thermopile (MPT) sensors.
  • MPT multipixel thermopile
  • Each of the one or more masking characteristics can includes a masking characteristic severity score based on the alert durations.
  • the controller is further configured to generate, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier.
  • generating the social behavior scores includes: (1) generating, based on the one or more distancing characteristics and the pre-defmed safety parameters, one or more weighted distancing characteristics; (2) generating, based on the one or more masking characteristics and the pre-defmed safety parameters, one or more weighted masking characteristics; and (3) generating the social behavior scores based on the one or more weighted distancing characteristics, the one or more weighted masking characteristics, and a reinforcement learning algorithm.
  • the reinforcement learning algorithm is based on area masking violations, area distancing violations, and public health data.
  • the controller is further configured to transmit the generated social behavior scores to an incentive generation subsystem.
  • the incentive generation subsystem includes a controller configured to generate, based on one of the one or more social behavior scores, a reward corresponding to one of the unique identifiers. According to an example, a value of the reward is proportional to the one of the one or more social behavior scores. According to another example, the controller of the incentive generation subsystem is further configured to notify the person corresponding to the one of the one or more unique identifiers of the reward via a personal device in wireless communication with the controller.
  • a method for tracking and incentivizing social behavior includes capturing, via a plurality of sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area. The method further includes determining, using machine learning, whether an object in the area is a person. The method further includes generating, based on the data set, a unique identifier for each object in the area determined to be a person. The method further includes determining, based on the data set, one or more distancing characteristics corresponding to each unique identifier. The method further includes determining, based on the data set, one or more masking characteristics corresponding to each unique identifier.
  • the method further includes generating, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier.
  • the method further includes transmitting the generated social behavior scores to an incentive generation subsystem.
  • a processor or controller may be associated with one or more storage media (generically referred to herein as “memory,” e.g., volatile and non-volatile computer memory such as RAM, PROM, EPROM, EEPROM, floppy disks, compact disks, optical disks, magnetic tape, SSD, etc.).
  • the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein.
  • program or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., software or microcode) that can be employed to program one or more processors or controllers.
  • Fig. l is a top-level schematic of a system for tracking and incentivizing social behavior, in accordance with an example.
  • Fig. 2 is a further top-level schematic of a system for incentivizing social distancing, in accordance with an example.
  • Fig. 3 is a trajectory map for determining proximity alerts, in accordance with an example.
  • Fig. 4 is a trajectory map for determining masking alerts, in accordance with an example.
  • Fig. 5 is a schematic diagram of a luminaire, in accordance with an example.
  • Fig. 6 is a schematic diagram of a controller of a connected lighting system, in accordance with an example.
  • Fig. 7 is a schematic diagram of an incentive generation subsystem wirelessly connected to personal devices, in accordance with an example.
  • Fig. 8 is a flowchart of a method for tracking and incentivizing social behavior, in accordance with an example.
  • the present disclosure is generally directed to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure.
  • Sites using connected lighting systems typically include a variety of IoT sensors as part of the connected lighting system infrastructure.
  • the sites can be indoor (office spaces, residential buildings, shopping centers, schools, etc.), outdoor (parks, stadiums, etc.), or both.
  • IoT sensors can include multipixel thermopile (MPT) sensors, microphones, positioning sensors, red green blue (RGB) cameras, and others.
  • MPT multipixel thermopile
  • RGB red green blue
  • the system may identify them as a person, rather than a non-human object, using machine learning algorithms.
  • the system may generate an anonymized unique identifier for the person.
  • the unique identifier may be a randomized number or character generated at least partly based on the timestamp the person entered the site, the location of entrance, as well as other data collected by the sensors.
  • the system may use the unique identifier to monitor and track the person’s distancing and masking within the site.
  • the system may monitor distancing by tracking the trajectory over time of persons within the site. For each person, the system may generate a trajectory which includes time, location, and velocity information. By comparing trajectories, the system can generate proximity alerts if persons are within a distancing threshold for a predetermined period of time. Further, the system can use the velocity information of the trajectories to determine the field-of-view, or “horizon” of persons for particular points in time. By evaluating the horizons and the velocity vectors of persons during a proximity alert, the system can determine the intentionality of the proximity alert-generating behavior.
  • the system can conclude that the proximity of the two persons was intentional. Based on the sensed level of intentionality of persons in the site, the system may generate distancing characteristics which can be representative of any distancing violations.
  • the distancing characteristics may include scores indicating the severity of the characteristics.
  • the system may also monitor masking of persons as they travel along their trajectory. At various points along the trajectory, the system may identify mouth-regions of a person based on location and velocity information. The system may analyze temperature data of the mouth region. The temperature data can be collected by sensors such as MPT sensors. If the person is unmasked, the system may detect the change in temperature in their mouth- region and generate a masking alert. Based on the duration of the masking alerts, the system may generate masking characteristics which can be representative of any masking violations. The masking characteristics may include scores indicating the severity of the violation.
  • the system may generate a social behavior score based on the distancing and masking characteristics.
  • the system may weigh certain characteristics more heavily than others, based on the pre-defmed safety parameters of the site. For example, masking violations may be weighted more heavily than distancing violations.
  • the social behavior score can also be determined using a reinforcement learning algorithm. The reinforcement learning algorithm may be based on previously determined area violation data, as well as public health data.
  • the system may generate a reward based on the social behavior score.
  • the reward can be in a wide variety of forms depending on the site, such as coupons for grocery stores, discount codes for future events for stadiums, or even cryptocurrency.
  • the reward may be generally proportional to the social behavior score, meaning the higher the social behavior score, the greater the reward.
  • the reward, or a notification of the reward can be provided to a personal device of the occupant in wireless communication with the connected lighting system.
  • FIG. 1 shows an example connected lighting system 100 monitoring an area 200.
  • the area 200 includes a number of objects 202a-202f, including three persons 204a- 204c.
  • the connected lighting system includes a controller 102, two luminaires 106a, 106b, a discrete MPT sensor 144c, and an incentive generation subsystem 300.
  • the connected lighting system 100 is configured to illuminate the area 200.
  • the area 200 can be indoor (office spaces, residential buildings, shopping centers, schools, etc.), outdoor (parks, stadiums, etc.), or a combination of both.
  • Non-person objects can include any type of object 202 found in these areas 200, such as furniture, electronics, building infrastructure, personal items, and more.
  • Each of the luminaires 106a, 106b of FIG. 1 include a sensor bundle 104a, 104b, such as an Advanced Sensor Bundle (ASB).
  • ASB Advanced Sensor Bundle
  • Connected lighting systems 100 can use ASB’s, as well as discrete, individual sensors, to collect information regarding the area 200 being lit, and use this information to control the output of the light sources 146 of the luminaires 106a, 106b.
  • This information can also be used for a number of other purposes, such as, in this case, tracking and incentivizing social behavior.
  • the sensors 104a, 104b can be embedded within or arranged on the luminaires 106a, 106b, or they can be arranged separately, depending on the application.
  • sensor bundle 104a includes an MPT sensor 144a, a microphone 160a, and a positioning sensor 162.
  • sensor bundle 104b also includes an MPT sensor 144b, an RGB camera 166b, and a passive infrared (PIR) sensor 168b.
  • PIR passive infrared
  • Other types of sensors 104 may be used as required.
  • the sensors 104 can collect information at a wide range of sampling rates corresponding to the area 200. For example, if the area 200 is a busy grocery store with constantly moving traffic, the sensors 104 may capture data every second.
  • the sensors 104 could collect data much less frequently while still accurately representing the movement of persons 204 in the area 200.
  • the sampling rate of the sensors 104 could fluctuate based on expected behavior at certain times. For example, in the outdoor stadium example, the sampling rate could increase at times of expected movement, such as during stoppages in play, or immediately following the conclusion of the event.
  • the information collected by the sensors 104 may be compiled into a data set 108.
  • the data set 108 can be subdivided or classified by a number of differentiators, such as a type of data collected or the location of the sensors 104.
  • temperature information collected by the various MPT sensors 144 may be compiled as a temperature data set 136, a subset of the overall data set 108.
  • the example luminaire 106 includes a sensor bundle 104 with an MPT sensor 144, a microphone 160, a positioning sensor 162, an RGB camera 164, and a PIR sensor 166.
  • the luminaire 106 further includes a light source 146, such as one or more light emitting diodes (LEDs).
  • the luminaire 106 further includes a transceiver 410 to wirelessly communicate with the other aspects of the connected lighting system 100.
  • the luminaire 106 further includes a memory 177 to store the data set 108 captured by the sensor bundle 104.
  • the controller 102 of the connected lighting system is communicatively coupled to the luminaires 106a, 106b, the discrete MPT sensor 144c, and the incentive generation subsystem 300.
  • This coupling may be wired or wireless, depending on the application.
  • This coupling allows for data captured by the sensors 104 to be conveyed to the controller 102 for processing.
  • the controller 102 can then communicate with the incentive generation subsystem 300 to generate rewards 304 based on the data.
  • Another example of controller 102 is illustrated in FIG. 6.
  • the controller includes a processor 185, memory 175, and transceiver 420.
  • FIG. 1 further depicts person 204c moving into from outside of the area 200 to inside the area 200.
  • person 204c moves into the area 200
  • information corresponding to his presence is captured by the various sensors 104a, 104b, 104c.
  • This information is compiled into data set 108.
  • the controller 102 processes the data set 108 using machine learning 160 to determine if the object 202c is actually a person 204, as opposed to a non person object 202.
  • the machine learning 160 may consist of an algorithm trained by data corresponding to person 204 and non-person objects 202.
  • the algorithm can be trained by temperature data captured by MPT sensors 144. By analyzing the temperature data, controller 102 can look for heat patterns in the data 108 corresponding to a human heat signature.
  • the controller 102 can generate a heat pattern for an object 202 and compare the heat pattern to a threshold human heat signature (e.g., baseline human heat signature) to determine if object 202 is a person 204 or a non-person object 202.
  • a threshold human heat signature e.g., baseline human heat signature
  • the threshold human heat signature can include a single value or a range of values representing a human heat signature.
  • the threshold human heat signature can include predefined values, previous heat pattern measurements of one or more persons 204 or a combination of predefined values and previous heat pattern measurements of one or more persons 204. Other types of sensor data may be similarly used.
  • the person 204 is assigned a unique identifier 110.
  • the unique identifier 110 is used to anonymize the person 204 while the system 100 monitors their social behavior.
  • the unique identifier 110 is calculated based on a location 118 of the person 204 within the area, and the timestamp 120 of the person 204 at the location 118.
  • the unique identifier 110 can also be based on additional aspects of the data set 108 captured by the sensors 104. For example, a unique identifier 110 can be calculated based on equation 1 :
  • the unique identifier 110 ID value for person 204c is the result of /;, L (xi,yi), and S b [0] being processed by equation 1.
  • the unique identifier 110 can also be at least partly based on a randomly generated number or character for increased anonymization.
  • the area 200 monitored by the connected lighting system 100 can be split up into two or more zones 210 for more efficient monitoring. Each zone 210 is monitored by one or more sensors 104 to ensure coverage of the entire area 200.
  • the system 100 can identify persons 204 as they move across different zones, ensuring that captured information about their movements and masking will correspond to the correct unique identifier 110.
  • the area 200 includes zone 1 210a and zone 2210b.
  • Zone 1 210a is monitored by sensor bundle 104d, while zone 2 is monitored by sensor bundle 204e. Persons 204a, 204d move from zone 1 210a to zone 2210b.
  • the system 100 may recognize their movement from zone 1 210a to zone 2210b based on their movement towards the boundary between the zones 210a, 210b. As such, the system 100 can “predict” when persons 204 will travel across the zones 210, and allocate the data collected by the corresponding sensors 104 accordingly.
  • an area 200 may include tens or hundreds of zones 210 of various shapes and sizes.
  • the system 100 As persons 204 move through the area 200 and across various zones 210, the system 100 generates trajectories 122 corresponding to their movements over a period of time. The trajectories 122 are generated based on the data set 108 (or subsets of the data set 108). The system 100 can then analyze these trajectories 122 to determine if any of persons 204 in the area 200 have violated social distancing guidelines.
  • FIG. 3 shows three trajectories 122; trajectory 122a corresponding to person 204a, trajectory 122b corresponding to person 204b, and trajectory 122c corresponding to person 204c.
  • Each trajectory 122 includes a time data set 130, a location data set 132, and a velocity data set 134.
  • a trajectory 122 may be defined by equation 2:
  • T [ij] /3 ⁇ 4 J ,x ,y/, vx i J vyj) (2)
  • i is the sample number of the data
  • / corresponds to the unique identifier 110 for the person 204 being tracked
  • t is the timestamp at which the data of sample i was captured
  • x is the x-coordinate of the person 204 being tracked
  • y is the y-coordinate of the person being tracked
  • vx is the velocity of the person being tracked in the x-plane
  • y is the velocity of the person being tracked in the y-plane.
  • the system 100 can determine if two or more persons 204 are within a predefined distancing threshold for a predefined duration threshold.
  • the predefined distancing threshold and the predefined duration threshold may be set based on the area 200 and public health guidance. For example, if the area 200 is an indoor grocery store, the predefined distancing threshold can be set to 6 ft, and the predefined duration threshold can be set to 60 seconds. If two or more persons 204 are within the predefine distancing threshold for a time period exceeding the predefined duration threshold, a proximity alert 124 is generated.
  • the system 100 may transmit the proximity alert 124 to a personal device 208 of the corresponding person 204 or persons 204 to notify the respective person 204 or persons 204 of a potential violation of the predefined distancing threshold or to limit or reduce the time period the persons 204 exceed the distancing threshold.
  • the system 100 can associate the personal device 208 of the user to the unique identifier 110 of the respective person 204 to identify the correct personal device 208 to transmit the proximity alert 124.
  • the proximity alert 124 is associated with the unique identifiers 110 of the two (or more) persons 204 triggering the alert. Once the system 100 performs additional analyses to determine a distancing characteristic 112 associated with the proximity alert 124, including a severity score 128 representative of the intentionality of the violation.
  • the system 100 generates horizon data 126 for a trajectory 122.
  • the horizon data 126 represents the field of view of a person 204 at each point on the trajectory 122.
  • the field of view (or “horizon”) at each point i is determined based on the direction of the velocity vector ( vx,. v yi) of the trajectory 122, as each person 204 is assumed be to looking in the direction of their movement. While the horizons 126 of FIG.
  • the horizons 126 may include or correspond to a variety of different shapes (e.g., spherical, circular) and include a variety of different angles (e.g., horizontal field of view, vertical field of view) based at least in part on the field of view of the person 204.
  • the horizons 126 may also be conical if the sensors 104 are capable of collecting three-dimensional data.
  • the system 100 determines that persons 204 corresponding to the two trajectories 122 can see each other.
  • the velocity data 134 and the horizon data 126 are combined to analyze the two (or more) trajectories 122 triggering the proximity alert 124.
  • the velocity data 134 and the horizon data 126 are combined to analyze the two (or more) trajectories 122 triggering the proximity alert 124.
  • two persons 204 are walking towards each other, their velocity vectors are heading towards the same point in space.
  • persons 204 can see each other. Therefore, these conditions, along with the duration of the proximity alert 124, can be indicative of two persons 204 intentionally walking towards each other. This sensed intentionality can be reflected by the generated distancing characteristics 112 indicative of distancing violations, including the associated severity score 128. An intentional violation of distancing guidelines would generate a higher severity score 128 than an unintentional violation.
  • the distancing characteristics 112 associated with the proximity alert 124 would be assessed a high severity score 128.
  • trajectories 122a and 122b of FIG. 3 The paths of the trajectories 122a and 122b generate proximity alert 124a. Prior to the proximity alert 124a, the trajectories 122a, 122b are travelling towards each other for a significant period of time. Further, the horizons 126a and 126b associated with the trajectories 122a, 122b overlap immediately before the proximity alert 124a. Additionally, the length of the proximity alert 124a is quite long. Accordingly, the distancing violation corresponding to the proximity alert 124a is considered to be highly intentional, and is assigned a high severity score 128.
  • the distancing characteristics 112 would be assessed a low severity score 128.
  • trajectories 122b and 122c Prior to the occurrence of the proximity alert 124b, the trajectories 122b, 122c are mostly travelling towards different points, often parallel to each other. The trajectories 122b, 122c are not directed towards each other until immediately before the occurrence of the proximity alert 124b, indicative of unintentionality. Similarly, the horizons 126b, 126c do not overlap, partially due to object 202b preventing the two persons 204b, 204c from seeing each other. Accordingly, the distancing violation corresponding to the proximity alert 124c is considered to be unintentional, and is assigned a low severity score 128. In some implementations, the distancing characteristics 112 associated with a low severity score 128 indicative of unintentionality may be omitted from further analysis.
  • the system 100 In addition to monitoring the distancing of persons 204 in the area 200, the system 100 also monitors their masking.
  • the system 100 first uses the previously described trajectories 122 to determine a mouth region 138 for the monitored person 204 at a point in time.
  • the mouth region 138 of the person 204 is just in front of the person 204 as they move along the trajectory 122.
  • the location of the mouth region 138 can be determined based on the time data set 130, location data set, and velocity data set of the trajectory 122.
  • the system 100 can then analyze a portion of a temperature data set 136 corresponding to the mouth region 138 to determine if the person 204 is masked.
  • This temperature data set 136 can be captured by one or more MPT sensors 144.
  • the temperature of the mouth region 138 of an unmasked person 204 will be significantly higher and/or more variable than the mouth region of a masked person 204.
  • the temperature data sets 136 corresponding to the mouth regions 138 can be analyzed by a machine learning algorithm to determine if the person is properly masked.
  • the machine learning algorithm could be a support vector machine or artificial neural network.
  • the machine learning algorithms can be trained with two groups of data: (1) temperature data near the mouth when an individual is wearing a face mask; and (2) pixel values near mouth when an individual is not wearing a face mask.
  • the machine learning model can learn the decision boundary between the mask/no-mask scenarios and then perform a binary classification. Another class/category can also be added to detect instances where the face covering is not worn properly.
  • the system 100 can use data collected by RGB cameras 166 to capture images of the mouth regions 138 of the user. These images can be analyzed using various machine learning algorithms to identify the person 204 as masked or unmasked, as described above.
  • the system If the person 204 is determined to be unmasked or improperly masked, the system generates a masking alert 140.
  • the system also determines an alert duration 142 representing the time period the person is unmasked 204.
  • the masking alert 140 and alert duration 142 are then used to generate a masking characteristic 114 indicative of a masking violation.
  • the masking characteristic 114 includes a severity score 146 corresponding to the alert duration 142; the longer the alert duration 142, the higher the severity score 146.
  • the severity score 146 may also increase if the person 204 is fully unmasked, as opposed to simply improperly masked.
  • the system 100 may transmit the masking alert 140 and the alert duration 142 to a personal device 208 of the person 204 to remind or notify the respective person 204 of the masking violation.
  • FIG. 4 illustrates three persons 204a, 204b, 204c of various mask wearing states.
  • person 204a is unmasked.
  • person 204b is properly masked.
  • person 204c is improperly masked.
  • the temperature data 136a of mouth region 138a will be significantly higher or more variable than the temperature data 136b of mouth region 138b.
  • the system 100 can then determine a social behavior score 116.
  • the social behavior score 116 represents how well the person 204 corresponding to the unique identifier 110 has followed the social distancing rules of the area 200.
  • the social distancing score 116 can be calculated based on the aforementioned characteristics 112, 114, a set of pre-defmed safety parameters 170, and a reinforcement learning algorithm 152.
  • the safety parameters 170 are used to weigh the characteristics 112, 114 based on policies and features of the area. For instance, an outdoor area 200, such as a public park, may prefer to emphasize distancing over masking. On the other hand, public transit system where distancing is not always possible may prefer to emphasize masking.
  • the parameters 170 can incorporate a wide variety of factors, such as the demographics of persons 204 within the area 200. Based on these parameters 170, the system 100 generates weighted distancing characteristics 148 and weighted masking characteristics 150.
  • the reinforcement learning algorithm 152 adjusts the social distancing score 116 based on factors external to persons 204 being monitored, including area masking violations 154, area distancing violations 156, and public health data 158.
  • the area masking violations 154 and area distancing violations 156 can include statistics and trends regarding past distancing and/or masking violations recorded in the area 200. For instance, if the history of masking violations 154 in the monitored area 200 indicates a recent increase in such violations, the masking characteristics 114 may be weighted more heavily than the distancing characteristics 112. Accordingly, the reinforcement learning algorithm 152 allows the system 100 to “learn” based on masking and distancing trends in the area 200.
  • public health data 158 showing recent outbreaks may lead to an overall decrease in social distancing scores 116 to encourage persons 204 of the area 200 to strictly follow social distancing guidelines.
  • a regression based recurrent neural network algorithm can be used to generate social behavior scores 116 between 0 and 100.
  • This algorithm can use weights learned from each of the rule violations in that area 200 and rule violations for that particular person 204 as input to generate a score 116.
  • This model can initially be unsupervised. The model can later receive feedback from user surveys and site preferences. Rules can be added through a rule layer on top of this scoring mechanism.
  • an example incentive generation subsystem 300 includes a control 302, a memory 375, a processor 385, and a transceiver 430.
  • the transceiver 430 is used to communicate with the various personal devices 208a-208c of persons 204 within the area 200.
  • the incentive generation subsystem 300 generates a reward 304 based on the received social behavior score 116.
  • the reward 304 can be in a wide variety of forms depending on the area 200, such as coupons for grocery stores, discount codes for future events for stadiums, or even cryptocurrency.
  • the value of the reward 304 can be proportional to the social behavior score 116, meaning the higher the social behavior score 116, the greater the reward.
  • the reward 304, or a notification of the reward 304 can be provided to a personal device 208 of the person 204 corresponding to the unique identifier 110 in wireless communication with the incentive generation subsystem 100.
  • a method 500 for tracking and incentivizing social behavior includes capturing 502, via a plurality of sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area.
  • the method 500 further includes determining 504, using machine learning, whether an object in the area is a person.
  • the method 500 further includes generating 506, based on the data set, a unique identifier for each object in the area determined to be a person.
  • the method 500 further includes determining 508, based on the data set, one or more distancing characteristics corresponding to each unique identifier.
  • the method 500 further includes determining 510, based on the data set, one or more masking characteristics corresponding to each unique identifier.
  • the method 500 further includes generating 512, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier.
  • the method 500 further includes transmitting 514 the generated social behavior scores to an incentive generation subsystem.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • the present disclosure may be implemented as a system, a method, and/or a computer program product at any possible technical detail level of integration
  • the computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure
  • the computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device.
  • the computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing.
  • a non- exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing.
  • RAM random access memory
  • ROM read-only memory
  • EPROM or Flash memory erasable programmable read-only memory
  • SRAM static random access memory
  • CD-ROM compact disc read-only memory
  • DVD digital versatile disk
  • memory stick a floppy disk
  • mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon
  • a computer readable storage medium is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
  • Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network.
  • the network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers.
  • a network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
  • Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages.
  • the computer readable program instructions may execute entirely on the user’s computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider).
  • electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
  • the computer readable program instructions may be provided to a processor of a, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram or blocks.
  • the computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
  • each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s).
  • the functions noted in the blocks may occur out of the order noted in the Figures.
  • two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved.

Landscapes

  • Business, Economics & Management (AREA)
  • Human Resources & Organizations (AREA)
  • Engineering & Computer Science (AREA)
  • Strategic Management (AREA)
  • Economics (AREA)
  • Development Economics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Marketing (AREA)
  • General Business, Economics & Management (AREA)
  • Tourism & Hospitality (AREA)
  • Game Theory and Decision Science (AREA)
  • Accounting & Taxation (AREA)
  • Finance (AREA)
  • Quality & Reliability (AREA)
  • Operations Research (AREA)
  • Educational Administration (AREA)
  • Health & Medical Sciences (AREA)
  • Emergency Management (AREA)
  • Child & Adolescent Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Primary Health Care (AREA)
  • Circuit Arrangement For Electric Light Sources In General (AREA)

Abstract

A system for tracking and incentivizing social behavior is provided. The system captures, via sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area. The system then determines whether an object in the area is a person. The system then generates, based on the data set, a unique identifier for each object in the area determined to be a person. The system then determines, based on the data set, distancing characteristics corresponding to each unique identifier. The system then determines, based on the data set, masking characteristics corresponding to each unique identifier. The system then generates, based on the distancing characteristics, the masking characteristics, and pre-defined safety parameters, social behavior scores corresponding to each unique identifier. The system then transmits the generated social behavior scores to an incentive generation subsystem to generate a reward.

Description

Systems for incentivizing social distancing using connected lighting IoT infrastructure
FIELD OF THE DISCLOSURE
The present disclosure is directed generally to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure.
BACKGROUND
During the COVID-19 pandemic, it was observed that a substantial percentage of individuals testing positive for COVID-19 lacked detectable symptoms. Additionally, it was observed that those who eventually developed symptoms could spread the virus to others even before exhibiting symptoms. Due to this, the Center for Disease Control (CDC) in the United States recommended that everyone, sick or healthy, practice social or physical distancing by staying at least 6 feet from other persons from different households in both indoor and outdoor spaces. The CDC further recommended wearing face masks when social distancing could not be observed. However, many of the recommendations and guidelines issued by federal and local agencies were voluntary. Accordingly, many incidents surfaced where these social distancing guidelines were not observed, leading to an increase in the number of infections.
Conventional systems and methods exist for tracing social distancing and masking. For example, some systems rely on the exchange of Bluetooth information between devices worn by the occupants to evaluate distancing and/or track close encounters which could spread disease. Such systems are vulnerable to individuals who do not carry a functional Bluetooth-enabled device, or who have disabled the Bluetooth feature on their device. The Bluetooth systems may also be subject to privacy and security risks due to the sharing of information via a Bluetooth connection. Similarly, other systems which rely on high resolution cameras to track distancing and/or mask wearing may be perceived as invasive. Accordingly, there is a need in the art for improved systems to incentivize social distancing and mask wearing. SUMMARY OF THE DISCLOSURE
The present disclosure is generally directed to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure. Sites using connected lighting systems typically include a variety of IoT sensors as part of the connected lighting system infrastructure. By using IoT sensors, the connected lighting system can anonymously track the distancing and mask wearing of occupants in an indoor or outdoor site.
When a person enters a site, the system may generate an anonymized unique identifier for the occupant and use the unique identifier to monitor the person’s distancing and masking within the site. The system may monitor distancing of occupants by tracking the trajectory over time of the occupants within the site to determine proximity alerts and their associated intentionality. Based on the sensed level of intentionality of persons in the site, the system may generate distancing characteristics. The system may also monitor masking of persons as they travel along their trajectory. If a person is unmasked, the system may detect a change in temperature around their mouth and generate a masking alert. Based on the duration of the masking alert, the system may generate masking characteristics. The system may generate a social behavior score based on the distancing and masking characteristics, the pre-defmed safety parameters of the site, and a reinforcement learning algorithm. The system may transmit the generated social behavior score to an incentive generation subsystem. The incentive generation subsystem may generate a reward based on the social behavior score, and may provide the reward to the occupant.
Generally, in one aspect, a connected lighting system for tracking and incentivizing social behavior is provided. The connected lighting system includes a controller. The controller is communicatively coupled to a plurality of sensors. The plurality of sensors are arranged with luminaires in an area.
The controller is configured to capture, via the plurality of sensors, a data set. The data set corresponds to the area. The controller is further configured to determine, using machine learning, whether an object in the area is a person.
The controller is further configured to generate, based on the data set, a unique identifier for each object in the area determined to be a person. According to an example, generating the unique identifiers includes: (1) determining, based on the data set, a location and a time stamp for each object in the area determined to be a person; and (2) generating, based on the locations) and the time stamps, the unique identifiers for each object in the area determined to be a person. The controller is further configured to determine, based on the data set, one or more distancing characteristics corresponding to each unique identifier. According to an example, determining the one or more distancing characteristics includes: (1) generating, based on the data set, a trajectory corresponding to each unique identifier; generating, based on the trajectories, one or more proximity alerts, wherein each proximity alert corresponds to at least two of the unique identifiers, generating, based on the trajectories, horizon data corresponding to each unique identifier, and generating, based on the proximity alerts and the horizon data, the one or more distancing characteristics.
According to a further example, each of the one or more distancing characteristics includes a distancing characteristic severity score based on the trajectories and the horizon data. Each of the trajectories include a time data set, a location data set, and a velocity data set. The horizon data is further generated based on the velocity data set.
The controller is further configured to determine, based on the data set, one or more masking characteristics corresponding to each unique identifier. According to an example, determining the one or more masking characteristics includes: (1) retrieving, from the plurality of sensors, a temperature data set, (2) identifying, based on a trajectory for each object determined to be a person, one or more mouth regions, wherein the trajectory includes a time data set, a location data set, and a velocity data set; (3) generating, based on the temperature data corresponding to the mouth regions, one or more masking alerts, wherein each masking alert corresponds to one of the unique identifiers, and wherein each of the masking alerts includes an alert duration; and (4) generating, based on the alert durations of the masking alerts, the one or more masking characteristics.
According to an example, the temperature data set is retrieved from one or more multipixel thermopile (MPT) sensors. Each of the one or more masking characteristics can includes a masking characteristic severity score based on the alert durations.
The controller is further configured to generate, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier. According to an example, generating the social behavior scores includes: (1) generating, based on the one or more distancing characteristics and the pre-defmed safety parameters, one or more weighted distancing characteristics; (2) generating, based on the one or more masking characteristics and the pre-defmed safety parameters, one or more weighted masking characteristics; and (3) generating the social behavior scores based on the one or more weighted distancing characteristics, the one or more weighted masking characteristics, and a reinforcement learning algorithm. The reinforcement learning algorithm is based on area masking violations, area distancing violations, and public health data.
The controller is further configured to transmit the generated social behavior scores to an incentive generation subsystem. The incentive generation subsystem includes a controller configured to generate, based on one of the one or more social behavior scores, a reward corresponding to one of the unique identifiers. According to an example, a value of the reward is proportional to the one of the one or more social behavior scores. According to another example, the controller of the incentive generation subsystem is further configured to notify the person corresponding to the one of the one or more unique identifiers of the reward via a personal device in wireless communication with the controller.
Generally, in another aspect, a method for tracking and incentivizing social behavior is provided. The method includes capturing, via a plurality of sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area. The method further includes determining, using machine learning, whether an object in the area is a person. The method further includes generating, based on the data set, a unique identifier for each object in the area determined to be a person. The method further includes determining, based on the data set, one or more distancing characteristics corresponding to each unique identifier. The method further includes determining, based on the data set, one or more masking characteristics corresponding to each unique identifier. The method further includes generating, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier. The method further includes transmitting the generated social behavior scores to an incentive generation subsystem.
In various implementations, a processor or controller may be associated with one or more storage media (generically referred to herein as “memory,” e.g., volatile and non-volatile computer memory such as RAM, PROM, EPROM, EEPROM, floppy disks, compact disks, optical disks, magnetic tape, SSD, etc.). In some implementations, the storage media may be encoded with one or more programs that, when executed on one or more processors and/or controllers, perform at least some of the functions discussed herein.
Various storage media may be fixed within a processor or controller or may be transportable, such that the one or more programs stored thereon can be loaded into a processor or controller so as to implement various aspects as discussed herein. The terms “program” or “computer program” are used herein in a generic sense to refer to any type of computer code (e.g., software or microcode) that can be employed to program one or more processors or controllers.
It should be appreciated that all combinations of the foregoing concepts and additional concepts discussed in greater detail below (provided such concepts are not mutually inconsistent) are contemplated as being part of the inventive subject matter disclosed herein. In particular, all combinations of claimed subject matter appearing at the end of this disclosure are contemplated as being part of the inventive subject matter disclosed herein. It should also be appreciated that terminology explicitly employed herein that also may appear in any disclosure incorporated by reference should be accorded a meaning most consistent with the particular concepts disclosed herein.
These and other aspects of the various embodiments will be apparent from and elucidated with reference to the embodiment s) described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
In the drawings, like reference characters generally refer to the same parts throughout the different views. Also, the drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles of the various embodiments.
Fig. l is a top-level schematic of a system for tracking and incentivizing social behavior, in accordance with an example.
Fig. 2 is a further top-level schematic of a system for incentivizing social distancing, in accordance with an example.
Fig. 3 is a trajectory map for determining proximity alerts, in accordance with an example.
Fig. 4 is a trajectory map for determining masking alerts, in accordance with an example.
Fig. 5 is a schematic diagram of a luminaire, in accordance with an example. Fig. 6 is a schematic diagram of a controller of a connected lighting system, in accordance with an example.
Fig. 7 is a schematic diagram of an incentive generation subsystem wirelessly connected to personal devices, in accordance with an example.
Fig. 8 is a flowchart of a method for tracking and incentivizing social behavior, in accordance with an example. DETAILED DESCRIPTION OF EMBODIMENTS
The present disclosure is generally directed to systems and methods for incentivizing social distancing using connected lighting Internet of Things (IoT) infrastructure. Sites using connected lighting systems typically include a variety of IoT sensors as part of the connected lighting system infrastructure. The sites can be indoor (office spaces, residential buildings, shopping centers, schools, etc.), outdoor (parks, stadiums, etc.), or both. Such IoT sensors can include multipixel thermopile (MPT) sensors, microphones, positioning sensors, red green blue (RGB) cameras, and others. By using IoT sensors, the system can anonymously track the distancing and mask wearing of the occupants in an indoor or outdoor site.
When a person enters a site, the system may identify them as a person, rather than a non-human object, using machine learning algorithms. The system may generate an anonymized unique identifier for the person. The unique identifier may be a randomized number or character generated at least partly based on the timestamp the person entered the site, the location of entrance, as well as other data collected by the sensors. The system may use the unique identifier to monitor and track the person’s distancing and masking within the site.
The system may monitor distancing by tracking the trajectory over time of persons within the site. For each person, the system may generate a trajectory which includes time, location, and velocity information. By comparing trajectories, the system can generate proximity alerts if persons are within a distancing threshold for a predetermined period of time. Further, the system can use the velocity information of the trajectories to determine the field-of-view, or “horizon” of persons for particular points in time. By evaluating the horizons and the velocity vectors of persons during a proximity alert, the system can determine the intentionality of the proximity alert-generating behavior. For example, if (1) the velocity vectors of two persons are headed towards each other for a first predetermined amount of time and (2) the horizons of the two persons continue to overlap for a second predetermined amount of time, the system can conclude that the proximity of the two persons was intentional. Based on the sensed level of intentionality of persons in the site, the system may generate distancing characteristics which can be representative of any distancing violations. The distancing characteristics may include scores indicating the severity of the characteristics.
The system may also monitor masking of persons as they travel along their trajectory. At various points along the trajectory, the system may identify mouth-regions of a person based on location and velocity information. The system may analyze temperature data of the mouth region. The temperature data can be collected by sensors such as MPT sensors. If the person is unmasked, the system may detect the change in temperature in their mouth- region and generate a masking alert. Based on the duration of the masking alerts, the system may generate masking characteristics which can be representative of any masking violations. The masking characteristics may include scores indicating the severity of the violation.
The system may generate a social behavior score based on the distancing and masking characteristics. In generating the social behavior score, the system may weigh certain characteristics more heavily than others, based on the pre-defmed safety parameters of the site. For example, masking violations may be weighted more heavily than distancing violations. The social behavior score can also be determined using a reinforcement learning algorithm. The reinforcement learning algorithm may be based on previously determined area violation data, as well as public health data.
The system may generate a reward based on the social behavior score. The reward can be in a wide variety of forms depending on the site, such as coupons for grocery stores, discount codes for future events for stadiums, or even cryptocurrency. The reward may be generally proportional to the social behavior score, meaning the higher the social behavior score, the greater the reward. The reward, or a notification of the reward, can be provided to a personal device of the occupant in wireless communication with the connected lighting system.
FIG. 1 shows an example connected lighting system 100 monitoring an area 200. The area 200 includes a number of objects 202a-202f, including three persons 204a- 204c. The connected lighting system includes a controller 102, two luminaires 106a, 106b, a discrete MPT sensor 144c, and an incentive generation subsystem 300. The connected lighting system 100 is configured to illuminate the area 200. The area 200 can be indoor (office spaces, residential buildings, shopping centers, schools, etc.), outdoor (parks, stadiums, etc.), or a combination of both. Non-person objects can include any type of object 202 found in these areas 200, such as furniture, electronics, building infrastructure, personal items, and more.
Each of the luminaires 106a, 106b of FIG. 1 include a sensor bundle 104a, 104b, such as an Advanced Sensor Bundle (ASB). Connected lighting systems 100 can use ASB’s, as well as discrete, individual sensors, to collect information regarding the area 200 being lit, and use this information to control the output of the light sources 146 of the luminaires 106a, 106b. This information can also be used for a number of other purposes, such as, in this case, tracking and incentivizing social behavior. The sensors 104a, 104b can be embedded within or arranged on the luminaires 106a, 106b, or they can be arranged separately, depending on the application.
The sensors 104a, 104b included in each bundle can vary according to the application. For instance, sensor bundle 104a includes an MPT sensor 144a, a microphone 160a, and a positioning sensor 162. On the other hand, sensor bundle 104b also includes an MPT sensor 144b, an RGB camera 166b, and a passive infrared (PIR) sensor 168b. Other types of sensors 104 may be used as required. The sensors 104 can collect information at a wide range of sampling rates corresponding to the area 200. For example, if the area 200 is a busy grocery store with constantly moving traffic, the sensors 104 may capture data every second. However, if the area 200 is a seating area in an outdoor sports stadium, the sensors 104 could collect data much less frequently while still accurately representing the movement of persons 204 in the area 200. The sampling rate of the sensors 104 could fluctuate based on expected behavior at certain times. For example, in the outdoor stadium example, the sampling rate could increase at times of expected movement, such as during stoppages in play, or immediately following the conclusion of the event.
The information collected by the sensors 104 may be compiled into a data set 108. The data set 108 can be subdivided or classified by a number of differentiators, such as a type of data collected or the location of the sensors 104. For example, temperature information collected by the various MPT sensors 144 may be compiled as a temperature data set 136, a subset of the overall data set 108.
Another example luminaire 106 is illustrated in FIG. 5. In FIG. 5, the example luminaire 106 includes a sensor bundle 104 with an MPT sensor 144, a microphone 160, a positioning sensor 162, an RGB camera 164, and a PIR sensor 166. The luminaire 106 further includes a light source 146, such as one or more light emitting diodes (LEDs). The luminaire 106 further includes a transceiver 410 to wirelessly communicate with the other aspects of the connected lighting system 100. The luminaire 106 further includes a memory 177 to store the data set 108 captured by the sensor bundle 104.
The controller 102 of the connected lighting system is communicatively coupled to the luminaires 106a, 106b, the discrete MPT sensor 144c, and the incentive generation subsystem 300. This coupling may be wired or wireless, depending on the application. This coupling allows for data captured by the sensors 104 to be conveyed to the controller 102 for processing. The controller 102 can then communicate with the incentive generation subsystem 300 to generate rewards 304 based on the data. Another example of controller 102 is illustrated in FIG. 6. The controller includes a processor 185, memory 175, and transceiver 420.
FIG. 1 further depicts person 204c moving into from outside of the area 200 to inside the area 200. When person 204c moves into the area 200, information corresponding to his presence is captured by the various sensors 104a, 104b, 104c. This information is compiled into data set 108. The controller 102 processes the data set 108 using machine learning 160 to determine if the object 202c is actually a person 204, as opposed to a non person object 202. The machine learning 160 may consist of an algorithm trained by data corresponding to person 204 and non-person objects 202. For example, the algorithm can be trained by temperature data captured by MPT sensors 144. By analyzing the temperature data, controller 102 can look for heat patterns in the data 108 corresponding to a human heat signature. The controller 102 can generate a heat pattern for an object 202 and compare the heat pattern to a threshold human heat signature (e.g., baseline human heat signature) to determine if object 202 is a person 204 or a non-person object 202. The threshold human heat signature can include a single value or a range of values representing a human heat signature. The threshold human heat signature can include predefined values, previous heat pattern measurements of one or more persons 204 or a combination of predefined values and previous heat pattern measurements of one or more persons 204. Other types of sensor data may be similarly used.
When the system 100 determines a new person 204 has entered the area 200, the person 204 is assigned a unique identifier 110. The unique identifier 110 is used to anonymize the person 204 while the system 100 monitors their social behavior. In one example, the unique identifier 110 is calculated based on a location 118 of the person 204 within the area, and the timestamp 120 of the person 204 at the location 118. The unique identifier 110 can also be based on additional aspects of the data set 108 captured by the sensors 104. For example, a unique identifier 110 can be calculated based on equation 1 :
ID =/ (timestamp, location, other sensor readings) (1)
According to the example of FIG. 1, person 204c is detected entering the area 200 at timestamp /;, at location L (xi,yi). Further, the sensor bundle 104b captures a set of sensor readings Sb[/;]. In this example, the unique identifier 110 ID value for person 204c is the result of /;, L (xi,yi), and Sb[0] being processed by equation 1. In further examples, the unique identifier 110 can also be at least partly based on a randomly generated number or character for increased anonymization. The area 200 monitored by the connected lighting system 100 can be split up into two or more zones 210 for more efficient monitoring. Each zone 210 is monitored by one or more sensors 104 to ensure coverage of the entire area 200. The system 100 can identify persons 204 as they move across different zones, ensuring that captured information about their movements and masking will correspond to the correct unique identifier 110.
As shown in FIG. 2, the area 200 includes zone 1 210a and zone 2210b. Zone 1 210a is monitored by sensor bundle 104d, while zone 2 is monitored by sensor bundle 204e. Persons 204a, 204d move from zone 1 210a to zone 2210b. The system 100 may recognize their movement from zone 1 210a to zone 2210b based on their movement towards the boundary between the zones 210a, 210b. As such, the system 100 can “predict” when persons 204 will travel across the zones 210, and allocate the data collected by the corresponding sensors 104 accordingly. In this way, as person 204a travels from zone 1 210a to zone 2210b, the system 100 “hands off’ monitoring of the person 204a from sensor bundle 104d to sensor bundle 104e. In some examples, an area 200 may include tens or hundreds of zones 210 of various shapes and sizes.
As persons 204 move through the area 200 and across various zones 210, the system 100 generates trajectories 122 corresponding to their movements over a period of time. The trajectories 122 are generated based on the data set 108 (or subsets of the data set 108). The system 100 can then analyze these trajectories 122 to determine if any of persons 204 in the area 200 have violated social distancing guidelines.
FIG. 3 shows three trajectories 122; trajectory 122a corresponding to person 204a, trajectory 122b corresponding to person 204b, and trajectory 122c corresponding to person 204c. Each trajectory 122 includes a time data set 130, a location data set 132, and a velocity data set 134. For example, a trajectory 122 may be defined by equation 2:
T [ij] = /¾J,x ,y/, vxi J vyj) (2) where i is the sample number of the data, / corresponds to the unique identifier 110 for the person 204 being tracked, t is the timestamp at which the data of sample i was captured, x is the x-coordinate of the person 204 being tracked, y is the y-coordinate of the person being tracked, vx is the velocity of the person being tracked in the x-plane, and y is the velocity of the person being tracked in the y-plane.
By analyzing the location data set 130 of the trajectories 122, the system 100 can determine if two or more persons 204 are within a predefined distancing threshold for a predefined duration threshold. The predefined distancing threshold and the predefined duration threshold may be set based on the area 200 and public health guidance. For example, if the area 200 is an indoor grocery store, the predefined distancing threshold can be set to 6 ft, and the predefined duration threshold can be set to 60 seconds. If two or more persons 204 are within the predefine distancing threshold for a time period exceeding the predefined duration threshold, a proximity alert 124 is generated. The system 100 may transmit the proximity alert 124 to a personal device 208 of the corresponding person 204 or persons 204 to notify the respective person 204 or persons 204 of a potential violation of the predefined distancing threshold or to limit or reduce the time period the persons 204 exceed the distancing threshold. The system 100 can associate the personal device 208 of the user to the unique identifier 110 of the respective person 204 to identify the correct personal device 208 to transmit the proximity alert 124.
The proximity alert 124 is associated with the unique identifiers 110 of the two (or more) persons 204 triggering the alert. Once the system 100 performs additional analyses to determine a distancing characteristic 112 associated with the proximity alert 124, including a severity score 128 representative of the intentionality of the violation.
As demonstrated in FIG. 3, the system 100 generates horizon data 126 for a trajectory 122. The horizon data 126 represents the field of view of a person 204 at each point on the trajectory 122. The field of view (or “horizon”) at each point i is determined based on the direction of the velocity vector ( vx,. v yi) of the trajectory 122, as each person 204 is assumed be to looking in the direction of their movement. While the horizons 126 of FIG. 3 are depicted as triangular, the horizons 126 may include or correspond to a variety of different shapes (e.g., spherical, circular) and include a variety of different angles (e.g., horizontal field of view, vertical field of view) based at least in part on the field of view of the person 204. For example, the horizons 126 may also be conical if the sensors 104 are capable of collecting three-dimensional data. When the horizons 126 of two trajectories 122 overlap at a point in time, the system 100 determines that persons 204 corresponding to the two trajectories 122 can see each other.
The velocity data 134 and the horizon data 126 are combined to analyze the two (or more) trajectories 122 triggering the proximity alert 124. When two persons 204 are walking towards each other, their velocity vectors are heading towards the same point in space. As mentioned above, if their horizons 126 overlap, persons 204 can see each other. Therefore, these conditions, along with the duration of the proximity alert 124, can be indicative of two persons 204 intentionally walking towards each other. This sensed intentionality can be reflected by the generated distancing characteristics 112 indicative of distancing violations, including the associated severity score 128. An intentional violation of distancing guidelines would generate a higher severity score 128 than an unintentional violation.
For example, if two persons 204 are walking toward the same point (as indicated by their velocity vectors) for a long period of time, their horizons 126 overlap for a long period of time, and the duration of the proximity alert 124 is long, their distancing violation would be considered intentional. Accordingly, the distancing characteristics 112 associated with the proximity alert 124 would be assessed a high severity score 128.
This situation is demonstrated by trajectories 122a and 122b of FIG. 3. The paths of the trajectories 122a and 122b generate proximity alert 124a. Prior to the proximity alert 124a, the trajectories 122a, 122b are travelling towards each other for a significant period of time. Further, the horizons 126a and 126b associated with the trajectories 122a, 122b overlap immediately before the proximity alert 124a. Additionally, the length of the proximity alert 124a is quite long. Accordingly, the distancing violation corresponding to the proximity alert 124a is considered to be highly intentional, and is assigned a high severity score 128.
On the other hand, if the two persons 204 are not walking towards the same point for a period of time, their horizons 126 briefly overlap, and the duration of the proximity alert 124 is short, the distancing characteristics 112 would be assessed a low severity score 128.
This situation is demonstrated by trajectories 122b and 122c in FIG. 3. Prior to the occurrence of the proximity alert 124b, the trajectories 122b, 122c are mostly travelling towards different points, often parallel to each other. The trajectories 122b, 122c are not directed towards each other until immediately before the occurrence of the proximity alert 124b, indicative of unintentionality. Similarly, the horizons 126b, 126c do not overlap, partially due to object 202b preventing the two persons 204b, 204c from seeing each other. Accordingly, the distancing violation corresponding to the proximity alert 124c is considered to be unintentional, and is assigned a low severity score 128. In some implementations, the distancing characteristics 112 associated with a low severity score 128 indicative of unintentionality may be omitted from further analysis.
In addition to monitoring the distancing of persons 204 in the area 200, the system 100 also monitors their masking. The system 100 first uses the previously described trajectories 122 to determine a mouth region 138 for the monitored person 204 at a point in time. As the person 204 is presumed to be facing forward (i.e., the direction of their velocity vector) while travelling, the mouth region 138 of the person 204 is just in front of the person 204 as they move along the trajectory 122. Thus, the location of the mouth region 138 can be determined based on the time data set 130, location data set, and velocity data set of the trajectory 122.
The system 100 can then analyze a portion of a temperature data set 136 corresponding to the mouth region 138 to determine if the person 204 is masked. This temperature data set 136 can be captured by one or more MPT sensors 144. The temperature of the mouth region 138 of an unmasked person 204 will be significantly higher and/or more variable than the mouth region of a masked person 204.
The temperature data sets 136 corresponding to the mouth regions 138 can be analyzed by a machine learning algorithm to determine if the person is properly masked. In some examples, the machine learning algorithm could be a support vector machine or artificial neural network. The machine learning algorithms can be trained with two groups of data: (1) temperature data near the mouth when an individual is wearing a face mask; and (2) pixel values near mouth when an individual is not wearing a face mask. The machine learning model can learn the decision boundary between the mask/no-mask scenarios and then perform a binary classification. Another class/category can also be added to detect instances where the face covering is not worn properly.
If temperature data 136 is unavailable, the system 100 can use data collected by RGB cameras 166 to capture images of the mouth regions 138 of the user. These images can be analyzed using various machine learning algorithms to identify the person 204 as masked or unmasked, as described above.
If the person 204 is determined to be unmasked or improperly masked, the system generates a masking alert 140. The system also determines an alert duration 142 representing the time period the person is unmasked 204. The masking alert 140 and alert duration 142 are then used to generate a masking characteristic 114 indicative of a masking violation. The masking characteristic 114 includes a severity score 146 corresponding to the alert duration 142; the longer the alert duration 142, the higher the severity score 146. The severity score 146 may also increase if the person 204 is fully unmasked, as opposed to simply improperly masked. The system 100 may transmit the masking alert 140 and the alert duration 142 to a personal device 208 of the person 204 to remind or notify the respective person 204 of the masking violation.
FIG. 4 illustrates three persons 204a, 204b, 204c of various mask wearing states. As indicated by the white circle, person 204a is unmasked. As indicated by the black circle, person 204b is properly masked. As indicated by the circle with an interior square grid, person 204c is improperly masked. Thus, the temperature data 136a of mouth region 138a will be significantly higher or more variable than the temperature data 136b of mouth region 138b.
Once the system 100 has determined the distancing characteristics 112 and masking characteristics 114 associated with a unique identifier 110, the system can then determine a social behavior score 116. Broadly, the social behavior score 116 represents how well the person 204 corresponding to the unique identifier 110 has followed the social distancing rules of the area 200.
The social distancing score 116 can be calculated based on the aforementioned characteristics 112, 114, a set of pre-defmed safety parameters 170, and a reinforcement learning algorithm 152. The safety parameters 170 are used to weigh the characteristics 112, 114 based on policies and features of the area. For instance, an outdoor area 200, such as a public park, may prefer to emphasize distancing over masking. On the other hand, public transit system where distancing is not always possible may prefer to emphasize masking. The parameters 170 can incorporate a wide variety of factors, such as the demographics of persons 204 within the area 200. Based on these parameters 170, the system 100 generates weighted distancing characteristics 148 and weighted masking characteristics 150.
The reinforcement learning algorithm 152 adjusts the social distancing score 116 based on factors external to persons 204 being monitored, including area masking violations 154, area distancing violations 156, and public health data 158. The area masking violations 154 and area distancing violations 156 can include statistics and trends regarding past distancing and/or masking violations recorded in the area 200. For instance, if the history of masking violations 154 in the monitored area 200 indicates a recent increase in such violations, the masking characteristics 114 may be weighted more heavily than the distancing characteristics 112. Accordingly, the reinforcement learning algorithm 152 allows the system 100 to “learn” based on masking and distancing trends in the area 200. In a further example, public health data 158 showing recent outbreaks may lead to an overall decrease in social distancing scores 116 to encourage persons 204 of the area 200 to strictly follow social distancing guidelines.
In some examples, a regression based recurrent neural network algorithm can be used to generate social behavior scores 116 between 0 and 100. This algorithm can use weights learned from each of the rule violations in that area 200 and rule violations for that particular person 204 as input to generate a score 116. This model can initially be unsupervised. The model can later receive feedback from user surveys and site preferences. Rules can be added through a rule layer on top of this scoring mechanism.
Once the social distancing score 116 is generated, the system 100 transmits the social distancing score 116 and a corresponding unique identifier 110 to an incentive generation subsystem 300. As shown in FIG. 7, an example incentive generation subsystem 300 includes a control 302, a memory 375, a processor 385, and a transceiver 430. The transceiver 430 is used to communicate with the various personal devices 208a-208c of persons 204 within the area 200. The incentive generation subsystem 300 generates a reward 304 based on the received social behavior score 116. The reward 304 can be in a wide variety of forms depending on the area 200, such as coupons for grocery stores, discount codes for future events for stadiums, or even cryptocurrency. The value of the reward 304 can be proportional to the social behavior score 116, meaning the higher the social behavior score 116, the greater the reward. The reward 304, or a notification of the reward 304, can be provided to a personal device 208 of the person 204 corresponding to the unique identifier 110 in wireless communication with the incentive generation subsystem 100.
Generally, in another aspect, a method 500 for tracking and incentivizing social behavior is provided. The method includes capturing 502, via a plurality of sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area. The method 500 further includes determining 504, using machine learning, whether an object in the area is a person. The method 500 further includes generating 506, based on the data set, a unique identifier for each object in the area determined to be a person. The method 500 further includes determining 508, based on the data set, one or more distancing characteristics corresponding to each unique identifier. The method 500 further includes determining 510, based on the data set, one or more masking characteristics corresponding to each unique identifier. The method 500 further includes generating 512, based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier. The method 500 further includes transmitting 514 the generated social behavior scores to an incentive generation subsystem.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms. The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified.
As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of’ or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.”
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of’ and “consisting essentially of’ shall be closed or semi-closed transitional phrases, respectively.
The above-described examples of the described subject matter can be implemented in any of numerous ways. For example, some aspects may be implemented using hardware, software or a combination thereof. When any aspect is implemented at least in part in software, the software code can be executed on any suitable processor or collection of processors, whether provided in a single device or computer or distributed among multiple devices/computers.
The present disclosure may be implemented as a system, a method, and/or a computer program product at any possible technical detail level of integration. The computer program product may include a computer readable storage medium (or media) having computer readable program instructions thereon for causing a processor to carry out aspects of the present disclosure.
The computer readable storage medium can be a tangible device that can retain and store instructions for use by an instruction execution device. The computer readable storage medium may be, for example, but is not limited to, an electronic storage device, a magnetic storage device, an optical storage device, an electromagnetic storage device, a semiconductor storage device, or any suitable combination of the foregoing. A non- exhaustive list of more specific examples of the computer readable storage medium includes the following: a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), a static random access memory (SRAM), a portable compact disc read-only memory (CD-ROM), a digital versatile disk (DVD), a memory stick, a floppy disk, a mechanically encoded device such as punch-cards or raised structures in a groove having instructions recorded thereon, and any suitable combination of the foregoing. A computer readable storage medium, as used herein, is not to be construed as being transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission media (e.g., light pulses passing through a fiber-optic cable), or electrical signals transmitted through a wire.
Computer readable program instructions described herein can be downloaded to respective computing/processing devices from a computer readable storage medium or to an external computer or external storage device via a network, for example, the Internet, a local area network, a wide area network and/or a wireless network. The network may comprise copper transmission cables, optical transmission fibers, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. A network adapter card or network interface in each computing/processing device receives computer readable program instructions from the network and forwards the computer readable program instructions for storage in a computer readable storage medium within the respective computing/processing device.
Computer readable program instructions for carrying out operations of the present disclosure may be assembler instructions, instruction-set-architecture (ISA) instructions, machine instructions, machine dependent instructions, microcode, firmware instructions, state-setting data, configuration data for integrated circuitry, or either source code or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C++, or the like, and procedural programming languages, such as the “C” programming language or similar programming languages. The computer readable program instructions may execute entirely on the user’s computer, partly on the user's computer, as a stand-alone software package, partly on the user’s computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider). In some examples, electronic circuitry including, for example, programmable logic circuitry, field-programmable gate arrays (FPGA), or programmable logic arrays (PLA) may execute the computer readable program instructions by utilizing state information of the computer readable program instructions to personalize the electronic circuitry, in order to perform aspects of the present disclosure.
Aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to examples of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer readable program instructions.
The computer readable program instructions may be provided to a processor of a, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer readable program instructions may also be stored in a computer readable storage medium that can direct a computer, a programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer readable storage medium having instructions stored therein comprises an article of manufacture including instructions which implement aspects of the function/act specified in the flowchart and/or block diagram or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other device to cause a series of operational steps to be performed on the computer, other programmable apparatus or other device to produce a computer implemented process, such that the instructions which execute on the computer, other programmable apparatus, or other device implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods, and computer program products according to various examples of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the blocks may occur out of the order noted in the Figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts or carry out combinations of special purpose hardware and computer instructions.
Other implementations are within the scope of the following claims and other claims to which the applicant may be entitled.
While various examples have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the function and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the examples described herein. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific examples described herein. It is, therefore, to be understood that the foregoing examples are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, examples may be practiced otherwise than as specifically described and claimed. Examples of the present disclosure are directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.

Claims

CLAIMS:
1. A connected lighting system (100) for tracking and incentivizing social behavior, the connected lighting system (100) comprising a controller (102) communicatively coupled to a plurality of sensors (104) arranged with luminaires (106) in an area (200), wherein the controller (100) is configured to: capture, via the plurality of sensors (104), a data set (108) corresponding to the area (200); determine, using machine learning (160), whether an object (202) in the area (200) is a person (204), generate, based on the data set (108), a unique identifier (110) for each object (202) in the area (200) determined to be a person (204); determine, based on the data set (108), one or more distancing characteristics (112) corresponding to each unique identifier (110); determine, based on the data set (108), one or more masking characteristics (114) corresponding to each unique identifier (110); generate, based on the one or more distancing characteristics (112), the one or more masking characteristics (114), and pre-defmed safety parameters (170), one or more social behavior scores (116) corresponding to each unique identifier (110), and transmit the generated social behavior scores (116) to an incentive generation subsystem (300) having a controller (302) configured to generate, based on one of the one or more social behavior scores (116), a reward (304) corresponding to one of the unique identifiers (110).
2. The connected lighting system (100) of claim 1, wherein a value of the reward (304) is proportional to the one of the one or more social behavior scores (116).
3. The connected lighting system (100) of claim 1, wherein the controller (302) of the incentive generation subsystem (300) is further configured to notify the person (204) corresponding to the one of the one or more unique identifiers (110) of the reward via a personal device (208) in wireless communication with the controller (102).
4. The connected lighting system (100) of claim 1, wherein generating the unique identifiers (110) comprises: determining, based on the data set (108), a location (118) and a time stamp (120) for each object (202) in the area (200) determined to be a person (204); generating, based on the locations (118) and the time stamps (120), the unique identifiers (110) for each object (202) in the area (200) determined to be a person (204).
5. The connected lighting system (100) of claim 1, wherein determining the one or more distancing characteristics (112) comprises: generating, based on the data set (108), a trajectory (122) corresponding to each unique identifier (110); generating, based on the trajectories (122), one or more proximity alerts (124), wherein each proximity alert (124) corresponds to at least two of the unique identifiers (110); generating, based on the trajectories (122), horizon data (126) corresponding to each unique identifier (110); and generating, based on the proximity alerts (124) and the horizon data (126), the one or more distancing characteristics (112).
6. The connected lighting system (100) of claim 5, wherein each of the one or more distancing characteristics (112) comprises a distancing characteristic severity score (128) based on the trajectories (112) and the horizon data (126).
7. The connected lighting system (100) of claim 5, wherein each of the trajectories (112) comprises a time data set (130), a location data set (132), and a velocity data set (134).
8. The connected lighting system (100) of claim 7, wherein the horizon data (126) is further generated based on the velocity data set (134).
9. The connected lighting system (100) of claim 1, wherein determining the one or more masking characteristics (114) comprises: retrieving, from the plurality of sensors (104), a temperature data set (136); identifying, based on a trajectory (122) for each object (202) determined to be a person (204), one or more mouth regions (138), wherein the trajectory (122) comprises a time data set (130), a location data set (132), and a velocity data set (134); generating, based on the temperature data (136) corresponding to the mouth regions (138), one or more masking alerts (140), wherein each masking alert (140) corresponds to one of the unique identifiers (110), and wherein each of the masking alerts (140) comprises an alert duration (142); and generating, based on the alert durations (142) of the masking alerts (140), the one or more masking characteristics (114).
10. The connected lighting system (100) of claim 9, wherein the temperature data set (136) is retrieved from one or more multipixel thermopile (MPT) sensors (144).
11. The connected lighting system (100) of claim 9, wherein each of the one or more masking characteristics (114) comprise a masking characteristic severity score (146) based on the alert durations (142).
12. The connected lighting system (100) of claim 1, wherein generating the social behavior scores (116) comprises: generating, based on the one or more distancing characteristics (112) and the pre-defmed safety parameters (170), one or more weighted distancing characteristics (148); generating, based on the one or more masking characteristics (114) and the pre-defmed safety parameters (170), one or more weighted masking characteristics (150); and generating the social behavior scores (116) based on the one or more weighted distancing characteristics (148), the one or more weighted masking characteristics (150), and a reinforcement learning algorithm (152).
13. The connected lighting system (100) of claim 12, wherein the reinforcement learning algorithm (152) is based on area masking violations (154), area distancing violations (156), and public health data (158).
14. A method (500) for tracking and incentivizing social behavior, comprising: capturing (502), via a plurality of sensors arranged with luminaires of a connected lighting system covering an area, a data set corresponding to the area; determining (504), using machine learning, whether an object in the area is a person; generating (506), based on the data set, a unique identifier for each object in the area determined to be a person; determining (508), based on the data set, one or more distancing characteristics corresponding to each unique identifier; determining (510), based on the data set, one or more masking characteristics corresponding to each unique identifier; generating (512), based on the one or more distancing characteristics, the one or more masking characteristics, and pre-defmed safety parameters, one or more social behavior scores corresponding to each unique identifier, and transmitting (514) the generated social behavior scores to an incentive generation subsystem that generates, based on one of the one or more social behavior scores, a reward corresponding to one of the unique identifiers.
PCT/EP2022/064600 2021-06-02 2022-05-30 SYSTEMS FOR INCENTIVIZING SOCIAL DISTANCING USING CONNECTED LIGHTING IoT INFRASTRUCTURE WO2022253750A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163195838P 2021-06-02 2021-06-02
US63/195,838 2021-06-02
EP21179390.6 2021-06-15
EP21179390 2021-06-15

Publications (1)

Publication Number Publication Date
WO2022253750A1 true WO2022253750A1 (en) 2022-12-08

Family

ID=82218438

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2022/064600 WO2022253750A1 (en) 2021-06-02 2022-05-30 SYSTEMS FOR INCENTIVIZING SOCIAL DISTANCING USING CONNECTED LIGHTING IoT INFRASTRUCTURE

Country Status (1)

Country Link
WO (1) WO2022253750A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013188536A1 (en) * 2012-06-12 2013-12-19 Sensity Systems Inc. Lighting infrastructure and revenue model
CN111522073A (en) * 2020-04-26 2020-08-11 北京都是科技有限公司 Method for detecting mask wearing condition of target object and thermal infrared image processor
ES1251769U (en) * 2020-06-04 2020-08-26 Davo Ruben Hernandez SAFETY DISTANCE MARKING DEVICE (Machine-translation by Google Translate, not legally binding)
US20210134146A1 (en) * 2021-01-08 2021-05-06 Kundan Meshram Tracking and alerting traffic management system using iot for smart city

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013188536A1 (en) * 2012-06-12 2013-12-19 Sensity Systems Inc. Lighting infrastructure and revenue model
CN111522073A (en) * 2020-04-26 2020-08-11 北京都是科技有限公司 Method for detecting mask wearing condition of target object and thermal infrared image processor
ES1251769U (en) * 2020-06-04 2020-08-26 Davo Ruben Hernandez SAFETY DISTANCE MARKING DEVICE (Machine-translation by Google Translate, not legally binding)
US20210134146A1 (en) * 2021-01-08 2021-05-06 Kundan Meshram Tracking and alerting traffic management system using iot for smart city

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CONG T NGUYEN ET AL: "Enabling and Emerging Technologies for Social Distancing: A Comprehensive Survey and Open Problems", ARXIV.ORG, CORNELL UNIVERSITY LIBRARY, 201 OLIN LIBRARY CORNELL UNIVERSITY ITHACA, NY 14853, 23 September 2020 (2020-09-23), XP081793919, DOI: 10.1109/ACCESS.2020.3018140 *
EATON: "LumenSafe - Lighting and surveillance solution", 31 March 2019 (2019-03-31), pages 1 - 12, XP055864420, Retrieved from the Internet <URL:https://www.cooperlighting.com/s/connectedlighting/assets/brochures/eaton-lumensafe-brochure.pdf> [retrieved on 20211123] *

Similar Documents

Publication Publication Date Title
Haque et al. Towards vision-based smart hospitals: a system for tracking and monitoring hand hygiene compliance
US11504069B2 (en) Method and apparatus to infer object and agent properties, activity capacities, behaviors, and intents from contact and pressure images
US11037067B2 (en) Apparatus and method for occupancy detection
US10943204B2 (en) Realtime video monitoring applied to reduce customer wait times
Petersen et al. Establishing an image-based ground truth for validation of sensor data-based room occupancy detection
US9811989B2 (en) Event detection system
CN101221621B (en) Method and system for warning a monitored user about adverse behaviors
JP2009519510A (en) Abnormal crowd behavior detection
Sun et al. Indoor occupancy measurement by the fusion of motion detection and static estimation
JP2015523753A (en) Track determination of anomalous objects using variational Bayesian expectation maximization based on Gaussian process
Bernasco et al. Promise into practice: Application of computer vision in empirical research on social distancing
CN111127066A (en) Mining application method and device based on user information
US20220125359A1 (en) Systems and methods for automated monitoring of human behavior
US20210271217A1 (en) Using Real Time Data For Facilities Control Systems
WO2022253750A1 (en) SYSTEMS FOR INCENTIVIZING SOCIAL DISTANCING USING CONNECTED LIGHTING IoT INFRASTRUCTURE
JP6650659B2 (en) Suspicious person detection device and program
Hou et al. A low-cost in-situ system for continuous multi-person fever screening
Ettehadieh Systematic parameter optimization and application of automated tracking in pedestrian-dominant situations
KR20240044162A (en) Hybrid unmanned store management platform based on self-supervised and multi-camera
Abdelwahab et al. Measuring “nigiwai” from pedestrian movement
KR20220154473A (en) System of peventing external intrusion using virtulal detection line in image
Singh et al. Covid-19 mask usage and social distancing in social media images: Large-scale deep learning analysis
Yadav et al. Social distancing detector using deep learning
Gandomkar et al. Method to classify elderly subjects as fallers and non‐fallers based on gait energy image
CN111723616B (en) Personnel correlation measurement method and device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22733903

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22733903

Country of ref document: EP

Kind code of ref document: A1