WO2024015238A1 - Smart target co-witnessing hit attribution system and method - Google Patents
Smart target co-witnessing hit attribution system and method Download PDFInfo
- Publication number
- WO2024015238A1 WO2024015238A1 PCT/US2023/026913 US2023026913W WO2024015238A1 WO 2024015238 A1 WO2024015238 A1 WO 2024015238A1 US 2023026913 W US2023026913 W US 2023026913W WO 2024015238 A1 WO2024015238 A1 WO 2024015238A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- projectile
- target
- impact
- firing
- reality
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims description 92
- 238000010304 firing Methods 0.000 claims abstract description 247
- 230000006854 communication Effects 0.000 claims abstract description 40
- 238000004891 communication Methods 0.000 claims abstract description 40
- 230000003993 interaction Effects 0.000 claims abstract description 9
- 230000004438 eyesight Effects 0.000 claims abstract description 5
- 238000001514 detection method Methods 0.000 claims description 60
- 230000005855 radiation Effects 0.000 claims description 38
- 230000036541 health Effects 0.000 claims description 33
- 230000004044 response Effects 0.000 claims description 17
- 230000033001 locomotion Effects 0.000 claims description 14
- 239000004753 textile Substances 0.000 claims description 8
- 230000001953 sensory effect Effects 0.000 claims description 7
- 230000005540 biological transmission Effects 0.000 claims description 5
- 230000003116 impacting effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 29
- 230000000875 corresponding effect Effects 0.000 description 14
- 241000273930 Brevoortia tyrannus Species 0.000 description 13
- 238000012790 confirmation Methods 0.000 description 13
- 238000010801 machine learning Methods 0.000 description 13
- 238000001914 filtration Methods 0.000 description 12
- 238000003860 storage Methods 0.000 description 10
- 230000004888 barrier function Effects 0.000 description 9
- 230000008859 change Effects 0.000 description 9
- 238000012545 processing Methods 0.000 description 8
- 230000009471 action Effects 0.000 description 7
- 230000003190 augmentative effect Effects 0.000 description 7
- 239000003086 colorant Substances 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 239000011521 glass Substances 0.000 description 7
- 230000000670 limiting effect Effects 0.000 description 7
- 230000008901 benefit Effects 0.000 description 6
- 238000005259 measurement Methods 0.000 description 6
- 238000012360 testing method Methods 0.000 description 6
- ZPUCINDJVBIVPJ-LJISPDSOSA-N cocaine Chemical compound O([C@H]1C[C@@H]2CC[C@@H](N2C)[C@H]1C(=O)OC)C(=O)C1=CC=CC=C1 ZPUCINDJVBIVPJ-LJISPDSOSA-N 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 5
- 239000004744 fabric Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 5
- 238000012806 monitoring device Methods 0.000 description 5
- 230000006855 networking Effects 0.000 description 5
- 230000009467 reduction Effects 0.000 description 5
- 238000012549 training Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 210000003128 head Anatomy 0.000 description 4
- 230000002829 reductive effect Effects 0.000 description 4
- 230000001960 triggered effect Effects 0.000 description 4
- 208000010378 Pulmonary Embolism Diseases 0.000 description 3
- 238000005422 blasting Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 3
- 230000001276 controlling effect Effects 0.000 description 3
- 230000002596 correlated effect Effects 0.000 description 3
- 230000034994 death Effects 0.000 description 3
- 231100000517 death Toxicity 0.000 description 3
- 230000000994 depressogenic effect Effects 0.000 description 3
- 238000004880 explosion Methods 0.000 description 3
- 239000000463 material Substances 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 241001061257 Emmelichthyidae Species 0.000 description 2
- 210000004556 brain Anatomy 0.000 description 2
- 230000007177 brain activity Effects 0.000 description 2
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000001934 delay Effects 0.000 description 2
- 239000002360 explosive Substances 0.000 description 2
- 230000035876 healing Effects 0.000 description 2
- 230000004807 localization Effects 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000012544 monitoring process Methods 0.000 description 2
- 230000007935 neutral effect Effects 0.000 description 2
- 230000004297 night vision Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000037081 physical activity Effects 0.000 description 2
- 230000001766 physiological effect Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 230000000007 visual effect Effects 0.000 description 2
- BDEDPKFUFGCVCJ-UHFFFAOYSA-N 3,6-dihydroxy-8,8-dimethyl-1-oxo-3,4,7,9-tetrahydrocyclopenta[h]isochromene-5-carbaldehyde Chemical compound O=C1OC(O)CC(C(C=O)=C2O)=C1C1=C2CC(C)(C)C1 BDEDPKFUFGCVCJ-UHFFFAOYSA-N 0.000 description 1
- 241001061260 Emmelichthys struhsakeri Species 0.000 description 1
- 206010041347 Somnambulism Diseases 0.000 description 1
- 230000003213 activating effect Effects 0.000 description 1
- 230000004913 activation Effects 0.000 description 1
- 239000008186 active pharmaceutical agent Substances 0.000 description 1
- 230000003466 anti-cipated effect Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000036461 convulsion Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 239000009206 extralife Substances 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000001879 gelation Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000007654 immersion Methods 0.000 description 1
- 230000008676 import Effects 0.000 description 1
- 238000010348 incorporation Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000036961 partial effect Effects 0.000 description 1
- 230000003863 physical function Effects 0.000 description 1
- 230000006461 physiological response Effects 0.000 description 1
- 239000000700 radioactive tracer Substances 0.000 description 1
- 238000002310 reflectometry Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 239000005336 safety glass Substances 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 238000010187 selection method Methods 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 238000004904 shortening Methods 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 239000007858 starting material Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000004083 survival effect Effects 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
- 230000003442 weekly effect Effects 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/212—Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/213—Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/217—Input arrangements for video game devices characterised by their sensors, purposes or types using environment-related information, i.e. information generated otherwise than by the player, e.g. ambient temperature or humidity
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/24—Constructional details thereof, e.g. game controllers with detachable joystick handles
- A63F13/245—Constructional details thereof, e.g. game controllers with detachable joystick handles specially adapted to a particular type of game, e.g. steering wheels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/837—Shooting of targets
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J5/00—Target indicating systems; Target-hit or score detecting systems
- F41J5/06—Acoustic hit-indicating systems, i.e. detecting of shock waves
-
- F—MECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
- F41—WEAPONS
- F41J—TARGETS; TARGET RANGES; BULLET CATCHERS
- F41J5/00—Target indicating systems; Target-hit or score detecting systems
- F41J5/08—Infrared hit-indicating systems
Definitions
- the present invention relates generally to projectile-firing gaming systems and more particularly to a smart target co-wi tnessing hit attribution system and method.
- the present invention includes “video gamifying” a real life paintball/gel blaster/combat experience using augmented reality.
- the present invention in a preferred embodiment includes a system of wirelessly interconnected blasters, targets, wearables, and “battlefield” objects that allow players to engage in precisely tracked and managed projectile sports and games, resulting in an experience of “Video Games Outside”.
- This system contrasts with projectile-only sports such as airsoft and paintball, which provide tactile feedback but lack automatic score tracking and other digital features. It also contrasts with laser tag or non-proj ectile-firing sports, which may provide real-time access and control over scoring, game modes, weapon features, etc., but lack tactile feedback, and has to make use of non-intuitive and mostly invisible light physics.
- the present invention includes a system utilizing visible projectile physics and methodology to keep track of score and other gameplay features.
- the present invention includes digitizing an outdoor/indoor blaster gameplay experience thereby providing a mixed reality video game experience through the use of augmented reality (AR), virtual reality (VR), and/or extended-reality (XR) eyewear (goggles, glasses, helmet, etc.) while also including blasters that shoot or fire actual projectiles.
- AR augmented reality
- VR virtual reality
- XR extended-reality
- the projectiles cause the human target a sufficient amount of sensation such that it makes the expenence engaging.
- the use of the blaster shown in the accompanying drawings is only exemplary and not limiting. Any type of blaster, gun, etc., that shoots projectiles (e.g., projectile-firing device) is within the scope of the present invention.
- the AR/VR/XR eyewear and/or other wearable items such as vests, pants, belts, shoes, uniforms, jerseys, suits, and the like and the connected system (including all software, programming, architecture, sensors, wireless connectivity, networking, tracking codes, trackers, etc.) allows the participants to integrate video game type elements and graphics (e.g., having a life bar, different types of ammunition, levels, etc.) within a blaster game, experience or training.
- the present invention includes a vest/jersey or other wearable that can sense or determine when a player is hit or struck by a projectile (in an embodiment, at least within certain areas on the outer surface of the wearable).
- the wearable and connected system can sense or log a hit from the front or the back.
- the wearable can log or determine the strength, level or intensity of the impact of the projectile.
- the application, or a combination of the wearable and the application can log or determine the strength, level or intensity of the impact of the projectile. Therefore, when a player (or the wearable) is struck from a closer range the player loses a greater amount of health points than from a further range.
- the system can determine which other participant or player shot the player that is hit. This may be done using a location and direction solution in each blaster, e.g., ultra-wide band (UWB) sensors, magnetometer, GPS, WiFi, etc., combined with knowing who is shooting at any given time (so the system can know or determine the approximate time delay between a player firing and when the wearable is hit).
- the wearable may include buttons or other switches that when pushed or switched cause actions to be taken within the game whether in the AR/VR/XR/virtual environment or in the physical world. For example, pushing a button or touching a predetermined location on the wearable or on a strap or other worn item may cause the player's health points or level to increase after they were previously earned. Any gaming event can be associated with the switch, button or the like.
- a software application (“app” or “application”) is configured to be usable by all players or participants.
- the app can provide many different features, including app/game-controlled rate of fire on the blaster.
- the rate of fire feature (and any other game or invention related feature described herein) can be changeable or upgradeable based on gameplay and other predetermined features. For example, if the player visits a predetermined location and finds a upgraded weapon gun or the like or plays the game for a predetermined length of time, the player may earn a higher rate of fire. If a player has a better or higher level weapon in the game, the player has the ability to cause more damage faster to other players wearing the system connected wearables.
- the system may include a system-on-chip (SOC) or system-on-module in or on a module connectable to the blaster that connects to the app wirelessly (e.g., via Bluetooth).
- the app and system can then control the rate of fire of the blaster.
- This feature allows for a single blaster to represent many different potential digital weapons in the “game.” For example, a starter weapon might only allow for one shot per second in semi-auto mode and require a four second reload time after ten shots, while the top-end weapon in the game allows for ten shots per second in fully automatic mode and does not impose any “digital reload/time requirements.” As a result, a player would only need to physically reload the blaster every time the physical round hopper is emptied.
- the system may allow the ability to download different environments (e.g., AR/VR/XR environments) that can be used during gameplay.
- the environments can include jungles, old west, ships, snow, cities, historic, world monuments or movie, comic book and other related themes (e.g., Star Wars, Marvel, Harry Potter, etc.).
- AR/VR/XR system Any type of AR/VR/XR system or eyewear is within the scope of the present invention.
- the system may connect or combine with Apple’s (or others') AR glasses.
- AR/VR/XR eyewear and the connection to the system, app, etc. may provide for abilities or features for players or participants.
- the eyewear may provide the ability to see where your teammates are through walls, around buildings, etc.
- the eyewear may provide the ability to see the health being lost by another participant as you shoot them, and/or see your own health stats at all times or as you are hit.
- Motion tracking or sensing components, sensors or reflectors on the blaster or other items allow the game to replace the look of the physical blaster or other item in the players hand.
- the gun will graphically change on the screen (e.g., similar to a first person shooter game).
- the virtual gun show n in AR/VR/XR on the eyewear can change from a pistol to a machine gun when the player upgrades to that gun.
- the eyewear preferably provides the ability to see and go get supply drops (e.g., weapons, health kit, shield, potions, etc.) in the physical world (e.g., around the yard, playing area, arena, etc.).
- Accelerometers, gyroscopes or other motion sensing devices can be included on or within the blaster or other physical item to track movement that can be digitized and displayed on the AR/VR/XR eyewear or on the app.
- the battlefield environment provides inflatable or other barricades that are associated with the game (and may be wirelessly connected within the game or gaming session) look like other objects in the game that a player is hiding behind.
- an inflatable wall may look like a rock wall in AR/VR/XR.
- QR codes can be used on the barricades and any other object within the game for tracking purposes.
- the eyewear preferably shows different “skin” or outfit in the game based on your team, your personal preferences, what skins you own or have earned, etc.
- the wearables may include QR codes or other motion trackers for improved tracking performance.
- the AR/VR/XR eyewear provides the ability to find virtual shields, first aid kits, better weapons, ammo, treasures, extra lives, etc. at predetermined locations (e.g., laying on the ground, on a barricade or other object). For example, the player sees an object, such as an ammunition magazine in the eyewear at a distance and has to move to the predetermined location to obtain the magazine. Once the system determines the player is at the proper location (e.g., via UWB, magnetometer, GPS, WiFi, etc.), the player obtains the extra ammunition for use in the game.
- predetermined locations e.g., laying on the ground, on a barricade or other object.
- the player sees an object, such as an ammunition magazine in the eyewear at a distance and has to move to the predetermined location to obtain the magazine.
- the system determines the player is at the proper location (e.g., via UWB, magnetometer, GPS, WiFi, etc.), the player obtains the extra ammunition for use
- the system preferably provides the ability to create or have a shield around the player so that another player cannot shoot them (the other player's gun may not be able to fire). This can be done via location services and magnetometer data to know someone is pointing at a shielded area and then prevent their blaster from firing.
- a mobile device such as an iPhone
- This can be a version of the system that does not include AR/VR/XR eyewear.
- Many features described herein can be done without the eyewear.
- the system connection between the app via a mobile device on and/or connected to the blaster
- Any or all of the features described herein can be used together to provide a software application experience without the use of AR/VR/XR eyewear.
- the system may include an accompanying PC/Mobile/Console game.
- the applicant may partner with a game like Fortnite or one or more video games can be created or developed that accompany or can be used with the system.
- Any of the items described herein can be sold or provided as a kit or unit.
- a gameplay kit can include a number of blasters, inflatable bases and/or barricades, grenades, ammunition, magazines, AR/VR/XR eyewear, dow nloadable app and/or game that are all sold as a unit.
- Modules that include components that allow connectivity to or within the system can also be sold or provided. The modules can connect or otherwise attach to items such as blasters.
- a module can be added to a blaster and a mobile device with software (e.g., a phone with an app) can be secured to the module such that the blaster can be used within the system.
- the blaster can include the software and/or app and the necessary connectivity, etc. built therein.
- Players may earn experience points (XP) in the system and game by playing outside and entering physical world tournaments.
- XP can be earned based on length of playing time or based on points earned on better or worse game play.
- An embodiment can include remotely controlled blasters that are included at each teams' base that can be controlled by someone in a different location via the game/system/app (camera and computer controlled firing versus the opposing team).
- An embodiment can include players remotely piloting drones that shoot projectiles (e.g., Gel Blaster cannons) or other shooting device and that can attack the other team remotely utilizing a computer or mobile device.
- projectiles e.g., Gel Blaster cannons
- the system can include features, components or elements for collecting and/or monitoring health data.
- the sy stem can connect or otherwise be associated with Apple HealthKit, Google Health or other systems to track calories burned while playing, monitor heart rate or determine other health data.
- the calories burned and other health data can be used for gamification.
- the system can include daily, weekly and/or monthly contests for longest/highest game play and calories burned to earn XP. The digitization of the game play experience allows for community tournaments, neighborhood tournaments, teams, monthly contests, leaderboards, etc.
- the system includes the ability to broadcast any of the gameplay via cameras mounted on the players or their equipment (e.g., the eyewear) or by connecting to the feed that is broadcast on any or all of the player's AR/VR/XR eyewear or the app.
- This allows games to be broadcast or streamed on television, computers. Twitch, Youtube, etc., thereby allowing for a sport, television show, channel, etc. to be based around gameplay, tournaments, etc.
- the system can also include remotely operated explosives (e.g., a C4 looking Gel Ball bomb) and inflatable smart forts for each team.
- the forts or other location/ structure may include the ability for the other team to "defeat” or finish the fort.
- the fort can include sensors or the like so that when the other team shoots a predetermined location or area on the fort (e.g.,. a “death star hole”) enough times it will cause the inflatable fort/base to deflate.
- the system may include built in speed loading of gellets or other projectiles.
- the system can include the ability to digitally dispense or load ammunition.
- a smart ammo dispenser may be based on predetermined parameters within the gameplay. For example, a game may be set where play ers can only refill a certain number of times during the length of the game and after they have reached the limit the system prevents the player from reloading (e.g., locking the canister, stopping the blaster from working, etc.). The right to refill may be earned and the blaster may begin working again.
- the module or other digitally related components of the present invention may be connectable to the power of the blaster.
- the blaster may include leads or a jack that provides power to the external item (such as a module).
- the external item such as a module
- the wearable can also include infrared (IR) reflectivity that works or is operatively connected to an IR reader on gun so that the system can sense what player is shooting another player and to provide other capabilities.
- IR infrared
- the system can also provide connection so that other powered accessories (e g., projectile velocitychanger) on the blaster or the barrel thereof can be operated. For example, if a player earns a higher rate of fire or a higher velocity during gameplay, the system can provide instructions (via connectivity) to move or change something mechanically or physically on the gun (e.g., slide a sleeve) to provide the higher velocity projectile firing.
- other powered accessories e e., projectile velocitychanger
- the system can provide instructions (via connectivity) to move or change something mechanically or physically on the gun (e.g., slide a sleeve) to provide the higher velocity projectile firing.
- the system may include projectile firing devices, such as sniper rifles, that only work in certain areas or predetermined locations in the field of play. For example, the player may earn a "large sniper rifle" within the game, but the rifle can only be used from a designated location (e.g., a certain hill). In use, based on the system sensing the location of the player the rifle cannot be fired unless the player is in a predetermined location (e.g., on the hill).
- projectile firing devices such as sniper rifles
- the present invention allows or provides for the incorporation of elements of arena survival video games into a mixed- reality AR/VR/XR environment.
- the system includes guns/blasters that include the capability of connecting to a mobile device that includes an app thereon.
- the system includes the ability for players or users to “level up” based on outside playing time, etc. Leveling up can include actions such as the ability to obtain better equipment (e.g., guns, ammunition, etc.), and other aspects desired by the player. This can also utilize or connect to exercise related apps or devices (e.g., Apple Watch, Fitbit, etc ).
- the system includes wearables that have sensors or the like thereon that can determine when it is struck. This may include capacitive indicator(s) on the wearable and communication/ connectivity to a processor that provides feedback on AR eyewear/glasses or the equivalent.
- the vest or other armor can also provide sensing of the proximity of a shooter based on impact of the projectile.
- the blaster includes one or more holes or openings that are auto-adjusted to modify feet-per-second of gel and communication/ connectivity to AR interface for different types and levels of blasters.
- the system includes Bluetooth or other wireless connectivity between elements of the system that are not physically connected (i.e., vest, glasses, blaster, etc.). Any of the elements or components of the system described herein can be wirelessly connected (or wired). Any of the physical items, e.g., barricades, magazines, forts, bases, wearables, blasters, eyewear and the like can include connectivity to the system.
- the system includes location or other proximity based features (e.g., so a play er can see the location of teammates, for providing higher assessed damage based on how close the shooter is to the target, the type of blaster).
- the system includes a heads-up-display in AR glasses with mixed reality to accommodate data from vest/blaster and other connected components while engaging with the real, physical environment simultaneously. It will be appreciated that the system can also be utilized for applications in law enforcement, military, training, hunting and other shooting or physical scenarios.
- the present invention generally includes a smart target co-witnessing hit attribution system and method.
- the present invention includes providing accurate counting of proj ectile hits on a smart target integrated with an application to keep track of successful hits in order to provide a more rich, immersive, and accurate gaming or shooting experience.
- the smart target sy stem is a stationary, modular device that uses piezoelectric sensors to detect and positively identify impact strikes from gellets (or other projectiles) while filtering out other types of impacts.
- the smart target includes wireless networking technology to communicate over a mesh network using a particular communications protocol to effectively keep "real-time" or near-real time track of successful hits on the smart target.
- the smart target is a gameplay element in a projectile-firing gaming system that may be affixed, embedded, or modular with wearables, shields, drones, turrets, or other physical objects. Anything that is a physical object within a gameplay environment that a projectile is intended to impact may include a smart target.
- the smart target preferably includes a piezoelectric (PE) sensor configured to determine an impact of a particular projectile upon (or near) the smart target.
- PE piezoelectric
- a PE sensor is an electromechanical device that uses the piezoelectric effect to measure changes in pressure, strain, force (or other physiological effects) by converting them to an electrical charge. In smart target parlance, the PE sensor is utilized for hit detection.
- a smart target that includes a housing for receiving at least two sensors configured to co-witness an impact of an object at or near a predetermined distance from a predetermined location, a first sensor of the at least two sensors configured to detect an impact of an object at or near a predetermined distance from a predetermined location and generate a first sensor signal corresponding to the detected impact, a second sensor of the at least two sensors configured to receive information relating to the detected impact and generate a second sensor signal corresponding to the detected impact, and control circuitry configured to receive the first sensor signal and the second sensor signal, confirm whether the first sensor and the second sensor signals have co-witnessed the same impact of the object by determining whether they were received within a predetermined period of time, and provide a target state update via a network in accordance with at least one protocol.
- the control circuitry may be configured to confirm, from the first sensor signal, whether the object includes a projectile based on a trained model configured to recognize impacts of projectiles, and generate projectile impact data when the object is confirmed to be a projectile, wherein the target state update includes the projectile impact data.
- the second sensor signal may include one of an infrared signal from an infrared sensor, a brain-computer interface signal from an electroencephalogram sensor, a computer-vision signal from a computer-vision system, a radio-frequency signal from a radio frequency sensor, or a fabric-deformation signal from a smart textile sensor.
- a smart target that includes a first target location, a first piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the first target location and generate a first piezoelectric sensor signal corresponding to the detected impact, a first infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a first infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the first target location when the object is determined to be a projectile by comparing the first piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a first predetermined period of time, generate first target location state data indicating that the first target location was successfully impacted by the projectile, and transmit the first target location state data via a network in accordance with at least one protocol.
- the first piezoelectric sensor signal may be filtered before it is compared with the piezoelectric impact signal model.
- the first infrared sensor signal may include an angle of detection and an intensity of infrared radiation.
- the first piezoelectric sensor may be further configured to detect the impact of the object on or within a predetermined distance from a second target location.
- the smart target may further include a second target location, a second piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the second target location and generate a second piezoelectric sensor signal corresponding to the detected impact, and a second infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a second infrared sensor signal.
- the control circuitry may be further configured to confirm a successful projectile impact on the second target location when the object is determined to be a projectile by comparing the second piezoelectric sensor signal with the piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within the first predetermined period of time, generate second target location state data indicating that the second target location was successfully impacted by the projectile, and transmit the second target location state data via a network in accordance with at least one protocol.
- the smart target may include a frame configured to receive the first target location.
- the first target location may be located on a wearable device.
- the first target location state data may be configured to be received by an extended-reality gaming application configured to control the projectile-firing device and the wearable device in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectilefiring device, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, update gaming metric data to indicate a successful impact of the fired projectile on the first target location and attribute a successful hit to the projectile-firing device on the first target location when the time the impact occurred on the first target location and the time the projectile-finng device was fired occur within a second predetermined period of time.
- the gaining metric data may be updated to indicate the successful impact of the fired projectile on the first target location and the successful hit is attributed to the projectilefiring device on the first target location when the location of the projectile-firing device is within a predetermined range from the first target location and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first target location.
- the updated gaming metric data may be configured to be displayed to a user.
- a smart target co-witnessing hit attribution system that includes a network configured to provide communication and control connectivity in accordance with at least one protocol, a projectile-firing device comprising a projectile repository configured to carry projectiles for firing and an infrared emitter configured to emit infrared radiation at a predetermined wavelength and beamwidth, a smart target that includes a piezoelectric sensor configured to detect an impact of an object at or near a predetermined distance from a predetermined location relative to the smart target and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the smart target when the object is determined to be a projectile by comparing the piezoelectric signal with a predetermined piezoelectric impact signal model trained to detect impact of the projectile and the impact of the object and the detection of the
- the gaming metric data may be updated to indicate the successful impact of the fired projectile on the smart target and the successful hit is attributed to the projectile-firing device on the smart target when the location of the projectile-firing device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the smart target.
- the network may be a target mesh network configured to transmit messages including the target state data to the extended-reality gaming application.
- the at least one protocol may be configured to effectuate transmission and reception of data to nodes within the network using hexadecimal values
- the hexadecimal values may specify data concerning a status of the target.
- the hexadecimal values specify data concerning a status of a user.
- a smart target co-witnessing hit attribution method A first sensor signal is received corresponding to detection of an impact of an object at or near a predetermined distance from a predetermined location relative to a first smart target location, a second sensor signal relating to the detected impact is received; the first sensor signal and the second sensor signal are determine to be received within a first predetermined period of time, and a target state update is provided via a network in accordance with at least one protocol.
- the object comprises a projectile based on a trained model configured to recognize impacts of projectiles, and projectile impact data is generated, wherein the target state update includes the projectile impact data.
- the method may further include wherein the target state update is received, projectilefiring device data from a projectile-firing device is received, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, and gaming metric data is updated to indicate a successful impact of the fired projectile on the first smart target location and attribute a successful hit to the projectile-firing device on the first smart target location when the time the impact occurred on or near the first smart target location and the time the proj ectile- firing device was fired occur within a second predetermined period of time.
- the gaming metric data may be updated to indicate the successful impact of the fired projectile on the first smart target location and the successful hit is attributed to the projectile-firing device on the first smart target location when the location of the projectilefiring device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first smart target location.
- an extended- reality projectile-firing gaming system includes a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application, and an extended-reality device.
- the extended-reality gaming application is configured to control the projectile-firing device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, and receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object.
- the extended-reality device includes at least one display and is configured to provide computer-aided vision of real and virtual objects and a computer- aided interface for interaction with the real and virtual objects, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
- the computer-aided interface may include voice-activation control configured to select one or more of the real and virtual objects and to provide the commands or instructions.
- the computer-aided interface may include an eye sensor configured to monitor eye contact with the real and virtual objects.
- the commands or instructions may be selectable using a control input when the eye contact is aligned with one of the real and virtual objects.
- the eye sensor may identify a selection of the commands or instructions associated with the real and virtual obj ects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period.
- the eye sensor may identify a selection of the commands or instructions associated with the real and virtual objects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period but does not exceed a second predetermined time period.
- the control input may be voice-activated.
- the control input may include at least one sensor configured to monitor hand movements and the commands or instructions are selectable based on the monitored hand movements.
- the control input may be a remote device.
- the remote device is coupled to the projectilefiring device.
- the remote device may be configured to receive data via the network from at least one of the projectile-firing device and the battlefield object.
- the remote device may be configured to be affixed to the projectile-firing device and provide at least one of nearfield communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data to the extended-reality gaming application.
- the real and virtual objects may be displayed in response to the projectile-firing device data.
- the real and virtual objects may be displayed in response to the battlefield object data.
- the gameplay configuration may include information corresponding to at least one of a game mode, a health meter, a shield meter, a projectile tally, a projectile repository tally, a rate of fire, a reload interval, a virtual power up, a success tally, a failure tally, a success to failure ratio, an alert, a battlefield role, a team affiliation, a battlefield objective, and a set of battlefield parameters.
- the type of the projectile-firing device may be selectable through the computer-aided interface.
- the commands and instructions may include one or more of an amount of projectiles available for firing, a predetermined velocity of the projectiles for firing, a predetermined rate of fire of the projectiles, and a predetermined reload interval of the amount of projectiles.
- the player may be selectable through the computer-aided interface and the extended- reality gaming application may facilitate communication with the player.
- the virtual objects may be sharable through the computer-aided interface with a player.
- the extended- reality device may be configured to display a prompt to provide at least one of the commands or instructions.
- the virtual objects may include a menu of selectable inputs corresponding to the commands or instructions.
- the menu of selectable inputs may be a drop-down menu.
- the virtual objects may depict an application configured to be executed on the extended- reality projectile-firing gaming system.
- the extended-reality gaming application may be further configured to update gaming metric data to indicate a successful impact of the fired projectile on the battlefield object and a successful hit by the projectile-firing device on the battlefield object when a time the impact occurred on the battlefield object and a time the projectile was fired occur within a predetermined time period, wherein the predetermined time period is selected to confirm that the projectile-firing device fired the fired projectile that was detected as impacting the battlefield object, and wherein the updated gaming metric data is displayed on the extended-reality device and configured to modify the real and virtual objects in real or near real-time.
- the updated gaming metric data may be further configured to receive a location and orientation of the projectile-firing device, wherein the extended-reality gaming application records a successful impact of the fired projectile on the battlefield object and a successfully fired projectile by the projectile-firing device associated with the battlefield object when the location of the projectile-firing device is within a predetermined range from the battlefield object and the orientation of the projectile-firing device is aligned at a predetermined area associated with the battlefield object.
- the battlefield object may include a smart target comprising a target location, a piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the target location and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, and control circuitry configured to confirm a successful projectile impact on the target location when the object is determined to be a projectile by comparing the piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a predetermined period of time, generate target location state data indicating that the target location was successfully impacted by the projectile, and transmit the target location state data via the network in accordance with at least one protocol, wherein the extended- reality gaming application is configured to modify the real and virtual obj ects in real or near real-time in response to
- an extended- reality projectile-firing gaming system that includes a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application configured to control the projectile-firing device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object, and a computer-aided interface for interaction with gameplay control options associated with the gameplay configuration and continually updated in real or near real-time during the gaming session, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
- the commands or instructions may be received using voice
- FIG. 1 is a diagram of an extended-reality projectile-firing gaming system in accordance with a preferred embodiment of the present invention
- FIG. 2 is a diagram of a projectile-firing device in accordance with a preferred embodiment of the present invention
- FIG. 3A is a perspective view of a standalone target in accordance with a preferred embodiment of the present invention.
- FIG. 3B is a perspective view of a monolithic target in accordance with a preferred embodiment of the present invention.
- FIG. 4 is a perspective view of a battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention
- FIG. 5 is a perspective view of a mixed-reality battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention
- FIG. 6 is a top-down view of a battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention
- FIG. 7 is a flow diagram of an extended-reality proj ectile-finng gaming method in accordance with a preferred embodiment of the present invention.
- FIG. 8 is a flow diagram of an extended-reality proj ectile-finng gaming method in accordance with a preferred embodiment of the present invention.
- FIG. 9 is a perspective view of one or more smart targets in accordance with a preferred embodiment of the present invention.
- FIG. 10 is a perspective view of a wearable device including smart targets in accordance with a preferred embodiment of the present invention.
- FIG. 11 is a smart blaster in accordance with a preferred embodiment of the present invention
- FIG. 12 is a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention
- FIG. 13-1 depicts a first portion of a schematic diagram of a circuit for a smart target cowitnessing hit attribution system in accordance with a preferred embodiment of the present invention
- FIG. 13-2 depicts a second portion of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention
- FIG. 13-3 depicts a second portion of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention
- FIG. 14A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 14B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 15A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 15B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 15C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 16A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 16B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention
- FIG. 16C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 17A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 17B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 17C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention.
- FIG. 18 is a table of a message protocol in accordance with a preferred embodiment of the present invention.
- FIG. 19 is a table of smart target states in accordance with a preferred embodiment of the present invention.
- FIG. 20 is a table of a message protocol in accordance with a preferred embodiment of the present invention.
- FIG. 21 A is a table of smart target states in accordance with a preferred embodiment of the present invention.
- FIG. 21 B is a table of smart target modes in accordance with a preferred embodiment of the present invention.
- FIG. 22A is a first portion of a flow diagram of a smart target co-witnessing hit attribution method in accordance with a preferred embodiment of the present invention.
- FIG. 22B is a second portion of a flow diagram of a smart target co-witnessing hit attribution method in accordance with a preferred embodiment of the present invention.
- FIG. 23 is a view of an extended-reality device in accordance with a preferred embodiment of the present invention.
- Like numerals refer to like parts throughout the several views of the drawings.
- references in this specification to "one embodiment,” “an embodiment,” “a preferred embodiment” or any other phrase mentioning the word “embodiment” means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the-disclosure and also means that any particular feature, structure, or characteristic described in connection with one embodiment can be included in any embodiment or can be omitted or excluded from any embodiment
- the appearances of the phrase “in one embodiment” in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
- various features are described which may be exhibited by some embodiments and not by others and may be omitted from any embodiment.
- any particular feature, structure, or characteristic described herein may be optional.
- the extended-reality projectile-firing gaming system generally includes a system of wirelessly interconnected blasters, targets, wearables, and “battlefield” objects that allow players to engage in precisely tracked and managed projectile sports and games, resulting in an experience of “Video Games Outside”. As shown in FIGS. 1-6, the extended-reality projectile-firing gaming system generally includes a system of wirelessly interconnected blasters, targets, wearables, and “battlefield” objects that allow players to engage in precisely tracked and managed projectile sports and games, resulting in an experience of “Video Games Outside”. As shown in FIGS.
- extended-reality projectile-finng gaming methods are shown, further illustrating preferred methods for attributing scoring to particular players or users of the extended-reality projectile-firing gaming system.
- FIGS. 9-24 a smart target co-witnessing hit attribution system and method are shown, illustrating particular embodiments of accurately determining “hits” and correctly attributing those hits to a particular projectile-firing device.
- FIG. 1 is a diagram of an extended-reality projectile-firing gaming system 100 in accordance with a preferred embodiment of the present invention.
- FIG. 1 depicts two users utilizing a network 102 in connection with the system 100.
- the system 100 includes a projectile-firing device 104, an extended-reality device 105, a wearable device 106, a standalone target 108, a drone 109, a cloud-based computing system 110, a remote device 112, and a fitness monitoring device 114.
- the network protocol may include Bluetooth, WiFi, radio frequency (RF), or other suitable network protocol.
- the network 102 is configured to utilize one or more network protocols to facilitate communication by and between the various components of the system 100.
- the network 102 may consist of any network type, including but not limited to a local area network (LAN), wide area network (WAN), and/or the internet.
- LAN local area network
- WAN wide area network
- internet the internet
- FIG. 2 is a diagram of a projectile-firing device 102 in accordance with a preferred embodiment of the present invention.
- the projectile-firing device 102 in a preferred embodiment, includes a housing 120, a handle 122, a trigger 124, a battery pack 126 (internal to the handle 122 in FIG.
- the housing 120 preferably is configured to receive the handle 122, the trigger 124, the battery pack 126, the muzzle 128, and the hopper 130.
- the housing 120 is also preferably configured to embody electromechanical apparatus to receive commands and/or instructions via the network 102, feed projectiles from the hopper 130 to be fired by the projectile-firing device 104, and mechanically fire one or more projectiles from the muzzle
- the electromechanical apparatus may include a gearbox configured to load and fire gellets, small biodegradable projectiles configured to be fired from the muzzle 128.
- the handle 122 is configured to receive the trigger 124 and used by the user 116, 118 to depress the trigger 124 to fire the projectiles from the muzzle 128.
- the device 104 may be configured to fire a single projectile, or multiple projectiles (e g., automatic fire mode).
- the device 104 includes a switch 133 to change the fire mode from single fire to automatic fire modes.
- the device 104 includes a display 152 (shown in FIG.
- the fire mode is configured to fire three projectiles upon the trigger being depressed once by the user 1 16, 118.
- the fire mode may also be controlled by the application via the network 102 to vary the fire mode depending upon the gaming configuration in accordance with a gaming session.
- the muzzle 128 is configured to permit the projectile from exiting the projectile-firing device 104.
- the projectile-firing device in a preferred embodiment, is configured to vary the velocity of a projectile from 90 feet-per-second (FPS) to 170 FPS. In other embodiments, the velocity is configured to reach 200 FPS or greater.
- the projectile-firing device receives commands and/or instructions from the application via the network 102 to electromechanically adjust the velocity at which the projectile-firing device 104 is fired. The velocity may be adjusted electromechanically through a variety of methodologies, all within the scope of the present invention.
- the electromechanical functionality of the projectile-firing device 104 includes a barrel 129 (shown internally with dashed lines) to be inserted into the muzzle 128 and/or muzzle area.
- the barrel 129 may be configured to increase or decrease a
- a first barrel 129 is configured to achieve a muzzle velocity of 170 FPS, while another barrel 129 is configured to achieve a muzzle velocity of 90 FPS.
- a barrel tip 131 is preferably configured to be affixed to the muzzle 128 and/or muzzle area. The barrel tip 131 also may be configured to achieve muzzle velocities within the velocity ranges described herein.
- the barrel tip 131 may be configured to reduce the velocity from a predetermined "default" range; thus, for example, when the barrel tip 131 is not affixed to the muzzle 128, the velocity is 170 FPS, while when the barrel tip 131 is affixed to the muzzle 128, the velocity is 90 FPS.
- the barrel tip 131 may be configured to allow beginners or those desiring a less aggressive gameplay experience to lower the velocity. Still further, in any of the aforementioned embodiments, the mechanical components of the muzzle 128, the barrel 129, and the barrel tip 131 may be combined with electronically adjusted velocity increments so as to achieve FPS velocities greater or lower than the predetermined velocities of the mechanical components.
- the projectile-firing device 104 may include physically adjustable areas alongside the barrel 129 to open or close to vary the velocity mechanically. In an embodiment, the velocity is decreased when the physically adjustable areas are opened further, while the velocity is increased when the physically adjustable areas are closed further. The opening and closing of the physically adjustable areas may be controlled by the application via the network and/or internal electronic components of the projectile-firing device 104.
- blaster spring tension may be utilized, meaning that certain weapon types can be made to fire physically longer or shorter distances as determined by the system, allowing for “range” as an attribute; e.g., Sniper vs Revolver (e.g., Table 1).
- the device 104 preferably includes electronic circuitry, or the like, to provide relatively accurate position and direction of the device 104. This data preferably is utilized to provide confirmation, in connection with the application, of a "hit" by a particular user 116, 118 on another user's 116, 118 wearable device 106 or a particular standalone target 108/monolithic target 148 or other target/battlefield object.
- the device 104 may also include an IR reader so that the system 100 in connection with the application can sense that a particular device 104 has reflected IR radiation on a wearable device 106 or target for further confirmation of a "hit.”
- the battery pack 126 is configured to provide power to device 104.
- the device 104 includes electromechanical apparatus such as internal electronic circuitry configured to be powered by the battery pack 126.
- the battery pack 126 is also preferably configured to operate mechanical firing apparatus of the device 104.
- the battery pack 126 preferably is a rechargeable battery pack 126 configured to receive a USB-C input to recharge the battery pack 126.
- the battery pack 126 may be recharged by a variety of recharge methods, all within the scope of the present invention, including a mim-USB input, or the like.
- the device 104 preferably includes a recharge port 127 configured to receive a recharge input.
- the hopper 130 preferably is configured to hold a "depot" of projectiles to be fed to the electromechanical apparatus within the device 104 for firing.
- the hopper is configured to hold 100, 200, 300, 400, 500, 600, 700, 800, 1000, or greater or fewer projectiles.
- the gaming configuration in accordance with a gaming session may limit the number of projectiles held by the hopper 130.
- the hopper 130 is configured to gravity feed projectiles into the device 104.
- the projectile-firing device 104 is configured to uniquely receive instructions and/or commands from the application via the network 102 to increase or decrease electromechanical functionality of the projectile-firing device 104 when receiving "damage,” receiving points or "health,” or during a variety of other scenarios, preferably in connection with a gaming configuration.
- the physical consequences of this capability impact the first user's 116 ability to inflict "damage" to the second user 118.
- the rate of fire may be increased or decreased, a reload interval may be increased or decreased, a velocity of the device 104 may be increased or decreased, and an amount of "damage" inflicted can be varied depending on the commands and/or instructions, including the type and configuration of the gaming profile (e.g., gaming configuration) that is configured to be set during a gaming session
- a projectile-firing device 104 may be set in a configuration as a sniper rifle, a pistol, or a shotgun.
- Each of these configurations are preset with specified rates of fire, reload intervals, damage capabilities, range of fire, or the like.
- the projectile-firing device 104 set in sniper rifle mode would have a slower rate of fire, a more frequent reload interval, and a higher damage capability than the projectile-firing device set in pistol mode, for example.
- the projectile-fmng device 104 is configured to be electromechamcally controlled by the application via the network 102 such that any of the above configurable options may be varied in real- or near-real-time.
- a projectile dispenser may be based on predetermined parameters within the gameplay and may embody or be part of the hopper 130.
- a game may be set where the users 116, 118 (e.g., players) can only refill a certain number of times during the length of the game and after they have reached the limit the system prevents the player from reloading (e.g., locking the canister, stopping the blaster from working, etc.). The right to refill may be earned and the blaster may begin working again.
- a home base/capture point station 172 may be utilized that permits the users 116, 118 to refill projectiles literally (e.g., within the existing hopper 130), as well as virtually per the game mode (e.g., gaming configuration).
- the projectile-firing device 104 is configured to monitor and track the time when a projectile is fired from the device 104, and to provide this information to the application.
- the system 100 is configured to receive the time the projectile was fired and correlate the time with the time a wearable device 106 was impacted by a projectile to determine that the user 116, 118 associated with the wearable device was hit by the user 1 16, 1 18 associated with a particular projectile-firing device 104.
- This methodology achieves benefits above and beyond traditional projectile-firing game systems because it can monitor and track who is responsible for a hit, allowing for more interactive gameplay and immersion.
- the device 104 preferably includes a display configured to provide useful game information, such as hits, health, ammo, blaster “type”, team alignment, remaining players, etc.
- the device 104 preferably includes electronic circuitry and a network interface to receive data from the network 102 in accordance with the application and the gaming configuration of that particular gaming session.
- two extended-reality devices 105 are depicted, separately worn by and associated with the users 116, 118.
- the extended-reality devices 105 are configured to communicate via the network 102 with the application.
- the 105 may be augmented reality (AR), mixed-reality (MR), virtual reality (VR), or extended- reality (XR) devices configured to receive and display data via the network 102 to the users 116, 118.
- the extended-reality device 105 is configured to enhance game modes and game attributes by mixed and/or augmented reality.
- Any suitable head-mounted display including those of Meta Quest, Nreal, or Apple Vision Pro, now and in the future may be utilized to display relevant game information in heads-up-display (HUD) style, allowing the users 11 , 118 to track gameplay configurations, statistics, etc., in a video-game-like manner.
- HUD heads-up-display
- the extended-reality device 105 includes a HUD display 142 configured to display current health, ammunition (i.e., projectiles), weapon/blaster/projectile-firing device 104 type, shields, and the like.
- the HUD display 142 may also display "kills,” “deaths,” and a "kill/death” and/or success/failure ratio, including a ranking of users taking part in the gaming session.
- the remaining team members and opponents are displayed.
- alerts, game start/end, timers, and special game states are displayed.
- the accuracy of the users 116, 118 may also be displayed as a shots/hits ratio.
- the HUD display 142 may also include display of the current objectives, game modes, progress, capture/flag/base status/health in accordance with the gameplay configuration.
- the gameplay configuration may utilize battlefield roles (e.g., medic, engineer, tank, sniper, etc ). These roles may include specialized perks and restnctions that may be displayed to the users 116, 118 via the HUD display 142.
- Each of these displayable items may also utilize real-time or near-real-time virtual overlays on the battlefield environment displayed through the HUD display 142.
- health and shield bars, team affiliation, badges, status, role, username, etc. can appear over the users' 116, 118 heads.
- Battlefield objectives can be highlighted in the distance, and capture or base areas can be defined and shown virtually, object pickups can be described and highlighted, weapon skins can be virtualized and overlaid on the device 104, and team colors, player skins, and other aesthetics can be mapped to human users 116, 118 in realtime or near-real-time.
- the HUD display 142 may also display zones of special status, such as neutral or safe, which can be virtually displayed on the battlefield environment, and hits, shield depletion, firing, etc. can be displayed as real-time or near-real-time virtual animations.
- Battlefield objects, skins, the environment, etc. can be themed to match game mode, user preference, and/or brand tie-ins, such as the Marvel Universe, horror, or Sci-Fi themes.
- the application can restrict or unlock certain themes, nomenclatures, modes, etc., based upon player progress, preference, or parental controls. For example, realistic weapon terminology, violent descriptions such as “killed,” or “bloody” animation effects can be restricted to older players or turned off entirely.
- QR code/motion-capture-type patterns can be printed on blasters, targets, and battlefield objects to aid the extended-reality device 105 in overlaying virtualized animations/skins accurately utilizing the HUD display 142.
- the extended-reality projectile-firing gaming system 100 utilizes the extended-reality devices 105 to provide a selection interface configured to work within the system 100 ecosystem and control varied options within the game.
- any of the selectable options described herein may be implemented through a selection interface viewable and usable through the extended- reality devices 105.
- the real and/or virtual objects described herein are associated with commands and instructions that are configured to be selected by the users 116, 118 using a computer-aided interface.
- the commands and instructions preferably include choosing the type of projectile-firing device 104 (e.g., type of blaster 104, firing mode, etc.), choosing with whom to communicate during a gaming session, choosing map pullup and options, and the like.
- controlling the blaster 104 mode may include toggling or selecting a single/multi- shot mode.
- the electromechanical apparatus may be controlled via the commands and instructions to fire in accordance with type/mode of firing, to fire faster and/or harder, or the like.
- a single-fire revolver example may include a single-shot mode of firing, a relatively slower/lighter projectile speed, and a specified number of reloads (e.g., six).
- a multi-shot rifle may include an automatic/burst mode of firing, and may speed up the projectile speed with a higher number of reloads.
- These different blaster 104 types are configured to be selectable using the computer-aided interface in association with the extended-reality devices 105.
- the commandss and instructions preferably include controlling rate of fire, such as 8 projectiles per second versus 12 projectiles per second.
- controllable options include other in-game implement/tool selection, ammunition selection, map review and expansion, equipping of different accessories, and communication.
- the controllable options may include a power-up menu selection, such as potions or other items found in the extended- reality environment world. For example, when the user’s 116 health is low, a virtual potion may be selected and used to increase the user’s hit points. Other examples are invincibility shields, increased speed, or the like.
- Map review and expansion preferably uses the location of other headsets (e.g., extended-reality devices 105), blasters 104, or other real object for which location services are available (e.g., remote device 112) for locations of other players within the map.
- the different accessories may include lights, tracer rounds, and night vision, among other like accessories.
- these accessories may be real or virtual items, or may simulate real-world items.
- Night vision goggles for example, may be simulated by providing a green “glow” through the extended-reality device 105.
- a light on the other hand, may be a real -world object that illuminates the gaming environment, or may be an effect that lightens the extended-reality environment through the device 105.
- the controllable options also include communication among teammates or other users 116, 118. For example, the user 116 sees a dropdown menu of teammates through the extended- reality device 105 and selects the one (or ones) that the user 116 would like to speak with.
- the computer-aided interface facilitates selection of communication options, including selection of a particular user with whom to communicate. For example, eye contact with a map icon depicting a user 116 for a long enough predetermined time period may be sufficient to open up communications with that user 116.
- the extended-reality devices 105 include the computer-aided interface, but in other embodiments, the interface is separate from the extended reality devices 105.
- the extended-reality devices 105 include one or more eye sensors configured to monitor eye contact of the users 116, 118.
- the eye sensors may be accompanied by an algorithm designed to monitor how long the users’ 1 16, 1 18 eyes are maintaining eye contact with a particular real or virtual object displayed by the extended-reality devices 105.
- the algorithm compares the time of eye contact with a predetermined time period (e.g., 1 or 2 seconds) to confirm a selection of the particular real or virtual object (e.g., a command or instruction associated with the real or virtual object).
- the algorithm may include a range such that eye contact greater than a first predetermined time period and less than a second predetermined time period confirms a selection. However, the algorithm may simply monitor eye contact without requiring the eye contact to be maintained within a certain time period. For example, if the user 116, 118 makes eye contact with a real or virtual object, the user 116, 118 may activate a command or instruction associated with the real or virtual object by using voice activation and/or selection by hand movement or remote device, or any combination thereof. For example, the user 116 makes eye contact with a “Revolver” blaster choice and says “Select” to choose the Revolver as the blaster 104 type.
- the user 116 makes eye contact with a “Revolver” blaster choice and depresses a button on the blaster 105 to choose the Revolver as the blaster 104 type.
- the button on the blaster 104 may be selectable via the remote device 112 that is affixed to the blaster 104 or may be separately provided from the blaster 104.
- the computer-aided interface may consist of separately implemented eye sensors, hand movements, physical controllers, remote devices 112, or combinations thereof.
- a trackpad, ball mouse, or similar physical system may be utilized to control a menu within a game.
- the physical controller can consists of merely up/down/left/right buttons on the blaster itself, or may be a separate standalone controller.
- the hand movement may be recognizable by use of one or more sensors using any of the sensor types described herein, such as LIDAR, cameras, or the like.
- the blaster 104 control via the computer-aided interface may be integrated with hit detection/attribution methods as disclosed herein.
- the methods disclosed herein may output updated gaming metric data and/or target location state data.
- the real and virtual objects may be modified to reflect real or near real-time events.
- the modified real and virtual objects may then provide updated commands or instructions available to be selected by the users 116, 118.
- a user 116 may confirm a hit using the methods provided herein that, when updated via the updated gaming metric data and/or target location state data, triggers an ingame response, such as providing power-ups or additional in-game capabilities.
- the user 116 is then able to select the in-game virtual (or real) item as it appears or is prompted to be utilized according to the computer-aided interface of the extended-reality device 105.
- the extended-reality device 105 utilizes the computer-aided interface to prompt the users 116, 118 to select or input a command or instruction.
- the computer-aided interface of the extended-reality device 105 is designed to minimize or eliminate negative impacts on gameplay or the users’ 116, 118 experience and provide an immersive gameplay environment.
- the extended-reality gaming system 100 utilizes audio to provide the computer-aided interface control options as described herein.
- the system 100 does not rely on display of the real and virtual objects through the extended- reality device 105 to provide selectable options, and may not be needed at all.
- the computer-aided interface utilizes verbal commands.
- the system 100 includes microphones and earbuds or headphones (or relies upon external speakers and microphones affixed to the blaster 104, wearable vest 108, and/or remote device 112) to provide the computer-aided interface for commands and instructions.
- the user 1 16 states “Computer: identify blaster inventory” into a microphone, and the extended-reality gaming application responds through the speaker “Revolver, sniper rifle.” The user 116 may then state into the microphone “Computer: Select revolver,” and the blaster 104 becomes the revolver in that gameplay session.
- the various audio options may be integrated with the extended-reality device 105 such that all or portions of the audio control options may be utilized alongside or in connection with the display of the real and virtual object and associated commands and instructions.
- FIG. 23 there is depicted a blaster 104 as displayed in an extended- reality device 104.
- the device 104 is a pair of MR goggles 104.
- the MR goggles 104 display both real and virtual objects.
- the floor tiles 580, the projectile 164, and the barrier 178 are all real objects.
- the ceiling tiles 582 are virtual objects.
- a blaster choice dropdown menu is depicted in FIG. 23 a blaster choice dropdown menu
- the computer-aided interface facilitates interaction with any of the real or virtual objects, including those depicted in FIG. 23.
- the real or virtual objects depicted in FIG. 23 are nonlimiting.
- the user 116 makes eye contact with “Blaster
- Revolver which drops down three options: Revolver, Shot Gun, and Sniper Rifle.
- the user 116 selects one by making eye contact with Shot Gun, for example, and using a selection method, as disclosed herein, to select Revolver.
- the computer-aided interface receives the command/instruction to select Revolver and provides it to the application, which is then used to control the gameplay characteristics.
- the selection of Revolver preferably selects limitations and requirements that are communicated to the blaster 104 or the application to appropriately limit the gameplay in connection with the selection criteria.
- One of ordinary skill in the art would understand that a variety of methodologies of selecting, processing, and implementing the selections are within the scope of the present invention.
- wearable devices 106 are depicted.
- the wearable devices 106 can have lighting profiles, sounds, and other controlled actions to integrate with gameplay.
- the first wearable device 106 is associated with the first user 116 and the second wearable device is associated with the second user 118.
- the wearable device 106 includes a torso portion 132 and a shoulder portion 134. As shown in FIG. 1, the torso portion 132 and the shoulder portions 134 are attached via straps 136.
- the torso portion 132 and the shoulder portions 134 of the wearable device 106 include wearable device targets 137 configured to detect impacts of projectiles on the wearable device 106.
- the wearable device targets 136 are in communication with one or more of the device 104, the standalone target 108, the drone 109, the cloud-based computing system 110, the remote device 112, and the fitness monitoring device 114. Each of these components is configured to communicate with one or more of the components of the system 100 via the network
- the wearable device 106 Upon detection of a "hit," the wearable device 106 is configured to send a message via the network 102 to the application indicating that the user 116, 118 associated with the wearable device 106 has been hit by a projectile. As mentioned herein, when the application receives the message that the wearable device 106 has been hit, the application correlates the time the projectile was impacted on the wearable device 106 with a time a proj ectile was fired by a proj ectile-firing device 104. The user 116, 118 associated with the projectile-firing device 104 that was correlated with the impact on the wearable device 106 is then credited with a hit against the user 1 16, 1 18 associated with the wearable device 106.
- the standalone target 108 is depicted as a "flower" arrangement in a preferred embodiment.
- the standalone target 108 includes a central portion 138 (e.g., "pistil” of flower) and a plurality of outer portions 140 (e.g., "petals” of flower).
- the standalone target 108 includes six outer portions 140, but in other embodiments, the number of outer portions 140 may be greater or fewer than six.
- Each of the central portion 138 and the outer portions 140 are configured to separately determine whether a projectile has impacted that portion of the standalone target 108.
- the same target impact methodology is utilized for the wearable device targets 136 and the central and outer portions 138, 140 of the standalone target 108.
- "gamifying" the standalone target 108 may include a requirement to separately “hit” each of the portions 138, 140 in order to credit a gameplay objective. In other embodiments, one or more of the central portion 138 and outer portions 140 are required to be hit in accordance with the gameplay configuration.
- the standalone target 108 permits a variety of gameplay options and/or objectives, all within the scope of the present invention.
- the standalone target 108 is preferably in communication with the application via the network 102 so that real-time or near-real-time data may be provided to the application, and visa versa. For example, upon a hit of the central portion 138 of the standalone target 108, the application may credit a "hit” using similar methodology as that used to credit a "hit 1 on the wearable device 106.
- the standalone target 108 may also be configured to include a physical indication (i.e., a red lighted portion) to indicate a hit upon a certain portion 138, 140 of the standalone target 108.
- the application upon receiving data from the network 102, may be configured to provide a virtual overlay on the target displayed on the display of the extended-reality device 105
- the standalone target 108 in FIG. 3A is shown in the same configuration as that shown in FIG. 1.
- the standalone target 108 includes support members 144 to support the central portion 138 and the outer portions 140.
- the standalone target 108 also includes a stand 146 configured to support the standalone target 108 in an upright position.
- a monolithic target 148 is shown, which is an alternative configuration of the standalone target 108.
- the monolithic target 148 includes a similar arrangement including a central portion 138 and outer portions 140, each configured to determine whether a projectile has impacted that particular portion of the monolithic target 148.
- the monolithic target 148 is constructed so that the central portion 138 and the outer portions 140 are embedded in the same structure (i.e., monolithic construction) or mold 150.
- the monolithic target 148 may be affixed to a structure such as a barrier or battlefield object, or may be positioned in an upright position using a stand, similar to that shown in FIG. 3 A.
- the monolithic target 148 is preferably configured to communicate with the application via the network 102, and includes the same "hit" confirmation methodology as that described herein in connection with the standalone target 108 and the wearable device 106.
- the system 100 as shown in FIG. 1 includes a drone 109.
- the drone 109 is configured to communicate via the netw ork 102 with the application.
- the drone 109 may be preprogrammed to provide a "passive" target similar to the standalone target 108 or the monolithic target 148, and/or may be an "active" target or participant preprogrammed to be controlled remotely.
- the drone 109 includes a projectile-firing device 104 affixed thereto.
- the particular projectile-firing device 104 depicted in FIG. 1 would be modified to be affixed to the drone 109.
- the cloud-based computing system 110 is configured to provide on-demand availability of computer system resources.
- the cloud-based computing system 110 preferably includes data storage and computing processor resources to offload storage and processing into a cloud environment.
- the cloud-based computing system 110 is preferably configured to store and run the application.
- the remote device 112 as depicted in FIG. 1 is preferably a smartphone and includes direction-sensing and location-sensing capabilities.
- the remote device 112 includes GPS and a magnetometer to determine position and direction of the device 112.
- the remote device 112 may be affixed to the projection-firing device 104 to provide relatively accurate position and direction of the device 104.
- This data preferably is utilized to provide confirmation, in connection with the application, of a "hit" by a particular user 116, 118 on another user's 116, 118 wearable device 106 or a particular standalone target 108/monolithic target 148 or other target/battl efield object.
- the remote device 112 may also include an infrared (IR) reader so that the system 100 in connection with the application can sense that a particular device 104 has reflected IR radiation on a wearable device 106 or target (e.g., the targets 106, 148, the drone 109) for further confirmation of a "hit.”
- IR infrared
- the remote device 112 is preferably connected to the network 102 and configured to communicate with the application.
- the remote device 112 in a preferred embodiment is configured to receive real- or near-real-time data from the network 102 to provide commands and/or instructions to the user 1 16, 1 18, and to display the commands and/or instructions thereon in accordance with the gaming configuration for that particular gaming session.
- the remote device 112 may display useful game information, such as hits, health, ammo, blaster “type”, team alignment, remaining players, etc.
- the fitness monitoring device 114 as depicted in FIG. 1 is configured to communicate with the application via the network 102.
- the device 114 may be a smart watch such as Apple Watch and Samsung Gear for integration with additional mobile applications.
- the device 114 may be configured to integrate with health apps and features via an API.
- the users 116, 118 may gain XP or other in-game features through increased physical activity and exertion. Users 116, 118 that exhibit more active gameplay, monitored through the fitness monitoring device 114, may earn increased XP.
- the users 116, 118 may use the XP to unlock special items, exclusive products, free items, or the like.
- the application is configured to be stored and run on the cloud-based computing system 110.
- the application is configured to receive, via the network 102, data from all physical components of the system 100 and stored on a database associated with the cloud-based computing system 110.
- the application may be configured to receive data from one or more wireless protocols such as Bluetooth (e.g., Bluetooth Low Energy (BLE)), WiFi, RF, etc.).
- BLE Bluetooth Low Energy
- the application preferably is configured to allow the users 116, 118 to download new game modes, receive commands and/or instructions, display score and skill-improvement tracking, and the like.
- the application may be configured to communicate with the remote device 112 to receive and display this information to the users 116, 118.
- the application is also configured to provide historical, up-to-date information concerning global and/or local scoreboards, such as those provided in video gaming environments.
- the application is also configured, in an embodiment, to provide control over light and/or sound preferences, customized gaming mode/gaming configuration creation or sharing, and the like.
- the application in a preferred embodiment, is also configured to control and communicate with the various targets and physical objects in the system 100 via the network 102.
- the application is configured to control the start/stop of a gaming session, provide live, real- or near-real-time scoring updates, and visual/audible/haptic user feedback in real- or near-real-time.
- the application configures the various components of the system 100 initially in a gaming session setup. Each of the "nodes" within the system 100 is configured to communicate with nearby
- nodes such that the application need only be connected via the network 102 to one other component within the system 100.
- sharing and communication of data, information, commands and/or instructions, and the like may be shared between devices, and not necessarily through a "central" hub or routing station.
- FIG. 4 a battlefield environment depicting exemplary components of the extended-reality projectile-firing system 100 is shown.
- the battlefield environment depicted in FIG. 4 is an outside arena setting showing multiple users 116, 118 each with a projectile-firing device 152.
- the device 152 is the same or substantially the same as that depicted in FIGS. 1 and 2, but includes a display 154 on the housing 120 facing the user 116, 118.
- the display 154 may be placed on other areas of the housing 120 without departing from the scope of the present invention.
- the display 154 is configured to provide useful game information, such as hits, health, ammo, blaster “type” (e.g., projectile-firing device 104, 152), team alignment, remaining players, etc.
- useful game information such as hits, health, ammo, blaster “type” (e.g., projectile-firing device 104, 152), team alignment, remaining players, etc.
- a health/shield meter 156 depicted in FIG. 4 shows that the user 116 has 100% of the available health points remaining, and a fully available shield.
- the inner ring depicted on the health/shield meter 156 may signify the remaining health
- the outer ring depicted on the health/shield meter 156 may signify the remaining shield.
- the upper left comer of the display 154 shows team information 158 including a team affiliation, players on the team, and opponent players.
- the team information 158 may also indicate that one or more players have been eliminated from game play.
- the display 154 also includes, on the lower right comer, device information 160, which includes the current weapon type (e.g., SURGE), the firing mode (e.g., semiautomatic, full automatic), the range (e.g., RNG: 4), the damage capacity per round (e.g., DMG: 5), and the number of rounds remaining (e.g., nine rounds).
- the range for example, may be determined by a level system wherein a "default" blaster may be given a Level 1 rating.
- Level 1 specifications for the blaster may be represented by a single star next to the type of blaster, and include RNG: 4 (out of 10), DMG: 5 (out of 20), set to semi-automatic only, and 9 rounds available per reload.
- the range may be a limiting range, such that once a blaster fires at a target or user beyond that range, the "hit" will not be credited toward the blaster's user.
- the range may be a dampening factor such that beyond the optimum range the blaster's damage capabilities is depleted significantly.
- the application can determine the range at which a device 152 was fired at a target 108, 148 or a user 116, 118, and attribute damage to the user associated with that particular device 152 based on the distance between the device 152 and the target 108, 148 or the user 116, 118.
- upgrades to the blaster level can be acquired in addition to other blaster types. Upgrades may increase the range, damage, and rounds available of a given blaster type. Other blaster types such as “shotguns” could be picked up, and would have specifications balanced differently, such as very high damage ceilings, reduced range ceilings, and varying round capacities.
- the display further includes, on the upper right comer, power/connectivity information 162, which includes icons representing, for example, WiFi and Bluetooth signals, remaining device power, etc.
- power/connectivity information 162 which includes icons representing, for example, WiFi and Bluetooth signals, remaining device power, etc.
- WiFi and Bluetooth protocols are utilized to provide the network 102.
- the device 152 depicted in FIG. 4 also shows a hopper 130 with projectiles 164 therein.
- the projectiles 164 may be gellets, paintballs, or any other type of projectile suitable for gameplay.
- the users 116, 118 depicted in FIG. 4 are shown wearing glasses 166.
- the glasses 166 may be simple safety glasses, or in an embodiment, may be extended-reality devices 105.
- FIG. 5 depicts a first-person view looking through an extended-reality device 105.
- the extended- reality device 105 includes the HUD display 142 through which both real-world physical objects such as the users 116, 118, trees, and ground are viewable, as well as virtual objects such as the health/ shield meter 156, the team information 158, the device information 160, and the power/connectivity information 162.
- FIG. 5 also includes player information 168 displayed in a pop-up style banner above the user 116, 118. For example, as shown in FIG.
- the player information 168 identifies the name of the player (e.g., JESSTHETIGER), the player's affiliation (e.g., shown by the icon to the left of the player's name), and the player's health/shield meter 156 within the pop-up banner.
- FIG. 5 further includes an in-game message 170 configured to provide the user 116, 118 associated with the extended-reality device 105 information about the gaming session. For example, as shown in FIG. 5, the user 116 identified as "CASEYTHELION" is shown to be "KNOCKED OUT!”, indicating that the user 1 16 has been eliminated from gameplay.
- Other information and/or data, as described herein, may be displayed to the users 116, 118 through the HUD display 142 without departing from the scope of the present invention.
- FIG. 6 a top-down view of a battlefield environment in accordance with an exemplary extended-reality projectile-firing system 100 is shown.
- the various components depicted in the battlefield environment of FIG. 6 include projectile-firing devices 104, 152 (e.g., blasters), standalone targets 108 (or monolithic targets 148), a drone 109, home base/capture point stations 172, bunkers 174, turrets 176, and barriers 178.
- the projectile-firing devices 104, 152 are each associated with a user 116, 118 (not shown in FIG. 6).
- the user 116, 118 preferably is associated with an extended-reality device 105, a wearable device 106 (e.g., wearable vest), a remote device
- a user 116, 118 may utilize physical player-borne shields, designed to shield or deflect projectiles 164. These could be provided to certain “roles” (e.g., Tank, Heavy), or picked up on the battlefield. An iteration of this device could have a collapse/expand function, so that it only activates under certain game conditions, or for certain time intervals with a recharge period, but can be equipped during the whole game.
- certain “roles” e.g., Tank, Heavy
- Each of these components are capable of being in communication with the application via the network 102, as described more fully herein. Not all components associated with the user 116, 118 must be operatively connected to a "central" routing station, but may communicate data with nearby elements, associated with the user 116, 118, another user 116, 118, or any of the networked components described herein.
- the standalone targets 108, the drone 109, the home base/capture point stations 172, the bunkers 174, and the turrets 176 are all configured to be in communication with the network 102.
- the battlefield environment of FIG. 6 depicts two teams or "sides", each comprising three users 116, 118 (e.g., players).
- the first team comprises three players on the eastern (right) side of the battlefield, while the second team comprises three players on the western (left) side of the battlefield.
- a standalone target 108 is associated with the first team.
- one objective of the second team is to eliminate the standalone target 108 on the first team's side.
- the target 108 may require a team to inflict damage or cause a certain number of hits to win, to unlock bonuses, "debuff ' the enemy team, or as a target training game mode.
- Debuffing an opposing team may include reducing the team's rate of fire, reload times, number of health points, reduce shields, or a variety of other reduction methodologies.
- the debuff may be configured to be on a timer, such that when the timer expires, the debuff is removed.
- the first team's side includes a drone 109 (which may or may not be associated, or on the same team, as the first team; i.e., neutral, or otherwise), a home base/capture point station 172, two bunkers 174, a turret 176, and two barriers 178.
- the second team's side includes a standalone target 108, a home base/capture point station 172, a bunker 174, two turrets 176, and two barriers 178. All of these battlefield components may be networked via the network 102 and configured and controlled by the application.
- a battlefield objective may include "capturing" the drone 109 to bring it onto the first team's side or the second team's side.
- the drone 109 may include a projectile-firing device 104 mounted and independently remotely controlled by an operator or user 116, 118.
- the drone 109 may include a kit enabling attachment of a projectile-firing device 104, a standalone target 108, or the like, to an existing drone platform.
- the drone 109 may be integrated with these components and provided as an integrated unit.
- the drones 109 may be configured to be “unlocked” as a perk for accomplishing certain objectives.
- the drones 109 may have certain automated features, allowing them to fly patterns remotely, such as hovering over the home base/capture point station 172 acting as defense, or following a given player, such as in certain video games.
- the drone 109 may be configured to execute "air strikes,” to be unlocked and called in, where the drone 109 takes off automatically and goes to a point or target, takes an action such as blasting it, and then flies back to a preset location.
- the motorized objects may implement navigational controls, live streaming video, control of projectilefiring devices 104 installed onto the motorized objects, and hit detection and correlation in connection with standalone targets affixed to the motorized objects.
- the motorized objects are ground-based remote controlled vehicles with mounted blasters (e.g., projectile-firing devices 104) or “explosives” (e.g., mines, grenades, etc.), which can be activated by the user 116, 118 or a remote operator.
- Both rovers and drones may be used as intelligence gathering with or without mounted blasters, where a camera feed is shown on the display 154 of a device 152, the HUD display 142, remote device 110, or other viewable display. Control over these units could be contested as part of gameplay, where there is only one available rover (or drone 109), and certain successful actions give one player or team access to it for a period of time.
- the home base/ capture point stations 172 are configured to communicate with the application via the network 102.
- the stations 172 allows players to refill projectiles (e.g., ammunition or rounds) physically, as well as digitally, per the gaming mode or configuration. Proximity to the stations 172 can offer "healing," restoring lost health points while m-game.
- the stations 172 may also serve as a respawn point; for example, if a player is eliminated during gameplay, the player must physically return to the player's team station 172 in order to return to the game.
- the game mode or configuration may determine the res pawn timer (i.e., how long a player must wait to respawn), as well as the number of respawns possible per player.
- stations 172 can serve as extensible capture points in appropriate game modes, where one team must hold multiple points simultaneously by positioning their players near it and keeping the opposing players away. Proximity can be determined by RF, NFC, GPS, blasting the object within a repeating time window, or a combination of these and/or other methods.
- the bunkers 174 in an embodiment are rapidly inflating and self-storing inflatable fold-out panels composed of rubber sheets, bistable spring sheets, and inflatable structural beams.
- the bunkers 174 include an attached pump or compressed air storage tank whereby the inflatable structural beams are inflated and the air pressure folds out the panels. Fixtures at the base of the unit can assist in maintaining the panel firmly in place. Inflation and deflation of the bunkers 174 may be controlled by the application and/or the users 116, 118 utilizing any of the methods described herein to activate or deactivate battlefield objects.
- the users 116, 118 may deploy the bunkers 174 by activating the bunkers 174 via the remote device 112, the display 154, or through an interface available through the extended-reality device 105 and/or the HUD display 142.
- the bunkers 174 may also be manually inflated/deflated by the users 116, 118.
- the application may also keep track of which bunkers 174 are deployed/inflated, and their location or proximity' to other battlefield objects in accordance with several embodiments disclosed herein.
- the turrets 176 are configured to be in communication with the network 102 and controllable through a variety of methods, as disclosed herein. For example, global, connected true “video games” can be augmented into the application to allow for control of a turret 176 via the internet from users worldwide.
- the turret 176 may be controllable via the application by a team's user 116. Controlling the turret 176 may include placing the turret 176 in a "sentry" mode, where if an opposing player is within a range of the turret 176, the turret 176 will fire projectiles 164 at the opposing player.
- the turret 176 may include motion sensors or body-detecting cameras to allow for "sentry” functionality.
- the turrets 176 may be configured to be semi -autonomous (e g., partially player controlled) and fully autonomous (e.g., player puts turret 176 into “mode”).
- the turrets 176 preferably include a standalone target 108 so that hit detection and correlation may occur. This functionality' would allow a team to "take out” or deactivate a turret 176 during gameplay, or capture the turret 176 for the opposing team's advantage.
- a device 104 mounted on or near the target is aimed at the user 116.
- the turret 176 would fire back at the player (e.g., a "wild west quick draw” mode). If the user 116 does not hit the target quickly enough, the turret 176 shoots back and the user 116 loses the round.
- This same mode could also be configured to operate with a standalone target 108.
- the adversarial mode can also be configured to enable a human player to remotely control the adversarial blaster via internet through the network 102.
- the turrets 176 may be player-deployable or “permanent” (e.g., at a venue) turrets 176 that can be controlled directly or remotely, and/or be automated via motion detection, computer vision, or at random.
- the turrets 176 preferably would contain the same hit detection systems as the wearable devices 106, the standalone targets 108, and the monolithic targets 148, allowing players to “destroy” them by hitting them a number of times, and allowing attribution of the turrets’ 176 hits to the appropriate player or team.
- These turrets 176 could allow disabled or remote players to engage with their friends on the battlefield, either via the internet, nearby RF, or hands-on controls.
- the barriers 178 preferably are deployable static barriers, similar to the bunkers 174, but "permanent" for that particular gaming session. For example, teams could be given a number of player-portable barriers to set up before the match, enabling a more interesting playspace on an otherwise flat arena such as a sports field.
- the barriers 178 can be constructed of lightweight but durable fabric on a collapsible frame similar to a folding chair or umbrella. They can also be printed with branded or themed livery.
- the capturable objects preferably can detect their proximity to a team’s home base/capture point station 172 through RF or NFC, or the like, and know when they are in motion via accelerometer and/or GPS, allowing the network to alert players when a “flag” is being captured.
- the battlefield objects illustrated in FIG. 6 may include battlefield "ordinance" and accessories.
- the battlefield objects may include mines that are deploy able, proximity detonated battlefield objects that blast projectiles outward.
- the mines do not blast projectiles outward, but are controlled by the application to credit a "hit" from the mine in connection with a particular user 116 within range of the mine proximity.
- a player coming too close to a mine is notified via audio/video/haptic feedback that they have been hit, and the hit credited against the player.
- This can also work without projectiles, where a player coming too close is simply notified via A/V/H feedback that they have been hit.
- the projectiles can be ejected, outward but not be required to impact the play er- worn target in order to count, and thus simply providing tactile feedback for the hit.
- the battlefield objects may also include thrown objects by the player. Similar to the mines described above, "grenades” may be utilized and would simulate an explosion.
- the type of weapon or projectile-firing device 104 may be virtual, utilizing the network 102 and NFC or other electronic means to allow for battlefield pickup of different weapon types or accessories.
- a small object preferably can be picked up and placed near or attached to the Smart Blaster (or via interaction with a static battlefield object, e.g., shooting a target), telling it to change configuration and registering it in the system as a new weapon type.
- Table 1 below illustrates potential examples of various weapon types within the scope of the present invention in accordance with a gaming mode or configuration.
- the "Revolver" of Table 1 includes a six-round capacity, requiring a player to reload the Revolver after firing six shots. To reload another six rounds, the Revolver is controlled to wait 3 seconds before being able to fire another round from the next six rounds. Each round inflicts 1 DMG per successful hit. The rate of fire is limited to 1 shot per second.
- the "Automatic" blaster of Table 1 includes a 30-round capacity with a reload time of 5 seconds. The DMG per hit for the Automatic is higher than the Revolver, at 2 DMG per hit. The rate of fire is not limited and may utilize the physical rate of fire available to the projectile-firing device 104. Alternatively, the rate of fire may be limited.
- the Bazooka of Table 1 is a "splash-damage" capable option. Since the Bazooka is a slower-firing weapon type, and inflicts greater damage per hit, its reload time is slower than other types of weapons and does not have more than one round in its capacity.
- the Sniper of Table 1 includes a 1 round capacity with a 4-second reload time, which limits the rate of fire by a 4-second interval. The Sniper has a greater damage capacity of 7 damage than the Revolver or the Automatic.
- One of ordinary skill in the art would understand that the various weapon types and parameters are fluid and may change depending on the gameplay configuration. For example, a range or "overheat" timer could be specified by weapon type.
- the trigger 124 of the device 104 is electronic; thus, the “reload” and related reload time can be enforced by disabling the trigger until the player hits a reload button, after which a reload timer is activated. Audible, visual, and haptic feedback can be provided for all of these actions and states (e.g., out of "digital" rounds/ammunition).
- the rate of fire can be controlled, only registering a trigger 124 pull or firing a projectile 164 at the appropriate time interval.
- Semi and fully automatic modes may be manually or automatically controlled, as described herein.
- the system 100 calculates damage digitally through data received from the network 102.
- the device 104 provides a signal indicating its weapon type; if the target 108, 148 or wearable device 106 is "hit," the target 108, 148 or wearable device 106 will know the weapon type and use the weapon characteristics (e g., as disclosed in Table 1) to calculate the damage.
- the application calculates damage inflicted on a battlefield object by a particular weapon type.
- the application hosted in the cloud-computing system 110, may store a lookup table or other suitable tracking method within a database, or the like, to track and credit hits by/to certain battlefield objects, as more fully described herein.
- the damage inflicted by a particular weapon type may be degraded or decreased at outer limits of the weapon type's range.
- Splash damage such as from a Bazooka, mine, or grenade
- the splash damage may be greater the closer the battlefield object/target 108, 148 or wearable device 106 is from the epicenter of the "explosion.”
- An IR device associated with the projectile-firing device 104 may be utilized to track splash damage. For example, a number of targets may be “pre-painted” with IR, which will hold them as painted for a fixed period of time, and if one is hit with a projectile within that window, they each receive damage.
- Various means of swapping weapon types is within the scope of the present invention.
- a variety of “skins” could be implemented, such as shells that attach or slide over the projectile-firing device 104. These skins could have the necessary electronics to tell the blaster what weapon type to become. The skins could combine both physical and virtual objects in accordance with a particular weapon type.
- stickers containing passive NFCs can be provided as packs, and players can affix them to objects of their choice to place around the battlefield, allowing loot drops including weapon types and upgrades; these objects could be misplaced without much loss, as new stickers can be easily replaced.
- scopes or silencers may be provided as accessories to the device 104 to provide relative effects on gameplay. For example, the scope could increase the range at which a particular weapon type is effective.
- FIG. 7 is a flow diagram of an extended-reality proj ectile-finng gaming method 200.
- Step 202 a gaming session in accordance with a predetermined gameplay configuration is initiated.
- the extended-reality gaming application is configured, in a preferred embodiment, to control and communicate instructions and/or messages to initiate the gaming session.
- the remote device 112 is configured to initiate the gaming session.
- projectile-firing device data is received from one or more blasters (e.g., projectile-firing devices 104, 152.)
- the projectile-firing device data preferably includes data configured to update gaming metric data in accordance with a gameplay configuration.
- the gameplay configuration may specify requirements for the gaming session, including available health points, shields, blaster types, available rounds, reload interval times, rounds per magazine, and battlefield objectives, among other things.
- the gaming metric data in a preferred embodiment tracks player scores, updating the gaming metric data to confirm successful impacts (e.g., projectile hits) from particular blasters 104 against particular wearable 106 and non- wearable battlefield objects 108, 148, 172, 176, etc.
- the projectile-firing device data includes data indicating that a projectile was fired, the time the projectile was fired, and location and orientation information.
- the projectile-firing device data may also include information obtainable from one or more sensors, such as near-field communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data.
- battlefield object data is received from one or more battlefield objects such as wearable or non-wearable devices.
- the battlefield object data preferably includes data configured to update gaming metric data in accordance with a gameplay configuration.
- the battlefield object data preferably includes information that the battlefield object 106, 108, 148, 172, 176, etc., was impacted with a fired projectile, the time of impact, and the location and orientation of the battlefield object.
- the battlefield object data may also include information obtainable from one or more sensors, such as near-field communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data.
- the battlefield object data may wholly, or partly, be provided via an affixed remote device 112 configured to provide the sensory data.
- the projectile-firing device data and the battlefield object data are received.
- these data are received by the cloud-based computing system 110 via the network 102 and usable by the extended- reality gaming application.
- these data are received by a standalone application at the physical location of the gaming session, or may be a combination of cloud-based and physical location-based computing and processing.
- One or more remote computing centers or systems may be utilized to receive and process the data.
- the firing of a projectile is confirmed, and a time stamp is associated with the firing of the projectile.
- the data tracks the projectiles as they are fired by the projectile-firing device 104, and provides time stamps when each projectile was fired in real- or near-real-time.
- the location and orientation of the blaster that fired the projectile are determined.
- the sensory data from the projectile-firing device data is preferably utilized to determine the location and orientation of the blaster.
- Step 212 an impact on a battlefield object is confirmed, and a time stamp is associated with the impact on the battlefield object.
- the battlefield object is tracked to perceive impacts on wearable devices 106 and/or non- wearable devices 108, 148, 172, 176, etc.
- Step 214 the location and orientation of the battlefield object are determined.
- the sensory data from the projectile-firing device data is preferably utilized to determine the location and orientation of the battlefield object.
- Step 216 it is determined whether the time the projectile was fired and the time of the impact on the battlefield object are within a predetermined time period.
- the predetermined time is measured in milliseconds, as the blaster's projectile velocity is preferably at least 90 FPS and can exceed 200 FPS. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
- Step 218 it is determined whether the location of the blaster that fired the projectile and the battlefield object detecting the impact are at locations within a predetermined area.
- the physical range of the blaster 104 is limited, but the blaster 104 may be further limited by the application such that certain impacts outside of the predetermined area are outside of the permitted blaster's 104 range.
- the location determination permits further confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
- Step 220 it is determined whether the orientation of the blaster that fired the projectile is oriented with the orientation of the battlefield object detecting an impact by a projectile. If the orientation is within a predetermined direction or orientation, the orientation determination permits further confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
- the gaming metric data is updated confirming a successful impact on the battlefield object and a successful hit by the blaster 104.
- the updated gaming metric data attributes a decrease in health points to a user or battlefield object that was impacted by the blaster 104, in accordance with the gaming configuration, and attributes damage inflicted upon the user or battlefield object to the user associated with the blaster 104.
- the gaming metric data in a preferred embodiment tracks player scores, updating the gaming metric data to confirm successful impacts (e g., projectile hits) from particular blasters 104 against particular wearable 106 and non-wearable battlefield objects 108, 148, 172, 176, etc. in real- or near-real-time.
- Steps 216, 218, and 220 and the updating of the gaming metric data in Step 222 are preferably conducted using the gaming application.
- FIG. 8 is a flow diagram of an extended-reality proj ectile-finng gaming method 300.
- blaster data is received from a blaster 104.
- the blaster data is preferably the projectile-firing device data associated with method 200.
- wearable vest data is received from a wearable vest 106.
- the wearable vest data is preferably the battlefield object data associated with method 200.
- battlefield object data is received from a battlefield object 108, 148, 172, 176, etc.
- the battlefield object data is preferably the same battlefield object data associated with method 200.
- the time of firing of a fired projectile from the blaster 104 is recorded. Preferably the time of firing is recorded and stored in connection with the gaming application.
- an impact of the fired projectile on the wearable vest 106 is confirmed.
- the methodology for determining an impact on a wearable vest 106 is disclosed elsewhere herein.
- the time of impact of the fired projectile on the wearable vest 106 is recorded.
- an impact of the fired projectile on the battlefield object 108, 148, 172, 176, etc. is confirmed.
- the methodology for determining an impact on a battlefield object 108, 148, 172, 176, etc. is disclosed elsewhere herein.
- the time of impact of the fired projectile on the battlefield object 108, 148, 172, 176, etc. is recorded.
- Step 318 it is determined whether the time of impact on the wearable vest 106 and the time of firing the fired projectile is within a predetermined time period. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular wearable vest.
- Step 320 it is determined whether the time of impact on the battlefield object 108, 148, 172, 176, etc. and the time of firing the fired projectile is within a predetermined time period. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
- the predetermined time is measured in milliseconds, as the blaster's projectile velocity is preferably at least 90 FPS and can exceed 200 FPS.
- Step 322 a successfully fired projectile attributed to the blaster 104 is recorded.
- the application records the successful hit of the projectile and associates the successful hit with the blaster 104 and the associated user.
- a successful impact on the wearable vest 106 and/or the battlefield object 108, 148, 172, 176, etc. is recorded.
- the application records the successful impact of the projectile on either the wearable vest 106 and/or the battlefield object 108, 148, 172, 176, etc.
- a blaster's 104 projectile will likely only be attributable to a single wearable vest 106 or a single battlefield object 108, 148, 172, 176, etc.
- one or more wearable vests 106 and/or batlefield object 108, 148, 172, 176, etc. may be impacted by a single "explosion.”
- the gaming metric data is updated to record the successful impact and the successful hit and atribute each to corresponding devices and their associated users.
- method 300 does not specifically include location and/or orientation confirmation of successful impacts or hits. However, those confirmations may be implemented in the method 300. Alternatively, any combination of timing, location, and orientation data and/or information may be provided to confirm successful impacts or hits, and are within the scope of the present invention.
- the updated gaming metric data identified in the methods 200, 300 preferably is provided to the projectile-firing devices 104, 152, the display 154, and/or the HUD display 142, and/or the like, to provide updated gaming score information in accordance with the gaming configuration during the gaming session in real- or near- real-time.
- FIG. 9 is a smart target 400 in accordance with a preferred embodiment of the present invention.
- the smart target 400 is configured to be integrated and utilized with a smart target co-witnessmg hit atribution system in accordance with a preferred embodiment of the present invention.
- the smart target 400 is a standalone target 108 or a monolithic target 148.
- the smart target 400 preferably includes a piezoelectric (PE) sensor 410 configured to determine an impact of a particular projectile upon (or near) the smart target.
- PE piezoelectric
- the smart target 400 is arranged as a "flower.”
- the standalone target 400 includes a pistil portion 402 (e.g., "pistil” of flower) and a plurality of petal portions 404 (e.g., "petals” of flower).
- the smart target 400 includes six petal portions 404, but in other embodiments, the number of petal portions 404 may be greater or fewer than six.
- Each of the pistil portion 402 and the petal portions 404 are configured to separately determine whether a projectile has impacted that portion of the standalone target 400.
- a PE sensor 410 is embedded or integrated and/or operably coupled with each of the pistil portion 402 and the petal portions 404.
- a single PE sensor 410 or two or more PE sensors 410 are utilized to sense impacts on the pistil portion 402 and the petal portions 404.
- a trained model as described further herein, is preferably utilized to enable electronic control circuitry to determine when a projectile has impacted a particular portion of the smart target 400.
- the same target impact methodology is utilized for the wearable device targets 136 and the central and outer portions 138, 140 of the standalone target 108.
- the smart target 400 includes support members 406 to support the pistil portion 402 and the petal portions 404.
- the smart target 400 also includes a stand 408 configured to support the smart target 400 in an upright position.
- a PE sensor 410 is an electromechanical device that uses the piezoelectric effect to measure changes in pressure, strain, force (or other physiological effects) by converting them to an electrical charge or signal. In smart target parlance, the PE sensor 410 is utilized for hit detection. In other embodiments, sensors other than the PE sensor 410 may be implemented, as described herein.
- the smart target 400 is affixed to one or more petals of a target system.
- the smart target depicted in FIG. 9 includes a central PE sensor 410 and outer "petal" PE sensors 410 configured to detect hit impacts.
- the same (or similar) modular PE sensor 410 is also configured to be used in connection with wearable vests, for example.
- FIG. 10 is a perspective view of a wearable device 412 including one or more smart targets 400 in accordance with a preferred embodiment of the present invention.
- the PE sensor 410 that forms the center (e.g., central PE sensor 410) and petals (outer “petal” PE sensors 410) of the wearable device 412 are affixed in the central torso portion of the wearable vest depicted in FIG. 2.
- the wearable device 412 may include one or more PE sensors 410 on the back of the user's wearable vest or on shoulder portions of the wearable device.
- the wearable device 412 is a smart target 400.
- one or more PE sensors 410 may be trained to determine a successful impact of a projectile (e.g., gellet).
- a projectile e.g., gellet
- logic is trained using machine learning (ML) to identify positive strikes against background noise and artifact impacts by identifying the waveform signature (i.e., electrical signal) specific to gellet composition and anticipated material density.
- waveform signature i.e., electrical signal
- logic trained through machine learning obtains training data; a robot (or the like) uses a smart blaster 416 and varies distance, capturing data to understand the waveform signature that a gellet (or other projectile) generates upon a successful hit.
- a two-dimensional convolutional ML model may be trained by multiple sensor inputs being provided simultaneously. The captured data provides a model to be utilized by the smart target 400 to detect a successful hit.
- the hit detection trained model accounts for off-angle impacts, variance of impact pressure, and other physical metrics (i.e., relative location adjacent to the smart target 400) to increase or decrease the amount of "damage" points given to the user upon a successful hit.
- the machine learning element obtains data for the trained model used by the PE sensor to understand the angle of impact and range of a gellet impact, as but one example.
- the trained model is generated using machine-leaming-assisted noise reduction (e.g., ambient vibration, non-proj ectile impacts), which may be implemented in hardware, firmware, software, or a combination thereof.
- Machine learning also may include identification of different materials that may be advantageous to eliminate or reduce noise reduction and provide a "cleaner" waveform signature to be used in creating filtering or the like.
- Envelope detection and peak detection circuits may be provided for determining characteristic gellet impact signals. This feature is not limited to gellet projectiles, but can be used for all types of projectiles including airsoft, paintball, bow style projectiles, firearms, etc.
- the trained model also reduces the ability for "false positive" impacts or cheating, distinguishing non-truthed impacts from real impacts.
- the accuracy embodied by the trained model eliminates or reduces the necessity for a referee to determine which individual was hit by a projectile, and attribute the hit to another individual.
- the smart target 400 includes a filter and amplifier circuit tuned to impact responses for detecting vibration.
- the filter and amplifier circuit may be configured to eliminate portions of a signal from the PE sensor 410 that are not attributable to a hit by a projectile.
- the trained model permits the designer of the filter and amplifier circuitry to understand, based on many multiple projectiles hitting a target with a PE sensor 410, the waveform signature of a successful hit.
- the filter and amplifier circuit is preferably designed to eliminate background noise and non-proj ectile impacts, able to differentiate between real impacts by a projectile and everything else that may occur during gameplay.
- the smart target 400 includes an IR detection sensor 414.
- Hit detection may be implemented such that if the PE sensor 410 identifies a hit, but the IR detection sensor 414 does not, then the detected hit is not counted as a real strike.
- the PE sensor 410 i.e., hit detection
- IR sensor 414 i.e., hit recognition
- the PE sensor 410 (i.e., hit detection) versus IR sensor 414 i.e., hit recognition) may be expressed as a ratio and a successful hit is not counted unless the ratio exceeds a particular threshold.
- the IR detection sensor 414 may also confirm the angle at which a projectile hit the target, which in a gameplay environment, may adjust "damage" or “hit points” up or down depending upon the angle of impact, tracked and calculated by the application.
- the IR sensor 414 can be embedded with PE sensor 410. In other embodiments, the IR sensor 414 could be in a different location from the PE sensor 410. Depending on the application; i.e., wearable device 412 vs standalone smart target 410, it may be advantageous to locate the IR sensor 414 near the PE sensor 410, or distanced apart. For example, the distance between the IR sensor 414 and the PE sensor 410 preferably is from /i centimeter to 2 meters. In addition, there may be advantages to the location of the IR sensor 414 versus the PE sensor 410 at varying ranges. For example, logic trained with machine learning may indicate that optimum distances between the sensors is achievable at ranges in accordance with the application. For example, in a blaster-based gaming environment, it may be advantageous to train logic at distances most likely to be experienced during gameplay (e.g., 50 feet or less).
- IR messaging modules are preferably universal and included to interface with other IR devices.
- IR modules preferably consist of a high power infrared LED meant for line of sight transmission systems, and an infrared receiver module with matched optical characteristics.
- the IR modules combined with application features include IR-based universal interaction with the application for actions such as shoot to interact, shoot to pair.
- Other sensors or methodologies may be utilized alternatively or in combination with the PE sensor 410 and IR sensor 414 co-witnessing techniques.
- a brain-computer interface (BCI) based impact detection system may include skin contact electroencephalogram (EEG) sensors (or others) read players’ physiological response to being hit by a projectile count it as a hit in the system.
- EEG skin contact electroencephalogram
- the BCI technique may be utilized in combination with IR cowitnessing or other confirmation methods.
- computer- vision (CV) aided hit detection may be utilized.
- CV computer- vision
- CV-aided hit detection may be utilized.
- cameras or other RF sensitive equipment may watch for projectile impact and use machine learning (ML) to filter noise and false positives.
- ML machine learning
- the CV-aided hit detection methodology may be best suited to controlled arena play where external cameras/sensors can be permanently mounted, and cost concerns are mitigated.
- the CV-aided hit detection methodology utilizes video and/or photo recognition technology to help accuracy and gameplay experience.
- a BCI sensor 458 is depicted encircling the head of the user 116, 118.
- the BCI sensor 458 includes EEG sensors 460 configured to record an electrogram of spontaneous electrical activity of the brain. While the EEG sensors 460 are depicted in certain areas of the BCI sensor 458, they are placed for illustrative purposes only, and one of ordinary skill in the art would recognize that the EEG sensors 460 would be optimally placed in order to accurately monitor and record electrical activity of the brain.
- a MyndBand EEg headset may be configured to be used in connection with the system disclosed herein. Other headsets or devices are within the scope of the present invention.
- a full or partial “helmet” may be utilized to facilitate placement of the EEG sensors 460 on portions of the head of the user 116, 118 that are not covered by the “headband” style headset depicted in FIG. 10.
- the BCI sensor 458 preferably is configured to filter or reduce noise of the EEG sensors 460 signals by utilizing a trained model. Similar to the descriptions of ML-trained models herein, a blaster 104 is shot multiple times from vary ing distances and angles to a smart target 400 and data is gathered from the BCI sensor 458 in the form of EEG sensors 460 signals.
- a trained model can assist the control circuitry of the smart target 400 (or distributed/remote software application and/or circuitry) to filter out false positives by isolating portions or bands of frequencies. This will assist the system in quickly and efficiently determining whether an object that hits the user is a projectile, and may further allow the system to determine impact angle and distance.
- the signal from the EEG sensors 460 may appear different (e.g., different distribution of frequencies, or reduced intensity at certain frequency bands) at an impact angle of 0° versus 45°.
- the signal from the EEG sensors 460 may appear different at close ranges versus farther ranges (e g., different intensity or reduced intensity at greater distances).
- the MyndBand uses thin and flexible electrodes (e.g., EEG sensors 460) in direct skin contact with areas of the user’s head.
- the flexible electrodes may be mounted in hats, headbands, Gel Blaster masks, or other wearables.
- the EEG sensors 460 were connected to a small computing device with a Bluetooth transmitter.
- the resulting data e.g., signals from the EEG sensors 460
- the EEG sensors 460 produced signals that was parsed to indicate brainwave changes associated with being hit by a gellet (e.g., projectile).
- the study utilized a surprise impact methodology' such that the participants were not anticipating being hit by a gellet.
- the participants were instructed to proceed with routine tasks until a tester used a blaster 104 to hit the users with a gellet.
- the study results showed a significant change in the EEG sensors 460 signal using different participants upon being hit by a gellet. A subset of the data acquired is shown below in TABLE 2.
- the EEG sensors 460 signals may vary in their intensity. It may be advantageous to utilize the BCI sensor 458 data as a primary or secondary co-witnessing element (e.g., along with the PE sensor 410, the IR sensor 414, or other suitable co-witnessing elements) depending on the range and reliability of the EEG sensors 460 signals at those ranges.
- the control circuitry or remote computing device can prioritize the BCI sensor 458 data depending on the gameplay environment, ranges, and other factors.
- a fabric deformation hit detection methodology may be utilized to provide cowitnessing of an impact.
- Smart textile wearables may detect changes in electrical cunent when deformed from projectile impact.
- optical fibers or conductive fabrics embedded or weaved into a wearable device 412 or other wearable shirt or vest are within the scope of the present invention.
- the fabric deformation hit detection methodology may be an additional form of co-witnessing a hit, and may also be utilized with ML for noise (false positive) reduction.
- the fabric deformation hit detection methodology may be implemented through use of electronic textiles in which microcontrollers, sensors, and actuators may be integrated into wearable items.
- the application when one player is closer than a threshold close-quarters distance, the application can turn off gellet (projectile) firing and utilize IR only for hit attribution.
- FIG. 11 is a smart blaster 416 in accordance with a preferred embodiment of the present invention.
- the smart blaster 416 includes the mechanical and electronic structures described herein to shoot gellets and project IR radiation.
- the smart blaster 416 preferably includes a wireless networking device configured to transmit and receive data as a node in a wireless network (e.g., mesh network, as described herein).
- Hit attribution in a preferred embodiment, is a combination of hit detection and hit recognition.
- hit detection includes the ability for a physical object with a PE sensor 410 to detect when the physical object was hit, as described above.
- hit detection determines whether an object that strikes at or near the PE sensor 410 is a projectile.
- Hit detection may also include timer logic between smart blaster 416 and target to associate time of impact to distance of shooter.
- triangulation utilizing a mesh network improves higher accuracy in hit detection.
- the smart target 400 includes embedded infrared (IR) detection sensors 414 that may detect hits from multiple angles.
- the smart blaster 416 (or other projectile-firing device) includes the ability to generate and project (e.g., emit) IR radiation in a beam 420 from an IR emitter 418.
- the IR emitter 418 includes a beam angle at which the IR radiation is projected that provides a spread/angle over distance.
- the range of an IR emitter 418 is 50 feet and the optics allow a spread of about 2 degrees, which preferably provides a beam spread of 2 feet at a distance of 50 ft.
- the IR emitter 418 spread may be defined as a wider diameter meant to illuminate at an angle where the top of the beam cone corresponds to a straight trajectory, and the bottom of the beam cone corresponds to a smart blaster 416 pointed towards the sky at a certain angle that makes the projectiles have a longer range, and thus the IR beam 420 accounts for projectile drop.
- a combination of these techniques could be used to accommodate different game modes and play styles.
- the smart blaster 416 may include a modifiable emitter 418 that includes two beams 420 having different spread angles.
- a point blank shot may be a wider beam spread (e.g., 3 degrees) with a larger spread angle while a longer range shout may be a narrower beam spread (e.g., 2 degrees) with a smaller spread angle.
- the emitter 418 may be configured to automatically switch between the different spread angles
- IR-based hit detection sensors 414 embedded in the smart target 400 when coupled with smart blasters 416, can maintain bidirectional communication across multiple players and use timer-based co-witnessing to identify, arbitrate, and attribute hits to the corresponding shooter.
- the smart target technology may be implemented on smart blasters 416 (e g., emitters 418) and all other objects, including wearables and standalone targets (e.g., receivers 414).
- FIG. 12 is a schematic diagram of a circuit for the smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention.
- the circuit of FIG. 12 is configured to be utilized in connection with the petal portion 404, the outer petals of the smart target 400 depicted in FIG. 9.
- the circuit depicted in FIG. 12 is configured to be utilized in connection with the petal portion 404, the outer petals of the smart target 400 depicted in FIG. 9.
- a microcontroller 422 e.g., ATT1NY404SSMCSO1C
- a PE sensor 410 e.g., a PE sensor 410
- a programming chip e.g., UPDI programming chip
- interfaces 426 e.g., SD1 and SM04B-SRSS- TB(LF)(SN) interfaces
- an array of addressable LEDs 428 e.g., WS2812s
- an IR sensor 414 e.g., IR sensor 414.
- the microcontroller 422 depicted in FIG. 11 is a Microchip ATtiny404SSMC-type microcontroller.
- the ATtiny404SSMC microcontroller utilizes high-performance, low- power AVR® RISC architecture.
- the microcontroller 422 is capable of running up to 20 MHz, with up to 2/4 KB Flash, 128/256 bytes of SRAM, and 64/128 bytes of EEPROM in a 14-pin package.
- the microcontroller 422 utilizes a flexible and low-power architecture including Event System and Sleepwalking, accurate analog features, and advanced peripherals.
- the microcontroller 422 includes PIEZO IN and IR SENSOR for piezo sensor 410 and IR sensor 414 inputs.
- the PE sensor 410 is a piezoelectric sensor configured to detect impacts at or near the physical location of the PE sensor 410.
- the programming chip 424 is a Microchip Unified Program and Debug Interface (UPDI) configured for external programming and debugging of the device.
- the interfaces 426 are SD1 and SM04B-SRSS-TB(LF)(SN) interfaces that are configured to receive I2C SCL and SDA signals.
- the SCL and SDA signals are data and clock signals, respectively.
- the array of addressable LEDs 428 is configured to drive LEDs or LED strips to light up portions of the petal portions 404.
- One of ordinary skill in the art would understand that other components may be utilized to achieve the electronic circuit and functionality represented by the circuit of FIG. 12, including a different programming chip 424, interfaces 426, and addressable LEDs 428.
- FIGS. 13-1, 13-2, and 13-3 depict portions of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention.
- the circuit of FIGS. 13-1, 13-2, 13-3 is configured to be utilized in connection with a "pistil," the center portion 402 of the smart target 400 of FIG. 1.
- 13-1, 13-2, and 13-3 include a microcontroller 430, a switch 432, a USB driver 434, a fuse 436, a USB receptacle 438, an external battery 440, an alternate microcontroller 442, an IR sensor 414, a PE sensor 410, interfaces 444, and an array of addressable LEDs 446.
- a microcontroller 430 a switch 432, a USB driver 434, a fuse 436, a USB receptacle 438, an external battery 440, an alternate microcontroller 442, an IR sensor 414, a PE sensor 410, interfaces 444, and an array of addressable LEDs 446.
- an ATTINY404SSMCSOIC microcontroller 422 an ATTINY404SSMCSOIC microcontroller 422
- the ESP32-WROOM-32E microcontroller 430 is shown in FIG. 13-1.
- the ESP32-WROOM- 32E microcontroller 430 is a Wi-Fi + Bluetooth + Bluetooth LE MCU module that is configured for low-power sensor networks and voice encoding, music streaming and MP3 decoding.
- the ESP32-WROOM-32E includes a PCB antenna.
- the microcontroller 430 includes PIEZO IN and IR SENSOR for the PE sensor 410 and the IR sensor 414 inputs. FIG.
- 13- 1 also depicts the switch 432, configured to switch between the microcontroller 430 and the alternate microcontroller 442, the USB driver 434 configured to provide power to the circuit, a fuse 436 in connection with power operation, the USB receptacle 438 configured to receive a USB power input, and an external battery 440 configured to power the device when not connected to USB power.
- Designations of A-G indicate connections of the circuit to the components shown on FIGS. 13-2 and 13-3.
- FIG. 13-2 shows an alternate microcontroller 442 (e.g., ESP32-C3-WROOM-02-N4) in place of the previously described microcontroller 430.
- the ESP32-C3-WROOM-02 microcontroller 442 includes Wi-Fi and Bluetooth LE modules, 4 MB external SPI flash, and an on-board PCB antenna.
- FIG. 13-2 also depicts the PE sensor 410 and the IR sensor 414 as well as the array of interfaces 444 (as opposed to single SD1 and SM04B-SRSS- TB(LF)(SN)) interfaces).
- the portion of the circuit depicted in FIG. 13-2 also includes a filtering resistor 445 to filter the PE sensor 410 signal before it is input into the microcontroller 442.
- the filtering resistor 445 to filter the PE sensor 410 signal in FIG. 13-2 is chosen so that the peak of the detected signal output is set to a certain voltage. If the value of this resistor is changed, the peak voltage shown by the projectile impact will change as well. Filtering to increase detection accuracy may involve including a circuit that allows changing this resistor value in real time to decrease or increase the device sensitivity.
- the IR emitter 418 spread may be defined as a wider diameter meant to illuminate at an angle where the top of the beam cone corresponds to a straight trajectory, and the bottom of the beam cone corresponds to a smart blaster 416 pointed towards the sky at a certain angle that makes the projectiles have a longer range, and thus the IR beam 420 accounts for projectile drop.
- a combination of these techniques could be used to accommodate different game modes and play styles.
- filtering techniques could be implemented in the circuit or in firmware/software.
- an envelope detector to wrap a gellet waveform may be utilized.
- a peak detector circuit may also be utilized, in which an output only would be provided to the microcontroller as a high signal when the voltage is above a peak voltage threshold. The output of a "high" signal may also reduce sampling requirements.
- a bandpass filter may also be utilized to tune the circuit and isolate the waveform signature. Combinations of these filtering techniques could be utilized.
- the ML trained logic is utilized to determine the best filtration techniques to fit the particular application (e.g., gellet impact on a standalone target).
- Hit attribution technologies embedded in the smart target and associated circuitry, firmware, and software can be expanded to include other input sources such as smartphone microphones (or standalone microphones). For example, identification and inventory of gellet acoustic signatures may be differentiated against background noise and artifacts for increased accuracy. The acoustic signatures may be added to the PE sensor 410/IR sensor 414 logic to achieve increased accuracy, or any combination of these hit attribution sources of data may be used separately to provide an appropriate level of accuracy, depending on a variety of factors including complexity of the device, cost, manufacturing difficulty, or other factors. A smartphone could also provide other sources of data to the application, including GPS or other sensor data to integrate to achieve increased accuracy of hit attribution.
- a wearable or standalone device could implement the other sensor data to increase accuracy.
- a standalone device at a known location relative to the impact location may be utilized to determine distance (e.g., echos), to provide additional sensor data for the application to utilize to increase accuracy.
- FIG. 13-3 shows the array of addressable LEDs 446 (e.g., WS2812s) and transistors 447 coupled to the data control input/output of the USB driver 434.
- addressable LEDs 446 e.g., WS2812s
- transistors 447 coupled to the data control input/output of the USB driver 434.
- FIGS. 14-17 depict exemplary smart target 400 mesh networks in accordance with preferred embodiments of the present invention.
- the target mesh networks depicted in FIGS. 14-17 may be utilized independently or in combination with each other.
- the target mesh networks of FIGS. 14A and 14B are built around a Bluetooth Low Energy (BLE) topology/protocol.
- the BLE protocol may utilize mesh networking to implement many-to- many communication.
- FIG. 14A depicts a basic BLE mesh network including a router 448, phone 450, root 452, and a plurality of nodes 454.
- the BLE mesh network in connection with an embodiment of the present invention is configured to control multiple Bluetooth- enabled devices from a single root 452.
- the root 452 controls all three nodes 454 by use of a publish/subscription model.
- the child nodes 454 are subscribed to the parent node 454 and the parent node 454 is publishing messages to the child nodes 454.
- FIG. 14B also depicts a basic BLE mesh network with a router 448, a phone 450, a root 452, nodes 454, and relay nodes 456.
- the relay nodes 456 are utilized to propagate child nodes 454 and move data downstream.
- FIGS. 15A, 15B, 15C are built around an ESP-NOW topology/protocol.
- the ESP-NOW protocol enables multiple devices to communicate with one another without using Wi-Fi, and is similar to low-power 2.4GHz wireless connectivity utilized in certain low-power devices.
- FIG. 15A there is shown a Broadcast Mode in which small packet messages are broadcast to all devices from the root 452 to the nodes 454.
- the message packets include sender ID and payload.
- FIG. 15B there is shown a Direct Message mode in which a direct message is sent from a root 452 to a device (e g., node 454).
- a device e g., node 454
- a Game State Broadcast mode in which a game state broadcast is sent from each device (node 454) in accordance with a predetermined interval.
- the current game state payload contains target hit counts and target miss counts.
- the smart target 400 sets its active state after comparing the incoming message with its current state. The higher counts are then set as the target's active state. The target then sends a Game State Broadcast.
- FIGS. 16A, 16B, 16C consist of a self-organizing and self- healing WiFi mesh network.
- FIG. 16A illustrates a self-organizing mesh network in which a node 454 is configured to connect with other nodes autonomously.
- each of the nodes 454 depicted in FIG. 16A are communicating with each other within the mesh network.
- the mesh network depicted in FIG. 16A permits multiple devices (i.e., nodes 454) to communicate with each other without the need for a router 448 through a wireless local area network.
- FIG. 16B illustrates a node 454 that is out of range (shown as a shaded node 454), and thus, not connected to the mesh network.
- FIG. 16C shows a connection to a node 454 (shown as a shaded node 454) that was out of range and lost connection, but reestablishes communication with a node 454 when it is within range of that particular node 454.
- This methodology is referred to as "self-healing" and can be maintained within a self-organizing mesh network autonomously.
- the target mesh networks of FIGS. 17A, 17B, 17C are WiFi mesh networks.
- FIG. 17A illustrates a basic WiFi mesh network in which the effective range of a router 448 is extended. Multiple nodes 454 within the network permit data to be transferred across the entire network, even if a particular node 454 may have been out of range of the router 448.
- the line depicted in the graphic illustrates the range of the router 448, in which only the root 452 is within range; however, the nodes 454 are configured to be connected to the root 452 (and other nodes 454) within the network to maintain connectivity and the exchange of data.
- FIG. 17B illustrates a mesh network where the router 448 and root 452 are integrated, thereby eliminating the necessity for a separate router 448.
- 17C illustrates a mesh network with a backup router 456, configured such that a node 454 of the network is also configured to be a router 448.
- the backup router/node 456 may be utilized to continue to facilitate communication with the nodes 454.
- FIGS. 14-17 illustrate preferred embodiments of the present invention.
- the smart target 400 system will adapt a wireless connection module-type that allows for multiple smart targets 400 to be connected to the same network and function as nodes 454 to selectively increase gameplay complexity, as well as expand the network coverage area, thus increasing the gameplay area size.
- Mesh networking will also uniquely allow for the low-latency broadcast of messages and commands to all players 116, 118 and devices in the network despite line-of-sight limitations existent in other systems.
- achieving real-time or near-real-time communication via a mesh network may include WiFi without traditional TCP/IP protocol, simplified message IDs, and/or device IDs.
- simple messaging transmits single packets blindly, or until an acknowledgement is received on its complementary receiver.
- Data link messaging may establish a connection through simple messaging, then begins to transmit continuously through an error-corrected protocol.
- low-latency communications may be implemented through a unique protocol.
- the smart target system and blasters
- the reduction of bandwidth required and usage of high-throughput wireless networks can broadcast information that pertains to gameplay in a number of different factors and controls.
- FIG. 18 is a table of a message protocol 458 in accordance with a preferred embodiment of the present invention.
- the message protocol 458 depicted in FIG. 18 includes "Dev Mode” 460 and "Game Mode” 462 for smart targets 400 and the application 464.
- "Dev Mode” 460 is a developer's mode 460 that includes Commands 466 for the smart targets 400 such as Set Target Index 468, Clear All Hit Count 470, Time Between Cycles 472, Cycles Between Hits 474, Start Slice Min Value 476, Minimum Read Value 478, and Hit Read Value 480.
- Commands for the application 464 include Sensor Hit Count 482, Dev App Settings 484, and Dev Sensor Readings 486.
- the "Game Mode” 462 is a mode used to facilitate communication between nodes 454 during a gameplay session that includes
- Commands 466 for the smart targets 400 such as Reset All Targets 488, Activate Target
- Commands 466 for the application 466 include Hit Message 494, Miss Message 496, and Active State 498.
- the Dev Mode 460 is for developers or programmers of the gameplay environment and development of features for gameplay.
- the Dev Mode 460 is also utilized to develop future physical functions of the smart targets and smart target systems.
- the Dev Mode 460 is preferably configured to develop ML trained logic for use with the PE sensor 410 or other sensors as described herein.
- the Game Mode 462 includes separate Target 400 and App 464 protocols 458.
- the Game Mode 462 for the App 464 in a preferred embodiment, once a smart target 400 is hit, the target 400 will broadcast a message with an ID 500/Parameters 502 structure (i.e., the Hit Message Command 494).
- the hexadecimal ID 500 for a Hit Message Command 494 is OxAA, identifying the message type.
- the Parameters 502 portion of the protocol 458 is an array of hexadecimal values (indices) configured to keep track of a hit count for each of the seven targets’ hit counts 504 in the gameplay mode (e.g., Target 1 Hit Count . . .
- the Player ID 506 portion of the Parameters 502 portion of the protocol 458 identifies a player associated with the target hit.
- the application/client 464 is configured to recognize each ID 500 type of the protocol 458 and configured to receive and recognize a particular number of Parameters 502 and process them accordingly.
- the Hit Message 494 displays the number of times each designated panel has been hit by gellets on the mobile app.
- the Miss Message 496 displays the number of times gellets were fired but did not hit the designated (and connected) panel on the mobile app.
- the Target Miss Count 508 Command 466 is needed for scoring in individual game modes as some games will take into account the accuracy of a user's 116,
- the Active State 498 shows which of the panels are activated or not.
- App 464 Game Mode 462 therefore is configured to receive and display the Hit Message 494, Miss Message 406, and Active State 498 in accordance with embodiments of the present invention as described herein.
- the Target 400 Game Mode 462 includes Reset All Targets 488, Activate Target 490, and Color Test 492 Commands 466.
- the Reset All Targets 488 Command 466 resets each of the connected panels to the null or a deactivated state.
- the Activate Target 490 Command 466 activates all of the connected panels to be ready for the game to begin.
- the Color Test 492 Command 466 lights up the panels in a connected smart target with a custom color from individual Red, Blue, and Green hex values. In a preferred embodiment, the custom colors are determined in accordance with the Target States of FIG. 19.
- the Activate Target 490 Command 466 includes Target State 510 Parameters 502 Target 0 State 510, Target 1 State 510 . . . Target 6 State 510.
- FIG. 19 is a table of smart target 400 Target States 510 in accordance with a preferred embodiment of the present invention. In accordance with FIG. 19, there is disclosed 256 possible Target States 510 each specified by an ID 512 in a hex value. For example, the "Off 1 516 Target State 510 is associated with the ID 512 0x00 hex value.
- Each ID 512 and Target State 510 are also associated with a Color 514; for example, the ID 512 of 0x00 and Target State 510 "Off 516 is associated with Color 514 #000000 (i.e., no color).
- FIG. 19 identifies other Target States 510 (i.e.. State column) of Hit 518, Miss 520, BT (i.e., Bluetooth) not connected 522, BT connected 524, Player 1 527, and Player 2 528.
- Other Target States 510 are possible in accordance with embodiments of the present invention.
- the Color 514 column of the Target States 510 protocol identifies the hex value (e.g., hex code) of the color, and in parentheses, the actual color associated with the hex value. For example, Magenta 530 is ascribed a hex value of #FF00FF, Green 532 is ascribed a hex value of #00F00, Red 534 is ascribed a hex value of
- each hex value indicates amounts of Red/Green/Blue.
- the Color Test 492 Command 466 is a way to test specific colors by providing the red, green, and blue hex values.
- the Target States 510 in FIG. 19 have a predefined color assignment and only give information about the current state of each panel. Target State 510 will tell the app 464 whether the panel has received a miss or a hit, it will also communicate the Bluetooth connection state of the smart target 400 device as a whole, among other states.
- the Parameters 502 portion of the App 464 Game Mode 462 for the Activate Target 488 Command 466 includes Target 1 State 510 . . . Target 7 State 510, and Player ID 506.
- the Target 1 State 510 . . . Target 7 State 510 identifies the current Target State 510 in accordance with FIG. 19.
- the Player ID 506 is the identifier for each connected smart blaster 416.
- the target states 510 of a smart target 400 in accordance with a preferred embodiment of the present invention.
- the 'Off 516 state represents the panel is turned off and not detecting impacts.
- the 'Hit' 518 state is communicated when the panel is hit by a gellet.
- the 'Miss' 520 state is communicated when the panel is missed by a gellet (by time out or by hitting another panel).
- the 'BT not connected' state 522 is communicated when the smart target 400 is not connected to a user's phone with Bluetooth.
- the 'BT connected' state 524 is communicated when the smart target 400 is connected to a user's phone with Bluetooth.
- FIG. 20 is a table of a message protocol 542 in accordance with a preferred embodiment of the present invention.
- FIG. 20 depicts a message protocol 542 that is an alternative embodiment to the message protocol 458 of FIG. 18.
- FIG. 20 depicts a message protocol 542 that is an alternative embodiment to the message protocol 458 of FIG. 18.
- the "Game Mode” 462 is a mode used to facilitate communication between nodes 454 during a gameplay session that includes Commands 466 for the smart targets 400 such as Reset All Targets 488, Activate Target 490, similar to those same commands depicted and described in connection with FIG. 18.
- Commands 466 for the application 464 include Hit Message 494, Miss Message 496, and Active State 498.
- the Commands 466, ID 500, and Parameters 502 portions of the message protocol 542 generally function similarly to the same portions of the message protocol 458, as further described herein.
- Parameters 502 portion of the message protocol 542 use the nomenclature “Pad” instead of “Target.”
- the nomenclature “Pad” can be substituted for the term “Target” as the “Pad” is a portion or area of the smart target 400 such as the pistil portion 402 and the petal portions 404.
- the message protocol 542 includes additional Commands 466 relating to the Target 400 portion of the protocol 542 such as Deactivate Target 544, Activate Pad 546, Activate Hidden Pad 548, Set All Color 550, Set Color State 552, Set Device Name 554, and Set Device Mode 556.
- the Deactivate Target 544 Command 466 is configured to deactivate the smart target 400, conceivably in a scenario in which, according to the gameplay configuration, the target has been taken out by gellet hits and is no longer active.
- the Activate Pad 546 Command 466 is configured to activate a portion or area of the smart target 400, such as only the pistil portion 402 or one of the petal portions 402. The Activate
- Hidden Pad 548 Command 466 is configured to activate a pad (e.g. pistil or petal portions 402, 404) that has previously been hidden in accordance with the gameplay configuration by the software application.
- the Set All Color 550 Command 466 is configured to set the color of each of the pads to a specified RGB hex value.
- the Set Color State 552 is configured to set the color of specified pads (e.g., Pad 0 State) to a particular color.
- the Set Device Name 554 Command 466 is configured to specify a particularized name for a device (e.g., node 454 or smart target 400) using an array of hex values.
- the Set Device Mode 556 Command 466 is configured to set a Device Target Mode 560 in accordance with the Target Modes 560 depicted in the table of FIG. 21B.
- the message protocol 542 includes the additional Command 466 relating to the App 464 portion of the protocol 542 of Display State 558.
- the Display State 558 Command 466 is configured to display the particular Target State 510.
- the Target State 510 is displayed on the application 464 to the user 116, 118.
- FIG. 21 A is a table of smart target 400 Target States 510 in accordance with a preferred embodiment of the present invention.
- 256 possible Target States 510 each specified by an ID 512 in a hex value.
- the "Off 516 Target State 510 is associated with the ID 512 0x00 hex value.
- Each ID 512 and Target State 510 are also associated with a Color 514; for example, the ID 512 of 0x00 and Target State 510 "Off 516 is associated with Color 514 #000000 (i.e., no color).
- FIG. 21 A is a table of smart target 400 Target States 510 in accordance with a preferred embodiment of the present invention.
- 256 possible Target States 510 each specified by an ID 512 in a hex value.
- the "Off 516 Target State 510 is associated with the ID 512 0x00 hex value.
- Each ID 512 and Target State 510 are also associated with a Color 514; for example, the ID 512 of 0x
- Target States 510 i.e., State column
- BT i.e., Bluetooth
- Other Target States 510 are disclosed as “(Active)” states.
- the Color 514 column of the Target States 510 protocol identifies the hex value (e.g., hex code) of the color, and in parentheses, the actual color associated with the hex value. Each of the colors depicted in FIG. 19 are repeated in FIG. 21 A.
- FIG. 21B is a table of smart target 400 Target Modes 560 in accordance with a preferred embodiment of the present invention.
- the Target Modes 560 are configured to operate the colors of the smart targets 400 in accordance with a prescribed animation or gameplay configuration.
- an ID 568 and State 570 indicate the specific hex value and associated information about the state of the Target Mode 560.
- a Rainbow Animation 572 is configured to display a rainbow display of color on the smart target 400.
- a Flipper Game 574, a Eliminator Game 576, and a IR Dev Game 578 are all configured to operate colors and timing of the colors in connection with a particular gameplay configuration.
- One of ordinary skill in the art would understand that a variety of animations and gameplay configurations are within the scope of the present invention and those depicted in FIG. 21B are exemplary.
- FIG. 22A, 22B is a flow diagram of a smart target co-witnessing hit attribution method 600 in accordance with a preferred embodiment of the present invention.
- a first sensor signal from a first sensor is received in response to an impact at or near a first smart target location.
- the first sensor is the PE sensor 410.
- the first sensor is an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computervision sensor, a radio-frequency sensor, or a smart textile sensor. While the first sensor is specified to be a single sensor, other embodiments in which multiple PE sensors 410 or IR sensors 414 or combinations of the above-referenced sensors are provided.
- the first sensor signal is triggered to be transmitted to and received by control circuitry preferably integrated with the first sensor upon sensing an impact.
- a PE sensor 410 generates an electrical charge or signal that corresponds to the physical act that it sensed, and provides the signal for processing.
- the impact may be the impact of any object, including any type of projectile.
- it is determined whether the impact occurs at or near a predetermined distance (e.g., an area) from a predetermined location (e.g., the first smart target location).
- the predetermined distance may be set such that impacts only on the first smart target location (e.g., a pistil portion 402 or petal portion 404) are provided to the control circuitry.
- the predetermined distance may encompass a larger area than the first smart target location. In a case in which the first sensor senses an impact on a second smart target location, that data may be utilized to confirm impact on the second smart target location.
- the control circuitry may be configured to determine that the first sensor signal corresponds to a successful impact.
- the PE sensor 410 may be configured through circuitry (or otherwise) to provide only signals beyond a peak threshold to the control circuitry.
- the control circuitry is integrated with a peak detector and threshold.
- the first sensor may simply provide all signals in response to an impact at or near the first smart target location.
- other methods such as software or firmware manipulation or filtration may be provided to assure that the first sensor signal is an appropriate detection of a hit in association with the first smart target location.
- filtration techniques as descnbed herein may be utilized to isolate and assist the control circuitry in determining that the impact was a projectile associated with the gameplay configuration and environment rather than another type of impact.
- the projectile is a gellet.
- the first smart target location preferably is a pistil portion 402 or petal portion 404.
- the PE sensor 410 alone, or in combination with control circuitry', may be configured to sense more than one smart target 400 location.
- a single first sensor may be utilized to sense impacts of projectiles at two locations on the smart target 400.
- the first smart target location may also be a smart target 400 located on the wearable device 412.
- a second sensor signal from a second sensor is received in response to an impact at or near the first smart target location.
- Step 604 is configured to co-witness the same impact event as that identified in connection with Step 602.
- the second sensor may be any of a variety of sensors including a PE sensor 410, an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor.
- the first sensor is a PE sensor 410
- the second sensor is an IR sensor 414, or another type of sensor as described above.
- the first and second sensors are both PE sensors 410.
- the second sensor signal is generated based in response to an impact at or near the first smart target location.
- the IR sensor 414 may be configured to provide an IR signal should it receive radiation at a particular wavelength.
- the wavelength is 940 nanometers.
- Other wavelengths are suitable for use as an IR sensor 414.
- the IR sensor 414 in an embodiment does not utilize infrared radiation, but rather, utilizes frequencies and wavelengths outside of the infrared band.
- One of ordinary skill in the art would understand that while certain wavelengths are theoretically possible, one should provide safe solutions that prevent dangerous situations for a user’s eyes and other body parts sensitive to certain types of radiation.
- the first sensor signal and the second sensor signal were determined to be received within a predetermined time period.
- the first and second sensor signals are presumed to arrive almost instantly from their respective sensors.
- One of ordinary skill in the art would understand how to provide the first and second sensor signals to control circuitry, or the like, to ensure that the time between sensing the impact and the time the signals are received is the same or substantially the same between the sensors.
- the predetermined time period preferably is set to be sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the signals were generated at or about the same time. This is important to ensure that the first and second sensors co-witnessed the same impact at the same time.
- the predetermined time period would necessarily be longer due to the time the projectile takes to impact a smart target 400.
- a predetermined time period depends upon several factors including circuitry delay, the relative delays caused by the circuitry for different types of sensors, the time the projectile takes to impact a target, and the velocity of the device 416, all of which are within the scope of the present invention.
- a smart target update is provided indicating a successful hit in connection with the first smart target location of the smart target 400.
- the smart target update is configured to identify the target state of each smart target location of the smart target.
- utilization of the target mesh networks and protocols described herein ensure sufficiently fast communication to ensure real-time or near-real-time updates to the smart target states.
- a third sensor signal from a third sensor is received in response to an impact at or near a second smart target location.
- the third sensor is the PE sensor 410.
- the third sensor is an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor. While the third sensor is specified to be a single sensor, other embodiments in which multiple PE sensors 410 or IR sensors 414 or combinations of the above-referenced sensors are provided.
- the third sensor signal is triggered to be transmitted to and received by control circuitry preferably integrated with the third sensor upon sensing an impact.
- a PE sensor 410 generates an electrical charge or signal that corresponds to the physical act that it sensed, and provides the signal for processing.
- the impact may be the impact of any object, including any type of projectile.
- the control circuitry may be configured to determine that the third sensor signal corresponds to a successful impact of a projectile.
- the PE sensor 410 may be configured through circuitry (or otherwise) to provide only signals beyond a peak threshold to the control circuitry.
- the control circuitry is integrated with a peak detector and threshold.
- the third sensor may simply provide all signals in response to an impact at or near the second smart target location.
- other methods such as software or firmware manipulation or filtration may be provided to assure that the third sensor signal is an appropriate detection of a hit in association with the second smart target location.
- filtration techniques as described herein may be utilized to isolate and assist the control circuitry in determining that the impact was a projectile associated with the gameplay configuration and environment rather than another type of impact.
- the projectile is a gellet.
- the second smart target location preferably is a pistil portion 402 or petal portion 404.
- the PE sensor 410 alone, or in combination with control circuitry', may be configured to sense more than one smart target 400 location.
- a single third sensor may be utilized to sense impacts of projectiles at two locations on the smart target 400.
- the second smart target location may also be a smart target 400 located on the wearable device 412.
- a fourth sensor signal from a fourth sensor is received in response to an impact at or near the second smart target location.
- Step 612 is configured to co-witness the same impact event as that identified in connection with Step 610.
- the second sensor may be any of a variety of sensors including a PE sensor 410, an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor.
- the third sensor is a PE sensor 410
- the fourth sensor is an IR sensor 414, or another type of sensor as described above.
- the third and fourth sensors are both PE sensors 410.
- the fourth sensor signal is generated based in response to an impact at or near the second smart target location.
- the fourth sensor signal is an IR sensor 414 signal
- the third sensor signal is a PE sensor 410 signal
- the IR sensor 414 may be configured to provide an IR signal should it receive radiation at a particular wavelength.
- the wavelength is 940 nanometers.
- Other wavelengths are suitable for use as an IR sensor 414.
- the IR sensor 414 in an embodiment does not utilize infrared radiation, but rather, utilizes frequencies and wavelengths outside of the infrared band.
- One of ordinary skill in the art would understand that while certain wavelengths are theoretically possible, one should provide safe solutions that prevent dangerous situations for a user’s eyes and other body parts sensitive to certain types of radiation.
- the second smart target location is preferably distinct from the first smart target location, but located on the same smart target 400. However, in other embodiments, the first and second smart target locations are located on different smart targets 400.
- the first through fourth sensors and control circuitry are preferably configured such that a single impact at or near both the first and second smart target locations will generate signals in which either the first and second sensor signals confirm an impact or the third and fourth sensors confirm an impact, but not both. Stated otherwise, the sensors should be sophisticated enough so that impacts of projectiles on first and second smart target locations may be distinguished.
- the third sensor signal and the fourth sensor signal were determined to be received within a predetermined time period.
- the third and fourth sensor signals are presumed to arrive almost instantly from their respective third and fourth sensors.
- One of ordinary skill in the art would understand how to provide the third and fourth sensor signals to control circuitry, or the like, to ensure that the time between sensing the impact and the time the signals are received is the same between the sensors.
- the predetermined time period preferably is sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the signals were generated at or about the same time. This is important to ensure that the third and fourth sensors co-witnessed the same impact at the same time.
- the third sensor and the forth sensor are the same (e.g., two IR sensors 414), then the signals would typically be generated at around the same time.
- the third sensor is the PE sensor 410 and the fourth sensor is the IR sensor 414, then the predetermined time period would necessarily be longer due to the time the projectile takes to impact a smart target 400.
- a predetermined time period depends upon several factors including circuitry delay, the relative delays caused by the circuitry for different types of sensors, the time the projectile takes to impact a target, and the velocity of the device 416, all of which are within the scope of the present invention.
- a smart target update is provided indicating a successful hit in connection with the second smart target location of the smart target 400.
- the smart target update is configured to identify the target state of each smart target location of the smart target.
- utilization of the target mesh networks and protocols described herein ensure sufficiently fast communication to ensure real-time or near-real-time updates to the smart target states.
- Step 618 multiple smart target updates are received in real-time or near-real-time.
- the smart target updates are provided in Steps 608 and 616 by control circuitry in connection with a wireless network in accordance with one or more protocols.
- the target mesh networks disclosed herein and the protocols identified as communicating via hexadecimal values may be utilized to provide or transmit the smart target updates.
- the smart target updates are target state updates in accordance with the protocols disclosed in FIGS. 18-21B.
- the smart target updates are configured to be received by an extended-reality gaming application.
- the extended-reality gaming application is, for example, the application 464.
- FIG. 22B is a second portion of a flow diagram of a smart target co-witnessing hit attribution method 600 in accordance with a preferred embodiment of the present invention.
- FIG. 22B shows the continuation of FIG. 22A, indicating “A” from Step 218.
- Step 620 projectile-firing device data is received, the data including orientation and location of the device 416, and the time of firing of the device 416.
- the device 416 includes a smart target 400.
- the device 416 preferably includes an IR emitter 418 that emits IR radiation in a beam 420.
- the IR emitter 418 module or other components in the device 416 may supply the orientation and location of the device 416.
- the time of projectile impact and the time of firing the device 416 is determined to be within a predetermined time period.
- a time stamp may be provided along with the first through fourth sensor signals, or a time stamp may be generated by control circuitry and packetized along with the smart target updates.
- a sensor and network latency may be calculated and predetermined such that receipt of the smart target update and the projectile-firing device data include a predetermined time delay from which it is determined whether the time of projectile impact and the time of firing occurred within the predetermined time period.
- the predetermined time period preferably is sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the cowitnessing of the impact and the attribution of the hit to the particular device 416 should be correlated. It may be advantageous to determine the predetermined time period based on a relatively immediate time window between emission of IR radiation from the IR emitter 418 and detection by the IR sensor 414.
- an expected time a projectile will travel over the distance from the device 416 to the smart target 400 and determine the predetermined time period based on the time window between firing of the projectile and the detection of an impact of an object by the PE sensor 410.
- the time to target for the projectile is approximately 147 milliseconds.
- the predetermined period of time therefore can be set to 147 milliseconds or greater to provide a margin of error (e.g., 150 ms or 160 ms).
- a timer can be set to expire at the predetermined period of time and a successful hit attributed both to the device 416 and the portion of the smart target 400 hit by the projectile if the time of impact is within the predetermined period of time.
- One of ordinary skill in the art would understand that a variety of methodologies to determine the predetermined time period are within the scope of the present invention.
- impact detection by the PE sensor 410 is turned off, or in ‘"sleep mode,” and the portion of the smart target 400 in question is set to only actively detect IR radiation.
- a timer is started on the processor associated with the smart target 400 circuitry and the impact detection is activated. The timer is permitted to run for a predetermined period of time, usually less than a few seconds, with the most optimal time determined by the physical distance between the device 416 and the PE sensor 410.
- the impact detection is turned off and the IR message received can be counted as a “miss.” If the PE sensor 410 detects an impact of a projectile, then the IR message is correlated with the PE sensor 410 detection and counted as a successful “hit.”
- one or more separate sensors are provided to accurately measure distance in real-time or near- real-time.
- a LIDAR sensor or laser-based sensor capable of measuring distances of up to 50 feet (e.g., a device 416 effective range) is provided. This distance measurement can be used by itself, or in combination with an integrated photogate-based chronograph integrated into the device 416.
- the barrel 129 is molded and/or integrated in an infrared wavelength-invisible material with two pairs of an infrared LED pointed across the barrel facing a phototransistor sensitive to that wavelength.
- a hard-coded time threshold can serve as the limiting factor in identifying which shooter achieved the first impact, where two shooters positioned at different distances from the target will have their projectiles arrive at different times, with one arriving after the target’s threshold expires, thus not being counted.
- the target mesh network can relay notices of multiple IR messages (e.g., IR sensor 414 detections) from different devices 416 being received at the same location on the smart target 400 back to the same devices 416 and any other “referee” devices in the target mesh network.
- IR messages e.g., IR sensor 414 detections
- a central processor handling hit attribution would then digitally stagger the projectile fire from these devices 416, to give the timing algorithm additional gaps between shots and IR messages across the devices 416, thus increasing shooter identification accuracy.
- the devices 416 can be outfitted with location-tracking sensors such as ultra-wideband (UWB) or Bluetooth Low Energy (BLE) that can selectively turn on or off the IR sensors physically positioned around the smart target 400 or wearable device 412, correlating the localization sensor data with the IR sensor directional data, further increasing accuracy in detecting the correct shooter.
- location-tracking sensors such as ultra-wideband (UWB) or Bluetooth Low Energy (BLE) that can selectively turn on or off the IR sensors physically positioned around the smart target 400 or wearable device 412, correlating the localization sensor data with the IR sensor directional data, further increasing accuracy in detecting the correct shooter.
- UWB ultra-wideband
- BLE Bluetooth Low Energy
- the device 416 is determined to be within a predetermined range of the first smart target location.
- the projectile-firing device data includes the location of the device 416
- a calculation of distance between the device 416 and the first smart target location can be made.
- the location may be fixed in an arena environment or may be dynamic where the location is on a wearable device 412.
- the target mesh network may be utilized to provide triangulation between different nodes 454 so that a location may be determined relative to other nodes within the network.
- Alternative methodologies of location services may be utilized, such as GPS tracking or the like.
- the predetermined range may be preset based on the battlefield environment and gaming characteristics.
- the predetermined range may be 50 feet, 40 feet, 25 feet, 10 feet, or other ranges depending upon the range of the device 416 and the distances of the smart targets 400.
- the predetermined range may also be determined based on a calculation of time between a time the device 416 is shot and a time the smart target 400 (or a location on the smart target 400) is impacted by a projectile. Using a known device 416 velocity (e.g., 90 FPS), a calculation of the distance to the target may be determined. Once the distance is known, a predetermined range may be the distance plus an additional amount of range (e.g., as a margin of error). Other methodologies to determine distance (and therefore, predetermined range) are disclosed herein, such as using a LIDAR sensor. One of ordinary skill in the art would understand that a variety of methodologies to predetermine the range are within the scope of the present invention.
- the device 416 is determined to be oriented towards the first smart target location.
- the application 464 determines if a vector based on the orientation of the device 416 is aligned with a predetermined area of the first smart target location. For example, IR detection of the device 416 IR emitter 418 may occur at the first smart target location if the smart target 400 includes an TR sensor 414. Tn that case, the IR emitter 418 emits radiation in a beam 420 that spreads over distance, as more fully described herein.
- the predetermined area may be 2 feet, as the typical range of an IR emitter 418 is 50 feet and the optics allow a spread of about 2 degrees, which preferably provides a beam spread of 2 feet at a distance of 50 ft.
- the predetermined area may also be larger or smaller depending on the beam spread of the IR emitter 418. For example, a wider beam spread would typically mean a larger predetermined area while a narrower beam spread would typically mean a smaller predetermined area.
- gaming metric data is updated to indicate a successful hit by the device 416 on the first smart target location of the smart target 400.
- the updated gaming metric data preferably is configured to be updated and usable by the application 464.
- Steps 630, 632, and 634 are substantively similar to Steps 624, 626, and 628, except that instead of the first smart target location a second smart target location is an object of the determinations. It should be understood that the same device 416 may be utilized to hit both the first smart target location and the second smart target location. However, preferably the same projectile may not be determined to hit both the first and second smart target locations simultaneously.
- Step 622 may be utilized to match projectile impacts with a particular smart target location.
- Steps 624, 626, 628 or Steps 630, 632, 634 then higher accuracy may be accomplished.
- a variety of other sensors e g., three or more may be provided to achieve higher accuracy. It is also the case that certain sensors may be better suited for particular ranges, gaming configurations, or devices.
- the BCI sensor 458 may be advantageous to use only at certain ranges to ensure that sufficient impacts occur on the user’s 116, 118 body.
- the barrel tip 131 is in a reduced velocity mode, it may be advantageous to utilize sensor types other than the BCI sensor 458.
- a player state is updated to attribute the hit to a user associated with the device 416.
- the application 464 is configured to update the gaming metric data and display the data to a user (and on particular smart targets 400). For example, the hit may increment a hit count by 1 unit; in other embodiments, as described herein, the hit may count for a particular amount of hit points.
- the application 464 is configured to take the angle of impact and range into account, as well as the gaming configuration and “ammo” type used, to calculated and attribute a certain amount of hit points to the user.
- a smart target state is updated to attribute the hit against an appropriate target location of the smart target 400.
- the application 464 is configured to update the gaming metric data and display the data to a user (and on particular smart targets 400). For example, the hit may increment a hit count against the first and/or second smart target locations associated with a smart target 400.
- application 464 is configured to take the angle of impact and range into account, as well as the gaming configuration and “ammo” type used, to calculated and attribute a certain amount of hit points against a particular location on a target.
- the locations may be pistil portions 402, petal portions 404, or standalone smart targets 400 affixed to the wearable device 412. Other embodiments with smart targets 400 affixed or located on other battlefield objects are within the scope of the present invention.
- an IR sensor 414 is configured to provide an IR signal that represents a particular angle of detection and an intensity of the infrared radiation. Based on the angle of detection and the intensity of the infrared radiation, the method 600 may further determine whether the device 416 is within a predetermined margin of error with respect to the location of the first smart target location and the orientation of the device 416. Stated otherwise, the IR sensor 414 may provide the pertinent information correlating the first smart target location and the device 416 to increase the accuracy of the hit attribution between a user and a smart target 400. In addition, it may be possible to eliminate a need to obtain orientation and location information from the device 416 each time it is fired, so long as the angle of detection and intensity information is sufficient to gauge the orientation and location of the device 416 relative to the first smart target location.
- functionality is implemented as software executing on a server that is in connection, via a network, with other portions of the system, including databases and external services.
- the server comprises a computer device capable of receiving input commands, processing data, and outputting the results for the user.
- the server consists of RAM (memory), hard disk, network, central processing unit (CPU).
- RAM memory
- hard disk hard disk
- CPU central processing unit
- the server could be replaced with, or augmented by, any number of other computer device types or processing units, including but not limited to a desktop computer, laptop computer, mobile or tablet device, or the like.
- the hard disk could be replaced with any number of computer storage devices, including flash drives, removable media storage devices (CDs, DVDs, etc.), or the like.
- the network can consist of any network type, including but not limited to a local area network (LAN), wide area network (WAN), and/or the internet.
- the server can consist of any computing device or combination thereof, including but not limited to the computing devices described herein, such as a desktop computer, laptop computer, mobile or tablet device, as well as storage devices that may be connected to the network, such as hard drives, flash drives, removable media storage devices, or the like.
- the storage devices e.g., hard disk, another server, a NAS, or other devices known to persons of ordinary skill in the art
- the various components of the present invention are stored or recorded on a hard disk or other like storage devices described above, which may be accessed and utilized by a web browser, mobile app, the server (over the network), or any of the peripheral devices described herein.
- One or more of the modules or steps of the present invention also may be stored or recorded on the server, and transmitted over the network, to be accessed and utilized by a web browser, a mobile app, or any other computing device that may be connected to one or more of the web browser, mobile app, the network, and/or the server.
- references to a “database” or to “database table” are intended to encompass any system for storing data and any data structures therein, including relational database management systems and any tables therein, non-relational database management systems, document- oriented databases, NoSQL databases, or any other system for storing data.
- Such software code, modules, or elements may be implemented with any programming or scripting language such as C, C++, C#, Java, Cobol, assembler, PERL, Python, PHP, or the like, or macros using Excel or other similar or related applications with various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Life Sciences & Earth Sciences (AREA)
- General Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Heart & Thoracic Surgery (AREA)
- Health & Medical Sciences (AREA)
- Cardiology (AREA)
- Biodiversity & Conservation Biology (AREA)
- Ecology (AREA)
- Environmental & Geological Engineering (AREA)
- Environmental Sciences (AREA)
- Biophysics (AREA)
- Physics & Mathematics (AREA)
- Acoustics & Sound (AREA)
- Management, Administration, Business Operations System, And Electronic Commerce (AREA)
Abstract
An extended-reality projectile-firing gaming system includes a projectile-firing device that includes a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application, and an extended-reality device. The extended-reality device includes at least one display and is configured to provide computer-aided vision of real and virtual objects and a computer-aided interface for interaction with the real and virtual objects, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
Description
SMART TARGET CO-WITNESSING HIT ATTRIBUTION SYSTEM AND METHOD
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application is a continuation-in-part of U.S. Patent Application No. 18/095,911 filed January 11, 2023, which is a continuation-in-part of U.S. Patent Application No. 17/862,001 filed July 11, 2022, which claims the benefit of U.S. Provisional Patent Application No. 63/220,343, filed on July 9, 2021 . U.S. Patent Application No. 18/095,911 also claims the benefit of U.S. Provisional Patent Application No. 63/409,582, filed on September 23, 2022. All applications listed above are incorporated by reference in their entireties herein.
FIELD OF THE INVENTION
[0002] The present invention relates generally to projectile-firing gaming systems and more particularly to a smart target co-wi tnessing hit attribution system and method.
BACKGROUND OF THE INVENTION
[0003] Indoor or outdoor shooting, blaster or skirmish sports, such as paintball, laser tag, air soft and the like are popular games and hobbies that allow people to run, exercise and have fun. Other shooting exercises, such as hunting, military and police training also involve physical activity. One example of a popular shooting device and game is Gel Blaster. Many people also enjoy video games and, in particular, video games that include guns or shooting (e.g., first-person shooter video games). Unlike video games, however, it is difficult to accurately keep track of hits and how to appropriate attribute a particular hit with a particular shooting device or user. There therefore exists a need to accurately determine when a target or user is hit during a gameplay scenario, particularly when utilizing projectiles. U.S. Patent Nos. 7,796,155, 9,355,583 and 9,573,064 are incorporated by reference herein in their entireties.
[0004] The background description disclosed anywhere in this patent application includes information that may be useful in understanding the present invention. It is not an admission that any of the information provided herein is prior art or relevant to the presently claimed invention, or that any publication specifically or implicitly referenced is prior art.
SUMMARY OF THE PREFERRED EMBODIMENTS
[0005] The present invention includes “video gamifying” a real life paintball/gel blaster/combat experience using augmented reality. The present invention in a preferred embodiment includes a system of wirelessly interconnected blasters, targets, wearables, and “battlefield” objects that allow players to engage in precisely tracked and managed projectile sports and games, resulting in an experience of “Video Games Outside”. This system contrasts with projectile-only sports such as airsoft and paintball, which provide tactile feedback but lack automatic score tracking and other digital features. It also contrasts with laser tag or non-proj ectile-firing sports, which may provide real-time access and control over scoring, game modes, weapon features, etc., but lack tactile feedback, and has to make use of non-intuitive and mostly invisible light physics. In contrast, the present invention includes a system utilizing visible projectile physics and methodology to keep track of score and other gameplay features.
[0006] In a preferred embodiment, the present invention includes digitizing an outdoor/indoor blaster gameplay experience thereby providing a mixed reality video game experience through the use of augmented reality (AR), virtual reality (VR), and/or extended-reality (XR) eyewear (goggles, glasses, helmet, etc.) while also including blasters that shoot or fire actual projectiles. It will be appreciated that the projectiles cause the human target a sufficient amount of sensation such that it makes the expenence engaging. It will be appreciated that the use of the blaster shown in the accompanying drawings is only
exemplary and not limiting. Any type of blaster, gun, etc., that shoots projectiles (e.g., projectile-firing device) is within the scope of the present invention.
[0007] The AR/VR/XR eyewear and/or other wearable items, such as vests, pants, belts, shoes, uniforms, jerseys, suits, and the like and the connected system (including all software, programming, architecture, sensors, wireless connectivity, networking, tracking codes, trackers, etc.) allows the participants to integrate video game type elements and graphics (e.g., having a life bar, different types of ammunition, levels, etc.) within a blaster game, experience or training.
[0008] In a preferred embodiment, the present invention includes a vest/jersey or other wearable that can sense or determine when a player is hit or struck by a projectile (in an embodiment, at least within certain areas on the outer surface of the wearable). For example, the wearable and connected system can sense or log a hit from the front or the back. In a preferred embodiment, the wearable can log or determine the strength, level or intensity of the impact of the projectile. Alternatively, the application, or a combination of the wearable and the application, can log or determine the strength, level or intensity of the impact of the projectile. Therefore, when a player (or the wearable) is struck from a closer range the player loses a greater amount of health points than from a further range.
[0009] In a preferred embodiment, the system can determine which other participant or player shot the player that is hit. This may be done using a location and direction solution in each blaster, e.g., ultra-wide band (UWB) sensors, magnetometer, GPS, WiFi, etc., combined with knowing who is shooting at any given time (so the system can know or determine the approximate time delay between a player firing and when the wearable is hit). The wearable may include buttons or other switches that when pushed or switched cause actions to be taken within the game whether in the AR/VR/XR/virtual environment or in the physical world. For example, pushing a button or touching a predetermined location on
the wearable or on a strap or other worn item may cause the player's health points or level to increase after they were previously earned. Any gaming event can be associated with the switch, button or the like.
[0010] In a preferred embodiment, a software application ("app" or "application") is configured to be usable by all players or participants. The app can provide many different features, including app/game-controlled rate of fire on the blaster. The rate of fire feature (and any other game or invention related feature described herein) can be changeable or upgradeable based on gameplay and other predetermined features. For example, if the player visits a predetermined location and finds a upgraded weapon gun or the like or plays the game for a predetermined length of time, the player may earn a higher rate of fire. If a player has a better or higher level weapon in the game, the player has the ability to cause more damage faster to other players wearing the system connected wearables. The system may include a system-on-chip (SOC) or system-on-module in or on a module connectable to the blaster that connects to the app wirelessly (e.g., via Bluetooth).
[0011] The app and system can then control the rate of fire of the blaster. This feature allows for a single blaster to represent many different potential digital weapons in the “game.” For example, a starter weapon might only allow for one shot per second in semi-auto mode and require a four second reload time after ten shots, while the top-end weapon in the game allows for ten shots per second in fully automatic mode and does not impose any “digital reload/time requirements.” As a result, a player would only need to physically reload the blaster every time the physical round hopper is emptied. This will also allow for when a player gets eliminated, their blaster does not fire anymore until a new game starts, they go re-spawn at their base, find an extra life somewhere within the battlefield, or otherwise, or other digital gameplay that is utilized during that game or round. It will be appreciated that there can be multiple scenarios where a player may not be able to use their blaster. For
example, in a preferred embodiment of the invention, the system may allow the ability to download different environments (e.g., AR/VR/XR environments) that can be used during gameplay. For example, the environments can include jungles, old west, ships, snow, cities, historic, world monuments or movie, comic book and other related themes (e.g., Star Wars, Marvel, Harry Potter, etc.). These environments may provide reasons or the ability to turn off blasting or provide other features within the system gameplay. For example, in a Harry Potter-related environment, there may be a spell or the like that can be "cast" and prevents all the other teams blasters or weapons from working. In a Star Wars environment, the blaster or other handheld item or weapon may become a lightsaber.
[0012] Any type of AR/VR/XR system or eyewear is within the scope of the present invention. For example, the system may connect or combine with Apple’s (or others') AR glasses.
The use of AR/VR/XR eyewear and the connection to the system, app, etc. may provide for abilities or features for players or participants. For example, the eyewear may provide the ability to see where your teammates are through walls, around buildings, etc. The eyewear may provide the ability to see the health being lost by another participant as you shoot them, and/or see your own health stats at all times or as you are hit. Motion tracking or sensing components, sensors or reflectors on the blaster or other items allow the game to replace the look of the physical blaster or other item in the players hand. In other words, if a player has a pistol in the game, the player (and other play ers) will see a pistol in the AR/VR/XR eyewear and when the player upgrades to, for example, a bazooka, the gun will graphically change on the screen (e.g., similar to a first person shooter game). For example, the virtual gun show n in AR/VR/XR on the eyewear can change from a pistol to a machine gun when the player upgrades to that gun. The eyewear preferably provides the ability to see and go get supply drops (e.g., weapons, health kit, shield, potions, etc.) in the physical world (e.g., around the yard, playing area, arena, etc.). Accelerometers,
gyroscopes or other motion sensing devices can be included on or within the blaster or other physical item to track movement that can be digitized and displayed on the AR/VR/XR eyewear or on the app.
[0013] In a preferred embodiment, the battlefield environment provides inflatable or other barricades that are associated with the game (and may be wirelessly connected within the game or gaming session) look like other objects in the game that a player is hiding behind. For example, an inflatable wall may look like a rock wall in AR/VR/XR. QR codes can be used on the barricades and any other object within the game for tracking purposes. The eyewear preferably shows different “skin” or outfit in the game based on your team, your personal preferences, what skins you own or have earned, etc. The wearables may include QR codes or other motion trackers for improved tracking performance.
[0014] In a preferred embodiment, the AR/VR/XR eyewear provides the ability to find virtual shields, first aid kits, better weapons, ammo, treasures, extra lives, etc. at predetermined locations (e.g., laying on the ground, on a barricade or other object). For example, the player sees an object, such as an ammunition magazine in the eyewear at a distance and has to move to the predetermined location to obtain the magazine. Once the system determines the player is at the proper location (e.g., via UWB, magnetometer, GPS, WiFi, etc.), the player obtains the extra ammunition for use in the game. The system preferably provides the ability to create or have a shield around the player so that another player cannot shoot them (the other player's gun may not be able to fire). This can be done via location services and magnetometer data to know someone is pointing at a shielded area and then prevent their blaster from firing.
[0015] In an embodiment, a mobile device, such as an iPhone, can be mounted on the side or other location on the physical blaster to allow certain of the system features discussed herein. This can be a version of the system that does not include AR/VR/XR eyewear. Many
features described herein can be done without the eyewear. For example, the system connection between the app (via a mobile device on and/or connected to the blaster) can provide the system with features such as app controlled firing, earning of experience and health points, other health kit data, sensing or determination of proximity when firing (i.e., how hard the player is hit by the projectiles), etc. Any or all of the features described herein can be used together to provide a software application experience without the use of AR/VR/XR eyewear.
[0016] The system may include an accompanying PC/Mobile/Console game. For example, the applicant may partner with a game like Fortnite or one or more video games can be created or developed that accompany or can be used with the system. Any of the items described herein can be sold or provided as a kit or unit. For example, a gameplay kit can include a number of blasters, inflatable bases and/or barricades, grenades, ammunition, magazines, AR/VR/XR eyewear, dow nloadable app and/or game that are all sold as a unit. Modules that include components that allow connectivity to or within the system can also be sold or provided. The modules can connect or otherwise attach to items such as blasters. This allows the blaster to be used without being connected to the system (without the module) or within the system. For example, a module can be added to a blaster and a mobile device with software (e.g., a phone with an app) can be secured to the module such that the blaster can be used within the system. In another embodiment, the blaster can include the software and/or app and the necessary connectivity, etc. built therein.
[0017] Players may earn experience points (XP) in the system and game by playing outside and entering physical world tournaments. For example, the XP can be earned based on length of playing time or based on points earned on better or worse game play.
[0018] An embodiment can include remotely controlled blasters that are included at each teams' base that can be controlled by someone in a different location via the game/system/app (camera and computer controlled firing versus the opposing team).
[0019] An embodiment can include players remotely piloting drones that shoot projectiles (e.g., Gel Blaster cannons) or other shooting device and that can attack the other team remotely utilizing a computer or mobile device.
[0020] In a preferred embodiment, the system can include features, components or elements for collecting and/or monitoring health data. For example, the sy stem can connect or otherwise be associated with Apple HealthKit, Google Health or other systems to track calories burned while playing, monitor heart rate or determine other health data. In a preferred embodiment, the calories burned and other health data can be used for gamification. For example, the system can include daily, weekly and/or monthly contests for longest/highest game play and calories burned to earn XP. The digitization of the game play experience allows for community tournaments, neighborhood tournaments, teams, monthly contests, leaderboards, etc. In a preferred embodiment, the system includes the ability to broadcast any of the gameplay via cameras mounted on the players or their equipment (e.g., the eyewear) or by connecting to the feed that is broadcast on any or all of the player's AR/VR/XR eyewear or the app. This allows games to be broadcast or streamed on television, computers. Twitch, Youtube, etc., thereby allowing for a sport, television show, channel, etc. to be based around gameplay, tournaments, etc.
[0021] The system can also include remotely operated explosives (e.g., a C4 looking Gel Ball bomb) and inflatable smart forts for each team. The forts or other location/ structure may include the ability for the other team to "defeat" or finish the fort. For example, the fort can include sensors or the like so that when the other team shoots a predetermined location or
area on the fort (e.g.,. a “death star hole”) enough times it will cause the inflatable fort/base to deflate.
[0022] The system may include built in speed loading of gellets or other projectiles. The system can include the ability to digitally dispense or load ammunition. A smart ammo dispenser may be based on predetermined parameters within the gameplay. For example, a game may be set where play ers can only refill a certain number of times during the length of the game and after they have reached the limit the system prevents the player from reloading (e.g., locking the canister, stopping the blaster from working, etc.). The right to refill may be earned and the blaster may begin working again.
[0023] In a preferred embodiment, the module or other digitally related components of the present invention may be connectable to the power of the blaster. For example, the blaster may include leads or a jack that provides power to the external item (such as a module). This allows a module or other system related component to be retrofitted on a blaster or other similar device that includes a power source. This can provide power so that the direction sensing capability, localization, magnetometer, wideband, IR reader and/or reflector and other components can be powered. The wearable can also include infrared (IR) reflectivity that works or is operatively connected to an IR reader on gun so that the system can sense what player is shooting another player and to provide other capabilities. The system can also provide connection so that other powered accessories (e g., projectile velocitychanger) on the blaster or the barrel thereof can be operated. For example, if a player earns a higher rate of fire or a higher velocity during gameplay, the system can provide instructions (via connectivity) to move or change something mechanically or physically on the gun (e.g., slide a sleeve) to provide the higher velocity projectile firing.
[0024] The system may include projectile firing devices, such as sniper rifles, that only work in certain areas or predetermined locations in the field of play. For example, the player may
earn a "large sniper rifle" within the game, but the rifle can only be used from a designated location (e.g., a certain hill). In use, based on the system sensing the location of the player the rifle cannot be fired unless the player is in a predetermined location (e.g., on the hill).
[0025] It will be appreciated by those of ordinary skill in the art that the present invention allows or provides for the incorporation of elements of arena survival video games into a mixed- reality AR/VR/XR environment. In a preferred embodiment, the system includes guns/blasters that include the capability of connecting to a mobile device that includes an app thereon. In a preferred embodiment, the system includes the ability for players or users to “level up” based on outside playing time, etc. Leveling up can include actions such as the ability to obtain better equipment (e.g., guns, ammunition, etc.), and other aspects desired by the player. This can also utilize or connect to exercise related apps or devices (e.g., Apple Watch, Fitbit, etc ). In a preferred embodiment, the system includes wearables that have sensors or the like thereon that can determine when it is struck. This may include capacitive indicator(s) on the wearable and communication/ connectivity to a processor that provides feedback on AR eyewear/glasses or the equivalent. The vest or other armor can also provide sensing of the proximity of a shooter based on impact of the projectile.
[0026] In a preferred embodiment, the blaster includes one or more holes or openings that are auto-adjusted to modify feet-per-second of gel and communication/ connectivity to AR interface for different types and levels of blasters. In a preferred embodiment, the system includes Bluetooth or other wireless connectivity between elements of the system that are not physically connected (i.e., vest, glasses, blaster, etc.). Any of the elements or components of the system described herein can be wirelessly connected (or wired). Any of the physical items, e.g., barricades, magazines, forts, bases, wearables, blasters, eyewear and the like can include connectivity to the system.
[0027] In a preferred embodiment, the system includes location or other proximity based features (e.g., so a play er can see the location of teammates, for providing higher assessed damage based on how close the shooter is to the target, the type of blaster). In a preferred embodiment, the system includes a heads-up-display in AR glasses with mixed reality to accommodate data from vest/blaster and other connected components while engaging with the real, physical environment simultaneously. It will be appreciated that the system can also be utilized for applications in law enforcement, military, training, hunting and other shooting or physical scenarios.
[0028] The present invention generally includes a smart target co-witnessing hit attribution system and method. In a preferred embodiment, the present invention includes providing accurate counting of proj ectile hits on a smart target integrated with an application to keep track of successful hits in order to provide a more rich, immersive, and accurate gaming or shooting experience.
[0029] In an embodiment, the smart target sy stem is a stationary, modular device that uses piezoelectric sensors to detect and positively identify impact strikes from gellets (or other projectiles) while filtering out other types of impacts. The smart target includes wireless networking technology to communicate over a mesh network using a particular communications protocol to effectively keep "real-time" or near-real time track of successful hits on the smart target.
100301 In an embodiment, the smart target is a gameplay element in a projectile-firing gaming system that may be affixed, embedded, or modular with wearables, shields, drones, turrets, or other physical objects. Anything that is a physical object within a gameplay environment that a projectile is intended to impact may include a smart target. The smart target preferably includes a piezoelectric (PE) sensor configured to determine an impact of a particular projectile upon (or near) the smart target. One of ordinary skill in the art would
understand that a PE sensor is an electromechanical device that uses the piezoelectric effect to measure changes in pressure, strain, force (or other physiological effects) by converting them to an electrical charge. In smart target parlance, the PE sensor is utilized for hit detection.
[0031] In accordance with another aspect of the present invention there is provided a smart target that includes a housing for receiving at least two sensors configured to co-witness an impact of an object at or near a predetermined distance from a predetermined location, a first sensor of the at least two sensors configured to detect an impact of an object at or near a predetermined distance from a predetermined location and generate a first sensor signal corresponding to the detected impact, a second sensor of the at least two sensors configured to receive information relating to the detected impact and generate a second sensor signal corresponding to the detected impact, and control circuitry configured to receive the first sensor signal and the second sensor signal, confirm whether the first sensor and the second sensor signals have co-witnessed the same impact of the object by determining whether they were received within a predetermined period of time, and provide a target state update via a network in accordance with at least one protocol.
[0032] The control circuitry may be configured to confirm, from the first sensor signal, whether the object includes a projectile based on a trained model configured to recognize impacts of projectiles, and generate projectile impact data when the object is confirmed to be a projectile, wherein the target state update includes the projectile impact data.
[0033] The second sensor signal may include one of an infrared signal from an infrared sensor, a brain-computer interface signal from an electroencephalogram sensor, a computer-vision signal from a computer-vision system, a radio-frequency signal from a radio frequency sensor, or a fabric-deformation signal from a smart textile sensor.
[0034] In accordance with another aspect of the present invention there is provided a smart target that includes a first target location, a first piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the first target location and generate a first piezoelectric sensor signal corresponding to the detected impact, a first infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a first infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the first target location when the object is determined to be a projectile by comparing the first piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a first predetermined period of time, generate first target location state data indicating that the first target location was successfully impacted by the projectile, and transmit the first target location state data via a network in accordance with at least one protocol.
[0035] The first piezoelectric sensor signal may be filtered before it is compared with the piezoelectric impact signal model. The first infrared sensor signal may include an angle of detection and an intensity of infrared radiation.
[0036] The first piezoelectric sensor may be further configured to detect the impact of the object on or within a predetermined distance from a second target location.
[0037] The smart target may further include a second target location, a second piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the second target location and generate a second piezoelectric sensor signal corresponding to the detected impact, and a second infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a second infrared sensor signal.
[0038] The control circuitry may be further configured to confirm a successful projectile impact on the second target location when the object is determined to be a projectile by comparing
the second piezoelectric sensor signal with the piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within the first predetermined period of time, generate second target location state data indicating that the second target location was successfully impacted by the projectile, and transmit the second target location state data via a network in accordance with at least one protocol.
[0039] The smart target may include a frame configured to receive the first target location. The first target location may be located on a wearable device.
[0040] The first target location state data may be configured to be received by an extended-reality gaming application configured to control the projectile-firing device and the wearable device in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectilefiring device, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, update gaming metric data to indicate a successful impact of the fired projectile on the first target location and attribute a successful hit to the projectile-firing device on the first target location when the time the impact occurred on the first target location and the time the projectile-finng device was fired occur within a second predetermined period of time.
[0041] The gaining metric data may be updated to indicate the successful impact of the fired projectile on the first target location and the successful hit is attributed to the projectilefiring device on the first target location when the location of the projectile-firing device is within a predetermined range from the first target location and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first target location. The updated gaming metric data may be configured to be displayed to a user.
[0042] In accordance with another aspect of the present invention there is provided a smart target co-witnessing hit attribution system that includes a network configured to provide communication and control connectivity in accordance with at least one protocol, a projectile-firing device comprising a projectile repository configured to carry projectiles for firing and an infrared emitter configured to emit infrared radiation at a predetermined wavelength and beamwidth, a smart target that includes a piezoelectric sensor configured to detect an impact of an object at or near a predetermined distance from a predetermined location relative to the smart target and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the smart target when the object is determined to be a projectile by comparing the piezoelectric signal with a predetermined piezoelectric impact signal model trained to detect impact of the projectile and the impact of the object and the detection of the infrared radiation occurred within a first predetermined period of time, generate target state data comprising the successful projectile impact, and provide the target state data via the network, an extended-reality gaming application configured to control the projectile-firing device and the wearable device in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectilefiring device, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, receive the target state data from the smart target, the target state data comprising a time the smart target was impacted by the projectile, update gaming metric data to indicate a successful impact of the fired projectile on the smart target and attribute a successful hit to the projectile-firing device on the smart target when the time the impact occurred on the smart
target and the time the projectile-firing device was fired occur within a second predetermined period of time.
[0043] The gaming metric data may be updated to indicate the successful impact of the fired projectile on the smart target and the successful hit is attributed to the projectile-firing device on the smart target when the location of the projectile-firing device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the smart target.
[0044] The network may be a target mesh network configured to transmit messages including the target state data to the extended-reality gaming application. The at least one protocol may be configured to effectuate transmission and reception of data to nodes within the network using hexadecimal values The hexadecimal values may specify data concerning a status of the target. The hexadecimal values specify data concerning a status of a user.
[0045] In accordance with another aspect of the present invention there is provided a smart target co-witnessing hit attribution method. A first sensor signal is received corresponding to detection of an impact of an object at or near a predetermined distance from a predetermined location relative to a first smart target location, a second sensor signal relating to the detected impact is received; the first sensor signal and the second sensor signal are determine to be received within a first predetermined period of time, and a target state update is provided via a network in accordance with at least one protocol.
100461 From the first sensor signal, it is determined that the object comprises a projectile based on a trained model configured to recognize impacts of projectiles, and projectile impact data is generated, wherein the target state update includes the projectile impact data.
[0047] The method may further include wherein the target state update is received, projectilefiring device data from a projectile-firing device is received, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at
which the projectile-firing device was fired, and gaming metric data is updated to indicate a successful impact of the fired projectile on the first smart target location and attribute a successful hit to the projectile-firing device on the first smart target location when the time the impact occurred on or near the first smart target location and the time the proj ectile- firing device was fired occur within a second predetermined period of time.
[0048] The gaming metric data may be updated to indicate the successful impact of the fired projectile on the first smart target location and the successful hit is attributed to the projectile-firing device on the first smart target location when the location of the projectilefiring device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first smart target location.
[0049] In accordance with another aspect of the present invention there is provided an extended- reality projectile-firing gaming system. The system includes a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application, and an extended-reality device. The extended-reality gaming application is configured to control the projectile-firing device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, and receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object. The extended-reality device includes at least one display and is configured to provide computer-aided vision of real and virtual objects and a computer- aided interface for interaction with the real and virtual objects, the computer-aided
interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
[0050] The computer-aided interface may include voice-activation control configured to select one or more of the real and virtual objects and to provide the commands or instructions.
[00 1] The computer-aided interface may include an eye sensor configured to monitor eye contact with the real and virtual objects.
[0052] The commands or instructions may be selectable using a control input when the eye contact is aligned with one of the real and virtual objects.
[0053] The eye sensor may identify a selection of the commands or instructions associated with the real and virtual obj ects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period.
[0054] The eye sensor may identify a selection of the commands or instructions associated with the real and virtual objects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period but does not exceed a second predetermined time period.
[0055] The control input may be voice-activated. The control input may include at least one sensor configured to monitor hand movements and the commands or instructions are selectable based on the monitored hand movements.
[0056] The control input may be a remote device. The remote device is coupled to the projectilefiring device. The remote device may be configured to receive data via the network from at least one of the projectile-firing device and the battlefield object. The remote device may be configured to be affixed to the projectile-firing device and provide at least one of nearfield communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data to the extended-reality gaming application.
[0057] The real and virtual objects may be displayed in response to the projectile-firing device data. The real and virtual objects may be displayed in response to the battlefield object data.
[0058] The gameplay configuration may include information corresponding to at least one of a game mode, a health meter, a shield meter, a projectile tally, a projectile repository tally, a rate of fire, a reload interval, a virtual power up, a success tally, a failure tally, a success to failure ratio, an alert, a battlefield role, a team affiliation, a battlefield objective, and a set of battlefield parameters.
[0059] The type of the projectile-firing device may be selectable through the computer-aided interface. The commands and instructions may include one or more of an amount of projectiles available for firing, a predetermined velocity of the projectiles for firing, a predetermined rate of fire of the projectiles, and a predetermined reload interval of the amount of projectiles.
[0060] The player may be selectable through the computer-aided interface and the extended- reality gaming application may facilitate communication with the player. The virtual objects may be sharable through the computer-aided interface with a player. The extended- reality device may be configured to display a prompt to provide at least one of the commands or instructions. The virtual objects may include a menu of selectable inputs corresponding to the commands or instructions. The menu of selectable inputs may be a drop-down menu.
[0061] The virtual objects may depict an application configured to be executed on the extended- reality projectile-firing gaming system.
[0062] The extended-reality gaming application may be further configured to update gaming metric data to indicate a successful impact of the fired projectile on the battlefield object and a successful hit by the projectile-firing device on the battlefield object when a time the
impact occurred on the battlefield object and a time the projectile was fired occur within a predetermined time period, wherein the predetermined time period is selected to confirm that the projectile-firing device fired the fired projectile that was detected as impacting the battlefield object, and wherein the updated gaming metric data is displayed on the extended-reality device and configured to modify the real and virtual objects in real or near real-time.
[0063] The updated gaming metric data may be further configured to receive a location and orientation of the projectile-firing device, wherein the extended-reality gaming application records a successful impact of the fired projectile on the battlefield object and a successfully fired projectile by the projectile-firing device associated with the battlefield object when the location of the projectile-firing device is within a predetermined range from the battlefield object and the orientation of the projectile-firing device is aligned at a predetermined area associated with the battlefield object.
[0064] The battlefield object may include a smart target comprising a target location, a piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the target location and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, and control circuitry configured to confirm a successful projectile impact on the target location when the object is determined to be a projectile by comparing the piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a predetermined period of time, generate target location state data indicating that the target location was successfully impacted by the projectile, and transmit the target location state data via the network in accordance with at least one protocol, wherein the extended-
reality gaming application is configured to modify the real and virtual obj ects in real or near real-time in response to the target location state data and display the modified real and virtual objects using the extended-reality device.
[0065] In accordance with another aspect of the present invention, there is provided an extended- reality projectile-firing gaming system that includes a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application configured to control the projectile-firing device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object, and a computer-aided interface for interaction with gameplay control options associated with the gameplay configuration and continually updated in real or near real-time during the gaming session, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration. The commands or instructions may be received using voiceactivation.
BRIEF DESCRIPTION OF THE DRAWINGS
[0066] The invention may be more readily understood by referring to the accompanying drawings in which:
[0067] FIG. 1 is a diagram of an extended-reality projectile-firing gaming system in accordance with a preferred embodiment of the present invention;
[0068] FIG. 2 is a diagram of a projectile-firing device in accordance with a preferred embodiment of the present invention;
[0069] FIG. 3A is a perspective view of a standalone target in accordance with a preferred embodiment of the present invention;
[0070] FIG. 3B is a perspective view of a monolithic target in accordance with a preferred embodiment of the present invention;
[0071] FIG. 4 is a perspective view of a battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention;
[0072] FIG. 5 is a perspective view of a mixed-reality battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention;
[0073] FIG. 6 is a top-down view of a battlefield environment depicting exemplary components of the extended-reality projectile-firing system in accordance with a preferred embodiment of the present invention;
[0074] FIG. 7 is a flow diagram of an extended-reality proj ectile-finng gaming method in accordance with a preferred embodiment of the present invention;
[0075] FIG. 8 is a flow diagram of an extended-reality proj ectile-finng gaming method in accordance with a preferred embodiment of the present invention.
100761 FIG. 9 is a perspective view of one or more smart targets in accordance with a preferred embodiment of the present invention;
[0077] FIG. 10 is a perspective view of a wearable device including smart targets in accordance with a preferred embodiment of the present invention;
[0078] FIG. 11 is a smart blaster in accordance with a preferred embodiment of the present invention;
[0079] FIG. 12 is a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention;
[0080] FIG. 13-1 depicts a first portion of a schematic diagram of a circuit for a smart target cowitnessing hit attribution system in accordance with a preferred embodiment of the present invention;
[0081] FIG. 13-2 depicts a second portion of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention;
[0082] FIG. 13-3 depicts a second portion of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention;
[0083] FIG. 14A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0084] FIG. 14B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0085] FIG. 15A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0086] FIG. 15B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
10087| FIG. 15C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0088] FIG. 16A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0089] FIG. 16B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0090] FIG. 16C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0091] FIG. 17A is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0092] FIG. 17B is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0093] FIG. 17C is a diagram of an exemplary smart target mesh network in accordance with a preferred embodiment of the present invention;
[0094] FIG. 18 is a table of a message protocol in accordance with a preferred embodiment of the present invention;
[0095] FIG. 19 is a table of smart target states in accordance with a preferred embodiment of the present invention;
[0096] FIG. 20 is a table of a message protocol in accordance with a preferred embodiment of the present invention;
[0097] FIG. 21 A is a table of smart target states in accordance with a preferred embodiment of the present invention;
[0098] FIG. 21 B is a table of smart target modes in accordance with a preferred embodiment of the present invention;
[0099] FIG. 22A is a first portion of a flow diagram of a smart target co-witnessing hit attribution method in accordance with a preferred embodiment of the present invention;
[00100] FIG. 22B is a second portion of a flow diagram of a smart target co-witnessing hit attribution method in accordance with a preferred embodiment of the present invention; and
[00101] FIG. 23 is a view of an extended-reality device in accordance with a preferred embodiment of the present invention.
[00102] Like numerals refer to like parts throughout the several views of the drawings.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[00103] The following description and drawings are illustrative and are not to be construed as limiting. Numerous specific details are described to provide a thorough understanding of the disclosure. However, in certain instances, well-known or conventional details are not described in order to avoid obscuring the description. References to one or an embodiment in the present disclosure can be, but not necessarily are references to the same embodiment; and, such references mean at least one of the embodiments. If a component is not shown in a drawing then this provides support for a negative limitation in the claims stating that that component is "not" present. However, the above statement is not limiting and in another embodiment, the missing component can be included in a claimed embodiment.
[00104] Reference in this specification to "one embodiment," "an embodiment," "a preferred embodiment" or any other phrase mentioning the word "embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one embodiment of the-disclosure and also means that any particular feature, structure, or characteristic described in connection with one embodiment can be included in any embodiment or can be omitted or excluded from any embodiment The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. Moreover, various features are described which may be exhibited by some embodiments and not by others and may be omitted from any embodiment. Furthermore, any particular feature, structure, or characteristic described herein may be optional. Similarly, various requirements are described which may be requirements for some embodiments but not other embodiments.
Where appropriate any of the features discussed herein in relation to one aspect or embodiment of the invention may be applied to another aspect or embodiment of the invention. Similarly, where appropriate any of the features discussed herein in relation to one aspect or embodiment of the invention may be optional with respect to and/or omitted from that aspect or embodiment of the invention or any other aspect or embodiment of the invention discussed or disclosed herein.
[00105] The terms used in this specification generally have their ordinary' meanings in the art, within the context of the disclosure, and in the specific context where each term is used. Certain terms that are used to describe the disclosure are discussed below, or elsewhere in the specification, to provide additional guidance to the practitioner regarding the description of the disclosure. For convenience, certain terms may be highlighted, for example using italics and/or quotation marks: The use of highlighting has no influence on the scope and meaning of a term; the scope and meaning of a term is the same, in the same context, whether or not it is highlighted.
[00106] It will be appreciated that the same thing can be said in more than one way. Consequently, alternative language and synonyms may be used for any one or more of the terms discussed herein. No special significance is to be placed upon whether or not a term is elaborated or discussed herein. Synonyms for certain terms are provided. A recital of one or more synonyms does not exclude the use of other synonyms. The use of examples anywhere in this specification including examples of any terms discussed herein is illustrative only, and is not intended to further limit the scope and meaning of the disclosure or of any exemplified term. Likewise, the disclosure is not limited to various embodiments given in this specification.
[00107] Without intent to further limit the scope of the disclosure, examples of instruments, apparatus, methods and their related results according to the embodiments of the present
disclosure are given below. Note that titles or subtitles may be used in the examples for convenience of a reader, which in no way should limit the scope of the disclosure. Unless otherw ise defined, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure pertains. In the case of conflict, the present document, including definitions, will control.
[00108] It will be appreciated that terms such as "front," "back," “top,” “bottom,” "side," "short," "long," "up," "down," "aft," "forward," "inboard," "outboard" and "below" used herein are merely for ease of description and refer to the orientation of the components as shown in the figures. It should be understood that any orientation of the components described herein is within the scope of the present invention.
[00109] Referring now to the drawings, which are for purposes of illustrating the present invention and not for purposes of limiting the same, the drawings show devices and components (and related methods) therein in accordance with preferred embodiments of an extended-reality projectile-firing gaming system and method. As shown in FIGS. 1-6, the extended-reality projectile-firing gaming system generally includes a system of wirelessly interconnected blasters, targets, wearables, and “battlefield” objects that allow players to engage in precisely tracked and managed projectile sports and games, resulting in an experience of “Video Games Outside”. As shown in FIGS. 7-8 extended-reality projectile-finng gaming methods are shown, further illustrating preferred methods for attributing scoring to particular players or users of the extended-reality projectile-firing gaming system. As shown in FIGS. 9-24, a smart target co-witnessing hit attribution system and method are shown, illustrating particular embodiments of accurately determining “hits” and correctly attributing those hits to a particular projectile-firing device.
[00110] Referring now to FIG. 1, FIG. 1 is a diagram of an extended-reality projectile-firing gaming system 100 in accordance with a preferred embodiment of the present invention.
FIG. 1 depicts two users utilizing a network 102 in connection with the system 100. The system 100 includes a projectile-firing device 104, an extended-reality device 105, a wearable device 106, a standalone target 108, a drone 109, a cloud-based computing system 110, a remote device 112, and a fitness monitoring device 114. The network protocol may include Bluetooth, WiFi, radio frequency (RF), or other suitable network protocol.
[00111] The network 102 is configured to utilize one or more network protocols to facilitate communication by and between the various components of the system 100. For example, as disclosed herein, the network 102 may consist of any network type, including but not limited to a local area network (LAN), wide area network (WAN), and/or the internet. One of ordinary skill in the art would understand that a variety of networks and/or network protocols may be implemented to provide connectivity between the various components of the system 100.
[00112] Referring to FIG. 1, two projectile-firing devices 104 are depicted. The first projectilefiring device 104 is associated with a first user 116 and the second projectile firing device 104 is associated with a second user 118. One of ordinary skill in the art would understand that more than two devices 104 may be utilized and that the system 100 is scalable to include more than two users 116, 118. FIG. 2 is a diagram of a projectile-firing device 102 in accordance with a preferred embodiment of the present invention. The projectile-firing device 102, in a preferred embodiment, includes a housing 120, a handle 122, a trigger 124, a battery pack 126 (internal to the handle 122 in FIG. 2), a muzzle 128, and a hopper 130. The housing 120 preferably is configured to receive the handle 122, the trigger 124, the battery pack 126, the muzzle 128, and the hopper 130. The housing 120 is also preferably configured to embody electromechanical apparatus to receive commands and/or instructions via the network 102, feed projectiles from the hopper 130 to be fired by the
projectile-firing device 104, and mechanically fire one or more projectiles from the muzzle
128. The electromechanical apparatus may include a gearbox configured to load and fire gellets, small biodegradable projectiles configured to be fired from the muzzle 128. The handle 122 is configured to receive the trigger 124 and used by the user 116, 118 to depress the trigger 124 to fire the projectiles from the muzzle 128. When the trigger 124 is depressed by the user 116, 118, the device 104 may be configured to fire a single projectile, or multiple projectiles (e g., automatic fire mode). In an embodiment, the device 104 includes a switch 133 to change the fire mode from single fire to automatic fire modes. Alternatively, or in combination with the switch 133, the device 104 includes a display 152 (shown in FIG. 4) that is configured to change the fire mode from single fire to automatic fire modes. In an embodiment, the fire mode is configured to fire three projectiles upon the trigger being depressed once by the user 1 16, 118. The fire mode may also be controlled by the application via the network 102 to vary the fire mode depending upon the gaming configuration in accordance with a gaming session.
[00113] The muzzle 128 is configured to permit the projectile from exiting the projectile-firing device 104. The projectile-firing device, in a preferred embodiment, is configured to vary the velocity of a projectile from 90 feet-per-second (FPS) to 170 FPS. In other embodiments, the velocity is configured to reach 200 FPS or greater. For example, the projectile-firing device receives commands and/or instructions from the application via the network 102 to electromechanically adjust the velocity at which the projectile-firing device 104 is fired. The velocity may be adjusted electromechanically through a variety of methodologies, all within the scope of the present invention.
[00114] In an embodiment, the electromechanical functionality of the projectile-firing device 104 includes a barrel 129 (shown internally with dashed lines) to be inserted into the muzzle 128 and/or muzzle area. The barrel 129 may be configured to increase or decrease a
19
velocity at which the projectile-firing device 104 fires a projectile. For example, a first barrel 129 is configured to achieve a muzzle velocity of 170 FPS, while another barrel 129 is configured to achieve a muzzle velocity of 90 FPS. Alternatively, or in combination with the barrel 129, a barrel tip 131 is preferably configured to be affixed to the muzzle 128 and/or muzzle area. The barrel tip 131 also may be configured to achieve muzzle velocities within the velocity ranges described herein. In yet another embodiment, the barrel tip 131 may be configured to reduce the velocity from a predetermined "default" range; thus, for example, when the barrel tip 131 is not affixed to the muzzle 128, the velocity is 170 FPS, while when the barrel tip 131 is affixed to the muzzle 128, the velocity is 90 FPS. The barrel tip 131 may be configured to allow beginners or those desiring a less aggressive gameplay experience to lower the velocity. Still further, in any of the aforementioned embodiments, the mechanical components of the muzzle 128, the barrel 129, and the barrel tip 131 may be combined with electronically adjusted velocity increments so as to achieve FPS velocities greater or lower than the predetermined velocities of the mechanical components. Still further, the projectile-firing device 104 may include physically adjustable areas alongside the barrel 129 to open or close to vary the velocity mechanically. In an embodiment, the velocity is decreased when the physically adjustable areas are opened further, while the velocity is increased when the physically adjustable areas are closed further. The opening and closing of the physically adjustable areas may be controlled by the application via the network and/or internal electronic components of the projectile-firing device 104.
[00115] Other methods of lengthening or shortening ranges mechanically are within the scope of the present invention. In an embodiment, electronic adjustment of blaster spring tension may be utilized, meaning that certain weapon types can be made to fire physically longer
or shorter distances as determined by the system, allowing for “range” as an attribute; e.g., Sniper vs Revolver (e.g., Table 1).
[00116] The device 104 preferably includes electronic circuitry, or the like, to provide relatively accurate position and direction of the device 104. This data preferably is utilized to provide confirmation, in connection with the application, of a "hit" by a particular user 116, 118 on another user's 116, 118 wearable device 106 or a particular standalone target 108/monolithic target 148 or other target/battlefield object. The device 104 may also include an IR reader so that the system 100 in connection with the application can sense that a particular device 104 has reflected IR radiation on a wearable device 106 or target for further confirmation of a "hit."
[00117] Referring still to FIG. 2, the battery pack 126 is configured to provide power to device 104. For example, the device 104 includes electromechanical apparatus such as internal electronic circuitry configured to be powered by the battery pack 126. The battery pack 126 is also preferably configured to operate mechanical firing apparatus of the device 104. The battery pack 126 preferably is a rechargeable battery pack 126 configured to receive a USB-C input to recharge the battery pack 126. In other embodiments, the battery pack 126 may be recharged by a variety of recharge methods, all within the scope of the present invention, including a mim-USB input, or the like. The device 104 preferably includes a recharge port 127 configured to receive a recharge input.
1001181 The hopper 130 preferably is configured to hold a "depot" of projectiles to be fed to the electromechanical apparatus within the device 104 for firing. In various embodiments, the hopper is configured to hold 100, 200, 300, 400, 500, 600, 700, 800, 1000, or greater or fewer projectiles. The gaming configuration in accordance with a gaming session may limit the number of projectiles held by the hopper 130. The hopper 130 is configured to gravity feed projectiles into the device 104.
[00119] In a preferred embodiment, the projectile-firing device 104 is configured to uniquely receive instructions and/or commands from the application via the network 102 to increase or decrease electromechanical functionality of the projectile-firing device 104 when receiving "damage," receiving points or "health," or during a variety of other scenarios, preferably in connection with a gaming configuration. The physical consequences of this capability impact the first user's 116 ability to inflict "damage" to the second user 118. As a result, the rate of fire may be increased or decreased, a reload interval may be increased or decreased, a velocity of the device 104 may be increased or decreased, and an amount of "damage" inflicted can be varied depending on the commands and/or instructions, including the type and configuration of the gaming profile (e.g., gaming configuration) that is configured to be set during a gaming session A projectile-firing device 104, for example, may be set in a configuration as a sniper rifle, a pistol, or a shotgun. Each of these configurations, in a preferred embodiment, are preset with specified rates of fire, reload intervals, damage capabilities, range of fire, or the like. The projectile-firing device 104 set in sniper rifle mode, for example, would have a slower rate of fire, a more frequent reload interval, and a higher damage capability than the projectile-firing device set in pistol mode, for example. The projectile-fmng device 104 is configured to be electromechamcally controlled by the application via the network 102 such that any of the above configurable options may be varied in real- or near-real-time.
1001201 The system 100 in accordance with the application gameplay configuration will allow or disallow the refill or usage of projectiles according to gameplay modes where a “reload” can be physically controlled from electromechanical assembly and connected via the network 102 and given instructions from the application. A projectile dispenser may be based on predetermined parameters within the gameplay and may embody or be part of the hopper 130. For example, a game may be set where the users 116, 118 (e.g., players) can
only refill a certain number of times during the length of the game and after they have reached the limit the system prevents the player from reloading (e.g., locking the canister, stopping the blaster from working, etc.). The right to refill may be earned and the blaster may begin working again. In an embodiment, a home base/capture point station 172 may be utilized that permits the users 116, 118 to refill projectiles literally (e.g., within the existing hopper 130), as well as virtually per the game mode (e.g., gaming configuration).
[00121] The projectile-firing device 104 is configured to monitor and track the time when a projectile is fired from the device 104, and to provide this information to the application. As described herein, the system 100 is configured to receive the time the projectile was fired and correlate the time with the time a wearable device 106 was impacted by a projectile to determine that the user 116, 118 associated with the wearable device was hit by the user 1 16, 1 18 associated with a particular projectile-firing device 104. This methodology achieves benefits above and beyond traditional projectile-firing game systems because it can monitor and track who is responsible for a hit, allowing for more interactive gameplay and immersion.
[00122] The device 104 preferably includes a display configured to provide useful game information, such as hits, health, ammo, blaster “type”, team alignment, remaining players, etc. The device 104 preferably includes electronic circuitry and a network interface to receive data from the network 102 in accordance with the application and the gaming configuration of that particular gaming session.
[00123] Referring again to FIG. 1, two extended-reality devices 105 are depicted, separately worn by and associated with the users 116, 118. The extended-reality devices 105 are configured to communicate via the network 102 with the application. The extended-reality devices
105 may be augmented reality (AR), mixed-reality (MR), virtual reality (VR), or extended- reality (XR) devices configured to receive and display data via the network 102 to the
users 116, 118. The extended-reality device 105 is configured to enhance game modes and game attributes by mixed and/or augmented reality. Any suitable head-mounted display, including those of Meta Quest, Nreal, or Apple Vision Pro, now and in the future may be utilized to display relevant game information in heads-up-display (HUD) style, allowing the users 11 , 118 to track gameplay configurations, statistics, etc., in a video-game-like manner. For example, the extended-reality device 105 includes a HUD display 142 configured to display current health, ammunition (i.e., projectiles), weapon/blaster/projectile-firing device 104 type, shields, and the like. The HUD display 142 may also display "kills," "deaths," and a "kill/death" and/or success/failure ratio, including a ranking of users taking part in the gaming session. In a gaming session where certain of the users 116, 118 have been eliminated from the gaming session in accordance with the gameplay configuration, the remaining team members and opponents are displayed. In an embodiment, alerts, game start/end, timers, and special game states are displayed. The accuracy of the users 116, 118 may also be displayed as a shots/hits ratio. The HUD display 142 may also include display of the current objectives, game modes, progress, capture/flag/base status/health in accordance with the gameplay configuration. In certain embodiments, the gameplay configuration may utilize battlefield roles (e.g., medic, engineer, tank, sniper, etc ). These roles may include specialized perks and restnctions that may be displayed to the users 116, 118 via the HUD display 142.
1001241 Each of these displayable items may also utilize real-time or near-real-time virtual overlays on the battlefield environment displayed through the HUD display 142. For example, health and shield bars, team affiliation, badges, status, role, username, etc. can appear over the users' 116, 118 heads. Battlefield objectives can be highlighted in the distance, and capture or base areas can be defined and shown virtually, object pickups can be described and highlighted, weapon skins can be virtualized and overlaid on the device 104, and team
colors, player skins, and other aesthetics can be mapped to human users 116, 118 in realtime or near-real-time. The HUD display 142 may also display zones of special status, such as neutral or safe, which can be virtually displayed on the battlefield environment, and hits, shield depletion, firing, etc. can be displayed as real-time or near-real-time virtual animations. Battlefield objects, skins, the environment, etc. can be themed to match game mode, user preference, and/or brand tie-ins, such as the Marvel Universe, WWII, or Sci-Fi themes. The application can restrict or unlock certain themes, nomenclatures, modes, etc., based upon player progress, preference, or parental controls. For example, realistic weapon terminology, violent descriptions such as “killed,” or “bloody” animation effects can be restricted to older players or turned off entirely.
[00125] QR code/motion-capture-type patterns can be printed on blasters, targets, and battlefield objects to aid the extended-reality device 105 in overlaying virtualized animations/skins accurately utilizing the HUD display 142.
[00126] In accordance with various embodiments, the extended-reality projectile-firing gaming system 100 utilizes the extended-reality devices 105 to provide a selection interface configured to work within the system 100 ecosystem and control varied options within the game. In an embodiment, any of the selectable options described herein may be implemented through a selection interface viewable and usable through the extended- reality devices 105. For example, the real and/or virtual objects described herein are associated with commands and instructions that are configured to be selected by the users 116, 118 using a computer-aided interface.
[00127] The commands and instructions preferably include choosing the type of projectile-firing device 104 (e.g., type of blaster 104, firing mode, etc.), choosing with whom to communicate during a gaming session, choosing map pullup and options, and the like. For example, controlling the blaster 104 mode may include toggling or selecting a single/multi-
shot mode. In a preferred embodiment, the electromechanical apparatus may be controlled via the commands and instructions to fire in accordance with type/mode of firing, to fire faster and/or harder, or the like. A single-fire revolver example may include a single-shot mode of firing, a relatively slower/lighter projectile speed, and a specified number of reloads (e.g., six). A multi-shot rifle, on the other hand, may include an automatic/burst mode of firing, and may speed up the projectile speed with a higher number of reloads. These different blaster 104 types are configured to be selectable using the computer-aided interface in association with the extended-reality devices 105. In addition, the commandss and instructions preferably include controlling rate of fire, such as 8 projectiles per second versus 12 projectiles per second.
[00128] Besides blaster 105 control, other controllable options include other in-game implement/tool selection, ammunition selection, map review and expansion, equipping of different accessories, and communication. In an embodiment, the controllable options may include a power-up menu selection, such as potions or other items found in the extended- reality environment world. For example, when the user’s 116 health is low, a virtual potion may be selected and used to increase the user’s hit points. Other examples are invincibility shields, increased speed, or the like. Map review and expansion preferably uses the location of other headsets (e.g., extended-reality devices 105), blasters 104, or other real object for which location services are available (e.g., remote device 112) for locations of other players within the map. The different accessories may include lights, tracer rounds, and night vision, among other like accessories. For example, these accessories may be real or virtual items, or may simulate real-world items. Night vision goggles, for example, may be simulated by providing a green “glow” through the extended-reality device 105. A light, on the other hand, may be a real -world object that illuminates the gaming environment, or may be an effect that lightens the extended-reality environment through the device 105.
[00129] The controllable options also include communication among teammates or other users 116, 118. For example, the user 116 sees a dropdown menu of teammates through the extended- reality device 105 and selects the one (or ones) that the user 116 would like to speak with. The computer-aided interface facilitates selection of communication options, including selection of a particular user with whom to communicate. For example, eye contact with a map icon depicting a user 116 for a long enough predetermined time period may be sufficient to open up communications with that user 116.
[00130] In an embodiment, the extended-reality devices 105 include the computer-aided interface, but in other embodiments, the interface is separate from the extended reality devices 105. For example, the extended-reality devices 105 include one or more eye sensors configured to monitor eye contact of the users 116, 118. The eye sensors may be accompanied by an algorithm designed to monitor how long the users’ 1 16, 1 18 eyes are maintaining eye contact with a particular real or virtual object displayed by the extended-reality devices 105. In an embodiment, the algorithm compares the time of eye contact with a predetermined time period (e.g., 1 or 2 seconds) to confirm a selection of the particular real or virtual object (e.g., a command or instruction associated with the real or virtual object). The algorithm may include a range such that eye contact greater than a first predetermined time period and less than a second predetermined time period confirms a selection. However, the algorithm may simply monitor eye contact without requiring the eye contact to be maintained within a certain time period. For example, if the user 116, 118 makes eye contact with a real or virtual object, the user 116, 118 may activate a command or instruction associated with the real or virtual object by using voice activation and/or selection by hand movement or remote device, or any combination thereof. For example, the user 116 makes eye contact with a “Revolver” blaster choice and says “Select” to choose the Revolver as the blaster 104 type. Or, the user 116 makes eye contact with a
“Revolver” blaster choice and depresses a button on the blaster 105 to choose the Revolver as the blaster 104 type. For example, the button on the blaster 104 may be selectable via the remote device 112 that is affixed to the blaster 104 or may be separately provided from the blaster 104. In other embodiments, the computer-aided interface may consist of separately implemented eye sensors, hand movements, physical controllers, remote devices 112, or combinations thereof. For example, a trackpad, ball mouse, or similar physical system may be utilized to control a menu within a game. The physical controller can consists of merely up/down/left/right buttons on the blaster itself, or may be a separate standalone controller. The hand movement may be recognizable by use of one or more sensors using any of the sensor types described herein, such as LIDAR, cameras, or the like.
[00131] In an embodiment, the blaster 104 control via the computer-aided interface may be integrated with hit detection/attribution methods as disclosed herein. For example, the methods disclosed herein may output updated gaming metric data and/or target location state data. In response to that data, the real and virtual objects may be modified to reflect real or near real-time events. The modified real and virtual objects may then provide updated commands or instructions available to be selected by the users 116, 118. For example, a user 116 may confirm a hit using the methods provided herein that, when updated via the updated gaming metric data and/or target location state data, triggers an ingame response, such as providing power-ups or additional in-game capabilities. The user 116 is then able to select the in-game virtual (or real) item as it appears or is prompted to be utilized according to the computer-aided interface of the extended-reality device 105.
[00132] In an embodiment, the extended-reality device 105 utilizes the computer-aided interface to prompt the users 116, 118 to select or input a command or instruction.
[00133] The computer-aided interface of the extended-reality device 105 is designed to minimize or eliminate negative impacts on gameplay or the users’ 116, 118 experience and provide an immersive gameplay environment.
[00134] In an embodiment, the extended-reality gaming system 100 utilizes audio to provide the computer-aided interface control options as described herein. In this embodiment, the system 100 does not rely on display of the real and virtual objects through the extended- reality device 105 to provide selectable options, and may not be needed at all. In this embodiment, the computer-aided interface utilizes verbal commands. For example, the system 100 includes microphones and earbuds or headphones (or relies upon external speakers and microphones affixed to the blaster 104, wearable vest 108, and/or remote device 112) to provide the computer-aided interface for commands and instructions. For example, the user 1 16 states “Computer: identify blaster inventory” into a microphone, and the extended-reality gaming application responds through the speaker “Revolver, sniper rifle.” The user 116 may then state into the microphone “Computer: Select revolver,” and the blaster 104 becomes the revolver in that gameplay session. As described herein, the various audio options may be integrated with the extended-reality device 105 such that all or portions of the audio control options may be utilized alongside or in connection with the display of the real and virtual object and associated commands and instructions.
[00135] Referring now to FIG. 23, there is depicted a blaster 104 as displayed in an extended- reality device 104. For example, the device 104 is a pair of MR goggles 104. The MR goggles 104 display both real and virtual objects. For example, the floor tiles 580, the projectile 164, and the barrier 178 are all real objects. However, the ceiling tiles 582 are virtual objects. In addition, there is depicted in FIG. 23 a blaster choice dropdown menu
584 and a communication dropdown menu 586. These are also virtual objects within the extended-reality environment. The computer-aided interface facilitates interaction with any
of the real or virtual objects, including those depicted in FIG. 23. One of ordinary skill in the art would understand that the real or virtual objects depicted in FIG. 23 are nonlimiting. In practice, in an embodiment, the user 116 makes eye contact with “Blaster
Choice”, which drops down three options: Revolver, Shot Gun, and Sniper Rifle. The user 116 then selects one by making eye contact with Shot Gun, for example, and using a selection method, as disclosed herein, to select Revolver. The computer-aided interface receives the command/instruction to select Revolver and provides it to the application, which is then used to control the gameplay characteristics. The selection of Revolver preferably selects limitations and requirements that are communicated to the blaster 104 or the application to appropriately limit the gameplay in connection with the selection criteria. One of ordinary skill in the art would understand that a variety of methodologies of selecting, processing, and implementing the selections are within the scope of the present invention.
[00136] Referring again to FIG. 1, two wearable devices 106 are depicted. One of ordinary' skill in the art would understand that while the wearable devices 106 are depicted as wearable vests, other wearables or body adornments affixed to other parts of the body may be utilized without departing from the scope of the present invention. The wearable devices 106 can have lighting profiles, sounds, and other controlled actions to integrate with gameplay. The first wearable device 106 is associated with the first user 116 and the second wearable device is associated with the second user 118. The wearable device 106 includes a torso portion 132 and a shoulder portion 134. As shown in FIG. 1, the torso portion 132 and the shoulder portions 134 are attached via straps 136. The torso portion 132 and the shoulder portions 134 of the wearable device 106 include wearable device targets 137 configured to detect impacts of projectiles on the wearable device 106. The wearable device targets 136 are in communication with one or more of the device 104, the
standalone target 108, the drone 109, the cloud-based computing system 110, the remote device 112, and the fitness monitoring device 114. Each of these components is configured to communicate with one or more of the components of the system 100 via the network
102.
[00137] Upon detection of a "hit," the wearable device 106 is configured to send a message via the network 102 to the application indicating that the user 116, 118 associated with the wearable device 106 has been hit by a projectile. As mentioned herein, when the application receives the message that the wearable device 106 has been hit, the application correlates the time the projectile was impacted on the wearable device 106 with a time a proj ectile was fired by a proj ectile-firing device 104. The user 116, 118 associated with the projectile-firing device 104 that was correlated with the impact on the wearable device 106 is then credited with a hit against the user 1 16, 1 18 associated with the wearable device 106.
[00138] The standalone target 108 is depicted as a "flower" arrangement in a preferred embodiment. The standalone target 108 includes a central portion 138 (e.g., "pistil" of flower) and a plurality of outer portions 140 (e.g., "petals" of flower). As shown in FIG. 1, the standalone target 108 includes six outer portions 140, but in other embodiments, the number of outer portions 140 may be greater or fewer than six. Each of the central portion 138 and the outer portions 140 are configured to separately determine whether a projectile has impacted that portion of the standalone target 108. For example, the same target impact methodology is utilized for the wearable device targets 136 and the central and outer portions 138, 140 of the standalone target 108. In accordance with the gameplay configuration, "gamifying" the standalone target 108 may include a requirement to separately "hit" each of the portions 138, 140 in order to credit a gameplay objective. In other embodiments, one or more of the central portion 138 and outer portions 140 are
required to be hit in accordance with the gameplay configuration. The standalone target 108 permits a variety of gameplay options and/or objectives, all within the scope of the present invention.
[00139] The standalone target 108 is preferably in communication with the application via the network 102 so that real-time or near-real-time data may be provided to the application, and visa versa. For example, upon a hit of the central portion 138 of the standalone target 108, the application may credit a "hit" using similar methodology as that used to credit a "hit1 on the wearable device 106. The standalone target 108 may also be configured to include a physical indication (i.e., a red lighted portion) to indicate a hit upon a certain portion 138, 140 of the standalone target 108. In addition, or alternatively, the application, upon receiving data from the network 102, may be configured to provide a virtual overlay on the target displayed on the display of the extended-reality device 105
[00140] Referring now to FIGS. 3A and 3B, the standalone target 108 in FIG. 3A is shown in the same configuration as that shown in FIG. 1. The standalone target 108 includes support members 144 to support the central portion 138 and the outer portions 140. The standalone target 108 also includes a stand 146 configured to support the standalone target 108 in an upright position. As shown in FIG. 3B, a monolithic target 148 is shown, which is an alternative configuration of the standalone target 108. The monolithic target 148 includes a similar arrangement including a central portion 138 and outer portions 140, each configured to determine whether a projectile has impacted that particular portion of the monolithic target 148. The monolithic target 148 is constructed so that the central portion 138 and the outer portions 140 are embedded in the same structure (i.e., monolithic construction) or mold 150. The monolithic target 148 may be affixed to a structure such as a barrier or battlefield object, or may be positioned in an upright position using a stand, similar to that shown in FIG. 3 A. Similarly to the standalone target 108, the monolithic
target 148 is preferably configured to communicate with the application via the network 102, and includes the same "hit" confirmation methodology as that described herein in connection with the standalone target 108 and the wearable device 106.
[00141] While specific seven-panel-shaped targets (e.g., standalone target 108, monolithic target 148) have been shown, other shapes or configurations are within the scope of the present invention.
[00142] The system 100 as shown in FIG. 1 includes a drone 109. The drone 109 is configured to communicate via the netw ork 102 with the application. The drone 109 may be preprogrammed to provide a "passive" target similar to the standalone target 108 or the monolithic target 148, and/or may be an "active" target or participant preprogrammed to be controlled remotely. In an embodiment, the drone 109 includes a projectile-firing device 104 affixed thereto. One of ordinary skill in the art would understand that the particular projectile-firing device 104 depicted in FIG. 1 would be modified to be affixed to the drone 109.
[00143] Referring still to FIG. 1, the cloud-based computing system 110 is configured to provide on-demand availability of computer system resources. The cloud-based computing system 110 preferably includes data storage and computing processor resources to offload storage and processing into a cloud environment. One of ordinary skill in the art w ould be familiar with cloud-based computing systems and would understand that a variety of configurations of components are within the scope of the present invention. The cloud-based computing system 110 is preferably configured to store and run the application.
[00144] The remote device 112 as depicted in FIG. 1 is preferably a smartphone and includes direction-sensing and location-sensing capabilities. For example, the remote device 112 includes GPS and a magnetometer to determine position and direction of the device 112. As disclosed herein, the remote device 112 may be affixed to the projection-firing device
104 to provide relatively accurate position and direction of the device 104. This data preferably is utilized to provide confirmation, in connection with the application, of a "hit" by a particular user 116, 118 on another user's 116, 118 wearable device 106 or a particular standalone target 108/monolithic target 148 or other target/battl efield object. The remote device 112 may also include an infrared (IR) reader so that the system 100 in connection with the application can sense that a particular device 104 has reflected IR radiation on a wearable device 106 or target (e.g., the targets 106, 148, the drone 109) for further confirmation of a "hit."
[00145] The remote device 112 is preferably connected to the network 102 and configured to communicate with the application. The remote device 112 in a preferred embodiment is configured to receive real- or near-real-time data from the network 102 to provide commands and/or instructions to the user 1 16, 1 18, and to display the commands and/or instructions thereon in accordance with the gaming configuration for that particular gaming session. The remote device 112 may display useful game information, such as hits, health, ammo, blaster “type”, team alignment, remaining players, etc.
[00146] The fitness monitoring device 114 as depicted in FIG. 1 is configured to communicate with the application via the network 102. The device 114 may be a smart watch such as Apple Watch and Samsung Gear for integration with additional mobile applications. For example, the device 114 may be configured to integrate with health apps and features via an API. In an embodiment, the users 116, 118 may gain XP or other in-game features through increased physical activity and exertion. Users 116, 118 that exhibit more active gameplay, monitored through the fitness monitoring device 114, may earn increased XP. The users 116, 118 may use the XP to unlock special items, exclusive products, free items, or the like.
[00147] As mentioned herein, the application is configured to be stored and run on the cloud-based computing system 110. For example, the application is configured to receive, via the network 102, data from all physical components of the system 100 and stored on a database associated with the cloud-based computing system 110. The application may be configured to receive data from one or more wireless protocols such as Bluetooth (e.g., Bluetooth Low Energy (BLE)), WiFi, RF, etc.). The application preferably is configured to allow the users 116, 118 to download new game modes, receive commands and/or instructions, display score and skill-improvement tracking, and the like. The application may be configured to communicate with the remote device 112 to receive and display this information to the users 116, 118. The application is also configured to provide historical, up-to-date information concerning global and/or local scoreboards, such as those provided in video gaming environments.
[00148] The application is also configured, in an embodiment, to provide control over light and/or sound preferences, customized gaming mode/gaming configuration creation or sharing, and the like.
[00149] The application, in a preferred embodiment, is also configured to control and communicate with the various targets and physical objects in the system 100 via the network 102. For example, the application is configured to control the start/stop of a gaming session, provide live, real- or near-real-time scoring updates, and visual/audible/haptic user feedback in real- or near-real-time. In accordance with a gaming configuration, the application configures the various components of the system 100 initially in a gaming session setup. Each of the "nodes" within the system 100 is configured to communicate with nearby
"nodes" such that the application need only be connected via the network 102 to one other component within the system 100. Thus, sharing and communication of data, information,
commands and/or instructions, and the like may be shared between devices, and not necessarily through a "central" hub or routing station.
[00150] Referring now to FIG. 4, a battlefield environment depicting exemplary components of the extended-reality projectile-firing system 100 is shown. The battlefield environment depicted in FIG. 4 is an outside arena setting showing multiple users 116, 118 each with a projectile-firing device 152. The device 152 is the same or substantially the same as that depicted in FIGS. 1 and 2, but includes a display 154 on the housing 120 facing the user 116, 118. One of ordinary skill in the art would understand that the display 154 may be placed on other areas of the housing 120 without departing from the scope of the present invention. The display 154 is configured to provide useful game information, such as hits, health, ammo, blaster “type” (e.g., projectile-firing device 104, 152), team alignment, remaining players, etc. For example, a health/shield meter 156 depicted in FIG. 4 shows that the user 116 has 100% of the available health points remaining, and a fully available shield. For example, the inner ring depicted on the health/shield meter 156 may signify the remaining health, while the outer ring depicted on the health/shield meter 156 may signify the remaining shield. The upper left comer of the display 154 shows team information 158 including a team affiliation, players on the team, and opponent players. The team information 158 may also indicate that one or more players have been eliminated from game play. The display 154 also includes, on the lower right comer, device information 160, which includes the current weapon type (e.g., SURGE), the firing mode (e.g., semiautomatic, full automatic), the range (e.g., RNG: 4), the damage capacity per round (e.g., DMG: 5), and the number of rounds remaining (e.g., nine rounds). The range, for example, may be determined by a level system wherein a "default" blaster may be given a Level 1 rating. For example, the Level 1 specifications for the blaster may be represented by a single star next to the type of blaster, and include RNG: 4 (out of 10), DMG: 5 (out of 20),
set to semi-automatic only, and 9 rounds available per reload. The range may be a limiting range, such that once a blaster fires at a target or user beyond that range, the "hit" will not be credited toward the blaster's user. Alternatively, the range may be a dampening factor such that beyond the optimum range the blaster's damage capabilities is depleted significantly. Through the networked system 100, the application can determine the range at which a device 152 was fired at a target 108, 148 or a user 116, 118, and attribute damage to the user associated with that particular device 152 based on the distance between the device 152 and the target 108, 148 or the user 116, 118.
[00151] Through gameplay, upgrades to the blaster level can be acquired in addition to other blaster types. Upgrades may increase the range, damage, and rounds available of a given blaster type. Other blaster types such as “shotguns” could be picked up, and would have specifications balanced differently, such as very high damage ceilings, reduced range ceilings, and varying round capacities.
[00152] The display further includes, on the upper right comer, power/connectivity information 162, which includes icons representing, for example, WiFi and Bluetooth signals, remaining device power, etc. In this embodiment, both the WiFi and Bluetooth protocols are utilized to provide the network 102.
[00153] The device 152 depicted in FIG. 4 also shows a hopper 130 with projectiles 164 therein. As described herein, the projectiles 164 may be gellets, paintballs, or any other type of projectile suitable for gameplay. The users 116, 118 depicted in FIG. 4 are shown wearing glasses 166. The glasses 166 may be simple safety glasses, or in an embodiment, may be extended-reality devices 105.
[00154] Referring now to FIG. 5, a mixed-reality battlefield environment depicting exemplary components of the extended-reality projectile-firing system 100 is shown. FIG. 5 depicts a first-person view looking through an extended-reality device 105. As shown, the extended-
reality device 105 includes the HUD display 142 through which both real-world physical objects such as the users 116, 118, trees, and ground are viewable, as well as virtual objects such as the health/ shield meter 156, the team information 158, the device information 160, and the power/connectivity information 162. FIG. 5 also includes player information 168 displayed in a pop-up style banner above the user 116, 118. For example, as shown in FIG.
5, the player information 168 identifies the name of the player (e.g., JESSTHETIGER), the player's affiliation (e.g., shown by the icon to the left of the player's name), and the player's health/shield meter 156 within the pop-up banner. FIG. 5 further includes an in-game message 170 configured to provide the user 116, 118 associated with the extended-reality device 105 information about the gaming session. For example, as shown in FIG. 5, the user 116 identified as "CASEYTHELION" is shown to be "KNOCKED OUT!", indicating that the user 1 16 has been eliminated from gameplay. Other information and/or data, as described herein, may be displayed to the users 116, 118 through the HUD display 142 without departing from the scope of the present invention.
[00155] Referring now to FIG. 6, a top-down view of a battlefield environment in accordance with an exemplary extended-reality projectile-firing system 100 is shown. The various components depicted in the battlefield environment of FIG. 6 include projectile-firing devices 104, 152 (e.g., blasters), standalone targets 108 (or monolithic targets 148), a drone 109, home base/capture point stations 172, bunkers 174, turrets 176, and barriers 178.
1001561 As described herein, the projectile-firing devices 104, 152 are each associated with a user 116, 118 (not shown in FIG. 6). The user 116, 118 preferably is associated with an extended-reality device 105, a wearable device 106 (e.g., wearable vest), a remote device
112 (e.g., a smartphone), and a fitness monitoring device 114. In addition to these components, a user 116, 118 may utilize physical player-borne shields, designed to shield or deflect projectiles 164. These could be provided to certain “roles” (e.g., Tank, Heavy),
or picked up on the battlefield. An iteration of this device could have a collapse/expand function, so that it only activates under certain game conditions, or for certain time intervals with a recharge period, but can be equipped during the whole game.
[00157] Each of these components are capable of being in communication with the application via the network 102, as described more fully herein. Not all components associated with the user 116, 118 must be operatively connected to a "central" routing station, but may communicate data with nearby elements, associated with the user 116, 118, another user 116, 118, or any of the networked components described herein. For example, the standalone targets 108, the drone 109, the home base/capture point stations 172, the bunkers 174, and the turrets 176 are all configured to be in communication with the network 102.
[001 8] The battlefield environment of FIG. 6 depicts two teams or "sides", each comprising three users 116, 118 (e.g., players). The first team comprises three players on the eastern (right) side of the battlefield, while the second team comprises three players on the western (left) side of the battlefield. On the first team's side of the battlefield, a standalone target 108 is associated with the first team. Thus, in an embodiment, one objective of the second team is to eliminate the standalone target 108 on the first team's side. For example, the target 108 may require a team to inflict damage or cause a certain number of hits to win, to unlock bonuses, "debuff ' the enemy team, or as a target training game mode. Debuffing an opposing team may include reducing the team's rate of fire, reload times, number of health points, reduce shields, or a variety of other reduction methodologies. The debuff may be configured to be on a timer, such that when the timer expires, the debuff is removed.
[00159] In addition, the first team's side includes a drone 109 (which may or may not be associated, or on the same team, as the first team; i.e., neutral, or otherwise), a home base/capture point station 172, two bunkers 174, a turret 176, and two barriers 178. The second team's
side includes a standalone target 108, a home base/capture point station 172, a bunker 174, two turrets 176, and two barriers 178. All of these battlefield components may be networked via the network 102 and configured and controlled by the application.
[00160] A battlefield objective may include "capturing" the drone 109 to bring it onto the first team's side or the second team's side. The drone 109 may include a projectile-firing device 104 mounted and independently remotely controlled by an operator or user 116, 118. The drone 109 may include a kit enabling attachment of a projectile-firing device 104, a standalone target 108, or the like, to an existing drone platform. In an embodiment, the drone 109 may be integrated with these components and provided as an integrated unit.
[00161] If risk of physical damage to the drone 109 is a factor, actual projectile firing could be disabled when the projectile-firing device 104 (e g., a blaster held by a user 116) is aimed at the drone 109, counting as being “shot down” when the trigger is depressed while aimed at the drone 109. The drones 109 may be configured to be “unlocked” as a perk for accomplishing certain objectives. The drones 109 may have certain automated features, allowing them to fly patterns remotely, such as hovering over the home base/capture point station 172 acting as defense, or following a given player, such as in certain video games. The drone 109 may be configured to execute "air strikes," to be unlocked and called in, where the drone 109 takes off automatically and goes to a point or target, takes an action such as blasting it, and then flies back to a preset location.
1001621 While the drone 109 is shown in FIG. 6, other mobile platforms such as rovers or other motorized objects may be utilized for enhanced gameplay. For example, the motorized objects may implement navigational controls, live streaming video, control of projectilefiring devices 104 installed onto the motorized objects, and hit detection and correlation in connection with standalone targets affixed to the motorized objects.
[00163] In an embodiment, the motorized objects are ground-based remote controlled vehicles with mounted blasters (e.g., projectile-firing devices 104) or “explosives” (e.g., mines, grenades, etc.), which can be activated by the user 116, 118 or a remote operator. Both rovers and drones may be used as intelligence gathering with or without mounted blasters, where a camera feed is shown on the display 154 of a device 152, the HUD display 142, remote device 110, or other viewable display. Control over these units could be contested as part of gameplay, where there is only one available rover (or drone 109), and certain successful actions give one player or team access to it for a period of time.
[00164] The home base/ capture point stations 172 are configured to communicate with the application via the network 102. The stations 172 allows players to refill projectiles (e.g., ammunition or rounds) physically, as well as digitally, per the gaming mode or configuration. Proximity to the stations 172 can offer "healing," restoring lost health points while m-game. The stations 172 may also serve as a respawn point; for example, if a player is eliminated during gameplay, the player must physically return to the player's team station 172 in order to return to the game. The game mode or configuration may determine the res pawn timer (i.e., how long a player must wait to respawn), as well as the number of respawns possible per player. In addition, these stations 172 can serve as extensible capture points in appropriate game modes, where one team must hold multiple points simultaneously by positioning their players near it and keeping the opposing players away. Proximity can be determined by RF, NFC, GPS, blasting the object within a repeating time window, or a combination of these and/or other methods.
[00165] The bunkers 174 in an embodiment are rapidly inflating and self-storing inflatable fold-out panels composed of rubber sheets, bistable spring sheets, and inflatable structural beams.
The bunkers 174 include an attached pump or compressed air storage tank whereby the inflatable structural beams are inflated and the air pressure folds out the panels. Fixtures at
the base of the unit can assist in maintaining the panel firmly in place. Inflation and deflation of the bunkers 174 may be controlled by the application and/or the users 116, 118 utilizing any of the methods described herein to activate or deactivate battlefield objects.
For example, the users 116, 118 may deploy the bunkers 174 by activating the bunkers 174 via the remote device 112, the display 154, or through an interface available through the extended-reality device 105 and/or the HUD display 142. The bunkers 174 may also be manually inflated/deflated by the users 116, 118. The application may also keep track of which bunkers 174 are deployed/inflated, and their location or proximity' to other battlefield objects in accordance with several embodiments disclosed herein.
[00166] The turrets 176 are configured to be in communication with the network 102 and controllable through a variety of methods, as disclosed herein. For example, global, connected true “video games” can be augmented into the application to allow for control of a turret 176 via the internet from users worldwide. The turret 176 may be controllable via the application by a team's user 116. Controlling the turret 176 may include placing the turret 176 in a "sentry" mode, where if an opposing player is within a range of the turret 176, the turret 176 will fire projectiles 164 at the opposing player. The turret 176 may include motion sensors or body-detecting cameras to allow for "sentry" functionality. The turrets 176 may be configured to be semi -autonomous (e g., partially player controlled) and fully autonomous (e.g., player puts turret 176 into "mode"). The turrets 176 preferably include a standalone target 108 so that hit detection and correlation may occur. This functionality' would allow a team to "take out" or deactivate a turret 176 during gameplay, or capture the turret 176 for the opposing team's advantage. The ability' to detect hits and correlate the hits with particular players in the battlefield scenario may enhance gameplay with rewards, loot, or other asymmetrical advantages to players for completing interactions according to the active gameplay mode or configuration.
[00167] In an embodiment, according to an "adversarial" target mode, a device 104 mounted on or near the target is aimed at the user 116. Under certain game modes, the turret 176 would fire back at the player (e.g., a "wild west quick draw" mode). If the user 116 does not hit the target quickly enough, the turret 176 shoots back and the user 116 loses the round. This same mode could also be configured to operate with a standalone target 108. The adversarial mode can also be configured to enable a human player to remotely control the adversarial blaster via internet through the network 102.
[00168] As with the bunkers 174, the turrets 176 may be player-deployable or “permanent” (e.g., at a venue) turrets 176 that can be controlled directly or remotely, and/or be automated via motion detection, computer vision, or at random. The turrets 176 preferably would contain the same hit detection systems as the wearable devices 106, the standalone targets 108, and the monolithic targets 148, allowing players to “destroy” them by hitting them a number of times, and allowing attribution of the turrets’ 176 hits to the appropriate player or team. These turrets 176 could allow disabled or remote players to engage with their friends on the battlefield, either via the internet, nearby RF, or hands-on controls.
[00169] The barriers 178 preferably are deployable static barriers, similar to the bunkers 174, but "permanent" for that particular gaming session. For example, teams could be given a number of player-portable barriers to set up before the match, enabling a more interesting playspace on an otherwise flat arena such as a sports field. The barriers 178 can be constructed of lightweight but durable fabric on a collapsible frame similar to a folding chair or umbrella. They can also be printed with branded or themed livery.
[00170] As described herein, global, connected true “video games” can be augmented into the application to allow for control via the internet from users worldwide. Semi-autonomous or fully autonomous extensions may include "swarms" or "strikes" from mobile platforms for tournaments or larger-scale battles.
[00171] In place of the permanent or semi-permanent battlefield objects illustrated in FIG. 6, capturable objects may be deployed in accordance with certain game modes. For example, in "capture-the-flag" gaming modes, the capturable objects preferably can detect their proximity to a team’s home base/capture point station 172 through RF or NFC, or the like, and know when they are in motion via accelerometer and/or GPS, allowing the network to alert players when a “flag” is being captured.
[00172] Beyond capturable objects, the battlefield objects illustrated in FIG. 6 may include battlefield "ordinance" and accessories. For example, the battlefield objects may include mines that are deploy able, proximity detonated battlefield objects that blast projectiles outward. In an embodiment, the mines do not blast projectiles outward, but are controlled by the application to credit a "hit" from the mine in connection with a particular user 116 within range of the mine proximity. In this case, a player coming too close to a mine is notified via audio/video/haptic feedback that they have been hit, and the hit credited against the player. This can also work without projectiles, where a player coming too close is simply notified via A/V/H feedback that they have been hit. Alternately, the projectiles can be ejected, outward but not be required to impact the play er- worn target in order to count, and thus simply providing tactile feedback for the hit.
[00173] The battlefield objects may also include thrown objects by the player. Similar to the mines described above, "grenades" may be utilized and would simulate an explosion.
1001741 As described herein, the type of weapon or projectile-firing device 104 may be virtual, utilizing the network 102 and NFC or other electronic means to allow for battlefield pickup of different weapon types or accessories. For example, a small object preferably can be picked up and placed near or attached to the Smart Blaster (or via interaction with a static battlefield object, e.g., shooting a target), telling it to change configuration and registering it in the system as a new weapon type. Table 1 below illustrates potential examples of
various weapon types within the scope of the present invention in accordance with a gaming mode or configuration.
TABLE 1
[00175] The "Revolver" of Table 1 includes a six-round capacity, requiring a player to reload the Revolver after firing six shots. To reload another six rounds, the Revolver is controlled to wait 3 seconds before being able to fire another round from the next six rounds. Each round inflicts 1 DMG per successful hit. The rate of fire is limited to 1 shot per second. The "Automatic" blaster of Table 1 includes a 30-round capacity with a reload time of 5 seconds. The DMG per hit for the Automatic is higher than the Revolver, at 2 DMG per hit. The rate of fire is not limited and may utilize the physical rate of fire available to the projectile-firing device 104. Alternatively, the rate of fire may be limited. The Bazooka of Table 1 is a "splash-damage" capable option. Since the Bazooka is a slower-firing weapon type, and inflicts greater damage per hit, its reload time is slower than other types of weapons and does not have more than one round in its capacity. The Sniper of Table 1 includes a 1 round capacity with a 4-second reload time, which limits the rate of fire by a 4-second interval. The Sniper has a greater damage capacity of 7 damage than the Revolver or the Automatic. One of ordinary skill in the art would understand that the various weapon types and parameters are fluid and may change depending on the gameplay configuration. For example, a range or "overheat" timer could be specified by weapon type.
[00176] In an embodiment, the trigger 124 of the device 104 is electronic; thus, the “reload” and related reload time can be enforced by disabling the trigger until the player hits a reload button, after which a reload timer is activated. Audible, visual, and haptic feedback can be provided for all of these actions and states (e.g., out of "digital" rounds/ammunition).
[00177] Similarly, the rate of fire can be controlled, only registering a trigger 124 pull or firing a projectile 164 at the appropriate time interval. Semi and fully automatic modes may be manually or automatically controlled, as described herein.
[00178] The system 100 calculates damage digitally through data received from the network 102.
For example, the device 104 provides a signal indicating its weapon type; if the target 108, 148 or wearable device 106 is "hit," the target 108, 148 or wearable device 106 will know the weapon type and use the weapon characteristics (e g., as disclosed in Table 1) to calculate the damage. Alternatively, or in combination, the application calculates damage inflicted on a battlefield object by a particular weapon type. The application, hosted in the cloud-computing system 110, may store a lookup table or other suitable tracking method within a database, or the like, to track and credit hits by/to certain battlefield objects, as more fully described herein. In an embodiment, the damage inflicted by a particular weapon type may be degraded or decreased at outer limits of the weapon type's range.
[00179] Splash damage, such as from a Bazooka, mine, or grenade, can be calculated by a proximity system, as more fully described herein. The splash damage may be greater the closer the battlefield object/target 108, 148 or wearable device 106 is from the epicenter of the "explosion." An IR device associated with the projectile-firing device 104 may be utilized to track splash damage. For example, a number of targets may be “pre-painted” with IR, which will hold them as painted for a fixed period of time, and if one is hit with a projectile within that window, they each receive damage.
[00180] Various means of swapping weapon types is within the scope of the present invention. For example, a variety of “skins” could be implemented, such as shells that attach or slide over the projectile-firing device 104. These skins could have the necessary electronics to tell the blaster what weapon type to become. The skins could combine both physical and virtual objects in accordance with a particular weapon type.
[00181] Other objects could be utilized to provide users 116, 118 with varying attributes for ingame performance. For example, stickers containing passive NFCs can be provided as packs, and players can affix them to objects of their choice to place around the battlefield, allowing loot drops including weapon types and upgrades; these objects could be misplaced without much loss, as new stickers can be easily replaced. In addition, scopes or silencers may be provided as accessories to the device 104 to provide relative effects on gameplay. For example, the scope could increase the range at which a particular weapon type is effective.
[00182] FIG. 7 is a flow diagram of an extended-reality proj ectile-finng gaming method 200. At Step 202, a gaming session in accordance with a predetermined gameplay configuration is initiated. The extended-reality gaming application is configured, in a preferred embodiment, to control and communicate instructions and/or messages to initiate the gaming session. Alternatively, or in combination, the remote device 112 is configured to initiate the gaming session.
|001831 At Step 204, projectile-firing device data is received from one or more blasters (e.g., projectile-firing devices 104, 152.) The projectile-firing device data preferably includes data configured to update gaming metric data in accordance with a gameplay configuration. The gameplay configuration may specify requirements for the gaming session, including available health points, shields, blaster types, available rounds, reload interval times, rounds per magazine, and battlefield objectives, among other things. The
gaming metric data in a preferred embodiment tracks player scores, updating the gaming metric data to confirm successful impacts (e.g., projectile hits) from particular blasters 104 against particular wearable 106 and non- wearable battlefield objects 108, 148, 172, 176, etc. Thus, the projectile-firing device data includes data indicating that a projectile was fired, the time the projectile was fired, and location and orientation information. The projectile-firing device data may also include information obtainable from one or more sensors, such as near-field communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data.
[00184] At Step 206, battlefield object data is received from one or more battlefield objects such as wearable or non-wearable devices. The battlefield object data preferably includes data configured to update gaming metric data in accordance with a gameplay configuration. Thus, the battlefield object data preferably includes information that the battlefield object 106, 108, 148, 172, 176, etc., was impacted with a fired projectile, the time of impact, and the location and orientation of the battlefield object. The battlefield object data may also include information obtainable from one or more sensors, such as near-field communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data. The battlefield object data may wholly, or partly, be provided via an affixed remote device 112 configured to provide the sensory data.
[00185] As described above, at Steps 204 and 206, the projectile-firing device data and the battlefield object data are received. In a preferred embodiment, these data are received by the cloud-based computing system 110 via the network 102 and usable by the extended- reality gaming application. In other embodiments, these data are received by a standalone application at the physical location of the gaming session, or may be a combination of cloud-based and physical location-based computing and processing. One or more remote computing centers or systems may be utilized to receive and process the data.
[00186] At Step 208. the firing of a projectile is confirmed, and a time stamp is associated with the firing of the projectile. For example, the data tracks the projectiles as they are fired by the projectile-firing device 104, and provides time stamps when each projectile was fired in real- or near-real-time. At Step 210, the location and orientation of the blaster that fired the projectile are determined. As described above, the sensory data from the projectile-firing device data is preferably utilized to determine the location and orientation of the blaster.
[00187] At Step 212, an impact on a battlefield object is confirmed, and a time stamp is associated with the impact on the battlefield object. For example, the battlefield object is tracked to perceive impacts on wearable devices 106 and/or non- wearable devices 108, 148, 172, 176, etc. At Step 214, the location and orientation of the battlefield object are determined. As described above, the sensory data from the projectile-firing device data is preferably utilized to determine the location and orientation of the battlefield object.
[00188] At Step 216, it is determined whether the time the projectile was fired and the time of the impact on the battlefield object are within a predetermined time period. For example, the predetermined time is measured in milliseconds, as the blaster's projectile velocity is preferably at least 90 FPS and can exceed 200 FPS. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
[00189] At Step 218, it is determined whether the location of the blaster that fired the projectile and the battlefield object detecting the impact are at locations within a predetermined area. As described herein, the physical range of the blaster 104 is limited, but the blaster 104 may be further limited by the application such that certain impacts outside of the predetermined area are outside of the permitted blaster's 104 range. The location determination permits further confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
[00190] At Step 220. it is determined whether the orientation of the blaster that fired the projectile is oriented with the orientation of the battlefield object detecting an impact by a projectile. If the orientation is within a predetermined direction or orientation, the orientation determination permits further confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object.
[00191] At Step 222, the gaming metric data is updated confirming a successful impact on the battlefield object and a successful hit by the blaster 104. The updated gaming metric data attributes a decrease in health points to a user or battlefield object that was impacted by the blaster 104, in accordance with the gaming configuration, and attributes damage inflicted upon the user or battlefield object to the user associated with the blaster 104. The gaming metric data in a preferred embodiment tracks player scores, updating the gaming metric data to confirm successful impacts (e g., projectile hits) from particular blasters 104 against particular wearable 106 and non-wearable battlefield objects 108, 148, 172, 176, etc. in real- or near-real-time.
[00192] The determinations in Steps 216, 218, and 220 and the updating of the gaming metric data in Step 222 are preferably conducted using the gaming application.
[00193] FIG. 8 is a flow diagram of an extended-reality proj ectile-finng gaming method 300. At Step 302, blaster data is received from a blaster 104. The blaster data is preferably the projectile-firing device data associated with method 200. At Step 304, wearable vest data is received from a wearable vest 106. The wearable vest data is preferably the battlefield object data associated with method 200. At Step 306, battlefield object data is received from a battlefield object 108, 148, 172, 176, etc. The battlefield object data is preferably the same battlefield object data associated with method 200.
[00194] At Step 308, the time of firing of a fired projectile from the blaster 104 is recorded. Preferably the time of firing is recorded and stored in connection with the gaming
application. At Step 310, an impact of the fired projectile on the wearable vest 106 is confirmed. The methodology for determining an impact on a wearable vest 106 is disclosed elsewhere herein. At Step 312, the time of impact of the fired projectile on the wearable vest 106 is recorded. At Step 314, an impact of the fired projectile on the battlefield object 108, 148, 172, 176, etc. is confirmed. The methodology for determining an impact on a battlefield object 108, 148, 172, 176, etc. is disclosed elsewhere herein. At Step 316, the time of impact of the fired projectile on the battlefield object 108, 148, 172, 176, etc. is recorded.
[00195] At Step 318, it is determined whether the time of impact on the wearable vest 106 and the time of firing the fired projectile is within a predetermined time period. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular wearable vest. At Step 320, it is determined whether the time of impact on the battlefield object 108, 148, 172, 176, etc. and the time of firing the fired projectile is within a predetermined time period. This determination provides confirmation that the blaster shot the particular projectile that was detected by the particular battlefield object. For example, the predetermined time is measured in milliseconds, as the blaster's projectile velocity is preferably at least 90 FPS and can exceed 200 FPS.
[00196] At Step 322, a successfully fired projectile attributed to the blaster 104 is recorded.
Preferably the application records the successful hit of the projectile and associates the successful hit with the blaster 104 and the associated user. At Step 324, a successful impact on the wearable vest 106 and/or the battlefield object 108, 148, 172, 176, etc. is recorded. Preferably the application records the successful impact of the projectile on either the wearable vest 106 and/or the battlefield object 108, 148, 172, 176, etc. In practice, a blaster's 104 projectile will likely only be attributable to a single wearable vest 106 or a single battlefield object 108, 148, 172, 176, etc. However, when another projectile-firing
device such as a Bazooka, mine, or grenade involves splash damage, one or more wearable vests 106 and/or batlefield object 108, 148, 172, 176, etc. may be impacted by a single "explosion."
[00197] At Step 326, the gaming metric data is updated to record the successful impact and the successful hit and atribute each to corresponding devices and their associated users.
[00198] It will be appreciated by one of ordinary skill in the art that method 300 does not specifically include location and/or orientation confirmation of successful impacts or hits. However, those confirmations may be implemented in the method 300. Alternatively, any combination of timing, location, and orientation data and/or information may be provided to confirm successful impacts or hits, and are within the scope of the present invention.
[00199] It will also be appreciated that the updated gaming metric data identified in the methods 200, 300 preferably is provided to the projectile-firing devices 104, 152, the display 154, and/or the HUD display 142, and/or the like, to provide updated gaming score information in accordance with the gaming configuration during the gaming session in real- or near- real-time.
[00200] Referring now to FIG. 9, FIG. 9 is a smart target 400 in accordance with a preferred embodiment of the present invention. The smart target 400 is configured to be integrated and utilized with a smart target co-witnessmg hit atribution system in accordance with a preferred embodiment of the present invention. In an embodiment, the smart target 400 is a standalone target 108 or a monolithic target 148. The smart target 400 preferably includes a piezoelectric (PE) sensor 410 configured to determine an impact of a particular projectile upon (or near) the smart target. In an embodiment, the smart target 400 is arranged as a "flower." The standalone target 400 includes a pistil portion 402 (e.g., "pistil" of flower) and a plurality of petal portions 404 (e.g., "petals" of flower). As shown in FIG. 9, the smart target 400 includes six petal portions 404, but in other embodiments, the number of
petal portions 404 may be greater or fewer than six. Each of the pistil portion 402 and the petal portions 404 are configured to separately determine whether a projectile has impacted that portion of the standalone target 400.
[00201] In an embodiment, a PE sensor 410 is embedded or integrated and/or operably coupled with each of the pistil portion 402 and the petal portions 404. In another embodiment, a single PE sensor 410 or two or more PE sensors 410 are utilized to sense impacts on the pistil portion 402 and the petal portions 404. A trained model, as described further herein, is preferably utilized to enable electronic control circuitry to determine when a projectile has impacted a particular portion of the smart target 400. Depending on the application it may be advantageous to provide multiple PE sensors 410 for a single smart target 400. In an embodiment, the same target impact methodology is utilized for the wearable device targets 136 and the central and outer portions 138, 140 of the standalone target 108. The smart target 400 includes support members 406 to support the pistil portion 402 and the petal portions 404. The smart target 400 also includes a stand 408 configured to support the smart target 400 in an upright position. One of ordinary skill in the art would understand that a PE sensor 410 is an electromechanical device that uses the piezoelectric effect to measure changes in pressure, strain, force (or other physiological effects) by converting them to an electrical charge or signal. In smart target parlance, the PE sensor 410 is utilized for hit detection. In other embodiments, sensors other than the PE sensor 410 may be implemented, as described herein.
[00202] As an example, the smart target 400 is affixed to one or more petals of a target system. In an embodiment, the smart target depicted in FIG. 9 includes a central PE sensor 410 and outer "petal" PE sensors 410 configured to detect hit impacts. The same (or similar) modular PE sensor 410 is also configured to be used in connection with wearable vests, for example. Referring now to FIG. 10, FIG. 10 is a perspective view of a wearable device 412
including one or more smart targets 400 in accordance with a preferred embodiment of the present invention. One can see that the PE sensor 410 that forms the center (e.g., central PE sensor 410) and petals (outer “petal” PE sensors 410) of the wearable device 412 are affixed in the central torso portion of the wearable vest depicted in FIG. 2. In addition, the wearable device 412 may include one or more PE sensors 410 on the back of the user's wearable vest or on shoulder portions of the wearable device. In an embodiment, the wearable device 412 is a smart target 400. Depending on the size and sensitivity of the PE sensors, as well as electronic, firmware, or software filtering of the PE sensor's electrical charge (i.e., electrical signal), one or more PE sensors 410 may be trained to determine a successful impact of a projectile (e.g., gellet).
[00203] In a preferred embodiment, logic is trained using machine learning (ML) to identify positive strikes against background noise and artifact impacts by identifying the waveform signature (i.e., electrical signal) specific to gellet composition and anticipated material density. For example, logic trained through machine learning obtains training data; a robot (or the like) uses a smart blaster 416 and varies distance, capturing data to understand the waveform signature that a gellet (or other projectile) generates upon a successful hit. In an embodiment, a two-dimensional convolutional ML model may be trained by multiple sensor inputs being provided simultaneously. The captured data provides a model to be utilized by the smart target 400 to detect a successful hit. The hit detection trained model accounts for off-angle impacts, variance of impact pressure, and other physical metrics (i.e., relative location adjacent to the smart target 400) to increase or decrease the amount of "damage" points given to the user upon a successful hit. The machine learning element obtains data for the trained model used by the PE sensor to understand the angle of impact and range of a gellet impact, as but one example.
[00204] More specifically, the trained model is generated using machine-leaming-assisted noise reduction (e.g., ambient vibration, non-proj ectile impacts), which may be implemented in hardware, firmware, software, or a combination thereof. Machine learning also may include identification of different materials that may be advantageous to eliminate or reduce noise reduction and provide a "cleaner" waveform signature to be used in creating filtering or the like. Envelope detection and peak detection circuits may be provided for determining characteristic gellet impact signals. This feature is not limited to gellet projectiles, but can be used for all types of projectiles including airsoft, paintball, bow style projectiles, firearms, etc.
[00205] The trained model also reduces the ability for "false positive" impacts or cheating, distinguishing non-truthed impacts from real impacts. The accuracy embodied by the trained model eliminates or reduces the necessity for a referee to determine which individual was hit by a projectile, and attribute the hit to another individual. In an embodiment, the smart target 400 includes a filter and amplifier circuit tuned to impact responses for detecting vibration. For example, the filter and amplifier circuit may be configured to eliminate portions of a signal from the PE sensor 410 that are not attributable to a hit by a projectile. As described above, the trained model permits the designer of the filter and amplifier circuitry to understand, based on many multiple projectiles hitting a target with a PE sensor 410, the waveform signature of a successful hit. Thus, the filter and amplifier circuit is preferably designed to eliminate background noise and non-proj ectile impacts, able to differentiate between real impacts by a projectile and everything else that may occur during gameplay.
[00206] In a preferred embodiment, the smart target 400 includes an IR detection sensor 414. Hit detection may be implemented such that if the PE sensor 410 identifies a hit, but the IR detection sensor 414 does not, then the detected hit is not counted as a real strike. In
another embodiment, the PE sensor 410 (i.e., hit detection) versus IR sensor 414 (i.e., hit recognition) may be expressed as a ratio and a successful hit is not counted unless the ratio exceeds a particular threshold.
[00207] In an embodiment, the IR detection sensor 414 may also confirm the angle at which a projectile hit the target, which in a gameplay environment, may adjust "damage" or "hit points" up or down depending upon the angle of impact, tracked and calculated by the application.
[00208] In an embodiment, The IR sensor 414 can be embedded with PE sensor 410. In other embodiments, the IR sensor 414 could be in a different location from the PE sensor 410. Depending on the application; i.e., wearable device 412 vs standalone smart target 410, it may be advantageous to locate the IR sensor 414 near the PE sensor 410, or distanced apart. For example, the distance between the IR sensor 414 and the PE sensor 410 preferably is from /i centimeter to 2 meters. In addition, there may be advantages to the location of the IR sensor 414 versus the PE sensor 410 at varying ranges. For example, logic trained with machine learning may indicate that optimum distances between the sensors is achievable at ranges in accordance with the application. For example, in a blaster-based gaming environment, it may be advantageous to train logic at distances most likely to be experienced during gameplay (e.g., 50 feet or less).
[00209] Referring still to the IR sensor 414, any combination of carrier frequency, emitter LED, and receiver module can be used. IR messaging modules are preferably universal and included to interface with other IR devices. IR modules preferably consist of a high power infrared LED meant for line of sight transmission systems, and an infrared receiver module with matched optical characteristics. The IR modules combined with application features include IR-based universal interaction with the application for actions such as shoot to interact, shoot to pair.
[00210] Other sensors or methodologies may be utilized alternatively or in combination with the PE sensor 410 and IR sensor 414 co-witnessing techniques. For example, a brain-computer interface (BCI) based impact detection system may include skin contact electroencephalogram (EEG) sensors (or others) read players’ physiological response to being hit by a projectile count it as a hit in the system. The BCI technique may be utilized in combination with IR cowitnessing or other confirmation methods. In addition, computer- vision (CV) aided hit detection may be utilized. For example, cameras or other RF sensitive equipment may watch for projectile impact and use machine learning (ML) to filter noise and false positives. The CV-aided hit detection methodology may be best suited to controlled arena play where external cameras/sensors can be permanently mounted, and cost concerns are mitigated. The CV-aided hit detection methodology utilizes video and/or photo recognition technology to help accuracy and gameplay experience.
[00211] Referring still to FIG. 10, a BCI sensor 458 is depicted encircling the head of the user 116, 118. the BCI sensor 458 includes EEG sensors 460 configured to record an electrogram of spontaneous electrical activity of the brain. While the EEG sensors 460 are depicted in certain areas of the BCI sensor 458, they are placed for illustrative purposes only, and one of ordinary skill in the art would recognize that the EEG sensors 460 would be optimally placed in order to accurately monitor and record electrical activity of the brain. For example, a MyndBand EEg headset may be configured to be used in connection with the system disclosed herein. Other headsets or devices are within the scope of the present invention. In other embodiments, a full or partial “helmet” may be utilized to facilitate placement of the EEG sensors 460 on portions of the head of the user 116, 118 that are not covered by the “headband” style headset depicted in FIG. 10.
[00212] As with the PE sensor 410, the BCI sensor 458 preferably is configured to filter or reduce noise of the EEG sensors 460 signals by utilizing a trained model. Similar to the
descriptions of ML-trained models herein, a blaster 104 is shot multiple times from vary ing distances and angles to a smart target 400 and data is gathered from the BCI sensor 458 in the form of EEG sensors 460 signals. A trained model can assist the control circuitry of the smart target 400 (or distributed/remote software application and/or circuitry) to filter out false positives by isolating portions or bands of frequencies. This will assist the system in quickly and efficiently determining whether an object that hits the user is a projectile, and may further allow the system to determine impact angle and distance. For example, the signal from the EEG sensors 460 may appear different (e.g., different distribution of frequencies, or reduced intensity at certain frequency bands) at an impact angle of 0° versus 45°. Likewise, the signal from the EEG sensors 460 may appear different at close ranges versus farther ranges (e g., different intensity or reduced intensity at greater distances).
[00213] In a study using a MyndBand EEG headset, EEG monitoring as a means of hit tracking and co-witnessing was researched. The MyndBand, as well as other EEG systems, uses thin and flexible electrodes (e.g., EEG sensors 460) in direct skin contact with areas of the user’s head. The flexible electrodes may be mounted in hats, headbands, Gel Blaster masks, or other wearables. The EEG sensors 460 were connected to a small computing device with a Bluetooth transmitter. The resulting data (e.g., signals from the EEG sensors 460) were read and displayed in real-time on a computer or mobile device. The EEG sensors 460 produced signals that was parsed to indicate brainwave changes associated with being hit by a gellet (e.g., projectile).
[00214] The study utilized a surprise impact methodology' such that the participants were not anticipating being hit by a gellet. The participants were instructed to proceed with routine tasks until a tester used a blaster 104 to hit the users with a gellet. The study results showed
a significant change in the EEG sensors 460 signal using different participants upon being hit by a gellet. A subset of the data acquired is shown below in TABLE 2.
TABLE 2
[00215] In a second phase of the study, the participants were again struck by gellets by surprise, but only half of the time. The other half of the time, the participants would experience the tester shooting the gellet near the participant, but not actually hit the participant. The data from the EEG sensors 460 signals reveals that additional significant brain activity occurred only when the gellet actually made contact. A subset of the data acquired is shown below in TABLE 3.
TABLE 3
[00216] In addition, the participants were test shot both to the front of the body and the back of the body to ensure the element of surprise. The EEG sensors 460 signals were recorded in realtime. The results indicated that almost all shots captured at short range over a significant amount of time (e g., 1-2 minutes) successfully showed significant brain activity.
[00217] At varying ranges and thus, varying intensities, the EEG sensors 460 signals may vary in their intensity. It may be advantageous to utilize the BCI sensor 458 data as a primary or secondary co-witnessing element (e.g., along with the PE sensor 410, the IR sensor 414, or other suitable co-witnessing elements) depending on the range and reliability of the EEG sensors 460 signals at those ranges. The control circuitry or remote computing device can prioritize the BCI sensor 458 data depending on the gameplay environment, ranges, and other factors.
[00218] Still further, a fabric deformation hit detection methodology may be utilized to provide cowitnessing of an impact. “Smart” textile wearables may detect changes in electrical cunent when deformed from projectile impact. For example, optical fibers or conductive fabrics embedded or weaved into a wearable device 412 or other wearable shirt or vest are within the scope of the present invention. As with the above methodologies, the fabric deformation hit detection methodology may be an additional form of co-witnessing a hit, and may also be utilized with ML for noise (false positive) reduction. In other embodiments, the fabric deformation hit detection methodology may be implemented through use of electronic textiles in which microcontrollers, sensors, and actuators may be integrated into wearable items.
[00219] In an embodiment, when one player is closer than a threshold close-quarters distance, the application can turn off gellet (projectile) firing and utilize IR only for hit attribution.
[00220] FIG. 11 is a smart blaster 416 in accordance with a preferred embodiment of the present invention. The smart blaster 416 includes the mechanical and electronic structures
described herein to shoot gellets and project IR radiation. The smart blaster 416 preferably includes a wireless networking device configured to transmit and receive data as a node in a wireless network (e.g., mesh network, as described herein). In other embodiments, [00221] Hit attribution, in a preferred embodiment, is a combination of hit detection and hit recognition. For example, hit detection includes the ability for a physical object with a PE sensor 410 to detect when the physical object was hit, as described above. In an embodiment, hit detection determines whether an object that strikes at or near the PE sensor 410 is a projectile. Hit detection may also include timer logic between smart blaster 416 and target to associate time of impact to distance of shooter. In addition, triangulation utilizing a mesh network (as more fully described herein) improves higher accuracy in hit detection. These features are used either independently or in combination to verify a correct strike by a correct individual and/or smart blaster 416 associated with the individual.
[00222] Hit recognition, in a preferred embodiment, enables co-witnessing of hit detection such that even greater accuracy in hit attribution may be achieved. For example, the smart target 400 includes embedded infrared (IR) detection sensors 414 that may detect hits from multiple angles. The smart blaster 416 (or other projectile-firing device) includes the ability to generate and project (e.g., emit) IR radiation in a beam 420 from an IR emitter 418. The IR emitter 418 includes a beam angle at which the IR radiation is projected that provides a spread/angle over distance. For example, the range of an IR emitter 418 is 50 feet and the optics allow a spread of about 2 degrees, which preferably provides a beam spread of 2 feet at a distance of 50 ft. In another embodiment, the IR emitter 418 spread may be defined as a wider diameter meant to illuminate at an angle where the top of the beam cone corresponds to a straight trajectory, and the bottom of the beam cone corresponds to a smart blaster 416 pointed towards the sky at a certain angle that makes the
projectiles have a longer range, and thus the IR beam 420 accounts for projectile drop. A combination of these techniques could be used to accommodate different game modes and play styles. In an embodiment, the smart blaster 416 may include a modifiable emitter 418 that includes two beams 420 having different spread angles. A point blank shot may be a wider beam spread (e.g., 3 degrees) with a larger spread angle while a longer range shout may be a narrower beam spread (e.g., 2 degrees) with a smaller spread angle. The emitter 418 may be configured to automatically switch between the different spread angles
[00223] IR-based hit detection sensors 414 embedded in the smart target 400, when coupled with smart blasters 416, can maintain bidirectional communication across multiple players and use timer-based co-witnessing to identify, arbitrate, and attribute hits to the corresponding shooter. The smart target technology may be implemented on smart blasters 416 (e g., emitters 418) and all other objects, including wearables and standalone targets (e.g., receivers 414).
[00224] FIG. 12 is a schematic diagram of a circuit for the smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention. The circuit of FIG. 12 is configured to be utilized in connection with the petal portion 404, the outer petals of the smart target 400 depicted in FIG. 9. The circuit depicted in FIG. 11 includes a microcontroller 422 (e.g., ATT1NY404SSMCSO1C), a PE sensor 410, a programming chip (e.g., UPDI programming chip) 424, interfaces 426 (e.g., SD1 and SM04B-SRSS- TB(LF)(SN) interfaces), an array of addressable LEDs 428 (e.g., WS2812s), and an IR sensor 414.
[00225] The microcontroller 422 depicted in FIG. 11 is a Microchip ATtiny404SSMC-type microcontroller. One of ordinary skill in the art would understand that other microcontrollers are within the scope of the present invention without departing from the present invention. The ATtiny404SSMC microcontroller utilizes high-performance, low-
power AVR® RISC architecture. The microcontroller 422 is capable of running up to 20 MHz, with up to 2/4 KB Flash, 128/256 bytes of SRAM, and 64/128 bytes of EEPROM in a 14-pin package. The microcontroller 422 utilizes a flexible and low-power architecture including Event System and Sleepwalking, accurate analog features, and advanced peripherals. The microcontroller 422 includes PIEZO IN and IR SENSOR for piezo sensor 410 and IR sensor 414 inputs.
[00226] The PE sensor 410 is a piezoelectric sensor configured to detect impacts at or near the physical location of the PE sensor 410. The programming chip 424 is a Microchip Unified Program and Debug Interface (UPDI) configured for external programming and debugging of the device. The interfaces 426 are SD1 and SM04B-SRSS-TB(LF)(SN) interfaces that are configured to receive I2C SCL and SDA signals. The SCL and SDA signals are data and clock signals, respectively. The array of addressable LEDs 428 is configured to drive LEDs or LED strips to light up portions of the petal portions 404. One of ordinary skill in the art would understand that other components may be utilized to achieve the electronic circuit and functionality represented by the circuit of FIG. 12, including a different programming chip 424, interfaces 426, and addressable LEDs 428.
[00227] FIGS. 13-1, 13-2, and 13-3 depict portions of a schematic diagram of a circuit for a smart target co-witnessing hit attribution system in accordance with a preferred embodiment of the present invention. The circuit of FIGS. 13-1, 13-2, 13-3 is configured to be utilized in connection with a "pistil," the center portion 402 of the smart target 400 of FIG. 1. The circuit of FIGS. 13-1, 13-2, and 13-3 include a microcontroller 430, a switch 432, a USB driver 434, a fuse 436, a USB receptacle 438, an external battery 440, an alternate microcontroller 442, an IR sensor 414, a PE sensor 410, interfaces 444, and an array of addressable LEDs 446. Instead of an ATTINY404SSMCSOIC microcontroller 422, an
ESP32-WROOM-32E microcontroller 430 is shown in FIG. 13-1. The ESP32-WROOM-
32E microcontroller 430 is a Wi-Fi + Bluetooth + Bluetooth LE MCU module that is configured for low-power sensor networks and voice encoding, music streaming and MP3 decoding. The ESP32-WROOM-32E includes a PCB antenna. One of ordinary skill in the art would understand that other microcontrollers are within the scope of the present invention without departing from the present invention. The microcontroller 430 includes PIEZO IN and IR SENSOR for the PE sensor 410 and the IR sensor 414 inputs. FIG. 13- 1 also depicts the switch 432, configured to switch between the microcontroller 430 and the alternate microcontroller 442, the USB driver 434 configured to provide power to the circuit, a fuse 436 in connection with power operation, the USB receptacle 438 configured to receive a USB power input, and an external battery 440 configured to power the device when not connected to USB power. Designations of A-G indicate connections of the circuit to the components shown on FIGS. 13-2 and 13-3.
[00228] FIG. 13-2 shows an alternate microcontroller 442 (e.g., ESP32-C3-WROOM-02-N4) in place of the previously described microcontroller 430. The ESP32-C3-WROOM-02 microcontroller 442 includes Wi-Fi and Bluetooth LE modules, 4 MB external SPI flash, and an on-board PCB antenna. FIG. 13-2 also depicts the PE sensor 410 and the IR sensor 414 as well as the array of interfaces 444 (as opposed to single SD1 and SM04B-SRSS- TB(LF)(SN)) interfaces).
[00229] The portion of the circuit depicted in FIG. 13-2 also includes a filtering resistor 445 to filter the PE sensor 410 signal before it is input into the microcontroller 442. In an embodiment, the filtering resistor 445 to filter the PE sensor 410 signal in FIG. 13-2 is chosen so that the peak of the detected signal output is set to a certain voltage. If the value of this resistor is changed, the peak voltage shown by the projectile impact will change as well. Filtering to increase detection accuracy may involve including a circuit that allows changing this resistor value in real time to decrease or increase the device sensitivity. In
another embodiment, the IR emitter 418 spread may be defined as a wider diameter meant to illuminate at an angle where the top of the beam cone corresponds to a straight trajectory, and the bottom of the beam cone corresponds to a smart blaster 416 pointed towards the sky at a certain angle that makes the projectiles have a longer range, and thus the IR beam 420 accounts for projectile drop. A combination of these techniques could be used to accommodate different game modes and play styles.
[00230] Other filtering techniques could be implemented in the circuit or in firmware/software. For example, an envelope detector to wrap a gellet waveform may be utilized. A peak detector circuit may also be utilized, in which an output only would be provided to the microcontroller as a high signal when the voltage is above a peak voltage threshold. The output of a "high" signal may also reduce sampling requirements. A bandpass filter may also be utilized to tune the circuit and isolate the waveform signature. Combinations of these filtering techniques could be utilized. The ML trained logic is utilized to determine the best filtration techniques to fit the particular application (e.g., gellet impact on a standalone target).
[00231] Hit attribution technologies embedded in the smart target and associated circuitry, firmware, and software can be expanded to include other input sources such as smartphone microphones (or standalone microphones). For example, identification and inventory of gellet acoustic signatures may be differentiated against background noise and artifacts for increased accuracy. The acoustic signatures may be added to the PE sensor 410/IR sensor 414 logic to achieve increased accuracy, or any combination of these hit attribution sources of data may be used separately to provide an appropriate level of accuracy, depending on a variety of factors including complexity of the device, cost, manufacturing difficulty, or other factors. A smartphone could also provide other sources of data to the application, including GPS or other sensor data to integrate to achieve increased accuracy of hit
attribution. In place of a smartphone, or in combination, a wearable or standalone device could implement the other sensor data to increase accuracy. For example, a standalone device at a known location relative to the impact location may be utilized to determine distance (e.g., echos), to provide additional sensor data for the application to utilize to increase accuracy.
[00232] FIG. 13-3 shows the array of addressable LEDs 446 (e.g., WS2812s) and transistors 447 coupled to the data control input/output of the USB driver 434.
[00233] In a preferred embodiment, to provide real-time or near-real-time score tracking, it is advantageous to categorize the trained model from ML datasets as a key from a database and efficiently broadcast the key as a hexadecimal identifier over low-latency communications. This "minimal" approach achieves efficient transmission and reception of data to be utilized by the application to track scores and provide scoring data to players in a gaming environment.
[00234] FIGS. 14-17 depict exemplary smart target 400 mesh networks in accordance with preferred embodiments of the present invention. The target mesh networks depicted in FIGS. 14-17 may be utilized independently or in combination with each other. The target mesh networks of FIGS. 14A and 14B are built around a Bluetooth Low Energy (BLE) topology/protocol. The BLE protocol may utilize mesh networking to implement many-to- many communication. FIG. 14A depicts a basic BLE mesh network including a router 448, phone 450, root 452, and a plurality of nodes 454. The BLE mesh network in connection with an embodiment of the present invention is configured to control multiple Bluetooth- enabled devices from a single root 452. In the network topology represented in FIG. 14A, the root 452 controls all three nodes 454 by use of a publish/subscription model. The child nodes 454 are subscribed to the parent node 454 and the parent node 454 is publishing messages to the child nodes 454.
16
[00235] FIG. 14B also depicts a basic BLE mesh network with a router 448, a phone 450, a root 452, nodes 454, and relay nodes 456. The relay nodes 456 are utilized to propagate child nodes 454 and move data downstream.
[00236] The target mesh networks of FIGS. 15A, 15B, 15C are built around an ESP-NOW topology/protocol. The ESP-NOW protocol enables multiple devices to communicate with one another without using Wi-Fi, and is similar to low-power 2.4GHz wireless connectivity utilized in certain low-power devices. In FIG. 15A, there is shown a Broadcast Mode in which small packet messages are broadcast to all devices from the root 452 to the nodes 454. The message packets include sender ID and payload. In FIG. 15B, there is shown a Direct Message mode in which a direct message is sent from a root 452 to a device (e g., node 454). In FIG. 15C, there is shown a Game State Broadcast mode in which a game state broadcast is sent from each device (node 454) in accordance with a predetermined interval. The current game state payload contains target hit counts and target miss counts. When receiving a Game State Broadcast message, the smart target 400 sets its active state after comparing the incoming message with its current state. The higher counts are then set as the target's active state. The target then sends a Game State Broadcast.
[00237] The target mesh networks of FIGS. 16A, 16B, 16C consist of a self-organizing and self- healing WiFi mesh network. FIG. 16A illustrates a self-organizing mesh network in which a node 454 is configured to connect with other nodes autonomously. In an example, each of the nodes 454 depicted in FIG. 16A are communicating with each other within the mesh network. The mesh network depicted in FIG. 16A permits multiple devices (i.e., nodes 454) to communicate with each other without the need for a router 448 through a wireless local area network. FIG. 16B illustrates a node 454 that is out of range (shown as a shaded node 454), and thus, not connected to the mesh network. When the node 454 comes back into range, communication with the node 454 is reestablished because of the constant
provisioning protocol (as more fully described herein). FIG. 16C shows a connection to a node 454 (shown as a shaded node 454) that was out of range and lost connection, but reestablishes communication with a node 454 when it is within range of that particular node 454. Once communication is reestablished, the formerly out of range node 454 is able to transmit and receive data with the mesh network. This methodology is referred to as "self-healing" and can be maintained within a self-organizing mesh network autonomously. [00238] The target mesh networks of FIGS. 17A, 17B, 17C are WiFi mesh networks. FIG. 17A illustrates a basic WiFi mesh network in which the effective range of a router 448 is extended. Multiple nodes 454 within the network permit data to be transferred across the entire network, even if a particular node 454 may have been out of range of the router 448. The line depicted in the graphic illustrates the range of the router 448, in which only the root 452 is within range; however, the nodes 454 are configured to be connected to the root 452 (and other nodes 454) within the network to maintain connectivity and the exchange of data. FIG. 17B illustrates a mesh network where the router 448 and root 452 are integrated, thereby eliminating the necessity for a separate router 448. FIG. 17C illustrates a mesh network with a backup router 456, configured such that a node 454 of the network is also configured to be a router 448. Thus, if the first router 448 in the network loses functionality', the backup router/node 456 may be utilized to continue to facilitate communication with the nodes 454.
1002391 FIGS. 14-17 illustrate preferred embodiments of the present invention. Preferably, the smart target 400 system will adapt a wireless connection module-type that allows for multiple smart targets 400 to be connected to the same network and function as nodes 454 to selectively increase gameplay complexity, as well as expand the network coverage area, thus increasing the gameplay area size. Mesh networking will also uniquely allow for the
low-latency broadcast of messages and commands to all players 116, 118 and devices in the network despite line-of-sight limitations existent in other systems.
[00240] In preferred embodiments, achieving real-time or near-real-time communication via a mesh network may include WiFi without traditional TCP/IP protocol, simplified message IDs, and/or device IDs. For example, simple messaging transmits single packets blindly, or until an acknowledgement is received on its complementary receiver. Data link messaging may establish a connection through simple messaging, then begins to transmit continuously through an error-corrected protocol.
[00241] In a preferred embodiment, low-latency communications may be implemented through a unique protocol. For example, the smart target system (and blasters) make use of a unique protocol to transmit information throughout the network with minimal data package size. The reduction of bandwidth required and usage of high-throughput wireless networks can broadcast information that pertains to gameplay in a number of different factors and controls.
[00242] FIG. 18 is a table of a message protocol 458 in accordance with a preferred embodiment of the present invention. The message protocol 458 depicted in FIG. 18 includes "Dev Mode" 460 and "Game Mode" 462 for smart targets 400 and the application 464. "Dev Mode" 460 is a developer's mode 460 that includes Commands 466 for the smart targets 400 such as Set Target Index 468, Clear All Hit Count 470, Time Between Cycles 472, Cycles Between Hits 474, Start Slice Min Value 476, Minimum Read Value 478, and Hit Read Value 480. Commands for the application 464 include Sensor Hit Count 482, Dev App Settings 484, and Dev Sensor Readings 486. The "Game Mode" 462 is a mode used to facilitate communication between nodes 454 during a gameplay session that includes
Commands 466 for the smart targets 400 such as Reset All Targets 488, Activate Target
490, and Color Test 492. Commands 466 for the application 466 include Hit Message 494,
Miss Message 496, and Active State 498. The Dev Mode 460 is for developers or programmers of the gameplay environment and development of features for gameplay. The Dev Mode 460 is also utilized to develop future physical functions of the smart targets and smart target systems. The Dev Mode 460 is preferably configured to develop ML trained logic for use with the PE sensor 410 or other sensors as described herein.
[00243] The Game Mode 462, as described herein, includes separate Target 400 and App 464 protocols 458. Referring to the Game Mode 462 for the App 464, in a preferred embodiment, once a smart target 400 is hit, the target 400 will broadcast a message with an ID 500/Parameters 502 structure (i.e., the Hit Message Command 494). For example, the hexadecimal ID 500 for a Hit Message Command 494 is OxAA, identifying the message type. The Parameters 502 portion of the protocol 458 is an array of hexadecimal values (indices) configured to keep track of a hit count for each of the seven targets’ hit counts 504 in the gameplay mode (e.g., Target 1 Hit Count . . . Target 7 Hit Count; Target 1 Miss Count . . . Target 7 Miss Count). The Player ID 506 portion of the Parameters 502 portion of the protocol 458 identifies a player associated with the target hit. The application/client 464 is configured to recognize each ID 500 type of the protocol 458 and configured to receive and recognize a particular number of Parameters 502 and process them accordingly. For example, the Hit Message 494 displays the number of times each designated panel has been hit by gellets on the mobile app. The Miss Message 496 displays the number of times gellets were fired but did not hit the designated (and connected) panel on the mobile app. The Target Miss Count 508 Command 466 is needed for scoring in individual game modes as some games will take into account the accuracy of a user's 116,
118 fired gellets. The Active State 498 shows which of the panels are activated or not. The
App 464 Game Mode 462 therefore is configured to receive and display the Hit Message
494, Miss Message 406, and Active State 498 in accordance with embodiments of the present invention as described herein.
[00244] Referring still to FIG. 18, the Target 400 Game Mode 462 includes Reset All Targets 488, Activate Target 490, and Color Test 492 Commands 466. The Reset All Targets 488 Command 466 resets each of the connected panels to the null or a deactivated state. The Activate Target 490 Command 466 activates all of the connected panels to be ready for the game to begin. The Color Test 492 Command 466 lights up the panels in a connected smart target with a custom color from individual Red, Blue, and Green hex values. In a preferred embodiment, the custom colors are determined in accordance with the Target States of FIG. 19.
[00245] The Activate Target 490 Command 466 includes Target State 510 Parameters 502 Target 0 State 510, Target 1 State 510 . . . Target 6 State 510. FIG. 19 is a table of smart target 400 Target States 510 in accordance with a preferred embodiment of the present invention. In accordance with FIG. 19, there is disclosed 256 possible Target States 510 each specified by an ID 512 in a hex value. For example, the "Off1 516 Target State 510 is associated with the ID 512 0x00 hex value. Each ID 512 and Target State 510 are also associated with a Color 514; for example, the ID 512 of 0x00 and Target State 510 "Off 516 is associated with Color 514 #000000 (i.e., no color). FIG. 19 identifies other Target States 510 (i.e.. State column) of Hit 518, Miss 520, BT (i.e., Bluetooth) not connected 522, BT connected 524, Player 1 527, and Player 2 528. Other Target States 510 are possible in accordance with embodiments of the present invention. The Color 514 column of the Target States 510 protocol identifies the hex value (e.g., hex code) of the color, and in parentheses, the actual color associated with the hex value. For example, Magenta 530 is ascribed a hex value of #FF00FF, Green 532 is ascribed a hex value of #00F00, Red 534 is ascribed a hex value of
#FF0000, Yellow 536 is ascribed a hex value of #FFFF00, Cyan 538 is ascribed a hex
value of 00FFFF, and Blue 540 is ascribed a hex value of 0000FF. For example, each hex value indicates amounts of Red/Green/Blue. The hex value #FF00FF specifies an RGB value of R=255, G=0, and B=255. In an embodiment, the Color Test 492 Command 466 is a way to test specific colors by providing the red, green, and blue hex values. The Target States 510 in FIG. 19 have a predefined color assignment and only give information about the current state of each panel. Target State 510 will tell the app 464 whether the panel has received a miss or a hit, it will also communicate the Bluetooth connection state of the smart target 400 device as a whole, among other states.
[00246] The Parameters 502 portion of the App 464 Game Mode 462 for the Activate Target 488 Command 466 includes Target 1 State 510 . . . Target 7 State 510, and Player ID 506. The Target 1 State 510 . . . Target 7 State 510 identifies the current Target State 510 in accordance with FIG. 19. The Player ID 506 is the identifier for each connected smart blaster 416.
[00247] Still referring to FIG. 19, there is illustrated the target states 510 of a smart target 400 in accordance with a preferred embodiment of the present invention. In an embodiment, the 'Off 516 state represents the panel is turned off and not detecting impacts. The 'Hit' 518 state is communicated when the panel is hit by a gellet. The 'Miss' 520 state is communicated when the panel is missed by a gellet (by time out or by hitting another panel). The 'BT not connected' state 522 is communicated when the smart target 400 is not connected to a user's phone with Bluetooth. The 'BT connected' state 524 is communicated when the smart target 400 is connected to a user's phone with Bluetooth. The 'Player 1' state 526 is communicated when the smart target 400 is connected to a user's phone and the user is denoted as Player 1 in the mobile app. The 'Player 2' state 528 is communicated when the smart target is connected to a user's phone and the user is denoted as Player 2 in the mobile app.
[00248] FIG. 20 is a table of a message protocol 542 in accordance with a preferred embodiment of the present invention. FIG. 20 depicts a message protocol 542 that is an alternative embodiment to the message protocol 458 of FIG. 18. In FIG. 20, the "Game Mode" 462 is a mode used to facilitate communication between nodes 454 during a gameplay session that includes Commands 466 for the smart targets 400 such as Reset All Targets 488, Activate Target 490, similar to those same commands depicted and described in connection with FIG. 18. Commands 466 for the application 464 include Hit Message 494, Miss Message 496, and Active State 498. The Commands 466, ID 500, and Parameters 502 portions of the message protocol 542 generally function similarly to the same portions of the message protocol 458, as further described herein. Notably, however, the Parameters 502 portion of the message protocol 542 use the nomenclature “Pad” instead of “Target.” However, the nomenclature “Pad” can be substituted for the term “Target” as the “Pad” is a portion or area of the smart target 400 such as the pistil portion 402 and the petal portions 404.
[00249] In addition, the message protocol 542 includes additional Commands 466 relating to the Target 400 portion of the protocol 542 such as Deactivate Target 544, Activate Pad 546, Activate Hidden Pad 548, Set All Color 550, Set Color State 552, Set Device Name 554, and Set Device Mode 556. The Deactivate Target 544 Command 466 is configured to deactivate the smart target 400, conceivably in a scenario in which, according to the gameplay configuration, the target has been taken out by gellet hits and is no longer active. The Activate Pad 546 Command 466 is configured to activate a portion or area of the smart target 400, such as only the pistil portion 402 or one of the petal portions 402. The Activate
Hidden Pad 548 Command 466 is configured to activate a pad (e.g. pistil or petal portions 402, 404) that has previously been hidden in accordance with the gameplay configuration by the software application. The Set All Color 550 Command 466 is configured to set the
color of each of the pads to a specified RGB hex value. The Set Color State 552 is configured to set the color of specified pads (e.g., Pad 0 State) to a particular color. The Set Device Name 554 Command 466 is configured to specify a particularized name for a device (e.g., node 454 or smart target 400) using an array of hex values. The Set Device Mode 556 Command 466 is configured to set a Device Target Mode 560 in accordance with the Target Modes 560 depicted in the table of FIG. 21B.
[00250] The message protocol 542 includes the additional Command 466 relating to the App 464 portion of the protocol 542 of Display State 558. The Display State 558 Command 466 is configured to display the particular Target State 510. In an embodiment, the Target State 510 is displayed on the application 464 to the user 116, 118.
[00251] FIG. 21 A is a table of smart target 400 Target States 510 in accordance with a preferred embodiment of the present invention. In accordance with FIG. 21 A, there is disclosed 256 possible Target States 510 each specified by an ID 512 in a hex value. For example, the "Off 516 Target State 510 is associated with the ID 512 0x00 hex value. Each ID 512 and Target State 510 are also associated with a Color 514; for example, the ID 512 of 0x00 and Target State 510 "Off 516 is associated with Color 514 #000000 (i.e., no color). FIG. 21A identifies other Target States 510 (i.e., State column) of Hit 518, Miss 520, BT (i.e., Bluetooth) not connected 522, BT connected 524, Player 1 527, and Player 2 528. Other Target States 510 are disclosed as “(Active)” states. The Color 514 column of the Target States 510 protocol identifies the hex value (e.g., hex code) of the color, and in parentheses, the actual color associated with the hex value. Each of the colors depicted in FIG. 19 are repeated in FIG. 21 A. In addition, Orange 562 is ascribed a hex value of #FF4000, Light Green 564 is ascribed a hex value of #C9FF00, and Teal 566 is ascribed a hex value of #00FF2A.
[00252] FIG. 21B is a table of smart target 400 Target Modes 560 in accordance with a preferred embodiment of the present invention. The Target Modes 560 are configured to operate the colors of the smart targets 400 in accordance with a prescribed animation or gameplay configuration. For example, an ID 568 and State 570 indicate the specific hex value and associated information about the state of the Target Mode 560. A Rainbow Animation 572 is configured to display a rainbow display of color on the smart target 400. A Flipper Game 574, a Eliminator Game 576, and a IR Dev Game 578 are all configured to operate colors and timing of the colors in connection with a particular gameplay configuration. One of ordinary skill in the art would understand that a variety of animations and gameplay configurations are within the scope of the present invention and those depicted in FIG. 21B are exemplary.
[00253] FIG. 22A, 22B is a flow diagram of a smart target co-witnessing hit attribution method 600 in accordance with a preferred embodiment of the present invention. At Step 602, a first sensor signal from a first sensor is received in response to an impact at or near a first smart target location. For example, the first sensor is the PE sensor 410. In other embodiments, the first sensor is an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computervision sensor, a radio-frequency sensor, or a smart textile sensor. While the first sensor is specified to be a single sensor, other embodiments in which multiple PE sensors 410 or IR sensors 414 or combinations of the above-referenced sensors are provided. The first sensor signal is triggered to be transmitted to and received by control circuitry preferably integrated with the first sensor upon sensing an impact. As disclosed herein, a PE sensor 410 generates an electrical charge or signal that corresponds to the physical act that it sensed, and provides the signal for processing. The impact may be the impact of any object, including any type of projectile. In an embodiment, it is determined whether the impact occurs at or near a predetermined distance (e.g., an area) from a predetermined
location (e.g., the first smart target location). The predetermined distance may be set such that impacts only on the first smart target location (e.g., a pistil portion 402 or petal portion 404) are provided to the control circuitry. In another embodiment, the predetermined distance may encompass a larger area than the first smart target location. In a case in which the first sensor senses an impact on a second smart target location, that data may be utilized to confirm impact on the second smart target location.
[00254] The control circuitry may be configured to determine that the first sensor signal corresponds to a successful impact. For example, the PE sensor 410 may be configured through circuitry (or otherwise) to provide only signals beyond a peak threshold to the control circuitry. In an embodiment, the control circuitry is integrated with a peak detector and threshold. Alternatively, the first sensor may simply provide all signals in response to an impact at or near the first smart target location. In this embodiment, other methods such as software or firmware manipulation or filtration may be provided to assure that the first sensor signal is an appropriate detection of a hit in association with the first smart target location. Still further, filtration techniques as descnbed herein may be utilized to isolate and assist the control circuitry in determining that the impact was a projectile associated with the gameplay configuration and environment rather than another type of impact. For example, the projectile is a gellet.
[00255] The first smart target location preferably is a pistil portion 402 or petal portion 404. The PE sensor 410 alone, or in combination with control circuitry', may be configured to sense more than one smart target 400 location. For example, a single first sensor may be utilized to sense impacts of projectiles at two locations on the smart target 400. The first smart target location may also be a smart target 400 located on the wearable device 412.
[00256] At Step 604, a second sensor signal from a second sensor is received in response to an impact at or near the first smart target location. Step 604 is configured to co-witness the
same impact event as that identified in connection with Step 602. As with Step 602, the second sensor may be any of a variety of sensors including a PE sensor 410, an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor. Preferably, if the first sensor is a PE sensor 410 then the second sensor is an IR sensor 414, or another type of sensor as described above. However, in an embodiment the first and second sensors are both PE sensors 410.
[00257] As with Step 602, at Step 604, the second sensor signal is generated based in response to an impact at or near the first smart target location. If the second sensor signal is an IR sensor 414 signal, and the first sensor signal is a PE sensor 410 signal, then the IR sensor 414 may be configured to provide an IR signal should it receive radiation at a particular wavelength. For example, the wavelength is 940 nanometers. Other wavelengths are suitable for use as an IR sensor 414. The IR sensor 414 in an embodiment does not utilize infrared radiation, but rather, utilizes frequencies and wavelengths outside of the infrared band. One of ordinary skill in the art would understand that while certain wavelengths are theoretically possible, one should provide safe solutions that prevent dangerous situations for a user’s eyes and other body parts sensitive to certain types of radiation.
[00258] At Step 606, the first sensor signal and the second sensor signal were determined to be received within a predetermined time period. In an embodiment, the first and second sensor signals are presumed to arrive almost instantly from their respective sensors. One of ordinary skill in the art would understand how to provide the first and second sensor signals to control circuitry, or the like, to ensure that the time between sensing the impact and the time the signals are received is the same or substantially the same between the sensors. The predetermined time period preferably is set to be sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the signals were generated at or about the same time. This is important to ensure that the first and second sensors co-witnessed the
same impact at the same time. For example, if the first sensor and the second sensor are the same (e.g., two IR sensors 414), then the signals would typically be generated at around the same time. However, if the first sensor is the PE sensor 410 and the second sensor is the IR sensor 414, then the predetermined time period would necessarily be longer due to the time the projectile takes to impact a smart target 400. Preferably, a predetermined time period depends upon several factors including circuitry delay, the relative delays caused by the circuitry for different types of sensors, the time the projectile takes to impact a target, and the velocity of the device 416, all of which are within the scope of the present invention.
[00259] At Step 608, a smart target update is provided indicating a successful hit in connection with the first smart target location of the smart target 400. For example, the smart target update is configured to identify the target state of each smart target location of the smart target. Preferably, utilization of the target mesh networks and protocols described herein ensure sufficiently fast communication to ensure real-time or near-real-time updates to the smart target states.
[00260] At Step 610, a third sensor signal from a third sensor is received in response to an impact at or near a second smart target location. For example, the third sensor is the PE sensor 410. In other embodiments, the third sensor is an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor. While the third sensor is specified to be a single sensor, other embodiments in which multiple PE sensors 410 or IR sensors 414 or combinations of the above-referenced sensors are provided. The third sensor signal is triggered to be transmitted to and received by control circuitry preferably integrated with the third sensor upon sensing an impact. As disclosed herein, a PE sensor 410 generates an electrical charge or signal that corresponds
to the physical act that it sensed, and provides the signal for processing. The impact may be the impact of any object, including any type of projectile.
[00261] The control circuitry may be configured to determine that the third sensor signal corresponds to a successful impact of a projectile. For example, the PE sensor 410 may be configured through circuitry (or otherwise) to provide only signals beyond a peak threshold to the control circuitry. In an embodiment, the control circuitry is integrated with a peak detector and threshold. Alternatively, the third sensor may simply provide all signals in response to an impact at or near the second smart target location. In this embodiment, other methods such as software or firmware manipulation or filtration may be provided to assure that the third sensor signal is an appropriate detection of a hit in association with the second smart target location. Still further, filtration techniques as described herein may be utilized to isolate and assist the control circuitry in determining that the impact was a projectile associated with the gameplay configuration and environment rather than another type of impact. For example, the projectile is a gellet.
[00262] The second smart target location preferably is a pistil portion 402 or petal portion 404. The PE sensor 410 alone, or in combination with control circuitry', may be configured to sense more than one smart target 400 location. For example, a single third sensor may be utilized to sense impacts of projectiles at two locations on the smart target 400. The second smart target location may also be a smart target 400 located on the wearable device 412.
|002631 At Step 612, a fourth sensor signal from a fourth sensor is received in response to an impact at or near the second smart target location. Step 612 is configured to co-witness the same impact event as that identified in connection with Step 610. As with Step 610, the second sensor may be any of a variety of sensors including a PE sensor 410, an IR sensor 414, an EEG sensor 460 (or BCI sensor 458), a computer-vision sensor, a radio-frequency sensor, or a smart textile sensor. Preferably, if the third sensor is a PE sensor 410 then the
fourth sensor is an IR sensor 414, or another type of sensor as described above. However, in an embodiment the third and fourth sensors are both PE sensors 410.
[00264] As with Step 610, at Step 612, the fourth sensor signal is generated based in response to an impact at or near the second smart target location. If the fourth sensor signal is an IR sensor 414 signal, and the third sensor signal is a PE sensor 410 signal, then the IR sensor 414 may be configured to provide an IR signal should it receive radiation at a particular wavelength. For example, the wavelength is 940 nanometers. Other wavelengths are suitable for use as an IR sensor 414. The IR sensor 414 in an embodiment does not utilize infrared radiation, but rather, utilizes frequencies and wavelengths outside of the infrared band. One of ordinary skill in the art would understand that while certain wavelengths are theoretically possible, one should provide safe solutions that prevent dangerous situations for a user’s eyes and other body parts sensitive to certain types of radiation.
[00265] The second smart target location is preferably distinct from the first smart target location, but located on the same smart target 400. However, in other embodiments, the first and second smart target locations are located on different smart targets 400. The first through fourth sensors and control circuitry are preferably configured such that a single impact at or near both the first and second smart target locations will generate signals in which either the first and second sensor signals confirm an impact or the third and fourth sensors confirm an impact, but not both. Stated otherwise, the sensors should be sophisticated enough so that impacts of projectiles on first and second smart target locations may be distinguished.
[00266] At Step 614, the third sensor signal and the fourth sensor signal were determined to be received within a predetermined time period. In an embodiment, the third and fourth sensor signals are presumed to arrive almost instantly from their respective third and fourth sensors. One of ordinary skill in the art would understand how to provide the third and
fourth sensor signals to control circuitry, or the like, to ensure that the time between sensing the impact and the time the signals are received is the same between the sensors. The predetermined time period preferably is sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the signals were generated at or about the same time. This is important to ensure that the third and fourth sensors co-witnessed the same impact at the same time. For example, if the third sensor and the forth sensor are the same (e.g., two IR sensors 414), then the signals would typically be generated at around the same time. However, if the third sensor is the PE sensor 410 and the fourth sensor is the IR sensor 414, then the predetermined time period would necessarily be longer due to the time the projectile takes to impact a smart target 400. Preferably, a predetermined time period depends upon several factors including circuitry delay, the relative delays caused by the circuitry for different types of sensors, the time the projectile takes to impact a target, and the velocity of the device 416, all of which are within the scope of the present invention.
[00267] At Step 616, a smart target update is provided indicating a successful hit in connection with the second smart target location of the smart target 400. For example, the smart target update is configured to identify the target state of each smart target location of the smart target. Preferably, utilization of the target mesh networks and protocols described herein ensure sufficiently fast communication to ensure real-time or near-real-time updates to the smart target states.
1002681 At Step 618, multiple smart target updates are received in real-time or near-real-time. In an embodiment, the smart target updates are provided in Steps 608 and 616 by control circuitry in connection with a wireless network in accordance with one or more protocols.
For example, the target mesh networks disclosed herein and the protocols identified as communicating via hexadecimal values may be utilized to provide or transmit the smart target updates. Preferably, the smart target updates are target state updates in accordance
with the protocols disclosed in FIGS. 18-21B. The smart target updates are configured to be received by an extended-reality gaming application. The extended-reality gaming application is, for example, the application 464.
[00269] FIG. 22B is a second portion of a flow diagram of a smart target co-witnessing hit attribution method 600 in accordance with a preferred embodiment of the present invention. FIG. 22B shows the continuation of FIG. 22A, indicating “A” from Step 218. At Step 620, projectile-firing device data is received, the data including orientation and location of the device 416, and the time of firing of the device 416. In an embodiment, the device 416 includes a smart target 400. The device 416 preferably includes an IR emitter 418 that emits IR radiation in a beam 420. The IR emitter 418 module or other components in the device 416 may supply the orientation and location of the device 416.
[00270] At Step 622, the time of projectile impact and the time of firing the device 416 is determined to be within a predetermined time period. Depending on the speed of the network, it is possible to determine at Step 622 whether the particular smart target update and the projectile-firing device data were received within a predetermined time period, particularly where a target mesh network is utilized in connection with the protocols of FIGS. 18-22B. In other embodiments, a time stamp may be provided along with the first through fourth sensor signals, or a time stamp may be generated by control circuitry and packetized along with the smart target updates. Still further, in other embodiments, a sensor and network latency may be calculated and predetermined such that receipt of the smart target update and the projectile-firing device data include a predetermined time delay from which it is determined whether the time of projectile impact and the time of firing occurred within the predetermined time period. The predetermined time period preferably is sufficiently small (e.g., fractions of seconds/milliseconds) to ensure that the cowitnessing of the impact and the attribution of the hit to the particular device 416 should be
correlated. It may be advantageous to determine the predetermined time period based on a relatively immediate time window between emission of IR radiation from the IR emitter 418 and detection by the IR sensor 414. In other embodiments, it may also be advantageous to calculate an expected time a projectile will travel over the distance from the device 416 to the smart target 400 and determine the predetermined time period based on the time window between firing of the projectile and the detection of an impact of an object by the PE sensor 410. For example, with a known distance to the target (e.g., 25 feet) and a known velocity (e.g., 170 FPS), the time to target for the projectile is approximately 147 milliseconds. The predetermined period of time therefore can be set to 147 milliseconds or greater to provide a margin of error (e.g., 150 ms or 160 ms). A timer can be set to expire at the predetermined period of time and a successful hit attributed both to the device 416 and the portion of the smart target 400 hit by the projectile if the time of impact is within the predetermined period of time. One of ordinary skill in the art would understand that a variety of methodologies to determine the predetermined time period are within the scope of the present invention.
[00271] A variety of methods are within the scope of the present invention to determine shooter identity. In an embodiment, impact detection by the PE sensor 410 is turned off, or in ‘"sleep mode,” and the portion of the smart target 400 in question is set to only actively detect IR radiation. When the IR radiation is received, a timer is started on the processor associated with the smart target 400 circuitry and the impact detection is activated. The timer is permitted to run for a predetermined period of time, usually less than a few seconds, with the most optimal time determined by the physical distance between the device 416 and the PE sensor 410. If the timer runs over the limit set in the code, the impact detection is turned off and the IR message received can be counted as a “miss.” If
the PE sensor 410 detects an impact of a projectile, then the IR message is correlated with the PE sensor 410 detection and counted as a successful “hit.”
[00272] In another embodiment, to time the IR message reception with the impact detection, one or more separate sensors are provided to accurately measure distance in real-time or near- real-time. For example, a LIDAR sensor or laser-based sensor capable of measuring distances of up to 50 feet (e.g., a device 416 effective range) is provided. This distance measurement can be used by itself, or in combination with an integrated photogate-based chronograph integrated into the device 416. In a preferred embodiment, the barrel 129 is molded and/or integrated in an infrared wavelength-invisible material with two pairs of an infrared LED pointed across the barrel facing a phototransistor sensitive to that wavelength. These two pairs of sensors will trigger once a projectile passes through them, and a timer is started when the first photogate sensor is triggered, and stopped when the second photogate sensor is triggered. This permits individual projectile muzzle velocity measurement, which can calculate the trajectory characteristics, such as horizontal velocity, which together with the distance measurement, can produce an estimation for the time that the Gellet is in flight. This estimation can be used to fine tune the IR detection vs impact detection time window (e.g. predetermined time period).
[00273] In another embodiment, in order to identify individual shooters aiming at the same target, a variety of these techniques can be applied. In one embodiment, a hard-coded time threshold can serve as the limiting factor in identifying which shooter achieved the first impact, where two shooters positioned at different distances from the target will have their projectiles arrive at different times, with one arriving after the target’s threshold expires, thus not being counted.
[00274] In yet another embodiment, if both (or more) shooters are aiming from the same distance at the same target, the target mesh network can relay notices of multiple IR messages (e.g.,
IR sensor 414 detections) from different devices 416 being received at the same location on the smart target 400 back to the same devices 416 and any other “referee” devices in the target mesh network. A central processor handling hit attribution would then digitally stagger the projectile fire from these devices 416, to give the timing algorithm additional gaps between shots and IR messages across the devices 416, thus increasing shooter identification accuracy.
[00275] In another preferred embodiment, the devices 416 can be outfitted with location-tracking sensors such as ultra-wideband (UWB) or Bluetooth Low Energy (BLE) that can selectively turn on or off the IR sensors physically positioned around the smart target 400 or wearable device 412, correlating the localization sensor data with the IR sensor directional data, further increasing accuracy in detecting the correct shooter.
[00276] At Step 624, the device 416 is determined to be within a predetermined range of the first smart target location. Given that the projectile-firing device data includes the location of the device 416, a calculation of distance between the device 416 and the first smart target location can be made. The location may be fixed in an arena environment or may be dynamic where the location is on a wearable device 412. The target mesh network may be utilized to provide triangulation between different nodes 454 so that a location may be determined relative to other nodes within the network. Alternative methodologies of location services may be utilized, such as GPS tracking or the like. The predetermined range may be preset based on the battlefield environment and gaming characteristics. For example, the predetermined range may be 50 feet, 40 feet, 25 feet, 10 feet, or other ranges depending upon the range of the device 416 and the distances of the smart targets 400. The predetermined range may also be determined based on a calculation of time between a time the device 416 is shot and a time the smart target 400 (or a location on the smart target 400) is impacted by a projectile. Using a known device 416 velocity (e.g., 90 FPS), a
calculation of the distance to the target may be determined. Once the distance is known, a predetermined range may be the distance plus an additional amount of range (e.g., as a margin of error). Other methodologies to determine distance (and therefore, predetermined range) are disclosed herein, such as using a LIDAR sensor. One of ordinary skill in the art would understand that a variety of methodologies to predetermine the range are within the scope of the present invention.
[00277] At Step 626, the device 416 is determined to be oriented towards the first smart target location. Using the orientation information from the projectile-firing device data, and the relative location of the first smart target location, the application 464 determines if a vector based on the orientation of the device 416 is aligned with a predetermined area of the first smart target location. For example, IR detection of the device 416 IR emitter 418 may occur at the first smart target location if the smart target 400 includes an TR sensor 414. Tn that case, the IR emitter 418 emits radiation in a beam 420 that spreads over distance, as more fully described herein. Thus, if the beam 420 is detected by the IR sensor 414 of the smart target 400 at the first smart target location, then a match between the particular device 416 and the first smart target location of the smart target 400 is achieved. The predetermined area may be 2 feet, as the typical range of an IR emitter 418 is 50 feet and the optics allow a spread of about 2 degrees, which preferably provides a beam spread of 2 feet at a distance of 50 ft. The predetermined area may also be larger or smaller depending on the beam spread of the IR emitter 418. For example, a wider beam spread would typically mean a larger predetermined area while a narrower beam spread would typically mean a smaller predetermined area.
[00278] At Step 628, gaming metric data is updated to indicate a successful hit by the device 416 on the first smart target location of the smart target 400. The updated gaming metric data preferably is configured to be updated and usable by the application 464. Steps 630, 632,
and 634 are substantively similar to Steps 624, 626, and 628, except that instead of the first smart target location a second smart target location is an object of the determinations. It should be understood that the same device 416 may be utilized to hit both the first smart target location and the second smart target location. However, preferably the same projectile may not be determined to hit both the first and second smart target locations simultaneously. If different projectiles hit the first and second smart target locations, respectively, then those impacts will have occurred at slightly different times, thereby matching the correct proj ectile impact with the correct smart target location. One of skill in the art would understand that only Step 622 may be utilized to match projectile impacts with a particular smart target location. However, when Step 622 is utilized also with Steps 624, 626, 628 or Steps 630, 632, 634, then higher accuracy may be accomplished. One of ordinary skill in the art would also understand that a variety of other sensors (e g., three or more) may be provided to achieve higher accuracy. It is also the case that certain sensors may be better suited for particular ranges, gaming configurations, or devices. For example, it may be advantageous to use the BCI sensor 458 only at certain ranges to ensure that sufficient impacts occur on the user’s 116, 118 body. In addition, at low-impact gaming configurations in which the barrel tip 131 is in a reduced velocity mode, it may be advantageous to utilize sensor types other than the BCI sensor 458. In addition, it may be advantageous to co-witness a first impact with an IR sensor 414, since certain gaming modes may automatically turn off the ability to fire a projectile at a distance closer than 5 feet from any users 116, 118, for example.
[00279] At Step 636, a player state is updated to attribute the hit to a user associated with the device 416. The application 464 is configured to update the gaming metric data and display the data to a user (and on particular smart targets 400). For example, the hit may increment a hit count by 1 unit; in other embodiments, as described herein, the hit may count for a
particular amount of hit points. The application 464 is configured to take the angle of impact and range into account, as well as the gaming configuration and “ammo” type used, to calculated and attribute a certain amount of hit points to the user.
[00280] At Step 638, a smart target state is updated to attribute the hit against an appropriate target location of the smart target 400. The application 464 is configured to update the gaming metric data and display the data to a user (and on particular smart targets 400). For example, the hit may increment a hit count against the first and/or second smart target locations associated with a smart target 400. As above with Step 636, in Step 638, application 464 is configured to take the angle of impact and range into account, as well as the gaming configuration and “ammo” type used, to calculated and attribute a certain amount of hit points against a particular location on a target. As described herein, the locations may be pistil portions 402, petal portions 404, or standalone smart targets 400 affixed to the wearable device 412. Other embodiments with smart targets 400 affixed or located on other battlefield objects are within the scope of the present invention.
[00281] In addition to the foregoing, an IR sensor 414 is configured to provide an IR signal that represents a particular angle of detection and an intensity of the infrared radiation. Based on the angle of detection and the intensity of the infrared radiation, the method 600 may further determine whether the device 416 is within a predetermined margin of error with respect to the location of the first smart target location and the orientation of the device 416. Stated otherwise, the IR sensor 414 may provide the pertinent information correlating the first smart target location and the device 416 to increase the accuracy of the hit attribution between a user and a smart target 400. In addition, it may be possible to eliminate a need to obtain orientation and location information from the device 416 each time it is fired, so long as the angle of detection and intensity information is sufficient to
gauge the orientation and location of the device 416 relative to the first smart target location.
[00282] In a preferred embodiment of the present invention, functionality is implemented as software executing on a server that is in connection, via a network, with other portions of the system, including databases and external services. The server comprises a computer device capable of receiving input commands, processing data, and outputting the results for the user. Preferably, the server consists of RAM (memory), hard disk, network, central processing unit (CPU). It will be understood and appreciated by those of skill in the art that the server could be replaced with, or augmented by, any number of other computer device types or processing units, including but not limited to a desktop computer, laptop computer, mobile or tablet device, or the like. Similarly, the hard disk could be replaced with any number of computer storage devices, including flash drives, removable media storage devices (CDs, DVDs, etc.), or the like.
[00283] The network can consist of any network type, including but not limited to a local area network (LAN), wide area network (WAN), and/or the internet. The server can consist of any computing device or combination thereof, including but not limited to the computing devices described herein, such as a desktop computer, laptop computer, mobile or tablet device, as well as storage devices that may be connected to the network, such as hard drives, flash drives, removable media storage devices, or the like.
1002841 The storage devices (e.g., hard disk, another server, a NAS, or other devices known to persons of ordinary skill in the art), are intended to be nonvolatile, computer readable storage media to provide storage of computer-executable instructions, data structures, program modules, and other data for the mobile app, which are executed by CPU/processor (or the corresponding processor of such other components). The various components of the present invention, are stored or recorded on a hard disk or other like storage devices
described above, which may be accessed and utilized by a web browser, mobile app, the server (over the network), or any of the peripheral devices described herein. One or more of the modules or steps of the present invention also may be stored or recorded on the server, and transmitted over the network, to be accessed and utilized by a web browser, a mobile app, or any other computing device that may be connected to one or more of the web browser, mobile app, the network, and/or the server.
[00285] References to a “database” or to “database table” are intended to encompass any system for storing data and any data structures therein, including relational database management systems and any tables therein, non-relational database management systems, document- oriented databases, NoSQL databases, or any other system for storing data.
[00286] Software and web or internet implementations of the present invention could be accomplished with standard programming techniques with logic to accomplish the various steps of the present invention described herein. It should also be noted that the terms “component,” “module,” or “step,” as may be used herein, are intended to encompass implementations using one or more lines of software code, macro instructions, hardware implementations, and/or equipment for receiving manual inputs, as will be well understood and appreciated by those of ordinary skill in the art. Such software code, modules, or elements may be implemented with any programming or scripting language such as C, C++, C#, Java, Cobol, assembler, PERL, Python, PHP, or the like, or macros using Excel or other similar or related applications with various algorithms being implemented with any combination of data structures, objects, processes, routines or other programming elements.
[00287] Although the operations of the method(s) herein are shown and described in a particular order, the order of the operations of each method may be altered so that certain operations may be performed in an inverse order or so that certain operations may be performed, at
least in part, concurrently with other operations. In another embodiment, instructions or sub-operations of distinct operations may be implemented in an intermittent and/or alternating manner.
[00288] Unless the context clearly requires otherwise, throughout the description and the claims, the words "comprise," "comprising," and the like are to be construed in an inclusive sense, as opposed to an exclusive or exhaustive sense; that is to say, in the sense of "including, but not limited to." As used herein, the terms "connected," "coupled," or any variant thereof, means any connection or coupling, either direct or indirect, between two or more elements; the coupling of connection between the elements can be physical, logical, or a combination thereof. Additionally, the words "herein," "above," "below," and words of similar import, when used in this application, shall refer to this application as a whole and not to any particular portions of this application. Where the context permits, words in the above Detailed Description of the Preferred Embodiments using the singular or plural number may also include the plural or singular number respectively. The word "or" in reference to a list of two or more items, covers all of the following interpretations of the word: any of the items in the list, all of the items in the list, and any combination of the items in the list.
[00289] The above-detailed description of embodiments of the disclosure is not intended to be exhaustive or to limit the teachings to the precise form disclosed above. While specific embodiments of and examples for the disclosure are described above for illustrative purposes, various equivalent modifications are possible within the scope of the disclosure, as those skilled in the relevant art will recognize. Further, any specific numbers noted herein are only examples: alternative implementations may employ differing values, measurements or ranges.
[00290] Although the operations of any method(s) disclosed or described herein either explicitly or implicitly are shown and described in a particular order, the order of the operations of each method may be altered so that certain operations may be performed in an inverse order or so that certain operations may be performed, at least in part, concurrently with other operations. In another embodiment, instructions or sub-operations of distinct operations may be implemented in an intermittent and/or alternating manner.
[00291] The teachings of the disclosure provided herein can be applied to other systems, not necessarily the system described above. The elements and acts of the various embodiments described above can be combined to provide further embodiments. Any measurements or dimensions described or used herein are merely exemplary and not a limitation on the present invention. Other measurements or dimensions are within the scope of the invention.
[00292] Any patents and applications and other references noted above, including any that may be listed in accompanying filing papers, are incorporated herein by reference in their entirety. Aspects of the disclosure can be modified, if necessary, to employ the systems, functions, and concepts of the various references described above to provide yet further embodiments of the disclosure.
[00293] These and other changes can be made to the disclosure in light of the above Detailed Description of the Preferred Embodiments. While the above description describes certain embodiments of the disclosure, and describes the best mode contemplated, no matter how detailed the above appears in text, the teachings can be practiced in many ways. Details of the system may vary considerably in its implementation details, while still being encompassed by the subject matter disclosed herein. As noted above, particular terminology used when describing certain features or aspects of the disclosure should not be taken to imply that the terminology is being redefined herein to be restricted to any
specific characteristics, features or aspects of the disclosure with which that terminology is associated. In general, the terms used in the following claims should not be construed to limit the disclosures to the specific embodiments disclosed in the specification unless the above Detailed Description of the Preferred Embodiments section explicitly defines such terms. Accordingly, the actual scope of the disclosure encompasses not only the disclosed embodiments, but also all equivalent ways of practicing or implementing the disclosure under the claims.
[00294] While certain aspects of the disclosure are presented below in certain claim forms, the inventors contemplate the various aspects of the disclosure in any number of claim forms. For example, while only one aspect of the disclosure is recited as a means-plus-function claim under 35 U.S.C. §112, ^|6, other aspects may likewise be embodied as a means-plus- function claim, or in other forms, such as being embodied in a computer-readable medium. (Any claims intended to be treated under 35 U.S.C. §112, T|6 will include the words "means for"). Accordingly, the applicant reserves the right to add additional claims after filing the application to pursue such additional claim forms for other aspects of the disclosure.
[00295] Accordingly, although exemplary embodiments of the invention have been shown and described, it is to be understood that all the terms used herein are descriptive rather than limiting, and that many changes, modifications, and substitutions may be made by one having ordinary skill in the art without departing from the spirit and scope of the invention.
Claims
1. A smart target comprising: a housing for receiving at least two sensors configured to co-witness an impact of an object at or near a predetermined distance from a predetermined location, a first sensor of the at least two sensors configured to detect an impact of an object at or near a predetermined distance from a predetermined location and generate a first sensor signal corresponding to the detected impact, a second sensor of the at least two sensors configured to receive information relating to the detected impact and generate a second sensor signal corresponding to the detected impact, and control circuitry configured to receive the first sensor signal and the second sensor signal, confirm whether the first sensor and the second sensor signals have co-witnessed the same impact of the object by determining whether they were received within a predetermined period of time, and provide a target state update via a network in accordance with at least one protocol.
2. The smart target of claim 1 wherein the control circuitry is further configured to confirm, from the first sensor signal, whether the object comprises a projectile based on a trained model configured to recognize impacts of projectiles, and generate projectile impact data when the object is confirmed to be a projectile, wherein the target state update includes the projectile impact data.
3. The smart target of claim 2 wherein the second sensor signal comprises one of an infrared signal from an infrared sensor, a brain-computer interface signal from an
electroencephalogram sensor, a computer-vision signal from a computer-vision system, a radio-frequency signal from a radio frequency sensor, or a fabric-deformation signal from a smart textile sensor.
4. A smart target comprising: a first target location, a first piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the first target location and generate a first piezoelectric sensor signal corresponding to the detected impact, a first infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a first infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the first target location when the object is determined to be a projectile by comparing the first piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a first predetermined period of time, generate first target location state data indicating that the first target location was successfully impacted by the projectile, and transmit the first target location state data via a network in accordance with at least one protocol.
5. The smart target of claim 4 wherein the first piezoelectric sensor signal is filtered before it is compared with the piezoelectric impact signal model.
6. The smart target of claim 4 wherein the first infrared sensor signal comprises an angle of detection and an intensity of infrared radiation.
7. The smart target of claim 4 wherein the first piezoelectric sensor is further configured to detect the impact of the object on or within a predetermined distance from a second target location.
8. The smart target of claim 4 further comprising a second target location, a second piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the second target location and generate a second piezoelectric sensor signal corresponding to the detected impact, and a second infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate a second infrared sensor signal.
9. The smart target of claim 8 wherein the control circuitry is further configured to confirm a successful projectile impact on the second target location when the object is determined to be a projectile by comparing the second piezoelectric sensor signal with the piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within the first predetermined period of time, generate second target location state data indicating that the second target location was successfully impacted by the projectile, and transmit the second target location state data via a network in accordance with at least one protocol.
10. The smart target of claim 4 further comprising a frame configured to receive the first target location.
11. The smart target of claim 4 wherein the first target location is located on a wearable device.
12. The smart target of claim 4 wherein the first target location state data is configured to be received by an extended-reality gaming application configured to control the projectile-firing device and the wearable device in accordance with a gaming session having a gameplay configuration for providing an extended-reality environment, receive projectile-firing device data from the projectile-firing device, the projectilefiring device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, update gaming metric data to indicate a successful impact of the fired projectile on the first target location and attribute a successful hit to the projectile-firing device on the first target location when the time the impact occurred on the first target location and the time the projectile-firing device was fired occur within a second predetermined period of time.
13. The smart target of claim 12 wherein the gaming metric data is updated to indicate the successful impact of the fired projectile on the first target location and the successful hit is attributed to the projectile-firing device on the first target location when the location of the projectile-firing device is within a predetermined range from the first target location and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first target location.
14. The smart target of claim 12 wherein the updated gaming metric data is configured to be displayed to a user.
15. A smart target co-witnessing hit attribution system comprising: a network configured to provide communication and control connectivity in accordance with at least one protocol, a projectile-firing device comprising a projectile repository configured to cany' projectiles for firing and an infrared emitter configured to emit infrared radiation at a predetermined wavelength and beamwidth, a smart target comprising a piezoelectric sensor configured to detect an impact of an object at or near a predetermined distance from a predetermined location relative to the smart target and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, control circuitry configured to confirm a successful projectile impact on the smart target when the object is determined to be a projectile by comparing the piezoelectric signal with a predetermined piezoelectnc impact signal model trained to detect impact of the projectile and the impact of the object and the detection of the infrared radiation occurred within a first predetermined period of time, generate target state data comprising the successful projectile impact, and provide the target state data via the network, an extended-reality gaming application configured to control the projectile-firing device and the wearable device in accordance with a gaming session having a gameplay configuration for providing an extended- reality environment,
receive projectile-firing device data from the projectile-firing device, the projectile-firing device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, receive the target state data from the smart target, the target state data comprising a time the smart target was impacted by the projectile, update gaming metric data to indicate a successful impact of the fired projectile on the smart target and attribute a successful hit to the projectile-firing device on the smart target when the time the impact occurred on the smart target and the time the projectile-firing device was fired occur within a second predetermined period of time.
16. The smart target co-witnessing hit attribution system of claim 15 wherein the gaming metric data is updated to indicate the successful impact of the fired projectile on the smart target and the successful hit is attributed to the projectile-firing device on the smart target when the location of the projectile-firing device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the smart target.
17. The smart target co-witnessing hit attribution system of claim 15 wherein the network is a target mesh network configured to transmit messages including the target state data to the extended-reality gaming application.
18. The smart target co-witnessing hit atribution system of claim 15 wherein the at least one protocol is configured to effectuate transmission and reception of data to nodes within the network using hexadecimal values.
19. The smart target co-witnessing hit attribution system of claim 18 wherein the hexadecimal values specify data concerning a status of the target.
20. The smart target co-witnessing hit attribution system of claim 18 wherein the hexadecimal values specify data concerning a status of a user.
21. A smart target co-witnessing hit attribution method, the method comprising the steps of: receiving a first sensor signal corresponding to detection of an impact of an object at or near a predetermined distance from a predetermined location relative to a first smart target location, receiving a second sensor signal relating to the detected impact; determining that the first sensor signal and the second sensor signal were received within a first predetermined period of time, providing a target state update via a network in accordance with at least one protocol.
22. The method of claim 21 further comprising determining, from the first sensor signal, that the object comprises a projectile based on a trained model configured to recognize impacts of projectiles, and generating projectile impact data, wherein the target state update includes the projectile impact data. no
23. The method of claim 21 wherein the second sensor signal comprises one of an infrared sensor signal, a brain-computer interface sensor signal, a computer-vision sensor signal, a radio-frequency sensor signal, or a fabric-deformation sensor signal.
24. The method of claim 21 further comprising receiving the target state update, receiving projectile-firing device data from a projectile-firing device, the projectilefiring device data comprising an orientation and location of the projectile-firing device and a time at which the projectile-firing device was fired, updating gaming metric data to indicate a successful impact of the fired projectile on the first smart target location and attribute a successful hit to the proj ectile-firing device on the first smart target location when the time the impact occurred on or near the first smart target location and the time the projectile-firing device was fired occur within a second predetermined period of time.
25. The method of claim 24 wherein the gaming metric data is updated to indicate the successful impact of the fired projectile on the first smart target location and the successful hit is attributed to the projectile-firing device on the first smart target location when the location of the projectile-firing device is within a predetermined range from the smart target and the orientation of the projectile-firing device is aligned at a predetermined area associated with the first smart target location.
26. The method of claim 21 wherein the network is a target mesh network configured to transmit messages including the target state update.
I l l
27. The method of claim 21 wherein the at least one protocol is configured to effectuate transmission and reception of data to nodes within the network using hexadecimal values.
28. The method of claim 27 wherein the hexadecimal values specify data concerning a status of the target.
29. The method of claim 27 wherein the hexadecimal values specify data concerning a status of a user.
30. An extended-reality projectile-firing gaming system comprising: a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application configured to control the projectile-finng device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended- reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object, and
an extended-reality device comprising at least one display and configured to provide computer-aided vision of real and virtual objects and a computer-aided interface for interaction with the real and virtual objects, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
31. The extended-reality projectile-firing gaming system of claim 30 wherein the computer-aided interface comprises voice-activation control configured to select one or more of the real and virtual objects and to provide the commands or instructions.
32. The extended-reality projectile-firing gaming system of claim 30 wherein the computer-aided interface comprises an eye sensor configured to monitor eye contact with the real and virtual objects.
33. The extended-reality projectile-firing gaming system of claim 32 wherein the commands or instructions are selectable using a control input when the eye contact is aligned with one of the real and virtual objects.
34. The extended-reality projectile-firing gaming system of claim 32 wherein the eye sensor identifies a selection of the commands or instructions associated with the real and virtual objects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period.
35. The extended-reality projectile-firing gaming system of claim 32 wherein the eye sensor identifies a selection of the commands or instructions associated with the real
and virtual objects when the eye contact is fixed on one of the real and virtual objects for at least a first predetermined time period but does not exceed a second predetermined time period.
36. The extended-reality projectile-firing gaming system of claim 32 wherein the control input is voice-activated.
37. The extended-reality projectile-firing gaming system of claim 32 wherein the control input is at least one sensor configured to monitor hand movements and the commands or instructions are selectable based on the monitored hand movements.
38. The extended-reality projectile-firing gaming system of claim 32 wherein the control input is a remote device.
39. The extended-reality projectile-firing gaming system of claim 38 wherein the remote device is coupled to the projectile-firing device.
40. The extended-reality projectile-firing gaming system of claim 38 wherein the remote device is configured to receive data via the network from at least one of the projectile-firing device and the battlefield object.
41. The extended-reality projectile-firing gaming system of claim 38 wherein the remote device is configured to be affixed to the projectile-firing device and provide at least one of near-field communication, radio frequency, infrared, magnetometer, and global positioning satellite sensory data to the extended-reality gaming application.
42. The extended-reality projectile-firing gaming system of claim 30 wherein at least one of the real and virtual objects are displayed in response to the projectile-firing device data.
43. The extended-reality projectile-firing gaming system of claim 30 wherein at least one of the real and virtual objects is displayed in response to the battlefield object data.
44. The extended-reality projectile-firing gaming system of claim 30 wherein the gameplay configuration comprises information corresponding to at least one of a game mode, a health meter, a shield meter, a projectile tally, a projectile repository tally, a rate of fire, a reload interval, a virtual power up, a success tally, a failure tally, a success to failure ratio, an alert, a battlefield role, a team affiliation, a battlefield objective, and a set of battlefield parameters.
45. The extended-reality projectile-firing gaming system of claim 30 wherein a type of the projectile-finng device is selectable through the computer-aided interface.
46. The extended-reality projectile-firing gaming system of claim 30 wherein the commands and instructions comprise one or more of an amount of projectiles available for firing, a predetermined velocity of the projectiles for firing, a predetermined rate of fire of the projectiles, and a predetermined reload interval of the amount of projectiles.
47. The extended-reality projectile-firing gaming system of claim 30 wherein a player is selectable through the computer-aided interface and the extended-reality gaming application facilitates communication with the player.
48. The extended-reality projectile-firing gaming system of claim 30 wherein at least one of the virtual objects is sharable through the computer-aided interface with a player.
49. The extended-reality projectile-firing gaming system of claim 30 wherein the extended-reality device is configured to display a prompt to provide at least one of the commands or instructions.
50. The extended-reality projectile-firing gaming system of claim 30 wherein at least one of the virtual objects is a menu of selectable inputs corresponding to the commands or instructions.
51. The extended-reality projectile-firing gaming system of claim 50 wherein the menu of selectable inputs is a drop-dow n menu.
52. The extended-reality projectile-firing gaming system of claim 30 wherein at least one of the virtual objects depicts an application configured to be executed on the extended-reality projectile-firing gaming system.
53. The extended-reality projectile-firing gaming system of claim 30 wherein the extended-reality gaming application is further configured to update gaming metric data to
indicate a successful impact of the fired projectile on the battlefield object and a successful hit by the projectile-firing device on the battlefield object when a time the impact occurred on the battlefield object and a time the projectile was fired occur within a predetermined time period, wherein the predetermined time period is selected to confirm that the projectile-firing device fired the fired projectile that was detected as impacting the battlefield object, and wherein the updated gaming metric data is displayed on the extended-reality device and configured to modify the real and virtual objects in real or near real-time.
54. The extended-reality projectile-firing gaming system of claim 53 wherein the updated gaming metric data is further configured to receive a location and orientation of the projectile-firing device, wherein the extended-reality gaming application records a successful impact of the fired projectile on the battlefield object and a successfully fired projectile by the projectile-finng device associated with the battlefield object when the location of the projectile-firing device is within a predetermined range from the battlefield object and the orientation of the projectile-firing device is aligned at a predetermined area associated with the battlefield object.
55. The extended-reality projectile-firing gaming system of claim 30 wherein the battlefield object comprises a smart target comprising a target location, a piezoelectric sensor configured to detect an impact of an object on or within a predetermined distance from the target location and generate a piezoelectric sensor signal corresponding to the detected impact, an infrared sensor configured to detect infrared radiation at a predetermined wavelength and generate an infrared sensor signal, and control circuitry configured to confirm a successful projectile impact on the target location when the object
is determined to be a projectile by comparing the piezoelectric sensor signal with a piezoelectric impact signal model trained to detect impact of the projectile and when an impact time of the object and a detection time of the infrared radiation each occurred within a predetermined period of time, generate target location state data indicating that the target location was successfully impacted by the projectile, and transmit the target location state data via the network in accordance with at least one protocol, wherein the extended- reality gaming application is configured to modify the real and virtual obj ects in real or near real-time in response to the target location state data and display the modified real and virtual objects using the extended-reality device.
56. An extended-reality projectile-firing gaming system comprising: a projectile-firing device comprising a projectile repository, a battlefield object, a network configured to provide communication and control connectivity in accordance with at least one protocol to the projectile-firing device and the battlefield object, an extended-reality gaming application configured to control the projectile-finng device and the battlefield object in accordance with a gaming session having a gameplay configuration for providing an extended- reality environment, receive projectile-firing device data from the projectile-firing device, the projectile-firing device data indicating that a projectile was fired, receive battlefield object data from the battlefield object, the battlefield object data indicating that an impact occurred on the battlefield object, and
a computer-aided interface for interaction with gameplay control options associated with the gameplay configuration and continually updated in real or near real-time during the gaming session, the computer-aided interface configured to receive commands or instructions associated with the real and virtual objects and corresponding to the gameplay configuration.
57. The extended-reality projectile-firing gaming system of claim 56 wherein the commands or instructions are received using voice-activation.
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/862,001 | 2022-07-11 | ||
US17/862,001 US11759717B2 (en) | 2021-07-09 | 2022-07-11 | Extended-reality projectile-firing gaming system and method |
US202263409582P | 2022-09-23 | 2022-09-23 | |
US63/409,582 | 2022-09-23 | ||
US18/095,911 | 2023-01-11 | ||
US18/095,911 US11813537B2 (en) | 2021-07-09 | 2023-01-11 | Smart target co-witnessing hit attribution system and method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2024015238A1 true WO2024015238A1 (en) | 2024-01-18 |
Family
ID=89537229
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2023/026913 WO2024015238A1 (en) | 2022-07-11 | 2023-07-05 | Smart target co-witnessing hit attribution system and method |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2024015238A1 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8951128B1 (en) * | 2004-09-27 | 2015-02-10 | Hasbro, Inc. | Device and method for an electronic tag game |
US20170116874A1 (en) * | 2015-10-27 | 2017-04-27 | Tdpb Holdings, Llc | Tactical skills development, assessment and reporting system |
US20170146493A1 (en) * | 2015-09-28 | 2017-05-25 | Wave Array Science, Inc. | Surface mountable piezoelectric sensor array fabric |
US20190374857A1 (en) * | 2018-06-08 | 2019-12-12 | Brian Deller | System and method for creation, presentation and interaction within multiple reality and virtual reality environments |
US20220023752A1 (en) * | 2020-07-23 | 2022-01-27 | Microsoft Technology Licensing, Llc | Virtual-projectile delivery in an expansive environment |
-
2023
- 2023-07-05 WO PCT/US2023/026913 patent/WO2024015238A1/en unknown
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8951128B1 (en) * | 2004-09-27 | 2015-02-10 | Hasbro, Inc. | Device and method for an electronic tag game |
US20170146493A1 (en) * | 2015-09-28 | 2017-05-25 | Wave Array Science, Inc. | Surface mountable piezoelectric sensor array fabric |
US20170116874A1 (en) * | 2015-10-27 | 2017-04-27 | Tdpb Holdings, Llc | Tactical skills development, assessment and reporting system |
US20190374857A1 (en) * | 2018-06-08 | 2019-12-12 | Brian Deller | System and method for creation, presentation and interaction within multiple reality and virtual reality environments |
US20220023752A1 (en) * | 2020-07-23 | 2022-01-27 | Microsoft Technology Licensing, Llc | Virtual-projectile delivery in an expansive environment |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20210234747A1 (en) | Augmented reality gaming system | |
US8469824B1 (en) | Device and method for an electronic tag game | |
US9816783B1 (en) | Drone-target hunting/shooting system | |
US9605926B1 (en) | Drone-target hunting/shooting system | |
US8550916B2 (en) | Interactive game systems and methods including a transceiver and transponder receptor | |
US20160184698A1 (en) | An interactive laser tag system and a method of using the same | |
US8920172B1 (en) | Method and system for tracking hardware in a motion capture environment | |
US8303421B1 (en) | Device and method for an electronic tag game | |
CN111589149B (en) | Using method, device, equipment and storage medium of virtual prop | |
US20170191800A1 (en) | Infrared Gaming System and Method of Use | |
US9901825B2 (en) | System, apparatus, and method of monitoring interactions | |
US11938408B1 (en) | Systems and methods for in-person live action gaming | |
US20190126135A1 (en) | Interactive laser tag system and a method of using the same | |
US11813537B2 (en) | Smart target co-witnessing hit attribution system and method | |
CN112717410B (en) | Virtual object control method and device, computer equipment and storage medium | |
CN110665235A (en) | Unmanned aerial vehicle amusement system that targets | |
KR101664609B1 (en) | Offline system for providing survival game service and respawn method for dead man of the same | |
TW201536389A (en) | Game apparatus | |
KR101695172B1 (en) | System for adjusting shooting mode of simulated gun | |
US11986739B2 (en) | Smart target co-witnessing hit attribution system and method | |
WO2024015238A1 (en) | Smart target co-witnessing hit attribution system and method | |
US11813536B2 (en) | Extended-reality projectile-firing gaming system and method | |
KR102583169B1 (en) | Shooting game apparatus using the drone | |
US20240307759A1 (en) | Gaming System | |
US9782667B1 (en) | System and method of assigning a target profile for a simulation shooting system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 23840142 Country of ref document: EP Kind code of ref document: A1 |