[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20070160960A1 - System and method for calculating a projectile impact coordinates - Google Patents

System and method for calculating a projectile impact coordinates Download PDF

Info

Publication number
US20070160960A1
US20070160960A1 US11/581,918 US58191806A US2007160960A1 US 20070160960 A1 US20070160960 A1 US 20070160960A1 US 58191806 A US58191806 A US 58191806A US 2007160960 A1 US2007160960 A1 US 2007160960A1
Authority
US
United States
Prior art keywords
images
impact
projectile
computer
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/581,918
Inventor
Paige Manard
Charles Doty
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Laser Shot Inc
Original Assignee
Laser Shot Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Laser Shot Inc filed Critical Laser Shot Inc
Priority to US11/581,918 priority Critical patent/US20070160960A1/en
Assigned to LASER SHOT, INC. reassignment LASER SHOT, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DOTTY, CHARLES, MANARD, PAIGE
Assigned to LASER SHOT, INC. reassignment LASER SHOT, INC. CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR'S NAME, PREVIOUSLY RECORDED AT REEL 018426 FRAME 0801. Assignors: DOTY, CHARLES, MANARD, PAIGE
Publication of US20070160960A1 publication Critical patent/US20070160960A1/en
Priority to US11/931,059 priority patent/US8360776B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41AFUNCTIONAL FEATURES OR DETAILS COMMON TO BOTH SMALLARMS AND ORDNANCE, e.g. CANNONS; MOUNTINGS FOR SMALLARMS OR ORDNANCE
    • F41A33/00Adaptations for training; Gun simulators
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/08Infrared hit-indicating systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J5/00Target indicating systems; Target-hit or score detecting systems
    • F41J5/10Cinematographic hit-indicating systems
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F41WEAPONS
    • F41JTARGETS; TARGET RANGES; BULLET CATCHERS
    • F41J9/00Moving targets, i.e. moving when fired at
    • F41J9/14Cinematographic targets, e.g. moving-picture targets

Definitions

  • the present invention relates to a system and method for determining the actual coordinates of a projectile impact. Particularly, the invention is directed to firearms and weapons training systems.
  • Targets are typically made of paper, plastic, cardboard, polystyrene, wood and other tangible materials. Softer materials, such as paper, allow for easy monitoring of impact location as shown by the hole created in the material, but the projectiles quickly destroy these materials. Metal targets are more durable, however, their intrinsic hardness creates difficulty in determining the actual impact location. Self-healing elastomeric materials, like rubber, fall somewhere in between—they are more durable than the softer materials, but determining the exact impact coordinates is not very easy. Training simulators were developed to simulate continuous action and overcome some of the disadvantages associated with shooting at traditional targets. However, these simulators require the use of simulated weapons. Simulated weapons do not accurately convey the feel and recoil action of firearms. Trainees, not used to extensive target practice with live firearms, may be disadvantaged when required to handle firearms in combat situations. Current training simulators use technology that limits realism and the ability for through performance measurement.
  • a variety of methods have been disclosed in the prior art to detect the impact location of live projectiles. Most of these methods require direct or visual inspection by the shooter or trainee.
  • Prior art methods detect holes, cold spots, spots of light or supersonic waves.
  • Other methods calculate trajectories or monitor changes in electrical properties at the impact zone in order to estimate the impact location.
  • the impact location of a projectile can be determined directly by locating the point of impact or penetration visually on the target itself. For example, paper or cardboard targets would show a hole in the target corresponding to the location of penetration of the projectile.
  • Metal targets may show a hole, indentation, or surface mark where the projectile impacted or penetrated.
  • These methods include employing a backlit screen which, when penetrated by a projectile, shows a bright spot from the backlight, using acoustic sensors which detect the shock wave from the passing projectile, or using thermal means of heating the target to a uniform temperature and then looking for cold holes left by the penetrating projectile.
  • a sensor is used to capture images of the energy changes, or spikes, across a planar surface.
  • the planar surface comprises one or more screens capable of displaying one or more targets.
  • the screen comprises a self-healing, elastomeric material.
  • the targets can comprise live video, computer graphics, digital animation, three-dimensional images, two-dimensional images, virtual targets and moving targets.
  • the sensor registers the impact by virtue of a corresponding change in energy across the screen surface.
  • the sensor is a thermal camera.
  • the sensor is connected to a computer.
  • the system is calibrated such that computer has enough information to translate coordinates from a three-dimensional plane defined by the target to logical virtual screen coordinates that can be used by the computer's operating system.
  • the computer further comprises software to calculate the exact pixel coordinates of the projectile impact from the logical virtual screen coordinates. Once the pixel coordinates have been calculated, the computer relays this information to the trainee using feedback mechanisms comprising a projector, monitor or any other electronic device capable of receiving and visually or graphically displaying this information.
  • the process of calculating the impact coordinates and relaying the information back to the trainee is limited only by the computer's processing speed, and the process is virtually instantaneous.
  • the system comprises a device such as a video player capable of recording and playing back true-to-life simulated training scenarios.
  • the computer transmits information about the impact coordinates to the video player.
  • the video player selects a scenario that depicts the after-effects or outcome of a projectile accurately hitting, nearly hitting or missing the target.
  • the scenarios can be projected on to a screen or displayed on a monitor or any other feedback device.
  • the invention does not involve detecting holes or damage to the target to determine impact location. Nor is the impact estimated from a determination of the projectile trajectory.
  • Sensors comprising image sensors or thermal sensors are used to detect an impact based on changes in energy at the screen surface.
  • the sensor comprises software to isolate thermal images of a projectile impacting the screen surface from continually captured thermal images of the screen surface. The isolated thermal images are sent to a computer attached to the sensor. The computer receives these coordinates as mouse clicks. The computer can calculate actual projectile impact coordinates, relative to a projected target on the screen surface, from the impact images transmitted by the sensor.
  • the invention can also be adapted to assist users of other types of projectile launchers such as bows, crossbows, spears, darts, balls, rocket launchers or other projectile launching devices, by detecting the heat energy transferred to the target upon impact or penetration.
  • projectile launchers such as bows, crossbows, spears, darts, balls, rocket launchers or other projectile launching devices
  • the invention improves the effectiveness and realism for training the military, police officers, marksmen, sportsmen or other firearm users, in a simulated environment using real weapons with real ammunition, by detecting the heat transferred to the target upon impact or penetration of the target by the projectile.
  • the invention is effective because the training does not need to be halted to determine the impact location.
  • the realism is improved because the trainee does not have to use a simulated or demilitarized weapon in training. Since actual weapons and ammunitions can be adapted for use with the system, the trainee experiences the sounds, recoil and discharge associated with the trainee's own weapon. The trainee is thus better able to handle real-life situations.
  • the invention allows the trainee to determine the impact location without approaching the target. This aids in safer training because the trainee is not required to be within the range of fire to view where the projectile impacted a target.
  • FIG. 1 shows a schematic of a training system to detect the actual projectile impact coordinates.
  • FIG. 2 shows a schematic of the actual impact coordinates projected onto the screen.
  • FIG. 3 shows a simulated training scenario
  • the invention comprises a training system and a method to detect actual coordinates of a projectile launched at one or more targets projected onto one or more screens.
  • the one or more targets comprise virtual targets, live video, computer graphics, digital animation, three-dimensional images, two-dimensional images and moving targets for receiving the projectile impact.
  • FIG. 1 shows the system comprising a calibrated sensor 4 capable of detecting energy changes at the point of impact on a screen when the projectile 2 impacts the screen 3 .
  • the sensor 4 captures images of the energy spikes on the surface 3 a of the screen and relays it to an attached computer 5 .
  • the computer 5 comprises software to calculate the actual coordinates of a projectile impact 10 based on the images transmitted by the sensor 4 .
  • FIG. 1 further depicts one or more feedback devices.
  • the feedback devices can comprise a projector 6 for displaying the coordinates onto the one or more screens, a monitor connected to the computer 7 , a printer connected to the computer 8 , or any other electronic device capable of receiving digital signals from the computer.
  • Feedback devices such as the monitor, the projector and printer, immediately translate the digital signals into visual or graphical representations of the calculated projectile impact coordinates.
  • FIG. 2 depicts the impact coordinates 10 of the projectile impact along a virtual X-axis 9 and a Y-axis 11 projected onto the screen 3 .
  • the system further comprises software that can display simulated training scenarios 12 on the screen 3 , as depicted in FIG. 3 .
  • the training scenarios depend upon the calculated impact coordinates.
  • the training scenario 12 would then show the target as continuing to move rather than immobilized.
  • the training scenarios 12 are selected according to further actions required.
  • the system is portable and it and can be used in indoor shooting ranges or in limited spaces where the ambient lighting is not easily reflected.
  • the system can comprise a portable shooting range comprising a housing comprising a container.
  • the containerized housing further comprises a screen for displaying projected targets, a thermal camera, a computer, a projector, and a monitor for providing immediate feedback.
  • the containerized system can be transported for on-site training.
  • the system finds application in various law enforcement training situations like sniper, artillery, weapons and sharpshooter training.
  • Any projectile launching device 1 can be adapted for use with the invention.
  • These devices include chemical or explosive powered devices such as firearms, pneumatic or compressed gas or spring-piston powered devices, elastic or spring tension powered devices, laser guns and bows, and any other device capable of launching projectiles.
  • projectiles 2 may be deployed with this invention.
  • the type of projectile used depends on the training requirements.
  • the projectiles comprise bullets, including lead bullets, copper jacketed bullets, steel jacketed bullets, tracer bullets, frangible bullets, plastic bullets, shotgun shot of various sizes and materials, and shotgun slugs.
  • Softair pellets, metal or plastic pellets, metal or plastic BBs, frangible pellets, arrows, spears, darts, stones, balls and hockey pucks, lasers, rockets, missiles, grenades and other objects, now known or later developed, that can leave a heat signature upon impact may be used as projectiles.
  • the projectiles 2 are launched at one or more screens 3 .
  • the screens 3 can be constructed from any of several materials comprising paper, cloth, plastic, metal or rubber.
  • the screen comprises an elastomeric material such as rubber, vinyl, silicone or plastic.
  • elastomeric materials allows for various projectile types to impact the material and either bounce off or penetrate the screen while doing minimal damage to the screen.
  • certain types of elastomeric materials such as rubber will allow the projectile 2 to open a hole the size of the projectile 2 , allow the projectile 2 to pass through the material, and then close back up due to the elastic nature of the material.
  • the front surface 3 a of the one or more screens 3 is coated with a white or light colored reflective coating to allow one or more targets to be projected upon it.
  • the back surface of the screen is preferably set up against a bullet trap or ballistic material.
  • the screens 3 are compact and they can be hung on the walls of a shooting range, or inside a containerized shooting range, for instance.
  • the screens 3 may comprise spring roller pull-down models, electrically operated types or the portable models.
  • the screens 3 may be operated with remote controls or may be manually controlled.
  • the screen sizes depend upon the distance between the screen and the projector.
  • any planar surface that can receive one or more projected images can act as a “screen.” Examples of such surfaces are rock walls, concrete walls, etc.
  • the projectiles 2 are launched at targets projected on to the screen surface 3 a .
  • These projected targets can comprise digital animation, live videos, computer graphics, three-dimensional images, two-dimensional images, moving targets and other pictorial representations.
  • the projected targets further comprise one or more virtual targets for receiving the projectile impact.
  • the training system comprises a sensor 4 , preferably a thermal imaging sensor for capturing thermal images of the screen surface 3 a .
  • the sensor 4 is directed at the front surface of the screen 3 a .
  • the sensor 4 may be placed at an angle to the screen 3 , that is, to the left of the front of the screen 3 and directly in front of the screen 3 , looking down at the screen 3 or positions other than perpendicular to the front of the screen.
  • the sensor 4 does not have to be able to see the entire projected target.
  • the sensor 4 continually captures thermal images of the screen 3 .
  • the sensor comprises software that can detect a projectile impact 10 on the screen 3 by comparing current thermal images of the screen surface 3 a with previously captured baseline thermal images of the screen surface 3 a .
  • the sensor 4 registers an impact 10 when the current thermal images of the screen show a deviation from the baseline images. The deviation from the baseline is caused by the energy transferred to the screens during the projectile impacting 10 or penetrating the screens.
  • the sensor 4 transmits only the impact images to the computer 5 for processing. Since the sensor 5 does not transmit multiple thermal image frames to the computer for analysis of impact 10 coordinates, the efficiency of the system is enhanced.
  • the sensor 4 comprises a thermal camera.
  • the thermal camera 4 comprises an infrared core that can detect heat across the energy spectrum, including the infrared region of the energy spectrum.
  • the thermal camera 4 comprises a frame rate of at least 30 frames per second to capture images of the energy spike due to the projectile impact.
  • the thermal camera 4 further comprises a frame rate of at least 60 frames per second.
  • thermal cameras 4 There are several commercially available examples of thermal cameras 4 that can be used with the training system.
  • One such commercial example is the M3000 Thermal Imaging Module manufactured by DRS Nytech Imaging Systems, Inc.
  • the thermal camera 4 contains a software interface manufactured by Lumenera, Inc.
  • the system further comprises a computer 5 to interpret and analyze the thermal images detected by the sensor.
  • the computer comprises 512 MB DDR, 40 GB hard drive capacity and a processing speed of 3 GHz.
  • the computer 5 is connected to the sensor 4 through an USB 2 or comparable interface.
  • the computer 5 comprises software to receive the images captured by the sensor 4 by clicking the mouse or as mouse clicks.
  • the computer 4 further comprises distortion calculation software libraries to calculate the actual pixel coordinates 9 of a projectile impact 10 . Once the computer calculates the actual pixel coordinates 9 , its software programs can digitally illustrate the impact coordinates. These illustrations are digitally transmitted to one or more feedback devices comprising a projector, monitor, printer or any other device capable of receiving digital signals.
  • the computer further comprises software programs that trigger virtual training scenarios 12 .
  • the sensor 4 is calibrated so that the computer 5 connected to the sensor 4 uses only the images relayed by the sensor 4 to determine impact coordinates 9 . Calibration also compensates for the distortions produced by the sensor 4 lens and extrinsic factors such as the placement of the sensor 4 relative to the screen 3 .
  • the computer 5 can relate the pixel coordinates from a projected target 9 to calibrated logical virtual screen coordinates that can then be used by the computer's 5 operating system to determine actual impact coordinates 9 .
  • the sensor 4 may be placed at an angle to the screen 3 , that is, in front of the screen 3 and to the left, directly in front of the screen 3 , looking down at the screen 3 , etc.
  • the sensor 4 does not have to be able to see the entire projected target.
  • the computer 5 can actually define its own viewable area within the area defined by the screen 3 . If the entire projected target is not viewable, then only the viewable areas of the screen 3 are calibrated. But, for instance, if the projected target is on a screen 3 that has borders containing materials that do not reflect light well, a projectile impact 10 in that border space may nevertheless be detected by the sensor 4 .
  • the calculation software can also calculate and compensate for the radial and tangential distortions caused by the sensor lens.
  • the system projects onto the screen 3 an arbitrary number of evenly spaced vertical lines and horizontal lines, one at a time. The system attempts to create these lines so that they encompass the entire projected area. This ensures accuracy in calculating the impact coordinates. If the coordinates cannot be found, then the system adjusts the size, position, and pixel width of the lines until an arbitrary accuracy error percentage threshold is reached.
  • the system next projects a “black” image onto the screen.
  • the pixel values from the black projected image are subtracted from the pixel values of the vertical projected image and the horizontal projected image. If both images produced by the subtraction contain pixels at the same place and their values are greater than an experimental threshold, their intersection defines one pixel coordinate.
  • the system also captures and stores thermal images comprising information on the baseline temperatures of each logical screen coordinate. When a projectile impacts the screen, energy is transferred to the screen. Thermal images of the screen are continually captured by the sensor and processed against the stored baseline screen images. If the current thermal images show a deviation from the captured thermal images, a projectile impact is registered.
  • the extrinsic parameters of the system can be determined. Two vertical lines and two horizontal lines are projected onto the one or more screens, with each line in each set of lines being as far apart as possible. The same process described above is used to determine the intersection between the set of lines. These coordinates are then undistorted using the distortion calculation software library with the parameters found above. This process results in the determination of four undistorted corner coordinates of the projected image.
  • the corner coordinates and the coordinates contained in the quadrilateral defined by the four corners must also be related to coordinates within the surface area of the screen.
  • a matrix capable of translating each coordinate to satisfy the above condition is created.
  • the matrix is created as follows: The variables required consist of the captured corner coordinates determined above and the “ideal” coordinates defined by the surface area of the screen. Starting with the ideal coordinates, the two-dimensional perspective matrix defined by those coordinates is calculated. The matrix is used to transform the captured coordinates. Next, the deviation between each transformed captured coordinate and the relative ideal coordinate is calculated. This deviation is the absolute value of the difference between each relative X and Y coordinate. Each deviation is added to the appropriate component of the last set of coordinates used to find the perspective matrix. Those coordinates are then used in the next calculation of the perspective matrix, and this process is carried out until an arbitrary combined deviation is reached or a maximum number of iterations have been run.
  • the logical screen position for each coordinate from a captured image may be determined by “undistorting” it using the distortion calculation software library, and then transforming the undistorted coordinate by the matrix found above.
  • the undistorted and transformed coordinate may be out of bounds of the virtual screen space.
  • the system further comprises an image-generating device comprising a liquid crystal display (LCD) projector, a digital projector, a digital light processing projector, a rear projection device, or a front projection device.
  • an image-generating device comprising a liquid crystal display (LCD) projector, a digital projector, a digital light processing projector, a rear projection device, or a front projection device.
  • the system comprises a LCD projector 6 .
  • An image is formed on the liquid crystal panel of the LCD projector from a digital signal from the computer 5 , for instance. This formed image is then displayed onto the screen 3 .
  • the system further comprises a plurality of training scenarios 12 that aid in skills training.
  • These training scenarios 12 comprise video scenarios, digital animation, two- and three-dimensional pictures and other electronic representations that may be projected onto the one or more screens 3 .
  • the training scenarios 12 can lead or branch into several possible outcomes beginning from one initial scene.
  • the trainees may pause or replay the completed scene to show the precise impact time and projectile impact coordinates 9 and thereby allow for detailed discussion of the trainee's actions.
  • the training scenarios comprise anticipated real-life situations comprising arrests by law enforcement personnel, investigative scenarios, courthouse scenarios, hostage scenarios and traffic stops.
  • the training scenarios also aid in judging when the use of force may be justified and/or necessary by showing the expected outcomes from a projectile impact 10 .
  • one or more targets are projected onto the one or more screens 3 or display surfaces using a projection device such as a projector 6 or any another graphics generating device that can project a target or scenario.
  • the targets can comprise virtual targets.
  • a projectile 2 launched from a projectile launching device 1 penetrates or impacts 10 the targets.
  • a calibrated sensor 4 is directed at the one or more screens 3 . When a projectile 2 impacts 10 the front surface of the screen 3 , an energy spike or change in temperature is detected at the screen surface 3 a .
  • the sensor 4 continually captures thermal images of the one or more screens 3 .
  • the sensor 4 processes these thermal images against baseline thermal images of the screen surface. The sensor registers an impact when a deviation from the baseline is observed.
  • the sensor 4 then isolates the impact images from the other captured screen images.
  • the isolated impact images are transmitted to the computer 5 connected to the sensor 4 . Since the computer 5 only receives images of the actual impact 10 , it does not have to process superfluous thermal images of the screen surface in order to detect an impact 10 . This greatly improves processing speed.
  • the sensor 4 is calibrated so that the computer 5 is able to detect actual pixel coordinates 9 of the projectile impact 10 relative to the projected target.
  • the computer 5 further comprises software to digitally illustrate the impact coordinates 9 .
  • Feedback devices comprising monitors 7 , printers 8 or other electronic devices capable of receiving a digital signal from the computer may be used to visually or graphically depict the impact coordinates 9 .
  • the impact coordinates 9 may also be projected, using the projector 6 onto the one or more screens 3 .
  • the system further comprises simulated training scenarios 12 that are triggered by the computer 5 upon the calculation of the actual projectile impact coordinates 9 .
  • These training scenarios 12 comprise video, digital animation or other virtual compilations of one or more situations that simulate real-life conditions. These situations comprise hostage scenarios, courthouse encounters, traffic stops and terrorist attacks.
  • Each scenario comprises a compilation of one or more scenes. The scenes are compiled in such a manner that any given scene may further branch into one or more scenes based on input from the computer regarding the calculated impact coordinates. The branching simulates expected outcomes in similar real life situations.
  • the impact coordinates 9 may further be superimposed against, say, a graphic of a target's body, and the coordinates “frozen” for the trainee to visually inspect the extent of any deviation from the expected shot location.
  • the training scenarios 12 may also be used to display collateral damage that may be expected in real life situations.
  • the system further comprises a one or more projectile launching devices comprising laser-triggering devices. These laser-triggering devices may be used to fire one or more projectiles comprising lasers at the screens 3 .
  • the system further comprises software to detect the location of the laser device that launched a particular laser at the screens 3 .
  • the system comprises a thermal sensor 4 comprising a thermal camera directed at the one or more screens 3 .
  • the thermal camera 4 comprises software to detect and isolate thermal images of the one or more projectile impacting 10 the one or more screens 3 .
  • the thermal camera 4 transmits the impact images to a connected computer 5 .
  • the computer 5 is connected to the thermal camera 4 through an USB2 or comparable interface.
  • the thermal camera 4 is calibrated so that the attached computer 5 can compute impact coordinates 9 relative to predetermined logical screen coordinates.
  • the impact coordinates 9 are sent to feedback devices comprising projectors 6 , printers 8 , monitors 7 or other electronic devices capable of receiving a digital signal from the computer 5 .
  • the feedback devices can visually or graphically illustrate the impact coordinates.
  • the system further comprises training scenarios 12 that comprise a compilation of imagery comprising video and animation figures.
  • the scenes are compiled to simulate real-life incidents, such as hostage situations and traffic stops, which are encountered by the law enforcement and military personnel.
  • the system comprises software that upon notification of the impact coordinates further branches into one or more possible outcome based scenarios. These outcome-based scenarios simulate real life responses.
  • the system further comprises a video editor. The trainee can film their own video clips and import them into the editor. The imported video is converted into MPEG-4 or comparable format. The trainee can then create scenarios comprising branching points as desired. Branching conditions that are correlated to the coordinates of the projectile impact may also be defined. The trainee may ultimately group multiple scenarios together to present diverse training situations in a single training session.
  • the thermal camera 4 continually captures current thermal images of the screen surface 3 .
  • the computer 5 connected to the thermal camera 4 receives these thermal images as mouse clicks.
  • the computer 5 processes these images against baseline thermal images of the screen surface. If the computer 5 detects a deviation from the baseline, an impact is registered.
  • the computer 5 further comprises software to calculate the projectile impact coordinates 9 from the impact images. Once the coordinates have been calculated, they are sent to feedback devices connected to the computer 5 .
  • one or more projectiles 2 are launched at one or more projected targets.
  • a thermal camera 4 is directed at one or more screens 3 comprising the projected targets.
  • the thermal camera 4 continually detects and captures thermal images of the screen surface.
  • the thermal camera 4 registers a projectile impact 10 , by comparing current thermal images of the screen surface with previously captured baseline thermal images of the screen. Any deviation from the baseline is attributable to the energy change caused by the projectile impact.
  • the thermal camera 4 isolates the impact images and transmits them to a computer 5 .
  • the computer 5 is connected to the thermal camera 4 through a USB2 or comparable interface.
  • the thermal camera 4 is calibrated so that the computer 5 can calculate the actual impact coordinates 9 relative to the projected target.
  • the computer 5 further comprises software to convert the impact coordinates 9 into digital signals.
  • Feedback devices comprising a monitor 7 , printer 8 or any other electronic device that can receive a digital signal from the computer 5 can be used to visually or graphically depict the impact coordinates.
  • the impact coordinates can be displayed along a virtual X-axis 10 and a Y-axis 11 projected on the screen surface.
  • a projector 6 may be used to project the impact coordinates images onto the screens 3 for immediate visual feedback to the trainee.
  • the software comprising outcome based training scenarios is triggered. These scenarios comprise a compilation of scenes that simulate real life responses or outcomes to a projectile impact.
  • a projector 6 or monitor may further be used to project these scenarios onto the screen 3 .

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Aiming, Guidance, Guns With A Light Source, Armor, Camouflage, And Targets (AREA)

Abstract

A training system and method to calculate actual coordinates of a projectile impact at one or more screens has been disclosed. A projectile is launched at a screen. One or more targets are projected onto the screen. A calibrated sensor is directed at the screen surface. The sensor continually captures thermal images of the screen surface. The sensor comprises software to detect and isolate thermal images of the projectile impacting the screen. These impact images are transmitted to a computer connected to the sensor. The computer comprises software to calculate the actual impact coordinates relative to a projected target. The calculated coordinates are digitally sent to feedback devices for display purposes. The system further comprises virtual training scenarios that are triggered upon notification of actual impact coordinates. These training scenarios simulate real life situations.

Description

    PRIORITY
  • This application claims the benefit of priority to U.S. provisional patent application No. 60/776,002 filed Oct. 21, 2005.
  • FIELD OF INVENTION
  • The present invention relates to a system and method for determining the actual coordinates of a projectile impact. Particularly, the invention is directed to firearms and weapons training systems.
  • BACKGROUND
  • Military personnel, police and other law enforcement officers, hunters, sportsmen and especially ordinary citizens need extensive training prior to handling weapons or firearms. When training military and law enforcement personnel, in particular, it is also important for the training systems to employ live weapons and for the immediate conditions to mimic or simulate real life conditions. In real-life situations, these personnel have very little reaction time to respond to multiple stimuli. A bullet or projectile that accurately hits its intended target may reduce, or even eliminate, collateral civilian and property losses. Interactive training systems, which aid in improving shot accuracy, have become very popular. To simulate realistic conditions any such training system must also provide multiple true-to-life scenarios without artificially enforced interruptions to identify the impact location.
  • Current training systems use a simulated weapon firing a simulated projectile at traditional or virtual targets. The targets are then imaged on a video projection screen. The location of a projectile impact is determined visually or is roughly estimated. These simulators use a beam of light to simulate the projectile and the path of the projectile. The light beam is a narrowly focused beam of visible light or near infrared light, such as those wavelengths produced by low energy laser diodes, which can then be imaged by conventional video cameras or imagers. Sometimes a filter is used to enhance the ability of these cameras to discern the normal reflected light and the light from the simulated projectile. These simulators do not allow for the use of live projectiles, such as bullets. Live projectiles can be used in shooting ranges with virtual targets projected on the backstop or targeting screen. The hit or impact locations can, be determined, however, the shooter has to constantly stop to gauge shot accuracy.
  • Targets are typically made of paper, plastic, cardboard, polystyrene, wood and other tangible materials. Softer materials, such as paper, allow for easy monitoring of impact location as shown by the hole created in the material, but the projectiles quickly destroy these materials. Metal targets are more durable, however, their intrinsic hardness creates difficulty in determining the actual impact location. Self-healing elastomeric materials, like rubber, fall somewhere in between—they are more durable than the softer materials, but determining the exact impact coordinates is not very easy. Training simulators were developed to simulate continuous action and overcome some of the disadvantages associated with shooting at traditional targets. However, these simulators require the use of simulated weapons. Simulated weapons do not accurately convey the feel and recoil action of firearms. Trainees, not used to extensive target practice with live firearms, may be disadvantaged when required to handle firearms in combat situations. Current training simulators use technology that limits realism and the ability for through performance measurement.
  • A variety of methods have been disclosed in the prior art to detect the impact location of live projectiles. Most of these methods require direct or visual inspection by the shooter or trainee. Prior art methods detect holes, cold spots, spots of light or supersonic waves. Other methods calculate trajectories or monitor changes in electrical properties at the impact zone in order to estimate the impact location. The impact location of a projectile can be determined directly by locating the point of impact or penetration visually on the target itself. For example, paper or cardboard targets would show a hole in the target corresponding to the location of penetration of the projectile. Metal targets may show a hole, indentation, or surface mark where the projectile impacted or penetrated. These methods have limitations. They may only be used a limited number of times before the target is destroyed. If they are impacted multiple times, it becomes difficult to determine which shots correspond to which hole. To observe the target holes from a distance, telescopic optical means must be employed by the user or a spotter to detect hit location. To directly observe the impact location, the target must be observed up close, by approaching the target, or by mechanically retrieving the target. This requires stopping the training and increases the safety risk of the trainee. Furthermore, all systems using a fixed target are limited in size and maneuverability either in side-to-side motion or in front to back motion. In order to get around these limitations, several alternative methods have been suggested in the prior art to detect impact location of a projectile on a target without having to observe the target at close range. These methods include employing a backlit screen which, when penetrated by a projectile, shows a bright spot from the backlight, using acoustic sensors which detect the shock wave from the passing projectile, or using thermal means of heating the target to a uniform temperature and then looking for cold holes left by the penetrating projectile.
  • However, these methods only estimate impact coordinates. And, the fixed targets used in these training methods possess limited maneuverability. Finally, the trainee does not get to realistically experience the possible after effects of a projectile impact.
  • This invention relates to a system and method for calculating the actual pixel coordinates of a projectile launched from a projectile launching device, such as a firearm. In one embodiment, a sensor is used to capture images of the energy changes, or spikes, across a planar surface. The planar surface comprises one or more screens capable of displaying one or more targets. In this embodiment, the screen comprises a self-healing, elastomeric material. The targets can comprise live video, computer graphics, digital animation, three-dimensional images, two-dimensional images, virtual targets and moving targets. When a projectile impacts or penetrates the one or more screens, the sensor registers the impact by virtue of a corresponding change in energy across the screen surface. In one embodiment, the sensor is a thermal camera.
  • The sensor is connected to a computer. The system is calibrated such that computer has enough information to translate coordinates from a three-dimensional plane defined by the target to logical virtual screen coordinates that can be used by the computer's operating system. The computer further comprises software to calculate the exact pixel coordinates of the projectile impact from the logical virtual screen coordinates. Once the pixel coordinates have been calculated, the computer relays this information to the trainee using feedback mechanisms comprising a projector, monitor or any other electronic device capable of receiving and visually or graphically displaying this information. The process of calculating the impact coordinates and relaying the information back to the trainee is limited only by the computer's processing speed, and the process is virtually instantaneous.
  • In another embodiment, the system comprises a device such as a video player capable of recording and playing back true-to-life simulated training scenarios. The computer transmits information about the impact coordinates to the video player. The video player selects a scenario that depicts the after-effects or outcome of a projectile accurately hitting, nearly hitting or missing the target. The scenarios can be projected on to a screen or displayed on a monitor or any other feedback device.
  • The invention does not involve detecting holes or damage to the target to determine impact location. Nor is the impact estimated from a determination of the projectile trajectory. Sensors comprising image sensors or thermal sensors are used to detect an impact based on changes in energy at the screen surface. In another embodiment, the sensor comprises software to isolate thermal images of a projectile impacting the screen surface from continually captured thermal images of the screen surface. The isolated thermal images are sent to a computer attached to the sensor. The computer receives these coordinates as mouse clicks. The computer can calculate actual projectile impact coordinates, relative to a projected target on the screen surface, from the impact images transmitted by the sensor.
  • The invention can also be adapted to assist users of other types of projectile launchers such as bows, crossbows, spears, darts, balls, rocket launchers or other projectile launching devices, by detecting the heat energy transferred to the target upon impact or penetration.
  • This combination of accurately measuring the impact coordinates and conveying potential outcomes using training scenarios, aids in creating a realistic training experience. The invention improves the effectiveness and realism for training the military, police officers, marksmen, sportsmen or other firearm users, in a simulated environment using real weapons with real ammunition, by detecting the heat transferred to the target upon impact or penetration of the target by the projectile. The invention is effective because the training does not need to be halted to determine the impact location. The realism is improved because the trainee does not have to use a simulated or demilitarized weapon in training. Since actual weapons and ammunitions can be adapted for use with the system, the trainee experiences the sounds, recoil and discharge associated with the trainee's own weapon. The trainee is thus better able to handle real-life situations. The invention allows the trainee to determine the impact location without approaching the target. This aids in safer training because the trainee is not required to be within the range of fire to view where the projectile impacted a target.
  • BRIEF DESCRIPTION OF DRAWINGS
  • FIG. 1 shows a schematic of a training system to detect the actual projectile impact coordinates.
  • FIG. 2 shows a schematic of the actual impact coordinates projected onto the screen.
  • FIG. 3 shows a simulated training scenario.
  • DETAILED DESCRIPTION
  • The invention comprises a training system and a method to detect actual coordinates of a projectile launched at one or more targets projected onto one or more screens. The one or more targets comprise virtual targets, live video, computer graphics, digital animation, three-dimensional images, two-dimensional images and moving targets for receiving the projectile impact. FIG. 1 shows the system comprising a calibrated sensor 4 capable of detecting energy changes at the point of impact on a screen when the projectile 2 impacts the screen 3. The sensor 4 captures images of the energy spikes on the surface 3 a of the screen and relays it to an attached computer 5. The computer 5 comprises software to calculate the actual coordinates of a projectile impact 10 based on the images transmitted by the sensor 4. FIG. 1 further depicts one or more feedback devices. The feedback devices can comprise a projector 6 for displaying the coordinates onto the one or more screens, a monitor connected to the computer 7, a printer connected to the computer 8, or any other electronic device capable of receiving digital signals from the computer. Feedback devices such as the monitor, the projector and printer, immediately translate the digital signals into visual or graphical representations of the calculated projectile impact coordinates. FIG. 2 depicts the impact coordinates 10 of the projectile impact along a virtual X-axis 9 and a Y-axis 11 projected onto the screen 3. The system further comprises software that can display simulated training scenarios 12 on the screen 3, as depicted in FIG. 3. The training scenarios depend upon the calculated impact coordinates. For example, where the impact coordinates 10 reflect that a moving target was missed, the training scenario 12 would then show the target as continuing to move rather than immobilized. The training scenarios 12 are selected according to further actions required. The system is portable and it and can be used in indoor shooting ranges or in limited spaces where the ambient lighting is not easily reflected. Alternatively, the system can comprise a portable shooting range comprising a housing comprising a container. The containerized housing further comprises a screen for displaying projected targets, a thermal camera, a computer, a projector, and a monitor for providing immediate feedback. Advantageously, the containerized system can be transported for on-site training. The system finds application in various law enforcement training situations like sniper, artillery, weapons and sharpshooter training.
  • Any projectile launching device 1 can be adapted for use with the invention. These devices include chemical or explosive powered devices such as firearms, pneumatic or compressed gas or spring-piston powered devices, elastic or spring tension powered devices, laser guns and bows, and any other device capable of launching projectiles.
  • Various types of projectiles 2 may be deployed with this invention. The type of projectile used depends on the training requirements. The projectiles comprise bullets, including lead bullets, copper jacketed bullets, steel jacketed bullets, tracer bullets, frangible bullets, plastic bullets, shotgun shot of various sizes and materials, and shotgun slugs. Softair pellets, metal or plastic pellets, metal or plastic BBs, frangible pellets, arrows, spears, darts, stones, balls and hockey pucks, lasers, rockets, missiles, grenades and other objects, now known or later developed, that can leave a heat signature upon impact may be used as projectiles.
  • The projectiles 2 are launched at one or more screens 3. The screens 3 can be constructed from any of several materials comprising paper, cloth, plastic, metal or rubber. In its preferred embodiment, the screen comprises an elastomeric material such as rubber, vinyl, silicone or plastic. The flexible nature of elastomeric materials allows for various projectile types to impact the material and either bounce off or penetrate the screen while doing minimal damage to the screen. Upon impact 10 or penetration by a projectile 2, certain types of elastomeric materials such as rubber will allow the projectile 2 to open a hole the size of the projectile 2, allow the projectile 2 to pass through the material, and then close back up due to the elastic nature of the material. While the hole is still present in the material, it still presents a relatively smooth surface on the front surface of the screen 3. The front surface 3 a of the one or more screens 3 is coated with a white or light colored reflective coating to allow one or more targets to be projected upon it. The back surface of the screen is preferably set up against a bullet trap or ballistic material. The screens 3 are compact and they can be hung on the walls of a shooting range, or inside a containerized shooting range, for instance. The screens 3 may comprise spring roller pull-down models, electrically operated types or the portable models. The screens 3 may be operated with remote controls or may be manually controlled. The screen sizes depend upon the distance between the screen and the projector. In an alternative embodiment, any planar surface that can receive one or more projected images can act as a “screen.” Examples of such surfaces are rock walls, concrete walls, etc.
  • The projectiles 2 are launched at targets projected on to the screen surface 3 a. These projected targets can comprise digital animation, live videos, computer graphics, three-dimensional images, two-dimensional images, moving targets and other pictorial representations. The projected targets further comprise one or more virtual targets for receiving the projectile impact.
  • As illustrated in FIG. 1, the training system comprises a sensor 4, preferably a thermal imaging sensor for capturing thermal images of the screen surface 3 a. The sensor 4 is directed at the front surface of the screen 3 a. However, the sensor 4 may be placed at an angle to the screen 3, that is, to the left of the front of the screen 3 and directly in front of the screen 3, looking down at the screen 3 or positions other than perpendicular to the front of the screen. The sensor 4 does not have to be able to see the entire projected target. In one aspect of this invention, the sensor 4 continually captures thermal images of the screen 3. In one embodiment, the sensor comprises software that can detect a projectile impact 10 on the screen 3 by comparing current thermal images of the screen surface 3 a with previously captured baseline thermal images of the screen surface 3 a. The sensor 4 registers an impact 10 when the current thermal images of the screen show a deviation from the baseline images. The deviation from the baseline is caused by the energy transferred to the screens during the projectile impacting 10 or penetrating the screens. The sensor 4 transmits only the impact images to the computer 5 for processing. Since the sensor 5 does not transmit multiple thermal image frames to the computer for analysis of impact 10 coordinates, the efficiency of the system is enhanced.
  • In another embodiment, the sensor 4 comprises a thermal camera. The thermal camera 4 comprises an infrared core that can detect heat across the energy spectrum, including the infrared region of the energy spectrum. In one embodiment the thermal camera 4 comprises a frame rate of at least 30 frames per second to capture images of the energy spike due to the projectile impact. In another embodiment, the thermal camera 4 further comprises a frame rate of at least 60 frames per second. There are several commercially available examples of thermal cameras 4 that can be used with the training system. One such commercial example is the M3000 Thermal Imaging Module manufactured by DRS Nytech Imaging Systems, Inc. The thermal camera 4 contains a software interface manufactured by Lumenera, Inc.
  • The system further comprises a computer 5 to interpret and analyze the thermal images detected by the sensor. Preferably, the computer comprises 512 MB DDR, 40 GB hard drive capacity and a processing speed of 3 GHz. The computer 5 is connected to the sensor 4 through an USB2 or comparable interface. The computer 5 comprises software to receive the images captured by the sensor 4 by clicking the mouse or as mouse clicks. The computer 4 further comprises distortion calculation software libraries to calculate the actual pixel coordinates 9 of a projectile impact 10. Once the computer calculates the actual pixel coordinates 9, its software programs can digitally illustrate the impact coordinates. These illustrations are digitally transmitted to one or more feedback devices comprising a projector, monitor, printer or any other device capable of receiving digital signals. The computer further comprises software programs that trigger virtual training scenarios 12.
  • The sensor 4 is calibrated so that the computer 5 connected to the sensor 4 uses only the images relayed by the sensor 4 to determine impact coordinates 9. Calibration also compensates for the distortions produced by the sensor 4 lens and extrinsic factors such as the placement of the sensor 4 relative to the screen 3. The computer 5 can relate the pixel coordinates from a projected target 9 to calibrated logical virtual screen coordinates that can then be used by the computer's 5 operating system to determine actual impact coordinates 9.
  • The sensor 4 may be placed at an angle to the screen 3, that is, in front of the screen 3 and to the left, directly in front of the screen 3, looking down at the screen 3, etc. The sensor 4 does not have to be able to see the entire projected target. The computer 5 can actually define its own viewable area within the area defined by the screen 3. If the entire projected target is not viewable, then only the viewable areas of the screen 3 are calibrated. But, for instance, if the projected target is on a screen 3 that has borders containing materials that do not reflect light well, a projectile impact 10 in that border space may nevertheless be detected by the sensor 4.
  • The calculation software can also calculate and compensate for the radial and tangential distortions caused by the sensor lens. To find the coordinates to be used in the distortion calculation software library, the system projects onto the screen 3 an arbitrary number of evenly spaced vertical lines and horizontal lines, one at a time. The system attempts to create these lines so that they encompass the entire projected area. This ensures accuracy in calculating the impact coordinates. If the coordinates cannot be found, then the system adjusts the size, position, and pixel width of the lines until an arbitrary accuracy error percentage threshold is reached.
  • The system next projects a “black” image onto the screen. The pixel values from the black projected image are subtracted from the pixel values of the vertical projected image and the horizontal projected image. If both images produced by the subtraction contain pixels at the same place and their values are greater than an experimental threshold, their intersection defines one pixel coordinate. After all coordinates have been calculated in this manner, they are stored and processed in the one or more distortion calculation software libraries. The system also captures and stores thermal images comprising information on the baseline temperatures of each logical screen coordinate. When a projectile impacts the screen, energy is transferred to the screen. Thermal images of the screen are continually captured by the sensor and processed against the stored baseline screen images. If the current thermal images show a deviation from the captured thermal images, a projectile impact is registered.
  • Once the intrinsic parameters of the sensor are known, the extrinsic parameters of the system can be determined. Two vertical lines and two horizontal lines are projected onto the one or more screens, with each line in each set of lines being as far apart as possible. The same process described above is used to determine the intersection between the set of lines. These coordinates are then undistorted using the distortion calculation software library with the parameters found above. This process results in the determination of four undistorted corner coordinates of the projected image.
  • The corner coordinates and the coordinates contained in the quadrilateral defined by the four corners must also be related to coordinates within the surface area of the screen. A matrix capable of translating each coordinate to satisfy the above condition is created. The matrix is created as follows: The variables required consist of the captured corner coordinates determined above and the “ideal” coordinates defined by the surface area of the screen. Starting with the ideal coordinates, the two-dimensional perspective matrix defined by those coordinates is calculated. The matrix is used to transform the captured coordinates. Next, the deviation between each transformed captured coordinate and the relative ideal coordinate is calculated. This deviation is the absolute value of the difference between each relative X and Y coordinate. Each deviation is added to the appropriate component of the last set of coordinates used to find the perspective matrix. Those coordinates are then used in the next calculation of the perspective matrix, and this process is carried out until an arbitrary combined deviation is reached or a maximum number of iterations have been run.
  • The logical screen position for each coordinate from a captured image may be determined by “undistorting” it using the distortion calculation software library, and then transforming the undistorted coordinate by the matrix found above. The undistorted and transformed coordinate may be out of bounds of the virtual screen space.
  • The system further comprises an image-generating device comprising a liquid crystal display (LCD) projector, a digital projector, a digital light processing projector, a rear projection device, or a front projection device. In one embodiment, the system comprises a LCD projector 6. An image is formed on the liquid crystal panel of the LCD projector from a digital signal from the computer 5, for instance. This formed image is then displayed onto the screen 3.
  • The system further comprises a plurality of training scenarios 12 that aid in skills training. These training scenarios 12 comprise video scenarios, digital animation, two- and three-dimensional pictures and other electronic representations that may be projected onto the one or more screens 3. Depending on the projectile impact coordinates 9, the training scenarios 12 can lead or branch into several possible outcomes beginning from one initial scene. The trainees may pause or replay the completed scene to show the precise impact time and projectile impact coordinates 9 and thereby allow for detailed discussion of the trainee's actions. The training scenarios comprise anticipated real-life situations comprising arrests by law enforcement personnel, investigative scenarios, courthouse scenarios, hostage scenarios and traffic stops. The training scenarios also aid in judging when the use of force may be justified and/or necessary by showing the expected outcomes from a projectile impact 10.
  • In one embodiment, one or more targets are projected onto the one or more screens 3 or display surfaces using a projection device such as a projector 6 or any another graphics generating device that can project a target or scenario. The targets can comprise virtual targets. A projectile 2 launched from a projectile launching device 1 penetrates or impacts 10 the targets. A calibrated sensor 4 is directed at the one or more screens 3. When a projectile 2 impacts 10 the front surface of the screen 3, an energy spike or change in temperature is detected at the screen surface 3 a. The sensor 4 continually captures thermal images of the one or more screens 3. The sensor 4 processes these thermal images against baseline thermal images of the screen surface. The sensor registers an impact when a deviation from the baseline is observed. The sensor 4 then isolates the impact images from the other captured screen images. The isolated impact images are transmitted to the computer 5 connected to the sensor 4. Since the computer 5 only receives images of the actual impact 10, it does not have to process superfluous thermal images of the screen surface in order to detect an impact 10. This greatly improves processing speed. The sensor 4 is calibrated so that the computer 5 is able to detect actual pixel coordinates 9 of the projectile impact 10 relative to the projected target. The computer 5 further comprises software to digitally illustrate the impact coordinates 9. Feedback devices comprising monitors 7, printers 8 or other electronic devices capable of receiving a digital signal from the computer may be used to visually or graphically depict the impact coordinates 9. The impact coordinates 9 may also be projected, using the projector 6 onto the one or more screens 3.
  • The system further comprises simulated training scenarios 12 that are triggered by the computer 5 upon the calculation of the actual projectile impact coordinates 9. These training scenarios 12 comprise video, digital animation or other virtual compilations of one or more situations that simulate real-life conditions. These situations comprise hostage scenarios, courthouse encounters, traffic stops and terrorist attacks. Each scenario comprises a compilation of one or more scenes. The scenes are compiled in such a manner that any given scene may further branch into one or more scenes based on input from the computer regarding the calculated impact coordinates. The branching simulates expected outcomes in similar real life situations. The impact coordinates 9 may further be superimposed against, say, a graphic of a target's body, and the coordinates “frozen” for the trainee to visually inspect the extent of any deviation from the expected shot location. The training scenarios 12 may also be used to display collateral damage that may be expected in real life situations.
  • The system further comprises a one or more projectile launching devices comprising laser-triggering devices. These laser-triggering devices may be used to fire one or more projectiles comprising lasers at the screens 3. The system further comprises software to detect the location of the laser device that launched a particular laser at the screens 3.
  • In yet another embodiment, the system comprises a thermal sensor 4 comprising a thermal camera directed at the one or more screens 3. The thermal camera 4 comprises software to detect and isolate thermal images of the one or more projectile impacting 10 the one or more screens 3. The thermal camera 4 transmits the impact images to a connected computer 5. The computer 5 is connected to the thermal camera 4 through an USB2 or comparable interface. The thermal camera 4 is calibrated so that the attached computer 5 can compute impact coordinates 9 relative to predetermined logical screen coordinates. The impact coordinates 9 are sent to feedback devices comprising projectors 6, printers 8, monitors 7 or other electronic devices capable of receiving a digital signal from the computer 5. The feedback devices can visually or graphically illustrate the impact coordinates. The system further comprises training scenarios 12 that comprise a compilation of imagery comprising video and animation figures. The scenes are compiled to simulate real-life incidents, such as hostage situations and traffic stops, which are encountered by the law enforcement and military personnel. The system comprises software that upon notification of the impact coordinates further branches into one or more possible outcome based scenarios. These outcome-based scenarios simulate real life responses. The system further comprises a video editor. The trainee can film their own video clips and import them into the editor. The imported video is converted into MPEG-4 or comparable format. The trainee can then create scenarios comprising branching points as desired. Branching conditions that are correlated to the coordinates of the projectile impact may also be defined. The trainee may ultimately group multiple scenarios together to present diverse training situations in a single training session.
  • In another embodiment, the thermal camera 4 continually captures current thermal images of the screen surface 3. The computer 5 connected to the thermal camera 4 receives these thermal images as mouse clicks. The computer 5 processes these images against baseline thermal images of the screen surface. If the computer 5 detects a deviation from the baseline, an impact is registered. The computer 5 further comprises software to calculate the projectile impact coordinates 9 from the impact images. Once the coordinates have been calculated, they are sent to feedback devices connected to the computer 5.
  • During the method for calculating the actual projectile impact coordinates 9, one or more projectiles 2 are launched at one or more projected targets. A thermal camera 4 is directed at one or more screens 3 comprising the projected targets. The thermal camera 4 continually detects and captures thermal images of the screen surface. The thermal camera 4 registers a projectile impact 10, by comparing current thermal images of the screen surface with previously captured baseline thermal images of the screen. Any deviation from the baseline is attributable to the energy change caused by the projectile impact. The thermal camera 4 isolates the impact images and transmits them to a computer 5. The computer 5 is connected to the thermal camera 4 through a USB2 or comparable interface. The thermal camera 4 is calibrated so that the computer 5 can calculate the actual impact coordinates 9 relative to the projected target. The computer 5 further comprises software to convert the impact coordinates 9 into digital signals. Feedback devices comprising a monitor 7, printer 8 or any other electronic device that can receive a digital signal from the computer 5 can be used to visually or graphically depict the impact coordinates. The impact coordinates can be displayed along a virtual X-axis 10 and a Y-axis 11 projected on the screen surface. A projector 6 may be used to project the impact coordinates images onto the screens 3 for immediate visual feedback to the trainee. Upon notification of the calculated projectile impact coordinates 9 by the computer 5, the software comprising outcome based training scenarios is triggered. These scenarios comprise a compilation of scenes that simulate real life responses or outcomes to a projectile impact. A projector 6 or monitor may further be used to project these scenarios onto the screen 3.
  • The foregoing description is illustrative and explanatory of several embodiments of the invention, it will by understood by those skilled in the art that various changes and modifications in form, materials and detail may be made therein without departing from the spirit and scope of the invention.

Claims (42)

1. A training system to detect impact coordinates of one or more projectiles launched from one or more projectile launchers, the system comprising:
one or more targets displayed on one or more screens;
a sensor directed at the one or more screens, the sensor capable of capturing thermal images of a projectile impact on the one or more targets;
means for calculating actual coordinates of the projectile impact from the thermal images captured by the sensor; and
means for providing immediate feedback of the projectile impact coordinates.
2. The system of claim 1, wherein the screens comprise a surface capable of receiving one or more targets projected onto the screen.
3. The system of claim 1, wherein the screens comprise an elastomeric material.
4. The system of claim 1, wherein the projectiles comprise bullets, lead bullets, copper jacketed bullets, steel jacketed bullets, plastic bullets, frangible bullets, rockets, missiles, BB pellets, softair pellets and arrows.
5. The system of claim 1, wherein the one or more projectile launchers are adapted to fire one or more projectiles comprising lasers.
6. The system of claim 5, wherein the system comprises means to determine the location of the projectile launcher from which the lasers were fired.
7. The system of claim 1, wherein the sensor further comprises means to compare currently captured thermal images of the screen with previously captured baseline thermal images of the screen to detect deviations from the baseline due to a heat signature left by the projectile impact.
8. The system of claim 7, wherein the sensor further comprises software to isolate images comprising deviations from the previously captured images.
9. The system of claim 7, wherein each image detected by the sensor comprises a plurality of pixels.
10. The system of claim 1, wherein the sensor comprises means to transmit thermal images of the projectile impact to a computer connected to the sensor.
11. The system of claim 10, wherein the computer comprises software to calculate actual pixel coordinates of the projectile impact relative to the projected targets from the isolated impact images transmitted by the sensor.
12. The system of claim 11, wherein the computer further comprises software to trigger simulated training scenarios on the one or more screens based on the calculated pixel coordinates.
13. The system of claim 1, wherein the computer comprises software for digitally illustrating the projectile impact coordinates relative to the one or more projected targets.
14. The system of claim 13, wherein the computer transmits digital images of the impact coordinates to a projector or monitor for immediate visual feedback.
15. A training system to detect impact coordinates of a projectile launched from a projectile launching apparatus, the system comprising:
one or more targets displayed on one or more screens, the screens comprising an elastomeric material for receiving a projectile impact;
a thermal camera facing the screens for capturing thermal images of the screen;
a computer for determining projectile impact coordinates by calculating pixel coordinates of the projectile impact from the thermal images captured by the thermal camera, the computer connected to the thermal camera; and
a projector for projecting visual images of the calculated pixel coordinates relative to the one or more targets, the projector connected to the computer.
16. The system of claim 15, wherein the one or more projected targets comprise live video, computer graphics, digital animation, three-dimensional images, two dimensional images, virtual targets and moving targets.
17. The system of claim 15, wherein the thermal camera comprises a frame rate of at least 30 frames per second to capture thermal images of the projectile impact.
18. The system of claim 15, wherein the thermal camera comprises a frame rate of at least 60 frames per second to capture thermal images of the projectile impacting the screen.
19. The system of claim 15, wherein the thermal camera is calibrated to compensate for radial distortion and tangential distortion in the captured images caused by the camera lens.
20. The system of claim 15, further comprising software to compensate for screen distortion.
21. The system of claim 15, wherein the thermal camera comprises software to detect a projectile impact by comparing current thermal images of the screen with baseline thermal images of the screen.
22. The system of claim 21, wherein the thermal camera further comprises means to isolate images of the projectile impact from other captured screen images.
23. The system of claim 22, wherein the thermal camera comprises means to transmit the isolated projectile impact images to the computer.
24. The system of claim 15, wherein the computer comprises software to calculate actual pixel coordinates of the projectile impact relative to the one or more projected targets.
25. The system of claim 24, wherein the computer comprises means to convert the calculated projectile impact coordinates into digital signals for transmission to one or more feedback devices.
26. A system for detecting actual coordinates of a projectile impact, the system comprising:
one or more targets, the targets projected onto one or more elastomeric screens adapted to receive the projectile impact;
a thermal camera directed at the screen, the thermal camera continually capturing thermal images of the one or more screens;
means for a computer to receive images captured by the thermal camera, the computer connected to the thermal camera
means for the computer to calculate actual impact coordinates relative to the projected targets from the images received from the thermal camera;
means for the computer to digitally illustrate the impact coordinates;
a projector for visually illustrating the impact coordinates on the screen, the projector connected to the computer; and
one or more simulated training scenarios to be displayed on the screen and one or more feedback devices.
27. The system of claim 26, wherein the projectiles comprise bullets, lead bullets, copper jacketed bullets, steel jacketed bullets, plastic bullets, frangible bullets, rockets, missiles, BB pellets, softair pellets and arrows.
28. The system of claim 26, wherein the one or more projected targets comprise live video, computer graphics, digital animation, three-dimensional images, two-dimensional images, moving targets and virtual targets.
29. The system of claim 26, wherein the computer further comprises means to process the thermal images received from the thermal camera to detect images of a projectile impact.
30. The system of claim 29, wherein the computer detects a projectile impact by comparing the thermal images of the screen with previously captured baseline thermal images of the screen images to observe deviations from the baseline.
31. The system of claim 26, wherein the system further comprises software to display the pixel coordinates of the projectile impact along a virtual X-axis and a Y-axis superimposed on the screen.
32. The system of claim 26, wherein the system further comprises means for creating one or more simulated training scenarios.
33. The system of claim 32, wherein the computer comprises software means to select and run one or more training scenarios, the training scenarios selected dependent upon the calculated pixel coordinates of the projectile impact.
34. The system of claim 33, wherein the system further comprises means to project the one or more training scenarios on the one or more screens.
35. A method for determining the position of a live projectile impact, the method comprising:
(a) calibrating a thermal camera to compensate for lens distortion;
(b) capturing baseline thermal images of screen coordinates;
(c) launching the projectile at one or more targets projected on a screen;
(d) detecting a heat signature left by a projectile impact on the screen using a thermal camera;
(e) calculating actual pixel coordinates of the projectile impact;
(f) displaying visual images of the pixel coordinates along a X-axis and a Y-axis transposed on the screen; and
(g) projecting simulated training scenarios onto the screen based upon the position of the pixel coordinates.
36. The method of claim 35, further comprising the step of the thermal camera detecting and isolating images of the one or more projectiles impacting the one or more screens.
37. The method of claim 36, further comprising the step of the thermal camera transmitting the isolated projectile impact images to an attached computer.
38. The method of claim 35, wherein the computer calculates the actual coordinates of the projectile impact relative to the projected targets.
39. The method of claim 38, wherein the computer creates digital images of the calculated projectile impact coordinates relative to the screen.
40. The method of claim 35 further comprising providing immediate feedback of the projectile impact coordinates.
41. The method of claim 40 where feedback is provided through a computer monitor.
42. A containerized training system to detect impact coordinates of a projectile launched from a projectile launching apparatus, the system comprising:
a housing comprising a metallic surface;
one or more targets displayed on one or more screens, the screens located within the housing;
a thermal camera facing the screens for capturing thermal images of the screens;
a computer for determining projectile impact coordinates by calculating pixel coordinates of the projectile impact from the thermal images captured by the thermal camera, the computer connected to the thermal camera; and
a projector for projecting visual images of the calculated pixel coordinates relative to the one or more targets, the projector connected to the computer.
US11/581,918 2005-10-21 2006-10-17 System and method for calculating a projectile impact coordinates Abandoned US20070160960A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US11/581,918 US20070160960A1 (en) 2005-10-21 2006-10-17 System and method for calculating a projectile impact coordinates
US11/931,059 US8360776B2 (en) 2005-10-21 2007-10-31 System and method for calculating a projectile impact coordinates

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US77600205P 2005-10-21 2005-10-21
US11/581,918 US20070160960A1 (en) 2005-10-21 2006-10-17 System and method for calculating a projectile impact coordinates

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US11/931,059 Continuation-In-Part US8360776B2 (en) 2005-10-21 2007-10-31 System and method for calculating a projectile impact coordinates

Publications (1)

Publication Number Publication Date
US20070160960A1 true US20070160960A1 (en) 2007-07-12

Family

ID=38233118

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/581,918 Abandoned US20070160960A1 (en) 2005-10-21 2006-10-17 System and method for calculating a projectile impact coordinates

Country Status (1)

Country Link
US (1) US20070160960A1 (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2008085906A2 (en) * 2007-01-04 2008-07-17 Jakks Pacific, Inc. Toy laser gun and laser target system priority claim
US20090194942A1 (en) * 2006-09-11 2009-08-06 Bruce Hodge Thermal target system
US20090274373A1 (en) * 2008-04-30 2009-11-05 Quanta Computer Inc. Image processing apparatus and method for generating coordination calibration points
US20100092925A1 (en) * 2008-10-15 2010-04-15 Matvey Lvovskiy Training simulator for sharp shooting
CN108012132A (en) * 2017-12-21 2018-05-08 中国人民解放军总参谋部第六十研究所 A kind of projection acquisition all-in-one machine
US10495416B2 (en) * 2013-01-10 2019-12-03 Brian Donald Wichner Methods and systems for determining a gunshot sequence or recoil dynamics of a gunshot for a firearm
US10712133B2 (en) * 2017-08-01 2020-07-14 nTwined LLC Impact indication system
JP2021032425A (en) * 2019-08-19 2021-03-01 株式会社日立国際電気 Shooting training system
CN114577059A (en) * 2022-04-06 2022-06-03 神州凯业(广东)科技有限公司 Police actual combat law enforcement integrated intelligent training system
JP7444819B2 (en) 2021-07-21 2024-03-06 株式会社日立国際電気 shooting training system

Citations (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2925582A (en) * 1956-02-22 1960-02-16 Oflice Nat D Etudes Et De Rech Acoustical firing indicator
US3402933A (en) * 1964-01-16 1968-09-24 George E. De Vogelaere Marksmanship training target film
US3778059A (en) * 1970-03-13 1973-12-11 Singer Co Automatic gunnery shock wave scoring apparatus using metallic conductors as shock wave sensors
US3849910A (en) * 1973-02-12 1974-11-26 Singer Co Training apparatus for firearms use
US4253670A (en) * 1979-08-07 1981-03-03 The United States Of America As Represented By The Secretary Of The Army Simulated thermal target
US4260160A (en) * 1979-03-05 1981-04-07 Saab-Scania Ab Target device for practice shooting in darkness
US4261579A (en) * 1978-05-30 1981-04-14 Australasian Training Aids (Pty.), Ltd. Shock wave triggered target indicating system
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4349728A (en) * 1978-12-07 1982-09-14 Australasian Training Aids Pty. Ltd. Target apparatus
US4353887A (en) * 1979-08-16 1982-10-12 Ciba-Geigy Corporation Divisible tablet having controlled and delayed release of the active substance
US4547359A (en) * 1983-04-18 1985-10-15 Boehringer Ingelheim Kg Divisible pharmaceutical tablet with delayed active ingredient release
US4553943A (en) * 1983-04-08 1985-11-19 Noptel Ky Method for shooting practice
US4799688A (en) * 1987-01-27 1989-01-24 Eastman Kodak Company Live fire target system
US4824677A (en) * 1986-12-18 1989-04-25 The Unjohn Company Grooved tablet for fractional dosing of sustained release medication
US5025424A (en) * 1990-05-21 1991-06-18 Rohrbaugh George W Shock wave scoring apparatus employing curved rod sensors
US5126145A (en) * 1989-04-13 1992-06-30 Upsher Smith Laboratories Inc Controlled release tablet containing water soluble medicament
US5132116A (en) * 1987-07-16 1992-07-21 Pierre Fabre Medicament Tablets of the hydrophilic matrix type based on salbutamol and a process for their preparation
US5281142A (en) * 1991-05-15 1994-01-25 Zaenglein Jr William Shooting simulating process and training device
US5366229A (en) * 1992-05-22 1994-11-22 Namco Ltd. Shooting game machine
US5451409A (en) * 1993-11-22 1995-09-19 Rencher; William F. Sustained release matrix system using hydroxyethyl cellulose and hydroxypropyl cellulose polymer blends
US5501467A (en) * 1993-05-03 1996-03-26 Kandel; Walter Highly visible, point of impact, firearm target-shatterable face sheet embodiment
US5551876A (en) * 1994-02-25 1996-09-03 Babcock-Hitachi Kabushiki Kaisha Target practice apparatus
US5566951A (en) * 1992-08-04 1996-10-22 Dart International, Inc. Method and apparatus enabling archery practice
US5649706A (en) * 1994-09-21 1997-07-22 Treat, Jr.; Erwin C. Simulator and practice method
US5888545A (en) * 1994-07-01 1999-03-30 Arzneimittelwerk Dresden Gmbh Carbamazepine medicament with retarded active substance release
US5924694A (en) * 1997-05-12 1999-07-20 Kent; Howard Daniel Ballistic target material
US5958452A (en) * 1994-11-04 1999-09-28 Euro-Celtique, S.A. Extruded orally administrable opioid formulations
US5980254A (en) * 1996-05-02 1999-11-09 Advanced Interactive Systems, Inc. Electronically controlled weapons range with return fire
US5999210A (en) * 1996-05-30 1999-12-07 Proteus Corporation Military range scoring system
US6012980A (en) * 1995-12-01 2000-01-11 Kabushiki Kaisha Sega Enterprises Coordinates detecting device, method for same and game device
US6245356B1 (en) * 1993-09-09 2001-06-12 Edward Mendell Co., Inc. Sustained release heterodisperse hydrogel systems-amorphous drugs
US6260466B1 (en) * 1996-10-03 2001-07-17 Barr & Stroud Limited Target aiming system
US6367800B1 (en) * 1999-06-07 2002-04-09 Air-Monic Llc Projectile impact location determination system and method
US20020051953A1 (en) * 2000-06-09 2002-05-02 John Clark Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US20020052411A1 (en) * 1998-09-03 2002-05-02 Dagmar Gobel Valproate compositions and processes for making
US20020107069A1 (en) * 2000-12-06 2002-08-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US20030035837A1 (en) * 1993-11-23 2003-02-20 Sackler Richard S. Method of treating pain by administering 24 hour oral opioid formulations exhibiting rapid rate of initial rise of plasma drug level
US6536907B1 (en) * 2000-02-08 2003-03-25 Hewlett-Packard Development Company, L.P. Aberration compensation in image projection displays
US20030157463A1 (en) * 2002-02-15 2003-08-21 Nec Corporation Shooting training system with device allowing instructor to exhibit example to player in real-time
US20030228557A1 (en) * 2002-06-07 2003-12-11 Nec Corporation Electronic competition network system, electronic competition method, a server, and a computer program for operating the server
US6840772B1 (en) * 1999-05-14 2005-01-11 Dynamit Nobel Gmbh Explosivstoff-Und Systemtechnik Method for the impact or shot evaluation in a shooting range and shooting range
US20060115530A1 (en) * 2002-10-16 2006-06-01 Anders Pettersson Gastric acid secretion inhibiting composition
US20060193911A1 (en) * 2005-02-28 2006-08-31 Penwest Pharmaceuticals Co., Controlled release venlafaxine formulations
US20070026364A1 (en) * 2005-01-13 2007-02-01 Jones Giles D Simulation devices and systems for rocket propelled grenades and other weapons
US20080192129A1 (en) * 2003-12-24 2008-08-14 Walker Jay S Method and Apparatus for Automatically Capturing and Managing Images
US20090238870A1 (en) * 2008-03-21 2009-09-24 Les Laboratoires Servier Dividable galenical form allowing modified release of the active ingredient

Patent Citations (56)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2925582A (en) * 1956-02-22 1960-02-16 Oflice Nat D Etudes Et De Rech Acoustical firing indicator
US3402933A (en) * 1964-01-16 1968-09-24 George E. De Vogelaere Marksmanship training target film
US3778059A (en) * 1970-03-13 1973-12-11 Singer Co Automatic gunnery shock wave scoring apparatus using metallic conductors as shock wave sensors
US3849910A (en) * 1973-02-12 1974-11-26 Singer Co Training apparatus for firearms use
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4514621A (en) * 1977-02-21 1985-04-30 Australasian Training Aids (Pty.) Limited Firing range
US4425500A (en) * 1977-02-21 1984-01-10 Australasian Training Aids (Pty.) Ltd. Firing range
US4261579A (en) * 1978-05-30 1981-04-14 Australasian Training Aids (Pty.), Ltd. Shock wave triggered target indicating system
US4349728A (en) * 1978-12-07 1982-09-14 Australasian Training Aids Pty. Ltd. Target apparatus
US4260160A (en) * 1979-03-05 1981-04-07 Saab-Scania Ab Target device for practice shooting in darkness
US4253670A (en) * 1979-08-07 1981-03-03 The United States Of America As Represented By The Secretary Of The Army Simulated thermal target
US4353887A (en) * 1979-08-16 1982-10-12 Ciba-Geigy Corporation Divisible tablet having controlled and delayed release of the active substance
US4553943A (en) * 1983-04-08 1985-11-19 Noptel Ky Method for shooting practice
US4595587A (en) * 1983-04-18 1986-06-17 Boehringer Ingelheim Kg Divisible pharmaceutical tablet with delayed active ingredient release
US4547359A (en) * 1983-04-18 1985-10-15 Boehringer Ingelheim Kg Divisible pharmaceutical tablet with delayed active ingredient release
US4683131A (en) * 1983-04-18 1987-07-28 Boehringer Ingelheim Kg Divisible pharmaceutical tablet with delayed active ingredient release
US4824677A (en) * 1986-12-18 1989-04-25 The Unjohn Company Grooved tablet for fractional dosing of sustained release medication
US4799688A (en) * 1987-01-27 1989-01-24 Eastman Kodak Company Live fire target system
US5132116A (en) * 1987-07-16 1992-07-21 Pierre Fabre Medicament Tablets of the hydrophilic matrix type based on salbutamol and a process for their preparation
US5126145A (en) * 1989-04-13 1992-06-30 Upsher Smith Laboratories Inc Controlled release tablet containing water soluble medicament
US5025424A (en) * 1990-05-21 1991-06-18 Rohrbaugh George W Shock wave scoring apparatus employing curved rod sensors
US5281142A (en) * 1991-05-15 1994-01-25 Zaenglein Jr William Shooting simulating process and training device
US5366229A (en) * 1992-05-22 1994-11-22 Namco Ltd. Shooting game machine
US5566951A (en) * 1992-08-04 1996-10-22 Dart International, Inc. Method and apparatus enabling archery practice
US5501467A (en) * 1993-05-03 1996-03-26 Kandel; Walter Highly visible, point of impact, firearm target-shatterable face sheet embodiment
US6245356B1 (en) * 1993-09-09 2001-06-12 Edward Mendell Co., Inc. Sustained release heterodisperse hydrogel systems-amorphous drugs
US5451409A (en) * 1993-11-22 1995-09-19 Rencher; William F. Sustained release matrix system using hydroxyethyl cellulose and hydroxypropyl cellulose polymer blends
US20030035837A1 (en) * 1993-11-23 2003-02-20 Sackler Richard S. Method of treating pain by administering 24 hour oral opioid formulations exhibiting rapid rate of initial rise of plasma drug level
US5551876A (en) * 1994-02-25 1996-09-03 Babcock-Hitachi Kabushiki Kaisha Target practice apparatus
US5888545A (en) * 1994-07-01 1999-03-30 Arzneimittelwerk Dresden Gmbh Carbamazepine medicament with retarded active substance release
US5649706A (en) * 1994-09-21 1997-07-22 Treat, Jr.; Erwin C. Simulator and practice method
US6743442B2 (en) * 1994-11-04 2004-06-01 Euro-Celtique, S.A. Melt-extruded orally administrable opioid formulations
US5958452A (en) * 1994-11-04 1999-09-28 Euro-Celtique, S.A. Extruded orally administrable opioid formulations
US5965161A (en) * 1994-11-04 1999-10-12 Euro-Celtique, S.A. Extruded multi-particulates
US6261599B1 (en) * 1994-11-04 2001-07-17 Euro-Celtique, S.A. Melt-extruded orally administrable opioid formulations
US6335033B2 (en) * 1994-11-04 2002-01-01 Euro-Celtique, S.A. Melt-extrusion multiparticulates
US20040185096A1 (en) * 1994-11-04 2004-09-23 Euro-Celtique S.A. Melt-extrusion multiparticulates
US6012980A (en) * 1995-12-01 2000-01-11 Kabushiki Kaisha Sega Enterprises Coordinates detecting device, method for same and game device
US5980254A (en) * 1996-05-02 1999-11-09 Advanced Interactive Systems, Inc. Electronically controlled weapons range with return fire
US5999210A (en) * 1996-05-30 1999-12-07 Proteus Corporation Military range scoring system
US6198501B1 (en) * 1996-05-30 2001-03-06 Proteus Corporation Military range scoring system
US6260466B1 (en) * 1996-10-03 2001-07-17 Barr & Stroud Limited Target aiming system
US5924694A (en) * 1997-05-12 1999-07-20 Kent; Howard Daniel Ballistic target material
US20020052411A1 (en) * 1998-09-03 2002-05-02 Dagmar Gobel Valproate compositions and processes for making
US6840772B1 (en) * 1999-05-14 2005-01-11 Dynamit Nobel Gmbh Explosivstoff-Und Systemtechnik Method for the impact or shot evaluation in a shooting range and shooting range
US6367800B1 (en) * 1999-06-07 2002-04-09 Air-Monic Llc Projectile impact location determination system and method
US6536907B1 (en) * 2000-02-08 2003-03-25 Hewlett-Packard Development Company, L.P. Aberration compensation in image projection displays
US20020051953A1 (en) * 2000-06-09 2002-05-02 John Clark Firearm laser training system and method facilitating firearm training with various targets and visual feedback of simulated projectile impact locations
US20020107069A1 (en) * 2000-12-06 2002-08-08 Nikon Corporation Game machine, method of performing game and computer-readable medium
US20030157463A1 (en) * 2002-02-15 2003-08-21 Nec Corporation Shooting training system with device allowing instructor to exhibit example to player in real-time
US20030228557A1 (en) * 2002-06-07 2003-12-11 Nec Corporation Electronic competition network system, electronic competition method, a server, and a computer program for operating the server
US20060115530A1 (en) * 2002-10-16 2006-06-01 Anders Pettersson Gastric acid secretion inhibiting composition
US20080192129A1 (en) * 2003-12-24 2008-08-14 Walker Jay S Method and Apparatus for Automatically Capturing and Managing Images
US20070026364A1 (en) * 2005-01-13 2007-02-01 Jones Giles D Simulation devices and systems for rocket propelled grenades and other weapons
US20060193911A1 (en) * 2005-02-28 2006-08-31 Penwest Pharmaceuticals Co., Controlled release venlafaxine formulations
US20090238870A1 (en) * 2008-03-21 2009-09-24 Les Laboratoires Servier Dividable galenical form allowing modified release of the active ingredient

Cited By (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090194942A1 (en) * 2006-09-11 2009-08-06 Bruce Hodge Thermal target system
US8985585B2 (en) * 2006-09-11 2015-03-24 Bruce Hodge Thermal target system
US8721460B2 (en) 2007-01-04 2014-05-13 Jakks Pacific, Inc. Toy laser gun and laser target system
US20080188314A1 (en) * 2007-01-04 2008-08-07 Brian Rosenblum Toy laser gun and laser target system
WO2008085906A3 (en) * 2007-01-04 2008-11-13 Jakks Pacific Inc Toy laser gun and laser target system priority claim
WO2008085906A2 (en) * 2007-01-04 2008-07-17 Jakks Pacific, Inc. Toy laser gun and laser target system priority claim
US20090274373A1 (en) * 2008-04-30 2009-11-05 Quanta Computer Inc. Image processing apparatus and method for generating coordination calibration points
TWI383334B (en) * 2008-04-30 2013-01-21 Quanta Comp Inc Image processing apparatus and method for generating coordination calibration points
US8275191B2 (en) * 2008-04-30 2012-09-25 Quanta Computer Inc. Image processing apparatus and method for generating coordination calibration points
US20100092925A1 (en) * 2008-10-15 2010-04-15 Matvey Lvovskiy Training simulator for sharp shooting
US10495416B2 (en) * 2013-01-10 2019-12-03 Brian Donald Wichner Methods and systems for determining a gunshot sequence or recoil dynamics of a gunshot for a firearm
US10712133B2 (en) * 2017-08-01 2020-07-14 nTwined LLC Impact indication system
CN108012132A (en) * 2017-12-21 2018-05-08 中国人民解放军总参谋部第六十研究所 A kind of projection acquisition all-in-one machine
JP2021032425A (en) * 2019-08-19 2021-03-01 株式会社日立国際電気 Shooting training system
JP7444819B2 (en) 2021-07-21 2024-03-06 株式会社日立国際電気 shooting training system
CN114577059A (en) * 2022-04-06 2022-06-03 神州凯业(广东)科技有限公司 Police actual combat law enforcement integrated intelligent training system

Similar Documents

Publication Publication Date Title
US8360776B2 (en) System and method for calculating a projectile impact coordinates
US20070160960A1 (en) System and method for calculating a projectile impact coordinates
US5641288A (en) Shooting simulating process and training device using a virtual reality display screen
EP2249117A1 (en) Shooting training systems using an embedded photo sensing panel
US5823779A (en) Electronically controlled weapons range with return fire
US5194006A (en) Shooting simulating process and training device
AU748378B2 (en) Network-linked laser target firearm training system
US20070190495A1 (en) Sensing device for firearm laser training system and method of simulating firearm operation with various training scenarios
US20070254266A1 (en) Marksmanship training device
US20150276349A1 (en) System and method for marksmanship training
WO1997041402B1 (en) Electronically controlled weapons range with return fire
EP1402224A2 (en) Firearm laser training system and method facilitating firearm training for extended range targets with feedback of firearm control
US9200870B1 (en) Virtual environment hunting systems and methods
US9267762B2 (en) System and method for marksmanship training
US20160231087A1 (en) System, device and method for firearms training
US9261332B2 (en) System and method for marksmanship training
CN111785118A (en) System and method for simulating live-action projection training
TR2022001799A1 (en) Blank, dry trigger range shooting system with laser image processing.
JP2024105956A (en) Shooting Training System
WO2023154027A2 (en) Shooting range system having blank cartridge and blank trigger with laser image processing
CN105004217A (en) Laser simulation shooting CS (Counter-Strike) counter-training system
AU783018B2 (en) Network-linked laser target firearm training system
UA109927U (en) ELECTRONIC MODULAR SHOOTER TRAINER
KR20160014239A (en) shooting system for military drill
AU2920202A (en) Network-linked laser target firearm training system

Legal Events

Date Code Title Description
AS Assignment

Owner name: LASER SHOT, INC., TEXAS

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DOTTY, CHARLES;MANARD, PAIGE;REEL/FRAME:018426/0801

Effective date: 20061013

AS Assignment

Owner name: LASER SHOT, INC., TEXAS

Free format text: CORRECTIVE ASSIGNMENT TO CORRECT THE ASSIGNOR'S NAME, PREVIOUSLY RECORDED AT REEL 018426 FRAME 0801;ASSIGNORS:DOTY, CHARLES;MANARD, PAIGE;REEL/FRAME:018912/0630

Effective date: 20061013

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION