US20080261192A1 - Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network - Google Patents
Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network Download PDFInfo
- Publication number
- US20080261192A1 US20080261192A1 US12/116,472 US11647208A US2008261192A1 US 20080261192 A1 US20080261192 A1 US 20080261192A1 US 11647208 A US11647208 A US 11647208A US 2008261192 A1 US2008261192 A1 US 2008261192A1
- Authority
- US
- United States
- Prior art keywords
- data
- simulator
- video
- playback
- training
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B9/00—Simulators for teaching or training purposes
Definitions
- a simulation training session is one in which training of personnel is performed through the use of a simulator device that outputs real-time data in response to interactions of the trainees.
- medical training centers conduct simulation training that generally involve students performing simulated medical procedures and/or examinations on a mannequin simulator, which exhibits symptoms of various ailments of a patient during simulated examination sessions.
- Other types of medical simulators include EKG machines, blood pressure monitors, and virtual reality endoscopic, laparoscopic, and endovascular simulators.
- Each examination room is equipped with monitoring equipment, including audio, visual and time recoding devices, so that the student's simulated encounter with the patient can be monitored in real time by an evaluator, such as a faculty member or upper class person.
- an evaluator such as a faculty member or upper class person.
- simulation training sessions are also recorded on video for subsequent analysis and teaching purposes.
- a similar configuration is used in other industries for other types of training sessions.
- the monitoring equipment in the examination rooms may include multiple audio/video (A/V) sources, e.g. video cameras, to provide various camera angles of the training session.
- a typical recording training session may have three video feeds, for instance, taken from different camera angles, and one of the video feeds might show a machine that displays data from a simulator, such as EKG, heart rate, or blood pressure data.
- the data from each of the A/V sources is sent to a respective recording/playback device, e.g., a digital video (DV) recorder, for recording onto some type of hard recording medium, such as DVDs or DV tapes. This results in the output of each of the video cameras, for example, to be stored on separate medium during the training session.
- a respective recording/playback device e.g., a digital video (DV) recorder
- the system may have the recording/playback devices synced together by a time sync generator.
- Each of the recording mediums produced by the multiple recording/playback devices, whether DVDs or DV tapes, are typically stored as a tape archive in a multimedia library.
- the video recorded by each of the A/V sources may also be input to a video/audio mixer or processor of some type.
- the mixer merges the video feeds from the A/V sources and the output is recorded onto a recording medium as a merged video with multiple windows, one for each video feed.
- Another method is to overlay the simulator data as a composite image with the video feeds, like a picture-in-picture.
- One problem with this method is that typically the overlay picture obscures part of the underlying image.
- the recording media e.g., DVD or DV tape, may then also be archived in the multimedia library. After all the video is stored and edited, each of the videos needs to be manually associated with each of the trainees for later retrieval.
- a trainer While viewing the training session during the recording, a trainer manually identifies performance events/issues, and manually notes the time during the video in which the event occurred. Once the simulation exercise is completed, the recording is stopped. Thereafter, the trainer conducts a debriefing session with the trainee(s) to evaluate the trainee's performance. Debriefing sessions can be performed right after the training session using the unmixed recordings, during which, the trainer plays back portions of recordings and analyzes the performance of the trainee using their notes as a guide. Since unmixed recordings are used, this process often involves lots of rewinding and fast-forwarding to get to points of interest.
- the trainer may desire to perform what is known as a highlighted debriefing session.
- a highlighted debriefing session the trainer plays back just a portion of the training session(s) for a detailed analysis with the trainee.
- a post-video editing process is required to extract clips of specific examples from the various media stored in the multimedia library. For example, assume that a trainer wants to make a video of where trainee makes the most mistakes. This would require the finding and retrieving of the videos that contain the key clips from the multimedia library.
- the clips are extracted from the video, and then either merged to create a video overlay, or the clips are mixed and alternated.
- the modified video is stored in the multimedia library, and then retrieved by the trainer in order to conduct the highlighted debrief session.
- the conventional system described above has several problems.
- One problem is that the system does not allow quantifiable individual feedback. Instead, the training session is often subjectively evaluated or graded based on what the trainer or reviewer thought they saw or didn't see during recorded exercise.
- a related problem is that to comment on a specific event that occurred during the training session, the trainer or reviewer must either remember where in the recorded event occurred, or note a time index in their notes.
- the trainer wants to highlight a specific area of interest in the recording, the trainer may have to have the recording played and the segment of interest recorded separately during the editing process. For training centers that have a large number of training sessions and a large number of recordings per training session, such constant editing can be a significant burden in terms of manpower and cost.
- the training sessions are stored on media such as DVD or DV tape that must be manually indexed and stored for archival. Since this is a manual process, problems may arise when attempting to find a particular recording for a particular training session or trainee.
- the recordings archived in the multimedia library may not be readily accessible to reviewers, particularly if the reviewers are not in the same location as the multimedia library. For these reasons, access to the recordings may not be possible or highly cumbersome as a number of training sessions recorded increases, which in turn, may limit the number of users who can view the recordings for evaluation or grading.
- a method and system for providing synchronous multimedia recording and playback.
- the exemplary embodiment include in response to a training session is being conducted, synchronously recording in real-time simulator data from a simulator captured by a simulator capture tool, and video of the training session captured by a plurality of A/V sources; encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client.
- all available training session video and simulator sources may be encoded into a streamable format as independent video files and streamed onto the network. This facilitates remote viewing of activity and real-time performance assessments of the training session.
- the assessment information is instantly tabulated and available to help drive a more objective feedback/debrief session using actual simulation trend data and multiple assessments, all synchronously tied to multiple video feeds.
- FIG. 1 is a block diagram of a simulation training system in accordance with an exemplary embodiment.
- FIG. 2 is a flow diagram illustrating a process implemented by the simulation training system for providing synchronous multimedia recording and playback in accordance with the exemplary embodiment.
- FIG. 3A is a flow diagram illustrating a process performed by the skills assessment tool synchronous display the simulation data and streaming video media files on the client.
- FIG. 3B is a diagram graphically illustrating a composite media file according to an exemplary embodiment.
- FIGS. 4A through 4D are example screenshots of the interface of the skills assessment tool when the playing the stream.
- FIG. 5 is a flow diagram illustrating a workflow process implemented by the simulated training system and its interaction with the end-user for recording, annotating, and debriefing a recorded training session.
- FIG. 6A is a diagram of an example assessment display screen.
- FIG. 6B is a diagram showing another example of an assessment display screen.
- FIG. 7 is a block diagram illustrating a process performed by the debrief tool and its interaction with the end-user for conducting an individual debriefing session between a trainer and a trainer.
- the present invention relates to synchronous multimedia recording and playback.
- the following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements.
- Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art.
- the end-user is also allowed to select what simulator variables are displayed.
- the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
- the exemplary embodiment provides a web-based simulation training system for providing synchronous multimedia recording and playback of recorded training sessions.
- Training sessions are recorded, and the recordings include not only synchronized video from multiple cameras, but also simulation data recorded from a simulator.
- Streaming technology is then utilized to allow end-users to access the recordings over the Internet via a browser and view and configure the recorded training sessions in real-time.
- the end-user is allowed to select which video streams are played back and to jump to any point along the recording time line, at which point all the videos and simulation data automatically play back at that point in time.
- the system provides synchronous multimedia recording and playback with user playback control of time, data, and event visualization over a network.
- FIG. 1 is a block diagram of a simulation training system in accordance with an exemplary embodiment.
- the exemplary embodiment provides a web-based simulation training system 10 for providing synchronous multimedia recording and playback of recorded training sessions.
- the simulation training system 10 includes a training center 12 that includes equipment for communicating with a simulation training website 14 over a network 16 , such as Internet.
- the training center 12 conducts and records simulation training sessions in one or more training rooms equipped with multiple audio/video (A/V) sources 18 , multiple encoder/recorders 20 , a time sync generator 22 , and a simulator data source 24 .
- A/V audio/video
- the training sessions are recorded using the A/V sources 18 and the data is sent to respective encoders/recorders 20 .
- the A/V sources 18 in an exemplary environment will be described as video cameras, but A/V sources 18 include any type of capture device, such as an auxiliary microphone or a still camera, and the like.
- the training sessions involve one or more trainees (not shown) who perform simulated procedures, or otherwise interact with, at least one simulator data source 24 that outputs real-time data in response.
- the type of training conducted by the training center 12 will be described in terms of medical training that would be suitable for doctors, nurses, and emergency response personnel, but the exemplary embodiments are applicable any type of training that involves the use of a any type of simulator.
- Example types of simulator data sources 24 in the medical industry for instance, include full-body mannequin simulators, virtual reality simulators, EKG machines, and blood pressure monitors.
- the online simulation training website 14 includes a software suite referred to as a skills assessment tool 26 , a web server 28 a and a video-on-demand server 28 b (collectively referred to as server 28 ), a session data archive 30 , a simulation data archive 32 , and a multimedia archive 34 .
- the skills assessment tool 26 hosted on the simulation training website 14 includes a debrief tool 38 and annotation and assessment tool 40 .
- the server 28 hosting the simulation training website 14 may be implemented as one server or any number of servers.
- the encoders/recorders 20 and the simulation capture tool 36 may be located remote from the training center, e.g., at the physical location of the simulation training website 14 .
- all the components shown in the simulation training website 14 including the encoders/recorders 20 and the simulation capture tool 36 may be implemented as a single physical device.
- the simulation training website 14 may be implemented as a custom application that is installed at the training center 12 , and accessed directly by clients 42 over a network.
- FIG. 2 is a flow diagram illustrating a process implemented by the simulation training system 10 for providing synchronous multimedia recording and playback in accordance with the exemplary embodiment.
- the process begins in response to a training session being conducted by synchronously recording in real-time both simulator data from a simulator data source 24 captured by a simulator capture tool, and video of the training session captured by a plurality of the A/V sources 18 (block 200 ).
- the simulator data is captured by simulation capture tool 36 .
- the time sync generator 22 is coupled to the encoders/recorders 20 and to the simulation capture tool 36 to control the synchronization of the recordings.
- the simulation capture tool 36 is provided by the simulation training website 14 , but is installed and executed on a computer (not shown) at the training center 12 that communicates with the simulator data source 24 .
- the simulation capture tool 36 may be located remote from the simulator data source 24 , such as at the simulation training website 14 .
- the each of the videos captured by A/V sources 18 are encoded as respective digital media files in streaming media format (block 202 ).
- streaming media is media that is consumed (heard and/or viewed) while the media is being delivered.
- the videos captured by the A/V sources 18 may be encoded by the encoders/decoders 20 .
- the digital media files are encoded as MPEG-4 files, but other formats may also be used.
- the simulator data may be captured 36 as telemetry values captured in its raw and/or compressed format.
- the telemetry values can then be visualized using a thin client, such as Flash PlayerTM, as a function of time.
- the simulator data can be captured using one of the A/V sources 18 by recording a video of the output of the simulator itself, e.g., by capturing a video of an EKG display.
- the simulation data may be encoded by the simulation capture tool 36 .
- the simulation data and the digital media files of the video feeds are transmitted to the simulation training website 14 .
- the simulation data is sent to the simulation training website 14 by simulation capture tool 36 , where it stored in the simulation data archive 32 and indexed by an ID of the training session.
- the video media files are sent to the simulation training website 14 by the encoders/decoders and are stored in the multimedia archive 34 .
- the server 28 transmits both the simulator data and a stream of the digital media files to the client 42 over the network 16 , such that when the client 42 receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client 42 (block 204 ).
- the skills assessment tool 26 causes the server 28 to transmit the simulator data and the stream of the video media files in response to receiving a request to view a recorded training session.
- an end-user of a client 42 may access the assessment tool 40 using a browser 44 and submit a request to view the training session.
- the end-user request can be made prior to, or during, a live training session for real-time viewing of the recording, or after the training session is complete. If the request is for a real-time viewing of the recording, then the end-user may represent a trainer/faculty member, a trainee/student observer, or other type of evaluator/reviewer.
- the end-user may also represent the trainers and trainees that took part in the training session.
- the skills assessment tool 26 causes the server 28 to transmit the simulator data and the stream of the video media files to the client 42 automatically based on some preconfigured settings.
- FIG. 3A is a flow diagram illustrating a process performed by the skills assessment tool 26 for synchronous display of the simulation data and the stream of video media files on the client 42 .
- the process begins in response to the skills assessment tool 26 receiving a request to view a training session (block 300 ).
- the skills assessment tool 26 then dynamically generates a single composite media file referencing as sources the video media files stored for a training session (block 302 ).
- the composite media file is encoded as a streaming MPEG-4 file.
- FIG. 3B is a diagram graphically illustrating a composite media file according to an exemplary embodiment.
- the skills assessment tool 26 created a composite media file 320 in MPEG-4 movie format that includes three links referencing three video media files 322 .
- one of the referenced video media files 322 is designated as a primary audio source (block 304 ) to control audio playback during synchronous video playback.
- the first video media file 322 referenced in the composite media file 300 may be designated the primary audio source by default.
- the audio portion of the primary audio source may be used to synchronize the videos at the time of playback.
- the audio portions of the other video media files 322 may be turned-off.
- the skills assessment tool 26 has a user interface that is displayed in the browser 44 of the client 42 .
- the skills assessment tool 26 sends the composite media file 320 to the video-on-demand server 28 b .
- the video-on-demand server 28 b streams the composite media file 320 to browser of the designated client 42 by using the links to pull the source media files 322 and 324 from the archives 32 and 34 and sending them through the stream (block 306 ).
- the exemplary embodiment provides a system for premixing the video, rather than a mixing of the video during playback.
- the skills assessment tool 26 retrieves the related simulation data from the simulation data archive 32 and sends the simulation data to the browser 44 ( 308 ), in one embodiment, via the web server 28 a.
- a media player 46 compatible with the format of the stream is automatically invoked.
- Example types of media players 46 for playing streaming media include Apple QuickTimeTM and Flash PlayerTM, for instance.
- the media player 46 then visualizes the simulation data and plays it in synchronization with the videos based on the time of the recordings.
- the media player 46 opens within the browser 44 .
- the media player 46 opens outside the browser 44 .
- the videos may be streamed individually from the server 28 b , and then synchronized on the client 42 by the media play 46 .
- the simulator data and each of the videos can be synchronously played in the single interface of the assessment tool 40 as displayed by the media player 46 . More specifically, the display screen of the assessment tool 40 is divided into separate windows corresponding to each of the source files included in the composite media file 300 and the simulation data.
- FIGS. 4A through 4B are example screenshots of the interface of the skills assessment tool 26 when the playing the stream.
- a skills assessment tool display screen 400 is shown that is divided into respective panes or windows 402 for playing each of the source videos files, and a window 404 for visualizing the simulator data.
- the simulator data may be displayed as a trend of telemetry values output from the simulator data source 24 as a function of time.
- a timeline 406 of the recorded training session is also display points of interest along the timeline that were flagged by reviewers/evaluators during an annotation session, as described further below.
- the simulator data shows physiological data may have been captured by a video screen capture, or by a visualization created from raw data captured from the simulator data source 24 .
- the videos in each of the four windows 402 and 404 , and the timeline 406 are played back synchronously, but each video is an independent and fully editable video file.
- the end-user is allowed to advance or return to any point in time in the synchronous playback of the videos.
- FIG. 4B is a skills assessment tool display screen 420 synchronously playing three video windows 422 , and one simulator data window 424 displaying the simulation data in a manner that identifies transition points of the telemetry values.
- An example of a transition point is summation data showing that a mannequin patient simulator went into cardiac arrest. Transition points are often an indicator of where during the training the trainee needs to perform a particular procedure and within a specified time frame.
- FIG. 4C is a skills assessment tool display screen 450 synchronously playing one video windows 452 with another video window 454 playing video of the display screen of a medical device data source.
- FIG. 4D is a skills assessment tool display screen 460 synchronously playing three video windows 464 and a simulation data window 464 .
- the simulation data window shows a graph of telemetry information as well as a list 466 of variables that may be displayed in the graph.
- the end-user can select from the list which variables to display to selectively control visualization of the telemetry information.
- the end-user may also select not only which variables to visualize, but also to specify an actual value for a selected variable to see where the value appears along the timeline.
- FIG. 5 is a flow diagram illustrating a workflow process implemented by the simulated training system 100 and its interaction with the end-user for recording, annotating, and debriefing a recorded training session.
- the process begins when a training session is initiated and the simulator data source 24 started (block 502 ).
- the simulation capture tool 36 detects the starting of the simulator data source 24 and automatically starts training session recording by invoking the A/V sources 18 and encoders/recorders 20 (block 504 ).
- an end-user can access the skills assessment tool 26 and submit a request to view the training session live, and in response, the recorded training session is streamed to the client with the simulation data.
- the annotation and assessment tool 40 also enables the end-user to enter annotation and assessment data of the trainee's performance while the videos and simulation data are synchronously played back (block 506 ). This process is referred to as an annotation session.
- the web server 28 a displays a list of live or prerecorded training sessions to access, or the end-user can search for a training session by entering metadata such as by case identification, or trainee identification.
- an assessment screen is displayed for displaying the training session.
- the training session is played once the training session begins.
- the pre-recorded training session is played automatically.
- the session data from the session data archive 30 and the simulation data from the simulation data archive is retrieved and transmitted over the network by the servers 28 a and 28 b , as described above.
- FIG. 6A is a diagram of an example assessment display screen 600 .
- the annotation and assessment tool 40 displays video windows 602 and a window 604 for displaying a checklist 606 listing predefined training tasks to be completed by a trainee during the training session, and allows the end-user to indicate which of the listed tasks were completed during synchronous playback, thereby providing real-time annotation and assessment of the trainee.
- the end-user may provide answers to the checklist by simply clicking “Yes” or “No”.
- some of the tasks on the checklist 606 may be associated with a timestamp that specifies at what time during the training session particular item should have been performed.
- the annotation and assessment tool 40 compares the specified time with the time the reviewer checks off the task and indicates whether the task was performed within the specified time. Point values may be assigned based on proper or improper execution of the predefined tasks. Providing a predefined list of training tasks for evaluators to complete provides a more objective approach in providing skills assessment.
- FIG. 6B is a diagram showing another example of an assessment display screen 620 .
- the screen 620 includes three video windows 622 and a simulation window 624 showing a trend of simulation data.
- a dialog box 626 is displayed that allows the reviewer to identify a predefined event, and to enter a comment about the event to define points of interest that occurred during the training session.
- a rating can be associated with the event, and the event can be associated with the trainee(s).
- Assessment sessions may be performed both in real time and subsequent to the training session. Because the recorded training session is provided over the Internet, annotation sessions may be performed by multiple reviewers at the same or different times. In addition, assessments can easily be done remotely.
- the annotation and assessment tool 40 stores the annotation and assessment data entered by the reviewers in the session data archive 30 in association with the training session and by the trainee(s). The annotation and assessment tool 40 may also automatically tally the annotations and assessments entered by the reviewers to create a composite assessment/score.
- trainees participating in a training session and the checklist of tasks to be completed during the training session are established prior to the training session.
- the trainer or the trainee can log into the assessment tool 40 to perform a pre-assessment in which the trainee enters information identifying the subject matter and participants of training session (integrated trainee identification).
- the trainer can login to the system and create a checklist of tasks to be completed by the trainees during the training session.
- the checklists can establish the order which the tasks must be performed as well as established when during the training session the tasks must be performed.
- Data defining the training session e.g., date/time, case number, room number, and so on
- checklist data, trainee data, and assessment data are all stored in the session data archive.
- Data in the session data archive 30 is preferably indexed by the training session, but can also be indexed by a group of trainees or by individual trainees. Because the recorded training session is stored in association with the group of trainees and the individual trainees in the session data archive 30 , the assessment data can be segmented out by individual trainees and viewed to see that trainees performance. This allows all training sessions and assessment entered for particular trainee, i.e., the trainee's entire training portfolio, to be easily retrieved and viewed with few mouse clicks.
- the simulation capture tool 36 detects the deactivation of the simulated data source, and turns-off the A/V sources 18 and encoders/recorders 20 to end the recording (block 510 ). This also ends any live assessments sessions.
- training centers 12 typically hold debriefing sessions where the recorded training session is reviewed by one of more evaluators with the trainees as a group or individually.
- the skills assessment tool 26 further includes a debrief tool 38 for enabling automated debrief sessions.
- the debrief tool 38 is invoked in response to the end-user, typically the trainer/evaluator, choosing to start a debrief session from the browser 44 (block 512 ).
- the debrief tool 38 allows the end-user to select which recorded training session to view, and which videos in the training session to view, and the selected videos are synchronously played back, as described above and as shown in FIGS. 4A-4C , at which point the end-user begins to review the training scenario with the trainee(s) (block 514 ).
- a debriefing session is similar to an annotation session in that all the video feeds and simulation data may be displayed on one screen in the browser 44 , but the debriefing session does not include a checklist or other area for the viewers to enter annotations.
- the end-user may choose to display only certain components from the recorded training session and how the simulator data should be visualized.
- the end-user can jump to segments of interest (block 516 ), choose to review faculty/peer assessments (block 518 ), or perform the review based on just the simulator trend data ( 520 ).
- the trainer may only choose to display video streams 1 and 2 as well as the simulation data, and at the same time view the assessment data entered by Dr. Johnson, for instance.
- FIG. 7 is a block diagram illustrating a process performed by the debrief tool 38 and its interaction with the end-user for conducting an individual debriefing session between a trainer and a trainee.
- the process begins by the debrief tool 38 receiving the trainer's selection of a trainee's portfolio (block 700 ).
- the debrief tool 38 retrieves all the annotations related to the trainee and combines them in real time into absolute and comparative reports from which the trainer may select to view (block 702 ). From the displayed reports, the trainer can identify problem areas (block 704 ), and then select from the interface of the debrief tool 38 a particular recorded training session to review with the trainee (block 706 . Once the recorded training session is played back (block 714 ), and the trainer is allowed to jump to any particular video clip.
- the trainer can show a timeline of events (block 708 ), show a list of related annotations and jump to a particular annotation point (block 710 ), and show a list of assessments related to the trainee ( 712 ), including any combination of faculty assessments, self-assessment, and peer assessments. The trainer then discusses the data with the trainee (block 716 ).
- the combination of the annotation and assessment tool 40 and debrief tool 38 enables a discussion and review of a trainee's performance to be based on absolute and comparative metrics in combination with multiple evaluator assessments, all linked to video, thereby providing more objective feedback to the trainee and an overall improved training process.
- a method and system for providing synchronous multimedia recording and playback has been disclosed.
- the present invention has been described in accordance with the embodiments shown, and one of ordinary skill in the art will readily recognize that there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention.
- the embodiments can be implemented using hardware, software, a computer readable medium containing program instructions, or a combination thereof.
- the debrief tool 38 , and annotation and assessment tool 40 are shown as separate components, the functionality of each may be combined into a lesser or greater number of components.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Business, Economics & Management (AREA)
- Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Educational Technology (AREA)
- General Physics & Mathematics (AREA)
- Electrically Operated Instructional Devices (AREA)
Abstract
A system and system are provided for providing synchronous multimedia recording and playback. Aspects the exemplary embodiment include in response to a training session is being conducted, synchronously recording in real-time simulator data from a simulator captured by a simulator capture tool, and video of the training session captured by a plurality of A/V sources; encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client
Description
- This application is a continuation of co-pending patent application Ser. No. 11/611,792, filed Dec. 15, 2006, assigned to the assignee of the present application, and incorporated herein by reference.
- The use of simulation training is growing rapidly. A simulation training session is one in which training of personnel is performed through the use of a simulator device that outputs real-time data in response to interactions of the trainees. In the medical industry, for example, medical training centers conduct simulation training that generally involve students performing simulated medical procedures and/or examinations on a mannequin simulator, which exhibits symptoms of various ailments of a patient during simulated examination sessions. Other types of medical simulators include EKG machines, blood pressure monitors, and virtual reality endoscopic, laparoscopic, and endovascular simulators. During each simulated examination session, which usually takes place in an assigned examination room, the student interacts with the patient during an appointed time period to make a diagnosis of the patient's ailment and to prescribe a proposed treatment plan or perform a procedure. Each examination room is equipped with monitoring equipment, including audio, visual and time recoding devices, so that the student's simulated encounter with the patient can be monitored in real time by an evaluator, such as a faculty member or upper class person. Typically, simulation training sessions are also recorded on video for subsequent analysis and teaching purposes. A similar configuration is used in other industries for other types of training sessions.
- The monitoring equipment in the examination rooms may include multiple audio/video (A/V) sources, e.g. video cameras, to provide various camera angles of the training session. A typical recording training session may have three video feeds, for instance, taken from different camera angles, and one of the video feeds might show a machine that displays data from a simulator, such as EKG, heart rate, or blood pressure data. The data from each of the A/V sources is sent to a respective recording/playback device, e.g., a digital video (DV) recorder, for recording onto some type of hard recording medium, such as DVDs or DV tapes. This results in the output of each of the video cameras, for example, to be stored on separate medium during the training session. Optionally, the system may have the recording/playback devices synced together by a time sync generator. Each of the recording mediums produced by the multiple recording/playback devices, whether DVDs or DV tapes, are typically stored as a tape archive in a multimedia library.
- In addition, the video recorded by each of the A/V sources may also be input to a video/audio mixer or processor of some type. Typically, the mixer merges the video feeds from the A/V sources and the output is recorded onto a recording medium as a merged video with multiple windows, one for each video feed. Another method is to overlay the simulator data as a composite image with the video feeds, like a picture-in-picture. One problem with this method, however, is that typically the overlay picture obscures part of the underlying image. The recording media, e.g., DVD or DV tape, may then also be archived in the multimedia library. After all the video is stored and edited, each of the videos needs to be manually associated with each of the trainees for later retrieval.
- While viewing the training session during the recording, a trainer manually identifies performance events/issues, and manually notes the time during the video in which the event occurred. Once the simulation exercise is completed, the recording is stopped. Thereafter, the trainer conducts a debriefing session with the trainee(s) to evaluate the trainee's performance. Debriefing sessions can be performed right after the training session using the unmixed recordings, during which, the trainer plays back portions of recordings and analyzes the performance of the trainee using their notes as a guide. Since unmixed recordings are used, this process often involves lots of rewinding and fast-forwarding to get to points of interest.
- Sometimes the trainer may desire to perform what is known as a highlighted debriefing session. In a highlighted debriefing session, the trainer plays back just a portion of the training session(s) for a detailed analysis with the trainee. To enable the highlighted debrief session, a post-video editing process is required to extract clips of specific examples from the various media stored in the multimedia library. For example, assume that a trainer wants to make a video of where trainee makes the most mistakes. This would require the finding and retrieving of the videos that contain the key clips from the multimedia library. During the video editing process, the clips are extracted from the video, and then either merged to create a video overlay, or the clips are mixed and alternated. After the video editing process is completed, the modified video is stored in the multimedia library, and then retrieved by the trainer in order to conduct the highlighted debrief session.
- Although recording simulation training sessions has definite advantages in terms of being a useful teaching tool, the conventional system described above has several problems. One problem is that the system does not allow quantifiable individual feedback. Instead, the training session is often subjectively evaluated or graded based on what the trainer or reviewer thought they saw or didn't see during recorded exercise. A related problem is that to comment on a specific event that occurred during the training session, the trainer or reviewer must either remember where in the recorded event occurred, or note a time index in their notes. In addition, if the trainer wants to highlight a specific area of interest in the recording, the trainer may have to have the recording played and the segment of interest recorded separately during the editing process. For training centers that have a large number of training sessions and a large number of recordings per training session, such constant editing can be a significant burden in terms of manpower and cost.
- Another problem is that the training sessions are stored on media such as DVD or DV tape that must be manually indexed and stored for archival. Since this is a manual process, problems may arise when attempting to find a particular recording for a particular training session or trainee. In addition, the recordings archived in the multimedia library may not be readily accessible to reviewers, particularly if the reviewers are not in the same location as the multimedia library. For these reasons, access to the recordings may not be possible or highly cumbersome as a number of training sessions recorded increases, which in turn, may limit the number of users who can view the recordings for evaluation or grading.
- Accordingly, a need exists for an improved method and system for providing synchronous multimedia recording and playback, particularly in the area of simulation training as one example.
- A method and system is provided for providing synchronous multimedia recording and playback. Aspects the exemplary embodiment include in response to a training session is being conducted, synchronously recording in real-time simulator data from a simulator captured by a simulator capture tool, and video of the training session captured by a plurality of A/V sources; encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client.
- According to the method and system disclosed herein, all available training session video and simulator sources may be encoded into a streamable format as independent video files and streamed onto the network. This facilitates remote viewing of activity and real-time performance assessments of the training session. The assessment information is instantly tabulated and available to help drive a more objective feedback/debrief session using actual simulation trend data and multiple assessments, all synchronously tied to multiple video feeds.
-
FIG. 1 is a block diagram of a simulation training system in accordance with an exemplary embodiment. -
FIG. 2 is a flow diagram illustrating a process implemented by the simulation training system for providing synchronous multimedia recording and playback in accordance with the exemplary embodiment. -
FIG. 3A is a flow diagram illustrating a process performed by the skills assessment tool synchronous display the simulation data and streaming video media files on the client. -
FIG. 3B is a diagram graphically illustrating a composite media file according to an exemplary embodiment. -
FIGS. 4A through 4D are example screenshots of the interface of the skills assessment tool when the playing the stream. -
FIG. 5 is a flow diagram illustrating a workflow process implemented by the simulated training system and its interaction with the end-user for recording, annotating, and debriefing a recorded training session. -
FIG. 6A is a diagram of an example assessment display screen. -
FIG. 6B is a diagram showing another example of an assessment display screen. -
FIG. 7 is a block diagram illustrating a process performed by the debrief tool and its interaction with the end-user for conducting an individual debriefing session between a trainer and a trainer. - The present invention relates to synchronous multimedia recording and playback. The following description is presented to enable one of ordinary skill in the art to make and use the invention and is provided in the context of a patent application and its requirements. Various modifications to the preferred embodiments and the generic principles and features described herein will be readily apparent to those skilled in the art. The end-user is also allowed to select what simulator variables are displayed. Thus, the present invention is not intended to be limited to the embodiments shown, but is to be accorded the widest scope consistent with the principles and features described herein.
- The embodiments disclosed herein are mainly described in terms of particular device and system provided in particular implementations. However, one of ordinary skill in the art will readily recognize that this method and system will operate effectively in other implementations. For example, the devices usable with the present invention can take a number of different forms. The present invention will also be described in the context of particular methods having certain steps. However, the method and system operate effectively for other methods having different and/or additional steps not inconsistent with the present invention.
- The exemplary embodiment provides a web-based simulation training system for providing synchronous multimedia recording and playback of recorded training sessions. Training sessions are recorded, and the recordings include not only synchronized video from multiple cameras, but also simulation data recorded from a simulator. Streaming technology is then utilized to allow end-users to access the recordings over the Internet via a browser and view and configure the recorded training sessions in real-time. The end-user is allowed to select which video streams are played back and to jump to any point along the recording time line, at which point all the videos and simulation data automatically play back at that point in time. Thus, the system provides synchronous multimedia recording and playback with user playback control of time, data, and event visualization over a network.
-
FIG. 1 is a block diagram of a simulation training system in accordance with an exemplary embodiment. The exemplary embodiment provides a web-basedsimulation training system 10 for providing synchronous multimedia recording and playback of recorded training sessions. Thesimulation training system 10 includes atraining center 12 that includes equipment for communicating with asimulation training website 14 over anetwork 16, such as Internet. Thetraining center 12 conducts and records simulation training sessions in one or more training rooms equipped with multiple audio/video (A/V) sources 18, multiple encoder/recorders 20, atime sync generator 22, and asimulator data source 24. - The training sessions are recorded using the A/
V sources 18 and the data is sent to respective encoders/recorders 20. The A/V sources 18 in an exemplary environment will be described as video cameras, but A/V sources 18 include any type of capture device, such as an auxiliary microphone or a still camera, and the like. The training sessions involve one or more trainees (not shown) who perform simulated procedures, or otherwise interact with, at least onesimulator data source 24 that outputs real-time data in response. The type of training conducted by thetraining center 12 will be described in terms of medical training that would be suitable for doctors, nurses, and emergency response personnel, but the exemplary embodiments are applicable any type of training that involves the use of a any type of simulator. Example types ofsimulator data sources 24 in the medical industry, for instance, include full-body mannequin simulators, virtual reality simulators, EKG machines, and blood pressure monitors. - The online
simulation training website 14 includes a software suite referred to as askills assessment tool 26, aweb server 28 a and a video-on-demand server 28 b (collectively referred to as server 28), asession data archive 30, asimulation data archive 32, and amultimedia archive 34. Theskills assessment tool 26 hosted on thesimulation training website 14 includes adebrief tool 38 and annotation andassessment tool 40. The server 28 hosting thesimulation training website 14 may be implemented as one server or any number of servers. - In another embodiment, the encoders/
recorders 20 and thesimulation capture tool 36 may be located remote from the training center, e.g., at the physical location of thesimulation training website 14. In another embodiment, all the components shown in thesimulation training website 14, including the encoders/recorders 20 and thesimulation capture tool 36 may be implemented as a single physical device. Further, thesimulation training website 14 may be implemented as a custom application that is installed at thetraining center 12, and accessed directly byclients 42 over a network. -
FIG. 2 is a flow diagram illustrating a process implemented by thesimulation training system 10 for providing synchronous multimedia recording and playback in accordance with the exemplary embodiment. Referring to bothFIGS. 1 and 2 , the process begins in response to a training session being conducted by synchronously recording in real-time both simulator data from asimulator data source 24 captured by a simulator capture tool, and video of the training session captured by a plurality of the A/V sources 18 (block 200). - According to the exemplary embodiment, the simulator data is captured by
simulation capture tool 36. Thetime sync generator 22 is coupled to the encoders/recorders 20 and to thesimulation capture tool 36 to control the synchronization of the recordings. In one embodiment, thesimulation capture tool 36 is provided by thesimulation training website 14, but is installed and executed on a computer (not shown) at thetraining center 12 that communicates with thesimulator data source 24. In another embodiment, thesimulation capture tool 36 may be located remote from thesimulator data source 24, such as at thesimulation training website 14. - During the recording, the each of the videos captured by A/
V sources 18 are encoded as respective digital media files in streaming media format (block 202). As used herein, streaming media is media that is consumed (heard and/or viewed) while the media is being delivered. The videos captured by the A/V sources 18 may be encoded by the encoders/decoders 20. In one embodiment, the digital media files are encoded as MPEG-4 files, but other formats may also be used. - In the exemplary embodiment, the simulator data may be captured 36 as telemetry values captured in its raw and/or compressed format. The telemetry values can then be visualized using a thin client, such as Flash Player™, as a function of time. In another embodiment, the simulator data can be captured using one of the A/
V sources 18 by recording a video of the output of the simulator itself, e.g., by capturing a video of an EKG display. The simulation data may be encoded by thesimulation capture tool 36. - During recording of the training session, the simulation data and the digital media files of the video feeds are transmitted to the
simulation training website 14. The simulation data is sent to thesimulation training website 14 bysimulation capture tool 36, where it stored in the simulation data archive 32 and indexed by an ID of the training session. The video media files are sent to thesimulation training website 14 by the encoders/decoders and are stored in themultimedia archive 34. - After all the captured data is transmitted to the
simulation training website 14 and stored, the server 28 transmits both the simulator data and a stream of the digital media files to theclient 42 over thenetwork 16, such that when theclient 42 receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client 42 (block 204). - In one embodiment, the
skills assessment tool 26 causes the server 28 to transmit the simulator data and the stream of the video media files in response to receiving a request to view a recorded training session. Referring toFIG. 1 , according to the exemplary embodiment, an end-user of aclient 42 may access theassessment tool 40 using abrowser 44 and submit a request to view the training session. The end-user request can be made prior to, or during, a live training session for real-time viewing of the recording, or after the training session is complete. If the request is for a real-time viewing of the recording, then the end-user may represent a trainer/faculty member, a trainee/student observer, or other type of evaluator/reviewer. If the request is for a prerecording training session, then the end-user may also represent the trainers and trainees that took part in the training session. In another embodiment, theskills assessment tool 26 causes the server 28 to transmit the simulator data and the stream of the video media files to theclient 42 automatically based on some preconfigured settings. -
FIG. 3A is a flow diagram illustrating a process performed by theskills assessment tool 26 for synchronous display of the simulation data and the stream of video media files on theclient 42. The process begins in response to theskills assessment tool 26 receiving a request to view a training session (block 300). Theskills assessment tool 26 then dynamically generates a single composite media file referencing as sources the video media files stored for a training session (block 302). In one embodiment, the composite media file is encoded as a streaming MPEG-4 file. -
FIG. 3B is a diagram graphically illustrating a composite media file according to an exemplary embodiment. In this example, theskills assessment tool 26 created acomposite media file 320 in MPEG-4 movie format that includes three links referencing three video media files 322. - Referring again to
FIG. 3A , after the composite media file 320 is generated, one of the referencedvideo media files 322 is designated as a primary audio source (block 304) to control audio playback during synchronous video playback. In one embodiment, the firstvideo media file 322 referenced in the composite media file 300 may be designated the primary audio source by default. The audio portion of the primary audio source may be used to synchronize the videos at the time of playback. The audio portions of the othervideo media files 322 may be turned-off. - The
skills assessment tool 26 has a user interface that is displayed in thebrowser 44 of theclient 42. To fulfill the end-user's request to view the training session, theskills assessment tool 26 sends the composite media file 320 to the video-on-demand server 28 b. The video-on-demand server 28 b streams the composite media file 320 to browser of the designatedclient 42 by using the links to pull the source media files 322 and 324 from thearchives skills assessment tool 26 retrieves the related simulation data from the simulation data archive 32 and sends the simulation data to the browser 44 (308), in one embodiment, via theweb server 28 a. - When the simulation data and the streamed composite media file 322 is received on the
client 42, amedia player 46 compatible with the format of the stream is automatically invoked. Example types ofmedia players 46 for playing streaming media include Apple QuickTime™ and Flash Player™, for instance. Themedia player 46 then visualizes the simulation data and plays it in synchronization with the videos based on the time of the recordings. In one embodiment, themedia player 46 opens within thebrowser 44. In another embodiment, themedia player 46 opens outside thebrowser 44. In another embodiment, the videos may be streamed individually from theserver 28 b, and then synchronized on theclient 42 by themedia play 46. - Because the
media player 46 receives a single streamed media file referencing each of the videos, the simulator data and each of the videos (with audio) can be synchronously played in the single interface of theassessment tool 40 as displayed by themedia player 46. More specifically, the display screen of theassessment tool 40 is divided into separate windows corresponding to each of the source files included in thecomposite media file 300 and the simulation data. -
FIGS. 4A through 4B are example screenshots of the interface of theskills assessment tool 26 when the playing the stream. In the example shown inFIG. 4A , a skills assessmenttool display screen 400 is shown that is divided into respective panes orwindows 402 for playing each of the source videos files, and awindow 404 for visualizing the simulator data. The simulator data may be displayed as a trend of telemetry values output from thesimulator data source 24 as a function of time. Also shown is atimeline 406 of the recorded training session. In one embodiment, in the case where a prerecorded training session is played, thetimeline 406 may also display points of interest along the timeline that were flagged by reviewers/evaluators during an annotation session, as described further below. - In this particular example, the simulator data shows physiological data may have been captured by a video screen capture, or by a visualization created from raw data captured from the
simulator data source 24. The videos in each of the fourwindows timeline 406 are played back synchronously, but each video is an independent and fully editable video file. By interacting with thetimeline 406, the end-user is allowed to advance or return to any point in time in the synchronous playback of the videos. -
FIG. 4B is a skills assessmenttool display screen 420 synchronously playing threevideo windows 422, and onesimulator data window 424 displaying the simulation data in a manner that identifies transition points of the telemetry values. An example of a transition point is summation data showing that a mannequin patient simulator went into cardiac arrest. Transition points are often an indicator of where during the training the trainee needs to perform a particular procedure and within a specified time frame. -
FIG. 4C is a skills assessmenttool display screen 450 synchronously playing onevideo windows 452 with anothervideo window 454 playing video of the display screen of a medical device data source. -
FIG. 4D is a skills assessmenttool display screen 460 synchronously playing threevideo windows 464 and asimulation data window 464. The simulation data window shows a graph of telemetry information as well as alist 466 of variables that may be displayed in the graph. In this embodiment, the end-user can select from the list which variables to display to selectively control visualization of the telemetry information. The end-user may also select not only which variables to visualize, but also to specify an actual value for a selected variable to see where the value appears along the timeline. - The above examples show that the exemplary embodiments enable synchronous multimedia recording and playback, where with viewer is provided with remote playback control of the recording over a network of time, data, and event visualization.
-
FIG. 5 is a flow diagram illustrating a workflow process implemented by the simulated training system 100 and its interaction with the end-user for recording, annotating, and debriefing a recorded training session. The process begins when a training session is initiated and thesimulator data source 24 started (block 502). In response, thesimulation capture tool 36 detects the starting of thesimulator data source 24 and automatically starts training session recording by invoking the A/V sources 18 and encoders/recorders 20 (block 504). - As stated above, an end-user can access the
skills assessment tool 26 and submit a request to view the training session live, and in response, the recorded training session is streamed to the client with the simulation data. According to a further aspect of the exemplary embodiment, the annotation andassessment tool 40 also enables the end-user to enter annotation and assessment data of the trainee's performance while the videos and simulation data are synchronously played back (block 506). This process is referred to as an annotation session. - In operation, once the end-user logs in to the
web server 28 a from abrowser 44 and accesses the assessment tool 40 (FIG. 1 ), theweb server 28 a displays a list of live or prerecorded training sessions to access, or the end-user can search for a training session by entering metadata such as by case identification, or trainee identification. - In response to the user selecting a recorded training session, an assessment screen is displayed for displaying the training session. In the case of a live training session, the training session is played once the training session begins. In the case of a pre-recorded training session, the pre-recorded trade session is played automatically. Based on the session ID of the selected session, the session data from the session data archive 30 and the simulation data from the simulation data archive is retrieved and transmitted over the network by the
servers -
FIG. 6A is a diagram of an exampleassessment display screen 600. In this embodiment, the annotation andassessment tool 40displays video windows 602 and awindow 604 for displaying achecklist 606 listing predefined training tasks to be completed by a trainee during the training session, and allows the end-user to indicate which of the listed tasks were completed during synchronous playback, thereby providing real-time annotation and assessment of the trainee. In this example, the end-user may provide answers to the checklist by simply clicking “Yes” or “No”. In one embodiment, some of the tasks on thechecklist 606 may be associated with a timestamp that specifies at what time during the training session particular item should have been performed. As a reviewer checks off tasks on the checklist, the annotation andassessment tool 40 compares the specified time with the time the reviewer checks off the task and indicates whether the task was performed within the specified time. Point values may be assigned based on proper or improper execution of the predefined tasks. Providing a predefined list of training tasks for evaluators to complete provides a more objective approach in providing skills assessment. -
FIG. 6B is a diagram showing another example of anassessment display screen 620. In this example, thescreen 620 includes threevideo windows 622 and asimulation window 624 showing a trend of simulation data. In addition, adialog box 626 is displayed that allows the reviewer to identify a predefined event, and to enter a comment about the event to define points of interest that occurred during the training session. In addition, a rating can be associated with the event, and the event can be associated with the trainee(s). - Assessment sessions may be performed both in real time and subsequent to the training session. Because the recorded training session is provided over the Internet, annotation sessions may be performed by multiple reviewers at the same or different times. In addition, assessments can easily be done remotely. The annotation and
assessment tool 40 stores the annotation and assessment data entered by the reviewers in the session data archive 30 in association with the training session and by the trainee(s). The annotation andassessment tool 40 may also automatically tally the annotations and assessments entered by the reviewers to create a composite assessment/score. - In one embodiment, trainees participating in a training session and the checklist of tasks to be completed during the training session are established prior to the training session. For example, prior to the training session, the trainer or the trainee can log into the
assessment tool 40 to perform a pre-assessment in which the trainee enters information identifying the subject matter and participants of training session (integrated trainee identification). In addition, the trainer can login to the system and create a checklist of tasks to be completed by the trainees during the training session. The checklists can establish the order which the tasks must be performed as well as established when during the training session the tasks must be performed. Data defining the training session (e.g., date/time, case number, room number, and so on), checklist data, trainee data, and assessment data are all stored in the session data archive. Data in the session data archive 30 is preferably indexed by the training session, but can also be indexed by a group of trainees or by individual trainees. Because the recorded training session is stored in association with the group of trainees and the individual trainees in the session data archive 30, the assessment data can be segmented out by individual trainees and viewed to see that trainees performance. This allows all training sessions and assessment entered for particular trainee, i.e., the trainee's entire training portfolio, to be easily retrieved and viewed with few mouse clicks. - Referring again to
FIG. 5 , when the training session is completed (block 508), thesimulation capture tool 36 detects the deactivation of the simulated data source, and turns-off the A/V sources 18 and encoders/recorders 20 to end the recording (block 510). This also ends any live assessments sessions. - As described above, training centers 12 typically hold debriefing sessions where the recorded training session is reviewed by one of more evaluators with the trainees as a group or individually. According to a further aspect of the exemplary embodiment, the
skills assessment tool 26 further includes adebrief tool 38 for enabling automated debrief sessions. - The
debrief tool 38 is invoked in response to the end-user, typically the trainer/evaluator, choosing to start a debrief session from the browser 44 (block 512). Thedebrief tool 38 allows the end-user to select which recorded training session to view, and which videos in the training session to view, and the selected videos are synchronously played back, as described above and as shown inFIGS. 4A-4C , at which point the end-user begins to review the training scenario with the trainee(s) (block 514). A debriefing session is similar to an annotation session in that all the video feeds and simulation data may be displayed on one screen in thebrowser 44, but the debriefing session does not include a checklist or other area for the viewers to enter annotations. The end-user may choose to display only certain components from the recorded training session and how the simulator data should be visualized. The end-user can jump to segments of interest (block 516), choose to review faculty/peer assessments (block 518), or perform the review based on just the simulator trend data (520). For example, the trainer may only choose to displayvideo streams -
FIG. 7 is a block diagram illustrating a process performed by thedebrief tool 38 and its interaction with the end-user for conducting an individual debriefing session between a trainer and a trainee. The process begins by thedebrief tool 38 receiving the trainer's selection of a trainee's portfolio (block 700). In response, thedebrief tool 38 retrieves all the annotations related to the trainee and combines them in real time into absolute and comparative reports from which the trainer may select to view (block 702). From the displayed reports, the trainer can identify problem areas (block 704), and then select from the interface of the debrief tool 38 a particular recorded training session to review with the trainee (block 706. Once the recorded training session is played back (block 714), and the trainer is allowed to jump to any particular video clip. In addition, the trainer can show a timeline of events (block 708), show a list of related annotations and jump to a particular annotation point (block 710), and show a list of assessments related to the trainee (712), including any combination of faculty assessments, self-assessment, and peer assessments. The trainer then discusses the data with the trainee (block 716). - The combination of the annotation and
assessment tool 40 anddebrief tool 38 enables a discussion and review of a trainee's performance to be based on absolute and comparative metrics in combination with multiple evaluator assessments, all linked to video, thereby providing more objective feedback to the trainee and an overall improved training process. - A method and system for providing synchronous multimedia recording and playback has been disclosed. The present invention has been described in accordance with the embodiments shown, and one of ordinary skill in the art will readily recognize that there could be variations to the embodiments, and any variations would be within the spirit and scope of the present invention. In addition, the embodiments can be implemented using hardware, software, a computer readable medium containing program instructions, or a combination thereof. In addition, although the
debrief tool 38, and annotation andassessment tool 40 are shown as separate components, the functionality of each may be combined into a lesser or greater number of components. - Software written according to the present invention is to be either stored in some form of computer-readable medium such as memory or CD-ROM, or is to be transmitted over a network, and is to be executed by a processor. Consequently, a computer-readable medium is intended to include a computer readable signal, which may be, for example, transmitted over a network. Accordingly, many modifications may be made by one of ordinary skill in the art without departing from the spirit and scope of the appended claims.
Claims (20)
1. A method for providing synchronous multimedia recording and playback, comprising:
in response to a training session is being conducted, synchronously recording in real-time simulator data from a simulator captured by a simulator capture tool, and video of the training session captured by a plurality of A/V sources;
encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and
transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client.
2. The method of claim 1 wherein the streaming further comprises dynamically generating a single media file that references as sources the respective streaming media files, and streaming the single media file.
3. The method of claim 1 further comprising playing back the simulator data as a trend of the telemetry values along a timeline.
4. The method of claim 3 further comprising identifying transition points of the telemetry values during playback.
5. The method of claim 1 further comprising playing back the videos on the client in a streaming media player, and playing back the simulator data in a flash player.
6. The method of claim 1 further comprising allowing an end-user of the client to advance or return to any point in time in the synchronous play back of the videos.
7. The method of claim 1 further comprising streaming the synchronous playback in real-time during the training session.
8. The method of claim 1 further comprising enabling an end-user to enter annotation and assessment data while the videos and simulation data are synchronously played back.
9. The method of claim 8 further comprising during the synchronous playback, displaying a checklist of training tasks to be completed by a trainee during the trading session, and allowing an end-user to indicate through the interface which tasks were completed during the synchronous playback for real-time annotation and assessment of the trainee.
10. A system for providing synchronous multimedia recording and playback, comprising:
a plurality of A/V sources for capturing video of a training session, the training session including a use of a simulator;
a simulation capture tool for capturing real-time simulator data from the simulator, wherein the simulator data and the video of the training session are recorded synchronously;
one or more encoders for encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and
means for transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client.
11. The system of claim 10 wherein the means for transmitting include a skills assessment tool and a video on demand server.
12. The system of claim 11 wherein the means skills assessment tool includes:
an annotation and assessment tool for enabling an end-user to enter annotation and assessment data of a trainee's performance while the videos and simulation data are synchronously played back; and
a debrief tool for allowing the end-user to select which recorded training session to view for enabling automated debrief sessions.
13. The system of claim 10 wherein a single media file is dynamically generated that references as sources the respective streaming media files, and streaming the single media file.
14. The system of claim 10 wherein the simulator data is played back as a trend of the telemetry values along a timeline.
15. The system of claim 14 wherein transition points of the telemetry values are identified during playback.
16. The system of claim 10 wherein the videos are played back on the client in a streaming media player, and playing back the simulator data in a flash player.
17. The system of claim 10 wherein an end-user of the client is allowed to advance or return to any point in time in the synchronous play back of the videos.
18. The system of claim 10 wherein the synchronous playback is streamed in real-time during the training session.
19. The system of claim 10 wherein during the synchronous playback, a checklist of training tasks to be completed is displayed by a trainee during the trading session, and an end-user is allowed to indicate through the interface which tasks were completed during the synchronous playback for real-time annotation and assessment of the trainee.
20. An executable software product stored on a computer-readable medium containing program instructions for providing synchronous multimedia recording and playback, wherein in response to a training session is being conducted, video of the training session is captured by a plurality of A/V sources, the program instructions for:
synchronously recording with the video, real-time simulator data from a simulator captured by a simulator capture tool;
encoding each of the videos captured by the plurality of A/V sources as respective digital media files formatted as streaming media; and
transmitting both the simulator data and the video media files from a server to a client over a network, such that when the client receives the simulator data and the stream, the respective videos are synchronously played back with the simulator data on the client.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US12/116,472 US20080261192A1 (en) | 2006-12-15 | 2008-05-07 | Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/611,792 US8113844B2 (en) | 2006-12-15 | 2006-12-15 | Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
US12/116,472 US20080261192A1 (en) | 2006-12-15 | 2008-05-07 | Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/611,792 Continuation US8113844B2 (en) | 2006-12-15 | 2006-12-15 | Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080261192A1 true US20080261192A1 (en) | 2008-10-23 |
Family
ID=39527754
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/611,792 Active 2030-01-06 US8113844B2 (en) | 2006-12-15 | 2006-12-15 | Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
US12/116,472 Abandoned US20080261192A1 (en) | 2006-12-15 | 2008-05-07 | Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/611,792 Active 2030-01-06 US8113844B2 (en) | 2006-12-15 | 2006-12-15 | Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network |
Country Status (1)
Country | Link |
---|---|
US (2) | US8113844B2 (en) |
Cited By (56)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090106003A1 (en) * | 2007-10-23 | 2009-04-23 | Universal Systems And Technology, Inc. | System, method and apparatus for management of simulations |
US20090319916A1 (en) * | 2008-06-24 | 2009-12-24 | Microsoft Corporation | Techniques to auto-attend multimedia conference events |
US20100030785A1 (en) * | 2005-07-12 | 2010-02-04 | Wilson Andrew S | Distributed capture and aggregation of dynamic application usage information |
US20110131208A1 (en) * | 2009-09-23 | 2011-06-02 | Verint Systems Ltd. | Systems and methods for large-scale link analysis |
US20110142217A1 (en) * | 2009-12-10 | 2011-06-16 | Verint Systems Ltd. | Methods and systems for mass link analysis using rule engines |
US20110238723A1 (en) * | 2010-01-31 | 2011-09-29 | Verint Systems Ltd. | Systems and methods for web decoding |
US20120215507A1 (en) * | 2011-02-22 | 2012-08-23 | Utah State University | Systems and methods for automated assessment within a virtual environment |
US8364147B2 (en) | 2010-04-28 | 2013-01-29 | Verint Americas, Inc. | System and method for determining commonly used communication terminals and for identifying noisy entities in large-scale link analysis |
US8665728B2 (en) | 2010-10-31 | 2014-03-04 | Verint Systems, Ltd. | System and method for IP target traffic analysis |
US8681640B2 (en) | 2010-06-08 | 2014-03-25 | Verint Systems, Ltd. | Systems and methods for extracting media from network traffic having unknown protocols |
US8745647B1 (en) * | 2006-12-26 | 2014-06-03 | Visible Measures Corp. | Method and system for internet video and rich media behavioral measurement |
US8767551B2 (en) | 2011-01-27 | 2014-07-01 | Verint Systems, Ltd. | System and method for flow table management |
WO2014110280A2 (en) | 2013-01-11 | 2014-07-17 | Zoll Medical Corporation | Ems decision support interface, event history, and related tools |
US20140308631A1 (en) * | 2008-07-28 | 2014-10-16 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US8959329B2 (en) | 2011-04-14 | 2015-02-17 | Verint Sytems, Ltd.. | System and method for selective inspection of encrypted traffic |
US8959025B2 (en) | 2010-04-28 | 2015-02-17 | Verint Systems Ltd. | System and method for automatic identification of speech coding scheme |
US8990238B2 (en) | 2011-04-27 | 2015-03-24 | Verint Systems Ltd. | System and method for keyword spotting using multiple character encoding schemes |
US9060029B2 (en) | 2011-10-31 | 2015-06-16 | Verint Systems Ltd. | System and method for target profiling using social network analysis |
US9223848B2 (en) | 2011-10-31 | 2015-12-29 | Verint Systems Ltd. | System and method of combined database system |
US9253261B2 (en) | 2011-07-31 | 2016-02-02 | Verint Systems Ltd. | System and method for main page identification in web decoding |
US9264446B2 (en) | 2011-01-27 | 2016-02-16 | Verint Systems Ltd. | System and method for efficient classification and processing of network traffic |
US9363667B2 (en) | 2012-10-21 | 2016-06-07 | Verint Systems Ltd. | System and method for user-privacy-aware communication monitoring and analysis |
US9386028B2 (en) | 2012-10-23 | 2016-07-05 | Verint Systems Ltd. | System and method for malware detection using multidimensional feature clustering |
US9491069B2 (en) | 2012-07-29 | 2016-11-08 | Verint Systems Ltd. | System and method of high volume rule engine |
US9497167B2 (en) | 2012-07-29 | 2016-11-15 | Verint Systems Ltd. | System and method for automatic provisioning of multi-stage rule-based traffic filtering |
US9589073B2 (en) | 2013-04-28 | 2017-03-07 | Verint Systems Ltd. | Systems and methods for keyword spotting using adaptive management of multiple pattern matching algorithms |
US9628580B2 (en) | 2013-10-30 | 2017-04-18 | Verint Systems Ltd. | System and method for conditional analysis of network traffic |
US9639520B2 (en) | 2013-01-29 | 2017-05-02 | Verint Systems Ltd. | System and method for keyword spotting using representative dictionary |
US9641444B2 (en) | 2014-01-30 | 2017-05-02 | Verint Systems Ltd. | System and method for extracting user identifiers over encrypted communication traffic |
US9646245B2 (en) | 2012-10-29 | 2017-05-09 | Verint Systems Ltd. | System and method for identifying contacts of a target user in a social network |
US9692730B2 (en) | 2011-01-27 | 2017-06-27 | Verint Systems Ltd. | System and method for decoding traffic over proxy servers |
US9690873B2 (en) | 2013-01-31 | 2017-06-27 | Verint Systems Ltd. | System and method for bit-map based keyword spotting in communication traffic |
US20170221372A1 (en) * | 2007-01-30 | 2017-08-03 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US9740915B2 (en) | 2011-10-31 | 2017-08-22 | Verint Systems Ltd. | System and method for link analysis based on image processing |
US9767279B2 (en) | 2012-04-23 | 2017-09-19 | Verint Systems Ltd. | Systems and methods for combined physical and cyber data security |
US9785701B2 (en) | 2014-01-30 | 2017-10-10 | Verint Systems Ltd. | Systems and methods for keyword spotting using alternating search algorithms |
US9871715B2 (en) | 2013-07-04 | 2018-01-16 | Verint Systems Ltd. | System and method for automated generation of web decoding templates |
US9923913B2 (en) | 2013-06-04 | 2018-03-20 | Verint Systems Ltd. | System and method for malware detection learning |
US10061922B2 (en) | 2012-04-30 | 2018-08-28 | Verint Systems Ltd. | System and method for malware detection |
US10142426B2 (en) | 2015-03-29 | 2018-11-27 | Verint Systems Ltd. | System and method for identifying communication session participants based on traffic patterns |
US10298622B2 (en) | 2012-07-29 | 2019-05-21 | Verint Systems Ltd. | System and method for passive decoding of social network activity using replica database |
US10491609B2 (en) | 2016-10-10 | 2019-11-26 | Verint Systems Ltd. | System and method for generating data sets for learning to identify user actions |
US10546008B2 (en) | 2015-10-22 | 2020-01-28 | Verint Systems Ltd. | System and method for maintaining a dynamic dictionary |
US10560842B2 (en) | 2015-01-28 | 2020-02-11 | Verint Systems Ltd. | System and method for combined network-side and off-air monitoring of wireless networks |
US10614107B2 (en) | 2015-10-22 | 2020-04-07 | Verint Systems Ltd. | System and method for keyword searching using both static and dynamic dictionaries |
US10630588B2 (en) | 2014-07-24 | 2020-04-21 | Verint Systems Ltd. | System and method for range matching |
US10958613B2 (en) | 2018-01-01 | 2021-03-23 | Verint Systems Ltd. | System and method for identifying pairs of related application users |
US10972558B2 (en) | 2017-04-30 | 2021-04-06 | Verint Systems Ltd. | System and method for tracking users of computer applications |
US10999295B2 (en) | 2019-03-20 | 2021-05-04 | Verint Systems Ltd. | System and method for de-anonymizing actions and messages on networks |
US11138617B2 (en) | 2014-04-28 | 2021-10-05 | Verint Systems Ltd. | System and method for demographic profiling of mobile terminal users based on network-centric estimation of installed mobile applications and their usage patterns |
US11381977B2 (en) | 2016-04-25 | 2022-07-05 | Cognyte Technologies Israel Ltd. | System and method for decrypting communication exchanged on a wireless local area network |
US11399016B2 (en) | 2019-11-03 | 2022-07-26 | Cognyte Technologies Israel Ltd. | System and method for identifying exchanges of encrypted communication traffic |
US11403559B2 (en) | 2018-08-05 | 2022-08-02 | Cognyte Technologies Israel Ltd. | System and method for using a user-action log to learn to classify encrypted traffic |
US11575625B2 (en) | 2017-04-30 | 2023-02-07 | Cognyte Technologies Israel Ltd. | System and method for identifying relationships between users of computer applications |
US11925439B2 (en) | 2018-10-23 | 2024-03-12 | Zoll Medical Corporation | Data playback interface for a medical device |
US12073928B2 (en) | 2019-03-22 | 2024-08-27 | Zoll Medical Corporation | Handling of age transmitted data in medical device system |
Families Citing this family (49)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7165927B2 (en) | 2002-06-19 | 2007-01-23 | Brooks Automation, Inc. | Automated material handling system for semiconductor manufacturing based on a combination of vertical carousels and overhead hoists |
WO2004034438A2 (en) | 2002-10-11 | 2004-04-22 | Brooks Automation, Inc. | Access to one or more levels of material storage shelves by an overhead hoist transport vehicle from a single track position |
US10878646B2 (en) | 2005-12-08 | 2020-12-29 | Smartdrive Systems, Inc. | Vehicle event recorder systems |
US20070150138A1 (en) | 2005-12-08 | 2007-06-28 | James Plante | Memory management in event recording systems |
US9201842B2 (en) | 2006-03-16 | 2015-12-01 | Smartdrive Systems, Inc. | Vehicle event recorder systems and networks having integrated cellular wireless communications systems |
US8996240B2 (en) | 2006-03-16 | 2015-03-31 | Smartdrive Systems, Inc. | Vehicle event recorders with integrated web server |
WO2007109162A2 (en) * | 2006-03-17 | 2007-09-27 | Viddler, Inc. | Methods and systems for displaying videos with overlays and tags |
US8989959B2 (en) * | 2006-11-07 | 2015-03-24 | Smartdrive Systems, Inc. | Vehicle operator performance history recording, scoring and reporting systems |
US8649933B2 (en) | 2006-11-07 | 2014-02-11 | Smartdrive Systems Inc. | Power management systems for automotive video event recorders |
US8868288B2 (en) | 2006-11-09 | 2014-10-21 | Smartdrive Systems, Inc. | Vehicle exception event management systems |
US10410676B2 (en) | 2006-11-27 | 2019-09-10 | Kbport Llc | Portable tablet computer based multiple sensor mount having sensor input integration with real time user controlled commenting and flagging and method of using same |
US9640089B2 (en) * | 2009-09-15 | 2017-05-02 | Kbport Llc | Method and apparatus for multiple medical simulator integration |
US8239092B2 (en) | 2007-05-08 | 2012-08-07 | Smartdrive Systems Inc. | Distributed vehicle event recorder systems having a portable memory data transfer system |
CN101802731A (en) * | 2007-09-11 | 2010-08-11 | Abb公司 | A system and a computer implemented method for automatically displaying process information in an industrial control system |
US20110191809A1 (en) | 2008-01-30 | 2011-08-04 | Cinsay, Llc | Viral Syndicated Interactive Product System and Method Therefor |
US8312486B1 (en) | 2008-01-30 | 2012-11-13 | Cinsay, Inc. | Interactive product placement system and method therefor |
US11227315B2 (en) | 2008-01-30 | 2022-01-18 | Aibuy, Inc. | Interactive product placement system and method therefor |
US8170976B2 (en) * | 2008-10-17 | 2012-05-01 | The Boeing Company | Assessing student performance and providing instructional mentoring |
US9495885B2 (en) * | 2008-12-26 | 2016-11-15 | Kbport Llc | Method and apparatus for illumination and recording of internal cavity of medical simulator and integrating simulation data |
US9418568B2 (en) | 2009-09-29 | 2016-08-16 | Advanced Training System Llc | System, method and apparatus for driver training system with dynamic mirrors |
US11875707B2 (en) | 2009-09-29 | 2024-01-16 | Advanced Training Systems, Inc. | System, method and apparatus for adaptive driver training |
US8469711B2 (en) | 2009-09-29 | 2013-06-25 | Advanced Training System Llc | System, method and apparatus for driver training of shifting |
US8699566B2 (en) * | 2010-01-27 | 2014-04-15 | International Business Machines Corporation | Adaptive and integrated visualization of spatiotemporal data from large-scale simulations |
US8683337B2 (en) * | 2010-06-09 | 2014-03-25 | Microsoft Corporation | Seamless playback of composite media |
US20120179039A1 (en) * | 2011-01-07 | 2012-07-12 | Laurent Pelissier | Methods and apparatus for producing video records of use of medical ultrasound imaging systems |
US9786193B2 (en) | 2011-09-01 | 2017-10-10 | L-3 Communications Corporation | Adaptive training system, method and apparatus |
WO2013033723A2 (en) | 2011-09-01 | 2013-03-07 | L-3 Communications Corporation | Adaptive training system, method and apparatus |
US9728228B2 (en) * | 2012-08-10 | 2017-08-08 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
US20140047371A1 (en) * | 2012-08-10 | 2014-02-13 | Smartdrive Systems Inc. | Vehicle Event Playback Apparatus and Methods |
US10692591B2 (en) * | 2013-02-01 | 2020-06-23 | B-Line Medical, Llc | Apparatus, method and computer readable medium for tracking data and events |
US9501878B2 (en) | 2013-10-16 | 2016-11-22 | Smartdrive Systems, Inc. | Vehicle event playback apparatus and methods |
US9610955B2 (en) | 2013-11-11 | 2017-04-04 | Smartdrive Systems, Inc. | Vehicle fuel consumption monitor and feedback systems |
KR101475632B1 (en) * | 2013-12-20 | 2014-12-22 | 엘에스산전 주식회사 | Method for playing operating record data of ems |
EP3089139A4 (en) * | 2013-12-26 | 2017-06-14 | Japan Science And Technology Agency | Movement learning support device and movement learning support method |
US8892310B1 (en) | 2014-02-21 | 2014-11-18 | Smartdrive Systems, Inc. | System and method to detect execution of driving maneuvers |
JP6132801B2 (en) * | 2014-03-31 | 2017-05-24 | 富士フイルム株式会社 | Data output apparatus, method and program |
JP6196575B2 (en) * | 2014-03-31 | 2017-09-13 | 富士フイルム株式会社 | Data output apparatus, method and program |
US10693736B2 (en) * | 2014-10-16 | 2020-06-23 | International Business Machines Corporation | Real time simulation monitoring |
US9663127B2 (en) | 2014-10-28 | 2017-05-30 | Smartdrive Systems, Inc. | Rail vehicle event detection and recording system |
US11069257B2 (en) | 2014-11-13 | 2021-07-20 | Smartdrive Systems, Inc. | System and method for detecting a vehicle event and generating review criteria |
US9679420B2 (en) | 2015-04-01 | 2017-06-13 | Smartdrive Systems, Inc. | Vehicle event recording system and method |
US20170116870A1 (en) * | 2015-10-21 | 2017-04-27 | Duolingo, Inc. | Automatic test personalization |
GB2555377A (en) * | 2016-10-13 | 2018-05-02 | Thermoteknix Systems Ltd | Monitoring system with interactive display interface |
WO2018118858A1 (en) | 2016-12-19 | 2018-06-28 | National Board Of Medical Examiners | Medical training and performance assessment instruments, methods, and systems |
US11462121B2 (en) * | 2017-02-15 | 2022-10-04 | Cae Inc. | Visualizing sub-systems of a virtual simulated element in an interactive computer simulation system |
US10755748B2 (en) | 2017-12-28 | 2020-08-25 | Sling Media L.L.C. | Systems and methods for producing annotated class discussion videos including responsive post-production content |
CN113272910A (en) * | 2018-11-15 | 2021-08-17 | 直观外科手术操作公司 | Training a user using an index to a motion picture |
FR3088751B1 (en) * | 2018-11-16 | 2021-02-12 | Blade | PROCESS FOR CAPTURING AND BROADCASTING A USER'S COMPUTER SESSION |
CN115412754A (en) * | 2022-08-16 | 2022-11-29 | 郑州小鸟信息科技有限公司 | Method for synchronously recording and replaying scenes of multiple signal sources based on same time axis |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6370457B1 (en) * | 1996-03-12 | 2002-04-09 | Training Innovations Group, Llc | Debriefing systems and methods for retrieving and presenting multiple datastreams with time indication marks in time synchronism |
US7265663B2 (en) * | 2001-11-28 | 2007-09-04 | Trivinci Systems, Llc | Multimedia racing experience system |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5077666A (en) * | 1988-11-07 | 1991-12-31 | Emtek Health Care Systems, Inc. | Medical information system with automatic updating of task list in response to charting interventions on task list window into an associated form |
US5441047A (en) * | 1992-03-25 | 1995-08-15 | David; Daniel | Ambulatory patient health monitoring techniques utilizing interactive visual communication |
US5769640A (en) * | 1992-12-02 | 1998-06-23 | Cybernet Systems Corporation | Method and system for simulating medical procedures including virtual reality and control method and system for use therein |
US5553609A (en) * | 1995-02-09 | 1996-09-10 | Visiting Nurse Service, Inc. | Intelligent remote visual monitoring system for home health care service |
US5882206A (en) * | 1995-03-29 | 1999-03-16 | Gillio; Robert G. | Virtual surgery system |
US20020002562A1 (en) * | 1995-11-03 | 2002-01-03 | Thomas P. Moran | Computer controlled display system using a graphical replay device to control playback of temporal data representing collaborative activities |
US6430997B1 (en) * | 1995-11-06 | 2002-08-13 | Trazer Technologies, Inc. | System and method for tracking and assessing movement skills in multidimensional space |
US6077082A (en) * | 1998-02-02 | 2000-06-20 | Mitsubishi Electric Information Technology Center America, Inc. (Ita) | Personal patient simulation |
US7860725B2 (en) * | 1998-05-26 | 2010-12-28 | Ineedmd.Com, Inc. | Method for remote medical consultation and care |
US7134074B2 (en) * | 1998-12-25 | 2006-11-07 | Matsushita Electric Industrial Co., Ltd. | Data processing method and storage medium, and program for causing computer to execute the data processing method |
US6739877B2 (en) * | 2001-03-06 | 2004-05-25 | Medical Simulation Corporation | Distributive processing simulation method and system for training healthcare teams |
US7231135B2 (en) * | 2001-05-18 | 2007-06-12 | Pentax Of American, Inc. | Computer-based video recording and management system for medical diagnostic equipment |
US20030105558A1 (en) * | 2001-11-28 | 2003-06-05 | Steele Robert C. | Multimedia racing experience system and corresponding experience based displays |
US6957392B2 (en) * | 2002-01-16 | 2005-10-18 | Laszlo Systems, Inc. | Interface engine providing a continuous user interface |
CN100379391C (en) * | 2002-05-07 | 2008-04-09 | 国立大学法人京都大学 | Medical cockpit system |
US20050264472A1 (en) * | 2002-09-23 | 2005-12-01 | Rast Rodger H | Display methods and systems |
US7082572B2 (en) * | 2002-12-30 | 2006-07-25 | The Board Of Trustees Of The Leland Stanford Junior University | Methods and apparatus for interactive map-based analysis of digital video content |
US20040223054A1 (en) * | 2003-05-06 | 2004-11-11 | Rotholtz Ben Aaron | Multi-purpose video surveillance |
US8393905B2 (en) * | 2004-12-17 | 2013-03-12 | Board Of Supervisors Of Louisiana State University And Agricultural And Mechanical College | Medical simulation computer system |
US7479967B2 (en) * | 2005-04-11 | 2009-01-20 | Systems Technology Inc. | System for combining virtual and real-time environments |
US7930419B2 (en) * | 2005-12-04 | 2011-04-19 | Turner Broadcasting System, Inc. | System and method for delivering video and audio content over a network |
-
2006
- 2006-12-15 US US11/611,792 patent/US8113844B2/en active Active
-
2008
- 2008-05-07 US US12/116,472 patent/US20080261192A1/en not_active Abandoned
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6370457B1 (en) * | 1996-03-12 | 2002-04-09 | Training Innovations Group, Llc | Debriefing systems and methods for retrieving and presenting multiple datastreams with time indication marks in time synchronism |
US7265663B2 (en) * | 2001-11-28 | 2007-09-04 | Trivinci Systems, Llc | Multimedia racing experience system |
Cited By (97)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100030785A1 (en) * | 2005-07-12 | 2010-02-04 | Wilson Andrew S | Distributed capture and aggregation of dynamic application usage information |
US8135827B2 (en) | 2005-07-12 | 2012-03-13 | Visible Measures Corp. | Distributed capture and aggregation of dynamic application usage information |
US8745647B1 (en) * | 2006-12-26 | 2014-06-03 | Visible Measures Corp. | Method and system for internet video and rich media behavioral measurement |
US20170221372A1 (en) * | 2007-01-30 | 2017-08-03 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US10152897B2 (en) * | 2007-01-30 | 2018-12-11 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US20090106003A1 (en) * | 2007-10-23 | 2009-04-23 | Universal Systems And Technology, Inc. | System, method and apparatus for management of simulations |
US20090319916A1 (en) * | 2008-06-24 | 2009-12-24 | Microsoft Corporation | Techniques to auto-attend multimedia conference events |
US20140308631A1 (en) * | 2008-07-28 | 2014-10-16 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US9495882B2 (en) * | 2008-07-28 | 2016-11-15 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US10127831B2 (en) * | 2008-07-28 | 2018-11-13 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US11227240B2 (en) | 2008-07-28 | 2022-01-18 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US11636406B2 (en) | 2008-07-28 | 2023-04-25 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US20170116881A1 (en) * | 2008-07-28 | 2017-04-27 | Breakthrough Performancetech, Llc | Systems and methods for computerized interactive skill training |
US9135630B2 (en) | 2009-09-23 | 2015-09-15 | Verint Systems Ltd. | Systems and methods for large-scale link analysis |
US20110131208A1 (en) * | 2009-09-23 | 2011-06-02 | Verint Systems Ltd. | Systems and methods for large-scale link analysis |
US20110142217A1 (en) * | 2009-12-10 | 2011-06-16 | Verint Systems Ltd. | Methods and systems for mass link analysis using rule engines |
US9154640B2 (en) | 2009-12-10 | 2015-10-06 | Verint Systems Ltd. | Methods and systems for mass link analysis using rule engines |
US20110238723A1 (en) * | 2010-01-31 | 2011-09-29 | Verint Systems Ltd. | Systems and methods for web decoding |
US8959025B2 (en) | 2010-04-28 | 2015-02-17 | Verint Systems Ltd. | System and method for automatic identification of speech coding scheme |
US8364147B2 (en) | 2010-04-28 | 2013-01-29 | Verint Americas, Inc. | System and method for determining commonly used communication terminals and for identifying noisy entities in large-scale link analysis |
US8509733B2 (en) | 2010-04-28 | 2013-08-13 | Verint Americas, Inc. | System and method for determining commonly used communication terminals and for identifying noisy entities in large-scale link analysis |
US9197523B2 (en) | 2010-06-08 | 2015-11-24 | Verint Systems Ltd. | Systems and methods for extracting media from network traffic having unknown protocols |
US8681640B2 (en) | 2010-06-08 | 2014-03-25 | Verint Systems, Ltd. | Systems and methods for extracting media from network traffic having unknown protocols |
US10547523B2 (en) | 2010-06-08 | 2020-01-28 | Verint Systems Ltd. | Systems and methods for extracting media from network traffic having unknown protocols |
US9203712B2 (en) | 2010-10-31 | 2015-12-01 | Verint Systems Ltd. | System and method for IP target traffic analysis |
US8665728B2 (en) | 2010-10-31 | 2014-03-04 | Verint Systems, Ltd. | System and method for IP target traffic analysis |
US10862869B2 (en) | 2011-01-27 | 2020-12-08 | Verint Systems Ltd. | System and method for decoding traffic over proxy servers |
US9929920B2 (en) | 2011-01-27 | 2018-03-27 | Verint Systems Ltd. | System and method for efficient classification and processing of network traffic |
US9264446B2 (en) | 2011-01-27 | 2016-02-16 | Verint Systems Ltd. | System and method for efficient classification and processing of network traffic |
US8767551B2 (en) | 2011-01-27 | 2014-07-01 | Verint Systems, Ltd. | System and method for flow table management |
US9692730B2 (en) | 2011-01-27 | 2017-06-27 | Verint Systems Ltd. | System and method for decoding traffic over proxy servers |
US10454790B2 (en) | 2011-01-27 | 2019-10-22 | Verint Systems Ltd | System and method for efficient classification and processing of network traffic |
US20120215507A1 (en) * | 2011-02-22 | 2012-08-23 | Utah State University | Systems and methods for automated assessment within a virtual environment |
US8959329B2 (en) | 2011-04-14 | 2015-02-17 | Verint Sytems, Ltd.. | System and method for selective inspection of encrypted traffic |
US8990238B2 (en) | 2011-04-27 | 2015-03-24 | Verint Systems Ltd. | System and method for keyword spotting using multiple character encoding schemes |
US11196820B2 (en) | 2011-07-31 | 2021-12-07 | Verint Systems Ltd. | System and method for main page identification in web decoding |
US10547691B2 (en) | 2011-07-31 | 2020-01-28 | Verint Systems Ltd. | System and method for main page identification in web decoding |
US9253261B2 (en) | 2011-07-31 | 2016-02-02 | Verint Systems Ltd. | System and method for main page identification in web decoding |
US9223848B2 (en) | 2011-10-31 | 2015-12-29 | Verint Systems Ltd. | System and method of combined database system |
US9740915B2 (en) | 2011-10-31 | 2017-08-22 | Verint Systems Ltd. | System and method for link analysis based on image processing |
US9060029B2 (en) | 2011-10-31 | 2015-06-16 | Verint Systems Ltd. | System and method for target profiling using social network analysis |
US9767279B2 (en) | 2012-04-23 | 2017-09-19 | Verint Systems Ltd. | Systems and methods for combined physical and cyber data security |
US11316878B2 (en) | 2012-04-30 | 2022-04-26 | Cognyte Technologies Israel Ltd. | System and method for malware detection |
US10061922B2 (en) | 2012-04-30 | 2018-08-28 | Verint Systems Ltd. | System and method for malware detection |
US9497167B2 (en) | 2012-07-29 | 2016-11-15 | Verint Systems Ltd. | System and method for automatic provisioning of multi-stage rule-based traffic filtering |
US9491069B2 (en) | 2012-07-29 | 2016-11-08 | Verint Systems Ltd. | System and method of high volume rule engine |
US10298622B2 (en) | 2012-07-29 | 2019-05-21 | Verint Systems Ltd. | System and method for passive decoding of social network activity using replica database |
US10079933B2 (en) | 2012-10-21 | 2018-09-18 | Verint Systems Ltd. | System and method for user-privacy-aware communication monitoring and analysis |
US9363667B2 (en) | 2012-10-21 | 2016-06-07 | Verint Systems Ltd. | System and method for user-privacy-aware communication monitoring and analysis |
US9386028B2 (en) | 2012-10-23 | 2016-07-05 | Verint Systems Ltd. | System and method for malware detection using multidimensional feature clustering |
US9646245B2 (en) | 2012-10-29 | 2017-05-09 | Verint Systems Ltd. | System and method for identifying contacts of a target user in a social network |
US10866998B2 (en) | 2012-10-29 | 2020-12-15 | Verint Systems Ltd. | System and method for identifying contacts of a target user in a social network |
CN110277160A (en) * | 2013-01-11 | 2019-09-24 | 卓尔医学产品公司 | Code checks the system and decision support method of medical events |
US10976908B2 (en) | 2013-01-11 | 2021-04-13 | Zoll Medical Corporation | EMS decision support interface, event history, and related tools |
WO2014110280A2 (en) | 2013-01-11 | 2014-07-17 | Zoll Medical Corporation | Ems decision support interface, event history, and related tools |
US11816322B2 (en) | 2013-01-11 | 2023-11-14 | Zoll Medical Corporation | EMS decision support interface, event history, and related tools |
EP2943926A4 (en) * | 2013-01-11 | 2018-05-23 | Zoll Medical Corporation | Ems decision support interface, event history, and related tools |
US10198427B2 (en) | 2013-01-29 | 2019-02-05 | Verint Systems Ltd. | System and method for keyword spotting using representative dictionary |
US9639520B2 (en) | 2013-01-29 | 2017-05-02 | Verint Systems Ltd. | System and method for keyword spotting using representative dictionary |
US9690873B2 (en) | 2013-01-31 | 2017-06-27 | Verint Systems Ltd. | System and method for bit-map based keyword spotting in communication traffic |
US9589073B2 (en) | 2013-04-28 | 2017-03-07 | Verint Systems Ltd. | Systems and methods for keyword spotting using adaptive management of multiple pattern matching algorithms |
US9923913B2 (en) | 2013-06-04 | 2018-03-20 | Verint Systems Ltd. | System and method for malware detection learning |
US11038907B2 (en) | 2013-06-04 | 2021-06-15 | Verint Systems Ltd. | System and method for malware detection learning |
US9871715B2 (en) | 2013-07-04 | 2018-01-16 | Verint Systems Ltd. | System and method for automated generation of web decoding templates |
US11038789B2 (en) | 2013-07-04 | 2021-06-15 | Verint Systems Ltd. | System and method for automated generation of web decoding templates |
US9628580B2 (en) | 2013-10-30 | 2017-04-18 | Verint Systems Ltd. | System and method for conditional analysis of network traffic |
US10084876B2 (en) | 2013-10-30 | 2018-09-25 | Verint Systems Ltd. | System and method for conditional analysis of network traffic |
US9641444B2 (en) | 2014-01-30 | 2017-05-02 | Verint Systems Ltd. | System and method for extracting user identifiers over encrypted communication traffic |
US9785701B2 (en) | 2014-01-30 | 2017-10-10 | Verint Systems Ltd. | Systems and methods for keyword spotting using alternating search algorithms |
US10719540B2 (en) | 2014-01-30 | 2020-07-21 | Verint Systems Ltd. | Systems and methods for keyword spotting using alternating search algorithms |
US11138617B2 (en) | 2014-04-28 | 2021-10-05 | Verint Systems Ltd. | System and method for demographic profiling of mobile terminal users based on network-centric estimation of installed mobile applications and their usage patterns |
US11463360B2 (en) | 2014-07-24 | 2022-10-04 | Cognyte Technologies Israel Ltd. | System and method for range matching |
US10630588B2 (en) | 2014-07-24 | 2020-04-21 | Verint Systems Ltd. | System and method for range matching |
US11432139B2 (en) | 2015-01-28 | 2022-08-30 | Cognyte Technologies Israel Ltd. | System and method for combined network-side and off-air monitoring of wireless networks |
US10560842B2 (en) | 2015-01-28 | 2020-02-11 | Verint Systems Ltd. | System and method for combined network-side and off-air monitoring of wireless networks |
US10142426B2 (en) | 2015-03-29 | 2018-11-27 | Verint Systems Ltd. | System and method for identifying communication session participants based on traffic patterns |
US10623503B2 (en) | 2015-03-29 | 2020-04-14 | Verint Systems Ltd. | System and method for identifying communication session participants based on traffic patterns |
US10614107B2 (en) | 2015-10-22 | 2020-04-07 | Verint Systems Ltd. | System and method for keyword searching using both static and dynamic dictionaries |
US11093534B2 (en) | 2015-10-22 | 2021-08-17 | Verint Systems Ltd. | System and method for keyword searching using both static and dynamic dictionaries |
US11386135B2 (en) | 2015-10-22 | 2022-07-12 | Cognyte Technologies Israel Ltd. | System and method for maintaining a dynamic dictionary |
US10546008B2 (en) | 2015-10-22 | 2020-01-28 | Verint Systems Ltd. | System and method for maintaining a dynamic dictionary |
US11381977B2 (en) | 2016-04-25 | 2022-07-05 | Cognyte Technologies Israel Ltd. | System and method for decrypting communication exchanged on a wireless local area network |
US10491609B2 (en) | 2016-10-10 | 2019-11-26 | Verint Systems Ltd. | System and method for generating data sets for learning to identify user actions |
US11303652B2 (en) | 2016-10-10 | 2022-04-12 | Cognyte Technologies Israel Ltd | System and method for generating data sets for learning to identify user actions |
US10944763B2 (en) | 2016-10-10 | 2021-03-09 | Verint Systems, Ltd. | System and method for generating data sets for learning to identify user actions |
US11575625B2 (en) | 2017-04-30 | 2023-02-07 | Cognyte Technologies Israel Ltd. | System and method for identifying relationships between users of computer applications |
US11095736B2 (en) | 2017-04-30 | 2021-08-17 | Verint Systems Ltd. | System and method for tracking users of computer applications |
US10972558B2 (en) | 2017-04-30 | 2021-04-06 | Verint Systems Ltd. | System and method for tracking users of computer applications |
US11336738B2 (en) | 2017-04-30 | 2022-05-17 | Cognyte Technologies Israel Ltd. | System and method for tracking users of computer applications |
US11336609B2 (en) | 2018-01-01 | 2022-05-17 | Cognyte Technologies Israel Ltd. | System and method for identifying pairs of related application users |
US10958613B2 (en) | 2018-01-01 | 2021-03-23 | Verint Systems Ltd. | System and method for identifying pairs of related application users |
US11403559B2 (en) | 2018-08-05 | 2022-08-02 | Cognyte Technologies Israel Ltd. | System and method for using a user-action log to learn to classify encrypted traffic |
US11925439B2 (en) | 2018-10-23 | 2024-03-12 | Zoll Medical Corporation | Data playback interface for a medical device |
US10999295B2 (en) | 2019-03-20 | 2021-05-04 | Verint Systems Ltd. | System and method for de-anonymizing actions and messages on networks |
US11444956B2 (en) | 2019-03-20 | 2022-09-13 | Cognyte Technologies Israel Ltd. | System and method for de-anonymizing actions and messages on networks |
US12073928B2 (en) | 2019-03-22 | 2024-08-27 | Zoll Medical Corporation | Handling of age transmitted data in medical device system |
US11399016B2 (en) | 2019-11-03 | 2022-07-26 | Cognyte Technologies Israel Ltd. | System and method for identifying exchanges of encrypted communication traffic |
Also Published As
Publication number | Publication date |
---|---|
US20080145830A1 (en) | 2008-06-19 |
US8113844B2 (en) | 2012-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8113844B2 (en) | Method, system, and computer-readable recording medium for synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network | |
US20080145829A1 (en) | Synchronous multi-media recording and playback with end user control of time, data, and event visualization for playback control over a network | |
US20120206566A1 (en) | Methods and systems for relating to the capture of multimedia content of observed persons performing a task for evaluation | |
Roshier et al. | Veterinary students' usage and perception of video teaching resources | |
CN103325400B (en) | A kind of surgical operation images recorded broadcast teaching system and method | |
US20130212507A1 (en) | Methods and systems for aligning items of evidence to an evaluation framework | |
US20130212521A1 (en) | Methods and systems for use with an evaluation workflow for an evidence-based evaluation | |
Weinger et al. | Video capture of clinical care to enhance patient safety | |
US20190172493A1 (en) | Generating video-notes from videos using machine learning | |
Dickson et al. | Student reactions to classroom lecture capture | |
WO2010144920A1 (en) | System for sequential juxtaposition of separately recorded scenes | |
Gorissen et al. | Usage reporting on recorded lectures using educational data mining | |
KR20060035729A (en) | Methods and systems for presenting and recording class sessions in a virtual classroom | |
US20070166689A1 (en) | Checklist builder and reporting for skills assessment tool | |
Chang | Constructing a streaming video-based learning forum for collaborative learning | |
US20190026006A1 (en) | System and method for presenting video and associated documents and for tracking viewing thereof | |
Viel et al. | Multimedia multi-device educational presentations preserved as interactive multi-video objects | |
Viel et al. | Interaction with a Problem Solving Multi Video Lecture: Observing Students from Distance and Traditional Learning Courses. | |
Elliot et al. | Digital video technology and production 101: lights, camera, action | |
US20180374376A1 (en) | Methods and systems of facilitating training based on media | |
Boronat et al. | PAMTEL-RT: web-based multimedia platform for tele-assistance of pediatric health emergencies in real time in training centers | |
KR101419655B1 (en) | The system to evaluate training with bookmark and the method of it | |
O’Donoghue et al. | The role of live video capture production in the development of student communication skills | |
Mackenzie et al. | Video analysis: an approach for use in healthcare | |
Shi et al. | 12 Tips for Creating High Impact Clinical Encounter Videos-with Technical Pointers |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: EXPRESSLY ABANDONED -- DURING EXAMINATION |