[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2017214400A1 - Networked apparatus for real-time visual integration of digital video with telemetry data feeds - Google Patents

Networked apparatus for real-time visual integration of digital video with telemetry data feeds Download PDF

Info

Publication number
WO2017214400A1
WO2017214400A1 PCT/US2017/036558 US2017036558W WO2017214400A1 WO 2017214400 A1 WO2017214400 A1 WO 2017214400A1 US 2017036558 W US2017036558 W US 2017036558W WO 2017214400 A1 WO2017214400 A1 WO 2017214400A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
telemetry data
integrated
wireless device
data
Prior art date
Application number
PCT/US2017/036558
Other languages
French (fr)
Inventor
Gamaliel AGUILAR-GAMEZ
Manu YARESHIMI
Original Assignee
9104 Studios Llc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 9104 Studios Llc filed Critical 9104 Studios Llc
Publication of WO2017214400A1 publication Critical patent/WO2017214400A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen
    • H04N21/43072Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen of multiple content streams on the same device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/27Server based end-user applications
    • H04N21/274Storing end-user multimedia data in response to end-user request, e.g. network recorder
    • H04N21/2743Video hosting of uploaded data from client
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42201Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/437Interfacing the upstream path of the transmission network, e.g. for transmitting client requests to a VOD server
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/81Monomedia components thereof
    • H04N21/8126Monomedia components thereof involving additional data, e.g. news, sports, stocks, weather forecasts
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/04Synchronising

Definitions

  • TITLE NETWORKED APPARATUS FOR REAL-TIME VISUAL
  • the illustrative embodiments relate to digital video management. More specifically, but not exclusively, the illustrative embodiments relate to a system, method, network, and apparatus for integrating video and telemetry data captured from multiple sources, devices, or sensors into a single stream.
  • Enhanced wireless communications have also made communication of digital video content more standard. For example, video captured by drones, vehicles, sport or helmet cameras, or body cameras is becoming very common. More users are also capturing associated data, such as weather, temperature, vehicle performance data, and biometric data. Mechanical and biometric sensors are being successfully integrated in clothing, wearables, personal electronic devices, and vehicles. As users capture more and longer video and more associated sensor data, real-time video management may become more tedious and overwhelming or even impossible. This is especially true for those with limited resources.
  • One embodiment provides a system, device, apparatus, network, and method for combining video with telemetry data.
  • the video is received from a camera associated with a user at a wireless device.
  • Telemetry data associated with the video is received at the wireless device.
  • the telemetry data is time stamped as received.
  • the video is overlaid with the telemetry data to generate integrated video utilizing the wireless device.
  • the integrated video is communicated from the wireless device to one or more users.
  • a wireless device includes a process for executing a set of instructions and a memory storing the set of instructions. The set of instructions are executed by the processor to implement the method described above.
  • Another embodiment provides a wireless device integrating video and telemetry data.
  • the wireless device includes a processor executing a set of instructions.
  • the wireless device further includes a transceiver in communication with a video source and one or more data sources.
  • the wireless device further includes a memory storing the set of instructions.
  • the set of instructions are executed to receive the video from the video source, receive telemetry data associated with the video from the one or more data sources, time stamp the telemetry data as received, overlay the video with the telemetry data to generate integrated video, and communicate the integrated video to one or more users.
  • Another embodiment provides a system for receiving a video stream and telemetry data streams.
  • the sampling rate of the video stream and the telemetry data streams are determined.
  • a session start time or user defined start time and date is utilized to perform an initial synchronization of the video stream with the telemetry data streams. Additional synchronization is performed to match a layer of telemetry data with each video frame.
  • the associated telemetry data is overlaid on each video frame to generate an integrated video stream.
  • the integrated video stream is communicated to one or more selected devices.
  • the telemetry data streams are received from a number of sensors with a number of different sampling rates.
  • the video stream represents a number of different video streams received from a number of different video sources utilizing a number of different formats and video frame rates.
  • the telemetry data streams and the video data streams are synchronized for immediate, real-time or delayed playback.
  • a user selection of the telemetry data to be overlaid on the video stream is received.
  • the video stream and telemetry data are stored separately to overlay the video stream with the telemetry data stream frame-by -frame when the integrated video stream is requested.
  • the integrated video stream is uploaded to share with one or more users or devices.
  • the integrated video is uploaded to a cloud platform accessible by a plurality of users.
  • the telemetry data stream is saved into an XML file, artifacts are drafted on a layer, and data values are retrieved from the XML file for presentation, and the telemetry data stream data is synchronized with the video utilizing time stamps and sample rates for the telemetry data.
  • the wireless device is a smart phone, tablet, or laptop.
  • the telemetry data stream is captured by one or more sensors associated with a user or a vehicle.
  • the one or more sensors may include biometric sensors associated with the user.
  • the sensors may include vehicle sensors.
  • the sensors may include environmental sensors.
  • the telemetry data stream is rendered as part of the integrated video stream in response to user preferences.
  • FIG. 1 is a pictorial representation of a communications environment 100 in accordance with an illustrative embodiment
  • FIG. 2 is a block diagram of a device integrating video and telemetry data in accordance with an illustrative embodiment
  • FIG. 3 is a flowchart of a process for overlaying video data with telemetry data in accordance with an illustrative embodiment
  • FIG. 4 is a flowchart of a process for processing integrated video in accordance with an illustrative embodiment
  • FIG. 5 is a flowchart of a process for sharing integrated video in accordance with an illustrative embodiment
  • FIGs. 6-12 are pictorial representation of integrated videos in accordance with an illustrative embodiment
  • FIG. 13 is a pictorial representation of a computing device in accordance with an illustrative embodiment.
  • Video streams may be processed to synchronize real-time high-resolution raw format video with any number of digital data feeds processed as telemetry data.
  • the video streams or content may be received from any number of cameras or video capture devices.
  • the digital data feeds may correspond to any number of sensors (e.g., user, vehicle, environment, etc.), wireless devices, interfaces, input/output devices, components, systems, equipment, or data sources.
  • the illustrative embodiments may be utilized to automatically identify and synchronize telemetry data from multiple sources to be rendered as digital graphics in a digital video overlay. This synchronization of video and telemetry data may be performed in real-time, near real-time (slight processing and latency delays), or for subsequent playback or review.
  • the illustrative embodiments may be utilized for any number
  • the video may be captured and then received from any number of devices, such as cameras or video capture devices produced by Nikon, Cannon, Olympus, Ricoh, Leica, Panasonic, Sony, Apple, Samsung, GoPro, TomTom, Olfi,
  • the telemetry data may be received from any number of data sources including, but not limited to, cameras, smartphones, smart watches, smart clothing, global positioning devices, heart rate monitors, speedometers, gauges (e.g., fuel, tachometer, temperature, pressure, speedometer, electronic, etc.), and so forth.
  • the video and the telemetry data is integrated together and formatted to generate integrated video.
  • the integrated video may be generated and communicated to any number of users or third parties utilizing a smartphone, tablet, laptop, desktop personal computer, gaming console, web page, applications (e.g., wireless app, desktop program, etc.), augmented reality system, virtual reality system (e.g., HoloLens, Google Glass, Oculus Rift, etc.), or other communications, computing, or input/output device or interface.
  • applications e.g., wireless app, desktop program, etc.
  • augmented reality system e.g., virtual reality system
  • virtual reality system e.g., HoloLens, Google Glass, Oculus Rift, etc.
  • other communications computing, or input/output device or interface.
  • the illustrative embodiments may be utilized with any form of media distribution channels including social media applications, websites, and distribution systems.
  • third parties may also capture video or other data that may be utilized in the illustrative embodiments.
  • FIG. 1 is a pictorial representation of a communications environment 100 in accordance with an illustrative embodiment.
  • the communications environment 100 represents one or more locations, conditions, or scenarios in which the illustrative embodiments may be utilized and implemented.
  • a user 108 may be driving (or a passenger in) a vehicle
  • the notebook 110 which may include a laptop 112.
  • the laptop 112 may communicate with any number of video and data capture devices 113, such as a camera 114, a thermometer 116, a timer 118, a heart rate monitor 120, and a global positioning system 122.
  • the vehicle 110 or the laptop 112 may communicate with a network 124 utilizing a wireless signal 126.
  • a user 128 may be riding a bicycle 130.
  • the bicycle 130 or user 128 may be carrying a wireless device 102 which may communicate with sensors 134, 136, 138.
  • the wireless device 102 may execute an application 140 for integrating video and telemetry data from the sensors 134, 136, 138.
  • the wireless device 102 may communicate with the network 124 through the wireless signal 142.
  • the communications environment 100 may also include a cloud network 150.
  • the cloud network 150 may include servers 152, databases 154, and a cloud platform 156.
  • the cloud network 150 may also communicate with third party resources 158.
  • the cloud platform 156 may include any number of components or module including the logic engine 160.
  • the communications environment 100 may be configured to stream, store, or record the integrated video (i.e., video and telemetry data) from any number of users simultaneously.
  • the video and telemetry data may be captured directly by one or more smart devices, such as laptops, cell phones, tablets, or so forth. The data may then be communicated directly to other devices or through one or more networks, such as the network 124.
  • the user 108 may represent a racecar driver and the vehicle 110 may represent a racecar.
  • the vehicle 1 10 may also represent any number of other motorized or non-motorized vehicles or transportation devices, such as bicycles, boats, motorcycles, airplanes, bobsleds, skis, kayaks, all-terrain vehicles, snowmobiles, trucks, or so forth.
  • the data capture devices 113 represent any number of physical, optical,
  • the data capture devices 113 may capture data relating to user 108, the vehicle 110, or the environment of the user 108 and the vehicle 1 10.
  • the data capture devices 1 13 communicate with the laptop 112.
  • the data capture devices 1 13 may be connected utilizing any number of physical connectors, wires, cables, buses, or so forth, as well as any number of wireless communications protocols, standards, components, or systems.
  • the laptop 112 may communicate with the camera 1 14 utilizing a USB connector.
  • the laptop 1 12 may communicate with the camera 1 14 utilizing a Wi-Fi connection.
  • the laptop 112 may communicate with the thermometer 116, the timer 1 18, and the heart rate monitor 120 utilizing wireless communications, such as Bluetooth.
  • the global positioning system 122 may represent an application or components of the laptop 1 12 or an
  • any of the data capture devices 113 may represent integrated systems or subsystems of the vehicle 1 10, the laptop 1 12, or externally positioned devices.
  • the data capture devices 113 may include the camera 114.
  • the camera 114 may represent any number of video, image, or optical capture devices.
  • the camera 114 may be externally connected to the vehicle 1 10 utilizing any number of mounting
  • the camera 1 14 may include a camera housing containing a camera lens contained within a front surface of the camera body, various indicators on the surface of the camera housing (e.g., light emitting diodes, touch screens, etc.), input mechanisms (e.g., buttons, switches, toggles, scroll wheels, levers, etc.), and electronics (e.g., processor, light sensors, memories, power electronics, metadata sensors, etc.) internal to the camera housing for capturing images via the camera lens and performing other functions.
  • the camera 1 14 may automatically or manually zoom, focus, rotate, pivot, or move to capture the best video or images possible in the applicable environment and conditions.
  • the camera 1 14 may be controlled by the user 108 or any number of remotely located users (e.g., through signals sent to the camera 1 14 through the network 126).
  • the camera 114 may be configured to capture video utilizing any number of visible or non-visible spectra.
  • the camera 1 14 may be an integrated video system of the vehicle 110.
  • the camera 1 10 may also represent any number of cameras or a camera array (e.g., forward facing, rearward facing, side cameras, roof camera, undercarriage camera, etc.).
  • the video image may transition between any number of video streams or capture devices.
  • the camera 1 14 may represent a smart phone, tablet, or personal video device (e.g., GoPro, etc.).
  • the telemetry data read by the data capture devices 113 may be overlaid onto any number of video streams.
  • the telemetry data may be overlaid onto a single video stream at a time.
  • the telemetry data or different variants of the telemetry data may be simultaneously overlaid onto distinct video streams that may be communicated to one or more receiving or observing parties or devices.
  • the camera 1 14 may capture video utilizing any number of file formats.
  • the telemetry data may be integrated with the various types of video formats (e.g., Audio Video Interleave (AVI), flash video format (FLV), Advanced Systems Format (ASF), Windows Media Video (WMV), Apple QuickTime Movie (MOV), Moving Pictures Expert Group 4 (MP4), Advanced Video Coding High Definition (AVCHD), etc.), allowing any number, type, or combination of cameras, sensors, and capture devices to be utilized.
  • AVI Audio Video Interleave
  • FLV flash video format
  • ASF Advanced Systems Format
  • WMV Windows Media Video
  • MOV Apple QuickTime Movie
  • MP4 Moving Pictures Expert Group 4
  • AVCHD Advanced Video Coding High Definition
  • the data capture devices 113 may include one or more biometric sensors, including any number of wearable, implantable, or other sensory devices.
  • the biometric sensors may measure heart rate, heart rhythm, blood pressure, temperature, blood oxygenation, neural activity, blood or excretion content (e.g., adrenaline, blood, sugars, etc.), voice characteristics, and other applicable data. All or portions of the data measured by the data capture devices 1 13 may be integrated into a video stream.
  • the data capture devices 1 13 may provide telemetry data that may be utilized by any number of outside parties for observation, performance enhancement, security,
  • laptop 1 12 is shown as
  • the laptop 112 may also communicate with any number of devices directly or through other networks.
  • the integrated video stream generated by the laptop 112 utilizing the camera 1 14 and other data capture devices 113 may be streamed to the to the cloud network 150 for storage in the servers 152.
  • the integrated video stream may also be streamed to any number of other devices, systems, or components.
  • the video and telemetry data is instantaneously viewable via live video broadcast or as synchronized.
  • the integrated video may be encrypted, password protected, or otherwise secured to prevent unwanted interception and viewing.
  • Integrated or external transceivers, cards, devices, or other components may be utilized by the laptop 112 to communicate with the network 124.
  • the laptop 112 or any number of other computing or communications devices may generate and render the real-time data synchronized with the telemetry data.
  • the laptop 112 may synchronize the information and data received from the data capture devices 113 to ensure that the applicable information, data, and streams are synchronized and formatted for accuracy and effectiveness.
  • the laptop 112 may utilize specialized software, hardware, or logic to take 1) the telemetry data (e.g., from the set of sensors represented by the data capture devices 113) from each sensor which may each have distinct sampling rates and synchronize 2) video (e.g., from any number of video sources using different video standards and frame rates) to synchronize the 1) data, and 2) video at the individual frame level for immediate, live, or delayed streaming or playback.
  • the data and video may be saved discretely for synchronization at any time. In other embodiments, the data and video may be saved in a single file or stream.
  • the integrated video generated by the laptop 112 may be archived by the laptop as well as devices, such as the servers 152.
  • the archived session may include the telemetry data and the video separated with information and logic for overlaying the telemetry data on the video for playback.
  • the archived session may then be accessed as part of a virtual reality session. For example, another user may virtually ride-along with the user utilizing the archived session.
  • a user may have the archived session displayed as augmented reality so that the user may be able to race against herself from the archived race session.
  • the archived session may be displayed utilizing a heads- up-display showing the user the difference between the two performances both visually and with the associated data (e.g., time differentials, lap times, lap position, pulse rate, etc.).
  • the user in a real-time situation may be motivated to improve upon the archived session.
  • the archived session may also be utilized as part of a video game, simulation, or training session.
  • the integrated video may provide the option to perform a real-time race with a virtual or remote version of a runner who performs at an elite level for performing virtual races, benchmarking, training, or so forth.
  • the user 128 may be riding a bicycle 130 with sensors 134, 136, 138.
  • the wireless device 132 may compile the telemetry data from the sensors 134, 136, 138 with video captured by the wireless device 132.
  • the wireless device 132 may be mounted to the frame or handlebars of the bicycle 130 to capture forward facing video as the user 128 rides the bicycle.
  • the user 128 may be wearing a helmet with a camera mounted to the helmet.
  • the sensors 134, 136, 138 may capture data, such as heart rate of the user 128, speed of the bicycle 130 (e.g., speedometer), location (e.g., GPS), cadence, blood pressure, altitude, and other user 128, bicycle 130, or environmental data.
  • one of the sensors 134, 136, 138 may capture multiple data points or data sources as a single source or multiple sources that may be overlaid onto the video captured by the wireless device 132.
  • the wireless device 132 may also receive digital input from sources, such as external servers, web pages, local memory, web services, and so forth.
  • the application 140 may be executed by the wireless device 132 to overlay the video with the telemetry data.
  • the application 140 may be opened or activated to generate an integrated video stream or file.
  • the application 140 may receive the video and telemetry data internally or from one or more sources, time stamp the telemetry data as received, overlay the video with the telemetry data, and communicate the integrated video stream to one or more users/devices.
  • the application 140 may utilize a graphical user interface to display applicable information and receive user preferences and settings for controlling the generation and communication of the integrated video stream.
  • the application may utilize any number of fields, soft buttons, icons, text and graphical display elements, audio and sound alerts, and so forth.
  • the user 128 may utilize the graphical user interface of the application 140 to specify the position, color, size, font, refresh rate, display settings, or other information applicable to how the telemetry data is overlaid on the video to generate the integrated video.
  • the application 140 may be utilized by the user 128 or other parties to control how the telemetry data captured by the sensors 134, 136, 138 is formatted and displayed as part of the integrated video stream.
  • the user 128 may specify what telemetry data of the available telemetry data is shown, the position of the telemetry data, the color of the different telemetry data, associated labels displayed, charts, text type, size, language, graphical components, units of measurement (e.g., English, metric, etc.), refresh rate, resolution, image quality and size, and so forth.
  • the user 128 may specify information and specifics for the video or telemetry data.
  • the viewing user may utilize an application to set the preferences, settings, and format for the integrated video stream that they view, save, or edit.
  • the user preferences may be utilized to repeatedly overlay the telemetry data onto the video content so that
  • the integrated video stream generated by the wireless device 132 may be streamed or communicated to any number of users.
  • the users may also utilize a version of the application 140 to receive and view the integrated video stream.
  • the cloud network 150 may receive the integrated video streams from the laptop 112 and wireless device 132 for storage or further distribution.
  • users may access the cloud network 132 utilizing a portal and a secure identifier to view the integrated video stream. For example, a password, pin number, session identification, keyword, or so forth may be required to access the integrated video stream.
  • the mobile application 140 may be utilized to stream the integrated video content from the cloud network 150 or directly from the device streaming or generating the integrated video content.
  • the user may be required to pay to have access to the integrated video stream.
  • the cloud network 150 may be configured to receive integrated video streams or generate integrated video streams.
  • the user 128 may carry a camera 160, a smart watch 162, and sensors 164, 166.
  • the camera 160 may include an application and processor for rendering the telemetry data received from the smart watch 162 and sensors 164, 166.
  • the camera 160 or other connected device may stream the video as well as the telemetry data from the smart watch 162 and sensors 164, 166 (the smart watch 162 and sensors 164, 166 may also communicate with the cloud network 150).
  • the cloud platform 156 may utilize the logic engine 160 to render the video and telemetry data from the user 128 for storage or communication.
  • the servers 152 may include a video server that receives and stores video captured from any number of users. The video stored on the servers 152 may be requested for review, playback, editing, or on-demand review at any time with authorized credentials (e.g., username, password, granted access rights, etc.).
  • the servers 152 (or the laptop 1 12 or wireless device 132) may generate video with multiple versions or rendered content such that the versions are intended for distinct user groups. For example, some lay-users may not appreciate or want to see information about the gear the vehicle is in, engine RPMs, or so forth, whereas other racers or coaches may be very interested in seeing such telemetry data. As a result, post processing may be performed on the telemetry data or integrated video as it is received.
  • the integrated video may be generated by a train and the associated conductors.
  • the train may have a number of different video cameras feeding video to a wireless computing device.
  • the conductors may wear biometric sensors that measure biometrics, such as heart rate, blood pressure, voice characteristics, heart activity, neural activity, or so forth.
  • the wireless computing device collects the telemetry data and video data through physical connections or wireless feeds to compile and generate the integrated video.
  • a wireless computing device may be utilized to capture surveillance video for prisoners in a detention facility.
  • the wireless computing device may also receive biometric data from wearable, implantables, or external biometric sensors.
  • the biometric sensors may sense the pulse rate, temperature, blood pressure, and/or voice characteristics of the prisoner. Additional sensors may measure the ambient conditions, such as temperature, humidity, and noise level.
  • the integrated video may be utilized to monitor, predict, de-escalate, and prevent escalation of events within the detention facility in real-time.
  • the wireless computing device may perform facial, graphic, audio, or pattern recognition to detect people, sounds, actions, activities, or so forth.
  • the integrated video generated by the wireless computing device may be processed to generate any number of alerts and visual indicators which may also be combined as part of the integrated video.
  • a wireless computing/communications device may be utilized to capture video for law enforcement, security, emergency services, or military
  • cameras may be integrated in the uniform, helmet/hat, body armor, packs, or other items worn or carried by the potential users.
  • Biometric sensors may be similarly integrated or externally monitoring the user, environment, third parties, events, actions, conditions, and so forth.
  • the wireless device may represent a drone aircraft equipped to capture video as well as other telemetry data.
  • the drone may be utilized to perform security/monitoring (e.g., animal counting, crop monitoring, measure snow levels, border surveillance, personal protection, etc.), spray pesticides, deliver products, perform military operations, race, and/or perform other tasks, functions, and activities.
  • the telemetry data may include drone data, such as flight path, drone position, orientation, altitude, battery levels, speed and ambient data, such as time of day, temperature, weather conditions, and so forth.
  • the integrated video may provide a single broadcast feed monitorable by a number of users to watch the drone, monitor and maximize the flight path, regulate product usage, or help operators make adjustments to flight path or objectives.
  • the illustrative embodiments may also be utilized within the medical field to generate real-time or archived integrated video.
  • Video captured about a person as well as telemetry data may be utilized for any number of treatment, surgery, and therapeutic applications.
  • a surgeon performing a remote surgery may be able to see the video with integrated patient and environmental information, such as patient temperature, temperature heart rate, blood pressure, respiration rate, operating room temperature, surgical instrument status, and so forth.
  • the real-time telemetry data may improve surgeries and medical decisions to better care for patients and potentially save lives.
  • a vehicle such as a semi-truck, taxi, Uber, Lyft, vehicle- for-hire, or other professional driver may connect to a wireless device to receive integrated video.
  • the vehicle 110 may represent a commercial vehicle driven by a commercial driver.
  • the integrated video generated by the laptop 112 may be sent to the cloud network 150 through at least the network 124 for at least temporary storage in the server 152.
  • the integrated video may be retrieved from the servers 152 to determine liability, environmental conditions, user
  • a trucking company may pay a monthly service to 1) archive integrated video associated with their trucks/drivers in the cloud network 150, and 2) monitor their drivers in real-time when driving/ working through the cloud platform 156 utilizing a web browser, application, or other interface.
  • any incidents may be automatically sent to one or more designated parties or devices as an alert or indicator.
  • the integrated video stream may be automatically edited to show the moments before, during, and after the incident.
  • the telemetry data may show the biometric data of the user, truck, and environmental information associated with the truck/user.
  • the user's heart rate may be determined through contacts integrated with a steering wheel and one or more internal camera may monitor the user's eyes/pupils to determine the user's status and attention level to the road/surroundings.
  • the video feed may include views of the back, front, and/or interior of the truck to show the environment, driving conditions, nearby drivers, and/or the driver of the truck.
  • the integrated video may show one or more of the video feeds simultaneously.
  • the cloud network 150 and cloud platform 156 may be utilized to provide software-as-a-service (SaaS).
  • SaaS software-as-a-service
  • applications downloaded and executed by one or more of the devices of the communications environment 100 may be utilized to generate and manage integrated video from a number of users, organizations, companies, groups, or so forth.
  • a one-time, annual, monthly, or other service fee may be utilized to authorize the user, the application, and/or associated video services.
  • One or more service providers may operate the cloud network 150. The service providers may service hundreds or thousands of clients with integrated video services.
  • the service provider or individual clients may make the integrated video stored in the cloud network 150 available to any number of users as a paid subscription or for training, monitoring, entertainment, education, or any number of other purposes.
  • an archived session of a car race session for Team A may be purchased by Team B (if publicly available) for the purpose of training drivers or engineers while improving benchmarked results.
  • Archived sessions may be leveraged to answer "what if scenarios, hypotheticals, performance tests and other situations in a virtual environment. As a result, time, money, and effort may be saved when evaluating and improving performance metrics.
  • the use of virtual reality systems and technology embodiments may be enhanced utilizing available physical data and resources from the video and telemetry data that may be viewed in real-time and archived for subsequent viewing or more detailed analysis.
  • the integrated video may be utilized to teach and develop military strategies and enhance personal, sports, scientific, medical, and business models, methods, systems, devices, and processes.
  • the integrated video may be displayed to one or more team members within a helmet, safety eyewear, heads-up-display or other visual system.
  • the telemetry data and video may be utilized to predict significant events, such as weapons discharges, utilization of weapons, injuries, or so forth.
  • the integrated video may be useful for training,
  • team members of a military unit may see the video, biometrics, ambient conditions, and other factors for other team members as automatically selected, displayed simultaneously, or as rotated through.
  • a user may select which team member's integrated video stream to view at any given time.
  • real-time scenarios such as battlefield operations
  • Additional applications are possible where data from any number of sources, such as the third-party resources 158 (e.g., medical databases, facial recognition databases, tracking systems, artificial intelligence for graphics, etc.), may be combined into the integrated video.
  • the third-party resources 158 e.g., medical databases, facial recognition databases, tracking systems, artificial intelligence for graphics, etc.
  • the illustrative embodiments may also be applied to the scientific, medical, and other technical fields for performing arthroscopic, endoscopic, internal, microscopic,
  • the telemetry data may be integrated with the video into a single easily consumable data feed to provide a better understanding of the multiple variables, conditions, and factors that may affect the procedure, observations, processes, or other activities performed by medical professionals, scientists, professionals, or other users. For example, patient pulse rate, blood pressure, respiration rate, blood oxygenation, blood chemical levels, external temperature, bed position, medical professionals on duty, room number, primary doctor, instrument status (e.g., RPMs, battery, position, etc.) or so forth may be displayed with internal or external video captured.
  • instrument status e.g., RPMs, battery, position, etc.
  • Communications within the communications environment 100 may occur utilizing any number of established or developing network or connection types, standards, and protocols.
  • communications architectures including cloud networks, mesh networks, powerline networks, client-server, network rings, peer-to-peer, n-tier, application server, or other distributed or network system architectures may be utilized.
  • the illustrative embodiments may be utilized to benchmark and create virtual reality or augmented reality for the playback of exact and precise performance data within real-time environments, such as the communications environment 100.
  • Virtual reality or augmented- reality may be utilized with electronic glass, smart glasses, heads-up-displays, virtual reality headsets (e.g., HoloLens, Google Glass, Oculus Rift, etc.) or other display or
  • FIG. 2 is a block diagram of a device integrating video and telemetry data in accordance with an illustrative embodiment.
  • FIG. 2 illustrates a capture system 200.
  • the capture system 200 may include a wireless device 202 that receives a video stream 204 and telemetry data 206.
  • the wireless device 202 may then generate an integrated video stream 208 including both the video stream 204 and the telemetry data 206.
  • the wireless device 202 may communicate directly or indirectly with any number of networks or devices.
  • the wireless device 202 is one embodiment of any number of computing or communications devices.
  • the wireless device 202 may represent any number of cell phones, PDAs, tablets, radios, mp3 players, GPS devices, electronic glass, glasses, accessories, helmets, goggles, heads-up-displays, computers, personal computers, or other electronic devices.
  • the wireless device 202 may perform any number of processes or steps, including, but not limited to, recording a session, recording an event, starting and stopping recording and telemetry data 206 integration, timestamping the telemetry data 206, synchronizing data, uploading and downloading the integrated stream 208, and recording the integrated stream 208.
  • the wireless device may timestamp the telemetry data 206 to an associated or correlated frame level of the video stream 204 based on the sampling rate of the video and individual telemetry data streams.
  • the wireless device 202 may include at least a processor 210, a communications interface 212, a memory 214, and an integration application
  • the wireless devices 602 may communicate through one or more communications networks.
  • the wireless device 202 may include any number of computing and telecommunications systems, devices, or components that are not specifically described our called out that may include transceivers, transmitters,
  • receivers busses, circuits, displays, logic, user interfaces, cards, ports,
  • the processor 210 is circuitry or logic enabled to control execution of a set of instructions.
  • the processor 210 may be one or more microprocessors, digital signal processors, application-specific integrated circuits (ASIC), field programmable gate array (FPGA), central processing units, or other devices suitable for controlling an electronic device including one or more hardware and software elements, executing software, instructions, programs, and applications, converting and processing signals and information, and performing other related tasks.
  • the processor 210 may be a single chip or integrated with other computing or communications elements.
  • the processor 210 may also represent other customized logic for implementing the processes, signal processing, commands, instructions, and functions herein described.
  • the memory 214 is a hardware component, device, or recording media configured to store data for subsequent retrieval or access at a later time.
  • the memory 214 may be static or dynamic memory.
  • the memory 214 may include a hard disk, random access memory (RAM), magnetic RAM (MRAM), cache, removable media drive, mass storage, or configuration suitable as storage for data, instructions, and information.
  • the memory 214 and processor 210 may be integrated.
  • the memory 214 may use any type of volatile or non-volatile storage techniques and mediums.
  • the communications interface 212 is the components for receiving user input and sending and receiving signals.
  • the communications interface 212 may include a one or more transmitter, receiver, or transceiver.
  • the transceiver is the circuitry configured to send and receive signals.
  • the transceiver may communicate with one or more devices simultaneously utilizing any number of standards or protocols, such as Bluetooth, Wi-Fi, cellular signals, or other radio frequency signals.
  • the communications interface 212 may include hybrid, intelligent, or multiple transceivers for simultaneous communications with a number of distinct devices.
  • the signals may be analog or digital.
  • the signals may be communicated directly or through a communications network.
  • the communications interface 212 may also include a user interface for receiving user input.
  • the user interface may also include a display for displaying information to a user.
  • the wireless device 202 may include any number of soft keys or hard keys.
  • Hard keys are dedicated buttons or interface elements hard-coded for a single, unique, and consistent purpose. Examples of hard keys include the directional indicators of the described embodiments. Hard keys may also include a dedicated keyboard, track ball, mouse, and other buttons.
  • Soft keys are programmable buttons or interface components. Soft keys may be positioned or located anywhere on a display device, such as a touch screen. Each of the soft keys may perform different functions in response to the default or user-defined
  • soft keys may display symbols, icons, text, or other that identify the soft key.
  • the soft keys may represent the integration application 216 utilized by the wireless device 202.
  • the integration application 216 is one or more applications, programs, modules, or set of instructions configured to create, compile, overlay, render, compress, and archive the telemetry data 206, the video stream 204, and the integrated stream 208.
  • the integration application 216 may represent digital logic or hardware that generates the integrated video content.
  • the integration application 216 may be a field programmable gate array (FPGA) or application specific integrated circuit (ASIC).
  • the integration application 216 may also communicate with other applications, devices or components to receive: mapping information, user biometrics (e.g., heart rate, calories, temperature, blood pressure, respiration rate, etc.), fitness tracking information, vehicle information (e.g., throttle position, revolutions per minute, brake pressure, tire pressure, oil pressure, oil temperature, water temperature, voltage, steering angle, gear, fuel pressure, air/fuel ratio, vacuum/boost), global positioning information (e.g., latitude, longitude, altitude, etc.), speed, acceleration, direction, orientation, session start/end times, lap time, lap number, distance travelled, barometric pressure, humidity, flight path, and so forth.
  • the integration application 216 may display the selected telemetry data 206 as well as statistically or user determined information related to the telemetry data, such as minimums, maximums, averages, and so forth.
  • the integrated stream 208 appears as a single stream, but may represent synchronized layers of telemetry data 206 overlaid on the selected video stream 204.
  • the integrated stream 208 may include one of a number of different video streams.
  • the incoming video 206 may be assigned identifiers for the frames for accurately layering the telemetry data 206 on the video stream 204 to generate the integrated stream 208.
  • the wireless device 202 may also utilize a map application.
  • the map application may be configured to format location and map information for display by the communications interface 212.
  • the map application may be a geographic
  • the map application may also display location and map information associated with the wireless device 202 or associated users/devices.
  • the map application may represent programs, such as Waze, Google Maps, Google Earth, Yahoo maps, MapQuest, Garmin, Microsoft Streets and Trips, and other similar mapping, electronic atlas, and cartography programs.
  • the map application may also represent an application that utilizes public, open-source, or proprietary databases compiled and maintained
  • the integration application 216 may be integrated with one or more of a mapping application, fitness/sport tracking application, personal health
  • the integration application 216 and
  • mapping application may also be integrated in an operating system, kernel, module, or other instructions executed by the processor 210.
  • the wireless device 202 is a stand-alone device for integrating the video stream 204 and the telemetry data 206 to generate the integrated stream 208.
  • a transceiver may send and receive information to one or more other devices and sensors.
  • the transceiver may be designated to utilize any number of default or user specified frequencies. For example, the
  • the transceiver may communicate with five different sensors communicating the telemetry data 206 through Bluetooth, a physically connected camera may communicate the video stream 204, and the wireless device 202 may communicate the integrated stream 208 to one or more observers utilizing a cellular, Wi-Fi, cellular, or satellite connection.
  • the wireless device 202 may be integrated in other components, devices, equipment, or systems of vehicles, household devices, computing devices, or other communications devices.
  • the video stream 204 may be captured by any number of other devices, such as cell phones, security cameras, body cameras, and so forth.
  • the telemetry data 206 may be overlaid on the video stream 204 from the different devices or third parties.
  • the integrated stream 208 may represent a crowdsourcing effort. For example, identifying information may be utilized to associate video with a particular event, activity, or location so that an administrator, editor, manager, or other user may select from all available and authentic video stream sources to produce the best results possible.
  • FIG. 3 is a flowchart of a process for overlaying video data with telemetry data in accordance with an illustrative embodiment.
  • the processes of FIG. 3 and 4 may be implemented by any number of computing or communications devices, such as laptops, smart phones, gaming devices, cameras, vehicle systems, tablets, or other digital capture device referred to as a wireless device for purposes of simplicity.
  • the processes of FIG. 3 and 4 may be implemented by a wireless device communicating with a camera or video capture device and with any number of sensors or data capturing devices that capture telemetry data.
  • the illustrative embodiments do not require proprietary file formats.
  • the wireless device may accept digital video and telemetry data from any device.
  • the illustrative embodiments may be implemented utilizing devices users already have access to thereby enhancing utilization and adoption rates of the methods and processes herein described.
  • the process may begin by initiating a video recording session (step 302).
  • the video recording or streaming may be initiated automatically (e.g., the vehicle being started, systems being activated, available communications connections, etc.) or in response to user input (e.g., selection of an application, powering on a camera, selection of a record/capture button, etc.).
  • the video recording session is automatically initiated in response to capturing video as well as telemetry data from multiple sources for integration with the video captured.
  • the wireless device receives video and telemetry data (step 304).
  • the video may correspond to any number of activities or events.
  • the video may correspond to sports (e.g., running, biking, etc.), vehicle events (e.g., car race, travel from place-to-place, etc.), extreme sports (e.g., sky diving, snowmobiling,
  • sample rates for the telemetry data may vary. As a result, the most recent measurement may be utilized until a next measurement is received as part of the telemetry data.
  • the illustrative embodiments allow different sampling rates and display rates to be utilized seamlessly.
  • the wireless device time stamps the telemetry data as received (step 306) and the software object takes into account the sampling rate of the telemetry data received and the frame rate of the video.
  • the telemetry data is time stamped to ensure that it is properly synchronized with the video being received using an algorithm that neutralizes the differences in sampling and frame rates.
  • Improperly synchronized data may present observers, coaches, family members, friends, or other remote parties false ideas or assumptions about the video and data being viewed.
  • a universal clock may be utilized to time stamp the beginning and end of the session that is applied to each of the telemetry and video data feeds. If the individual data feeds include a clock, the wireless device (or applicable system) reconciles the differences with the universal clock. Reconciliation may occur once per session, as data is received, or as often as needed to effectively reconcile any determined differences, deviations, variations, errors, or so forth.
  • the video data may have a time stamp for the video frames taken by a clock of the recording device, but may not be reliable or accurate.
  • the universal clock may be utilized to ensure synchronicity and accuracy across the communicating devices.
  • the wireless device overlays the video with the telemetry data to generate the integrated video stream (step 308).
  • the telemetry data is combined with the video in a format that is aesthetically pleasing.
  • the telemetry data is rendered as part of the video to generate the integrated video stream.
  • Any number of data graphics applications may be utilized to integrated the video and telemetry data in a desired format.
  • the user may have previously established how the telemetry data is integrated into the video in real-time or near real-time.
  • the video stream be rendered with the telemetry data.
  • the integrated video stream may also be compressed for transmission to any number of users or devices, directly or through one or more networks.
  • the wireless device may capture the telemetry data from any number of data feeds to produce a fully rendered video feed with the telemetry data.
  • the wireless device may utilize dedicated hardware or software as needed.
  • an algorithm may be utilized to the individual video
  • the frame receives the right overlay of data.
  • the video is re-encoded frame-by-frame.
  • the telemetry data overlay is drawn on the video as a still image and the video frames and stitched back together in sequence to create the video with integrated telemetry.
  • the formatting of the telemetry data and video data may be set by default or customized as needed as any number of templates or preset options.
  • the wireless device communicates the integrated video stream (step 310).
  • the integrated video stream may be broadcast to specified users or as a stream receivable by any number of parties.
  • the integrated video stream may be communicated as a feed with the telemetry data displayed within the graphics of the integrated video stream and with the associated metadata for recording or review by any number of end-user devices.
  • the process of FIG. 3 may be utilized by NASCAR to display integrated video content for multiple users simultaneously.
  • a NASCAR fan may be able to switch between the numerous drivers to see the video from the applicable car as well as telemetry data associated with the user and car.
  • hashtags, hyperlinks, tags, or other identifiers may be utilized to switch between streams.
  • FIG. 4 is a flowchart of a process for processing integrated video in accordance with an illustrative embodiment.
  • the process of FIG. 4 may correspond to one or more steps of FIG. 3, such as step 308 of FIG. 3.
  • the process may begin by gathering telemetry data (step 402).
  • the telemetry data may be gathered as it is received by the wireless device.
  • the wireless device may receive the telemetry data utilizing wireless communications, such as Bluetooth, Wi-Fi, or other radio frequency signals, or a direct physical connection (e.g., USB, Firewire, etc.).
  • the wireless device saves the telemetry data into an XML file and data structure (step 404). Any number of data files, markup languages, or structures may be utilized to save the telemetry data.
  • the data may include the associated time/time stamp associated with each piece of telemetry data.
  • the wireless device drafts artifacts on a layer (step 406).
  • the layer may represent a location for the telemetry data.
  • the wireless device retrieves data values from the XML file for presentation (step 408).
  • the data values are the telemetry data stored in the XML file that may be inserted into the layer for presentation.
  • the wireless device synchronizes the presented telemetry data with the video utilizing a timestamp and sample rates for each data signal (step 410).
  • FIG. 5 is a flowchart of a process for sharing integrated video in accordance with an illustrative embodiment.
  • the process may begin by streaming integrated video for immediate playback (step 502).
  • the integrated video may include video and telemetry data associated with the video.
  • the wireless device uploads the integrated video (step 504).
  • the integrated video may be automatically uploaded or manually uploaded by a user.
  • one or more applications may upload the integrated video in response to the telemetry data being fully overlaid, combined, or integrated with the video in a format that may be communicated to one or more users and associated devices.
  • the integrated video may be uploaded to one or more servers, networks, systems, applications, databases, or memories that are part of or in communication with the wireless device.
  • the wireless device shares the integrated video online (step 506).
  • the integrated video may be shared utilizing any number of website, communities, cloud networks or platforms, or file sharing services.
  • the integrated video may be uploaded to Facebook, Twitter, YouTube, Twitch, Vimeo, LiveLeak, Ustream, or any number of streaming or video sharing sites, devices, or platforms.
  • FIG. 6-11 are pictorial representations of integrated videos in accordance with an illustrative embodiment.
  • FIG. 6 includes integrated video stream 602
  • FIG. 7 illustrates integrated video stream 702
  • FIG. 8 illustrates integrated video stream 802
  • FIG. 9 illustrates integrated video streams 902
  • FIG. 10 illustrates integrated video stream
  • FIG. 11 illustrates integrated video stream 1102
  • FIG. 12 illustrates integrated video stream 1202.
  • the integrated video streams 600 may illustrate any number of video streams communicating in real-time (or near real-time) or video that may be stored, captured, or queued for subsequent communication.
  • the integrated video streams 600 may be captured and overlaid with telemetry data in real-time or near real-time. All or portions of the available telemetry data may be presented at any given time.
  • the integrated video stream 602 of FIG. 6 shows an example of a video stream that is ready to be played by a user utilizing any number of media players, software applications, or connections.
  • the integrated video stream 602 may show video from a race car that is captured utilizing a camera.
  • the integrated video stream 602 may include telemetry data 604, such as speed, current time, average lap time, most recent lap
  • the telemetry data may be varied or alternated to ensure that any users watching are not overloaded with information and that the video image is not overtaken with the telemetry data.
  • the user may position the telemetry data within the video.
  • the integrated video stream 702 of FIG. 7 may illustrate a sky diver jumping from a plane or balloon utilizing a wing suit, helmet, oxygen, and other protective gear.
  • the integrated video stream may display telemetry data and information, such as speed, heart rate (measured in beats per minute), and altitude.
  • the telemetry data may be captured by one or more sensors, biometric sensors, or devices that communicate with a computing or communications device, such as a cell phone, to generate the integrated video stream 702.
  • the integrated video stream 802 of FIG. 8 may illustrate the view of a user driving a vehicle for work, pleasure, or daily activities.
  • the integrated video stream 802 may illustrate a speed, date, time, trip time, trip identifier, trip identifier, driver name, map, path or directions, applicable intersection, and other applicable information.
  • the integrated video stream 802 may capture traffic incidents or wrecks to protect the user from unwanted liability or for recording keeping, safety, or insurance purposes.
  • the integrated video stream 902 of FIG. 9 may illustrate a user snowboarding.
  • the user may be wearing snow clothing and a costume.
  • the user may utilize a selfie-stick or other extended camera system with a miniature camera (e.g., GoPro camera system) to record video of snowboarding.
  • the telemetry data integrated with the integrated video stream 902 may include speed, temperature, altitude, path, mapping data, and other applicable information.
  • the integrated video stream 902 may be utilized for commercial or personal pursuits.
  • the integrated video stream 902 may be captured for a television show, movie, or Internet series about snowboarding.
  • the integrated video stream 1002 of FIG. 10 may illustrate a track meet and the 100 meter hurdles race.
  • the integrated video stream 1002 may show pulse rate for one or more of the racers, wind speed and direction, temperature at the event, and speed of one or more of the racers (e.g., numerically and graphically).
  • the integrated video stream 1 102 of FIG. 1 1 may illustrate video of a commercial transportation service, such as Lyft, Uber, or other taxi or transportation services.
  • the telemetry data may include speed, name of the company/service, date, pickup time, expected drop-off time, driver name/identifier, passenger, and location/map.
  • the integrated video stream 1102 may also be streamed or recorded for verification, safety, insurance, or other purposes (e.g., company policy, personal verification, etc.).
  • the user may be a professional driver that documents his trips utilizing video and the associated telemetry data that make up the integrated video stream 1102.
  • the integrated video stream 1202 of FIG. 12 may illustrate video and telemetry data from a drone aircraft.
  • the integrated video stream 1202 may show telemetry data including speed, temperature, altitude, height, distance travelled, battery status, range, and other applicable information.
  • the illustrative embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit,” “module” or “system.”
  • embodiments of the inventive subject matter may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium.
  • the described embodiments may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computing system (or other electronic device(s)) to perform a process according to embodiments, whether presently described or not, since every conceivable variation is not enumerated herein.
  • a machine-readable medium includes any mechanism for storing or transmitting information in a form (e.g., software, processing application) readable by a machine (e.g., a computer).
  • the machine-readable medium may include, but is not limited to, magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic instructions.
  • embodiments may be embodied in an electrical, optical, acoustical or other form of propagated signal (e.g., carrier waves, infrared signals, digital signals, etc.), or wireline, wireless, or other communications medium.
  • Computer program code for carrying out operations of the embodiments may be written in any combination of one or more programming languages, including an object- oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages.
  • the program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server.
  • the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN), a personal area network (PAN), or a wide area network (WAN), or the connection may be made to an external computer (e.g., through the Internet using an Internet Service Provider).
  • LAN local area network
  • PAN personal area network
  • WAN wide area network
  • Internet Service Provider an Internet Service Provider
  • FIG. 13 depicts a computing system 1300 in accordance with an illustrative embodiment.
  • the computing system 1300 may represent an electronic computing or communications device, such as the laptop 112 of FIG. 1.
  • the computing device 1300 may be utilized to receive video and telemetry data for creating and managing integrated video.
  • the computing system 1300 includes a processor unit 1301 (possibly including multiple processors, multiple cores, multiple nodes, and/or implementing multithreading, etc.).
  • the computing system includes memory 1307.
  • the memory 1307 may be system memory (e.g., one or more of cache, SRAM, DRAM, zero capacitor RAM, Twin Transistor RAM, eDRAM, EDO RAM, DDR RAM, EEPROM, NRAM, RRAM, SONOS, PRAM, etc.) or any one or more of the above already described possible realizations of machine-readable media.
  • system memory e.g., one or more of cache, SRAM, DRAM, zero capacitor RAM, Twin Transistor RAM, eDRAM, EDO RAM, DDR RAM, EEPROM, NRAM, RRAM, SONOS, PRAM, etc.
  • the computing system also includes a bus 1303 (e.g., PCI, ISA, PCI-Express, HyperTransport®, InfiniBand®, NuBus, etc.), a network interface 1305 (e.g., an ATM interface, an Ethernet interface, a Frame Relay interface, SONET interface, wireless interface, etc.), and a storage device(s) 1309 (e.g., optical storage, magnetic storage, etc.).
  • a bus 1303 e.g., PCI, ISA, PCI-Express, HyperTransport®, InfiniBand®, NuBus, etc.
  • a network interface 1305 e.g., an ATM interface, an Ethernet interface, a Frame Relay interface, SONET interface, wireless interface, etc.
  • storage device(s) 1309 e.g., optical storage, magnetic storage, etc.
  • the system memory 1307 embodies functionality to implement embodiments described above.
  • the system memory 1307 may include one or more functionalities that facilitate retrieval of the audio information associated with an identifier. Code may be implemented in any of the other devices of the computing system 1300. Any one of these functionalities may be partially (or entirely) implemented in hardware and/or on the processing unit 1301. For example, the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processing unit 1301, in a co-processor on a peripheral device or card, etc. Further, realizations may include fewer or additional components not illustrated in FIG. 13 (e.g., video cards, audio cards, additional network interfaces, peripheral devices, etc.).
  • the processor unit 1301, the storage device(s) 1309, and the network interface 1305 are coupled to the bus 1303. Although illustrated as being coupled to the bus 1303, the memory 1307 may be coupled to the processor unit 1301.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Theoretical Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biophysics (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Chemical & Material Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Biomedical Technology (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurosurgery (AREA)
  • Business, Economics & Management (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

A system, apparatus, and method for combining video with telemetry data. The video is received from a camera associated with a user at a wireless device. Telemetry data associated with the video is received at the wireless device. The telemetry data is time stamped as received. The video is overlaid with the telemetry data to generate integrated video utilizing the wireless device. The integrated video is communicated from the wireless device to one or more users.

Description

PCT APPLICATION
TITLE: NETWORKED APPARATUS FOR REAL-TIME VISUAL
INTEGRATION OF DIGITAL VIDEO WITH TELEMETRY DATA
FEEDS
PRIORITY STATEMENT
This application claims priority to U.S. Provisional Patent Application 62/347,271 filed on June 8, 2016, and entitled "NETWORKED APPARATUS FOR REAL-TIME VISUAL INTEGRATION OF DIGITAL VIDEO WITH TELEMETRY DATA FEEDS", which is hereby incorporated by reference in its entirety.
BACKGROUND
I. Field of the Disclosure
The illustrative embodiments relate to digital video management. More specifically, but not exclusively, the illustrative embodiments relate to a system, method, network, and apparatus for integrating video and telemetry data captured from multiple sources, devices, or sensors into a single stream. II. Description of the Art
Personal and commercial use of digital video is increasing exponentially. This growth is fostered by enhanced hardware, such as cameras/optics, memories, and logic. Enhanced wireless communications have also made communication of digital video content more standard. For example, video captured by drones, vehicles, sport or helmet cameras, or body cameras is becoming very common. More users are also capturing associated data, such as weather, temperature, vehicle performance data, and biometric data. Mechanical and biometric sensors are being successfully integrated in clothing, wearables, personal electronic devices, and vehicles. As users capture more and longer video and more associated sensor data, real-time video management may become more tedious and overwhelming or even impossible. This is especially true for those with limited resources.
SUMMARY OF THE DISCLOSURE One embodiment provides a system, device, apparatus, network, and method for combining video with telemetry data. The video is received from a camera associated with a user at a wireless device. Telemetry data associated with the video is received at the wireless device. The telemetry data is time stamped as received. The video is overlaid with the telemetry data to generate integrated video utilizing the wireless device. The integrated video is communicated from the wireless device to one or more users. In another embodiment, a wireless device includes a process for executing a set of instructions and a memory storing the set of instructions. The set of instructions are executed by the processor to implement the method described above.
Another embodiment provides a wireless device integrating video and telemetry data.
The wireless device includes a processor executing a set of instructions. The wireless device further includes a transceiver in communication with a video source and one or more data sources. The wireless device further includes a memory storing the set of instructions. The set of instructions are executed to receive the video from the video source, receive telemetry data associated with the video from the one or more data sources, time stamp the telemetry data as received, overlay the video with the telemetry data to generate integrated video, and communicate the integrated video to one or more users.
Another embodiment provides a system for receiving a video stream and telemetry data streams. The sampling rate of the video stream and the telemetry data streams are determined. A session start time or user defined start time and date is utilized to perform an initial synchronization of the video stream with the telemetry data streams. Additional synchronization is performed to match a layer of telemetry data with each video frame. The associated telemetry data is overlaid on each video frame to generate an integrated video stream. The integrated video stream is communicated to one or more selected devices.
In one embodiment, the telemetry data streams are received from a number of sensors with a number of different sampling rates.
In one embodiment, the video stream represents a number of different video streams received from a number of different video sources utilizing a number of different formats and video frame rates. The telemetry data streams and the video data streams are synchronized for immediate, real-time or delayed playback.
In one embodiment, a user selection of the telemetry data to be overlaid on the video stream is received. In one embodiment, the video stream and telemetry data are stored separately to overlay the video stream with the telemetry data stream frame-by -frame when the integrated video stream is requested.
In one embodiment, the integrated video stream is uploaded to share with one or more users or devices. The integrated video is uploaded to a cloud platform accessible by a plurality of users.
In one embodiment, the telemetry data stream is saved into an XML file, artifacts are drafted on a layer, and data values are retrieved from the XML file for presentation, and the telemetry data stream data is synchronized with the video utilizing time stamps and sample rates for the telemetry data.
In one embodiment, the wireless device is a smart phone, tablet, or laptop.
In one embodiment, the telemetry data stream is captured by one or more sensors associated with a user or a vehicle. The one or more sensors may include biometric sensors associated with the user. The sensors may include vehicle sensors. The sensors may include environmental sensors.
In one embodiment, the telemetry data stream is rendered as part of the integrated video stream in response to user preferences.
BRIEF DESCRIPTION OF THE DRAWINGS
Illustrated embodiments of the present invention are described in detail below with reference to the attached drawing figures, which are incorporated by reference herein, and where:
FIG. 1 is a pictorial representation of a communications environment 100 in accordance with an illustrative embodiment;
FIG. 2 is a block diagram of a device integrating video and telemetry data in accordance with an illustrative embodiment;
FIG. 3 is a flowchart of a process for overlaying video data with telemetry data in accordance with an illustrative embodiment;
FIG. 4 is a flowchart of a process for processing integrated video in accordance with an illustrative embodiment;
FIG. 5 is a flowchart of a process for sharing integrated video in accordance with an illustrative embodiment; FIGs. 6-12 are pictorial representation of integrated videos in accordance with an illustrative embodiment; and
FIG. 13 is a pictorial representation of a computing device in accordance with an illustrative embodiment.
DETAILED DESCRIPTION OF THE DISCLOSURE
The illustrative embodiments provide a system, method, network, and devices for automated real-time video processing and integration with telemetry data. Video streams may be processed to synchronize real-time high-resolution raw format video with any number of digital data feeds processed as telemetry data. The video streams or content may be received from any number of cameras or video capture devices. Likewise, the digital data feeds may correspond to any number of sensors (e.g., user, vehicle, environment, etc.), wireless devices, interfaces, input/output devices, components, systems, equipment, or data sources. The illustrative embodiments may be utilized to automatically identify and synchronize telemetry data from multiple sources to be rendered as digital graphics in a digital video overlay. This synchronization of video and telemetry data may be performed in real-time, near real-time (slight processing and latency delays), or for subsequent playback or review. The illustrative embodiments may be utilized for any number
of industrial, commercial, military or consumer applications. The processes herein described may be performed by any number of electronic devices including wired or wireless devices.
In one embodiment, the video may be captured and then received from any number of devices, such as cameras or video capture devices produced by Nikon, Cannon, Olympus, Ricoh, Leica, Panasonic, Sony, Apple, Samsung, GoPro, TomTom, Olfi,
Garmin, Veho, iSaw, Drift, Sony, Kaiser, Vikeepro, Lightdow, Canany, ANART, ion, Legazone, AudioSnax, APEMAN, DBPOWER, California
Sugar, iGearPro, GooKit, AudioSnax, HCcolo, ODRVM, Axess, Monoprice, ABLEGRID, and others. The telemetry data may be received from any number of data sources including, but not limited to, cameras, smartphones, smart watches, smart clothing, global positioning devices, heart rate monitors, speedometers, gauges (e.g., fuel, tachometer, temperature, pressure, speedometer, electronic, etc.), and so forth. The video and the telemetry data is integrated together and formatted to generate integrated video.
Similarly, the integrated video (including the video formatted to include the telemetry data) may be generated and communicated to any number of users or third parties utilizing a smartphone, tablet, laptop, desktop personal computer, gaming console, web page, applications (e.g., wireless app, desktop program, etc.), augmented reality system, virtual reality system (e.g., HoloLens, Google Glass, Oculus Rift, etc.), or other communications, computing, or input/output device or interface.
The illustrative embodiments may be utilized with any form of media distribution channels including social media applications, websites, and distribution systems. In addition, third parties may also capture video or other data that may be utilized in the illustrative embodiments.
FIG. 1 is a pictorial representation of a communications environment 100 in accordance with an illustrative embodiment. In one embodiment, the communications environment 100 represents one or more locations, conditions, or scenarios in which the illustrative embodiments may be utilized and implemented. A few examples of
the numerous potential electronic devices that may be utilized to perform the illustrative embodiments are shown.
In one embodiment, a user 108 may be driving (or a passenger in) a vehicle
110 which may include a laptop 112. The laptop 112 may communicate with any number of video and data capture devices 113, such as a camera 114, a thermometer 116, a timer 118, a heart rate monitor 120, and a global positioning system 122. The vehicle 110 or the laptop 112 may communicate with a network 124 utilizing a wireless signal 126.
In another embodiment, a user 128 may be riding a bicycle 130. The bicycle 130 or user 128 may be carrying a wireless device 102 which may communicate with sensors 134, 136, 138. The wireless device 102 may execute an application 140 for integrating video and telemetry data from the sensors 134, 136, 138. The wireless device 102 may communicate with the network 124 through the wireless signal 142.
The communications environment 100 may also include a cloud network 150. The cloud network 150 may include servers 152, databases 154, and a cloud platform 156. The cloud network 150 may also communicate with third party resources 158. The cloud platform 156 may include any number of components or module including the logic engine 160.
In one embodiment, the communications environment 100 may be configured to stream, store, or record the integrated video (i.e., video and telemetry data) from any number of users simultaneously. In other embodiments, the video and telemetry data may be captured directly by one or more smart devices, such as laptops, cell phones, tablets, or so forth. The data may then be communicated directly to other devices or through one or more networks, such as the network 124.
In one embodiment, the user 108 may represent a racecar driver and the vehicle 110 may represent a racecar. The vehicle 1 10 may also represent any number of other motorized or non-motorized vehicles or transportation devices, such as bicycles, boats, motorcycles, airplanes, bobsleds, skis, kayaks, all-terrain vehicles, snowmobiles, trucks, or so forth.
The data capture devices 113 represent any number of physical, optical,
chemical, mechanical, electrical, and other sensors or recording devices. The data capture devices 113 may capture data relating to user 108, the vehicle 110, or the environment of the user 108 and the vehicle 1 10. In one embodiment, the data capture devices 1 13 communicate with the laptop 112. The data capture devices 1 13 may be connected utilizing any number of physical connectors, wires, cables, buses, or so forth, as well as any number of wireless communications protocols, standards, components, or systems. For example, the laptop 112 may communicate with the camera 1 14 utilizing a USB connector. In another example, the laptop 1 12 may communicate with the camera 1 14 utilizing a Wi-Fi connection. The laptop 112 may communicate with the thermometer 116, the timer 1 18, and the heart rate monitor 120 utilizing wireless communications, such as Bluetooth. The global positioning system 122 may represent an application or components of the laptop 1 12 or an
external device or system. In some embodiments, any of the data capture devices 113 may represent integrated systems or subsystems of the vehicle 1 10, the laptop 1 12, or externally positioned devices.
The data capture devices 113 may include the camera 114. The camera 114 may represent any number of video, image, or optical capture devices. The camera 114 may be externally connected to the vehicle 1 10 utilizing any number of mounting
systems. For example, the camera 1 14 may include a camera housing containing a camera lens contained within a front surface of the camera body, various indicators on the surface of the camera housing (e.g., light emitting diodes, touch screens, etc.), input mechanisms (e.g., buttons, switches, toggles, scroll wheels, levers, etc.), and electronics (e.g., processor, light sensors, memories, power electronics, metadata sensors, etc.) internal to the camera housing for capturing images via the camera lens and performing other functions. For example, the camera 1 14 may automatically or manually zoom, focus, rotate, pivot, or move to capture the best video or images possible in the applicable environment and conditions. The camera 1 14 may be controlled by the user 108 or any number of remotely located users (e.g., through signals sent to the camera 1 14 through the network 126). The camera 114 may be configured to capture video utilizing any number of visible or non-visible spectra.
In another embodiment, the camera 1 14 may be an integrated video system of the vehicle 110. The camera 1 10 may also represent any number of cameras or a camera array (e.g., forward facing, rearward facing, side cameras, roof camera, undercarriage camera, etc.). The video image may transition between any number of video streams or capture devices. In one example, the camera 1 14 may represent a smart phone, tablet, or personal video device (e.g., GoPro, etc.). As a result, the telemetry data read by the data capture devices 113 may be overlaid onto any number of video streams. In one embodiment, the telemetry data may be overlaid onto a single video stream at a time. However, the telemetry data or different variants of the telemetry data may be simultaneously overlaid onto distinct video streams that may be communicated to one or more receiving or observing parties or devices. The camera 1 14 may capture video utilizing any number of file formats. The telemetry data may be integrated with the various types of video formats (e.g., Audio Video Interleave (AVI), flash video format (FLV), Advanced Systems Format (ASF), Windows Media Video (WMV), Apple QuickTime Movie (MOV), Moving Pictures Expert Group 4 (MP4), Advanced Video Coding High Definition (AVCHD), etc.), allowing any number, type, or combination of cameras, sensors, and capture devices to be utilized.
In one embodiment, the data capture devices 113 may include one or more biometric sensors, including any number of wearable, implantable, or other sensory devices. The biometric sensors may measure heart rate, heart rhythm, blood pressure, temperature, blood oxygenation, neural activity, blood or excretion content (e.g., adrenaline, blood, sugars, etc.), voice characteristics, and other applicable data. All or portions of the data measured by the data capture devices 1 13 may be integrated into a video stream.
The data capture devices 1 13 may provide telemetry data that may be utilized by any number of outside parties for observation, performance enhancement, security,
military, training, or entertainment. Although the laptop 1 12 is shown as
communicating with the network 124, the laptop 112 may also communicate with any number of devices directly or through other networks. In one embodiment, the integrated video stream generated by the laptop 112 utilizing the camera 1 14 and other data capture devices 113 may be streamed to the to the cloud network 150 for storage in the servers 152. The integrated video stream may also be streamed to any number of other devices, systems, or components. The video and telemetry data is instantaneously viewable via live video broadcast or as synchronized. The integrated video may be encrypted, password protected, or otherwise secured to prevent unwanted interception and viewing. Integrated or external transceivers, cards, devices, or other components may be utilized by the laptop 112 to communicate with the network 124.
The laptop 112 or any number of other computing or communications devices may generate and render the real-time data synchronized with the telemetry data. The laptop 112 may synchronize the information and data received from the data capture devices 113 to ensure that the applicable information, data, and streams are synchronized and formatted for accuracy and effectiveness. The laptop 112 may utilize specialized software, hardware, or logic to take 1) the telemetry data (e.g., from the set of sensors represented by the data capture devices 113) from each sensor which may each have distinct sampling rates and synchronize 2) video (e.g., from any number of video sources using different video standards and frame rates) to synchronize the 1) data, and 2) video at the individual frame level for immediate, live, or delayed streaming or playback. In one embodiment, the data and video may be saved discretely for synchronization at any time. In other embodiments, the data and video may be saved in a single file or stream.
In one embodiment, the integrated video generated by the laptop 112 may be archived by the laptop as well as devices, such as the servers 152. The archived session may include the telemetry data and the video separated with information and logic for overlaying the telemetry data on the video for playback. The archived session may then be accessed as part of a virtual reality session. For example, another user may virtually ride-along with the user utilizing the archived session. In another example, a user may have the archived session displayed as augmented reality so that the user may be able to race against herself from the archived race session. For example, the archived session may be displayed utilizing a heads- up-display showing the user the difference between the two performances both visually and with the associated data (e.g., time differentials, lap times, lap position, pulse rate, etc.). As a result, the user in a real-time situation may be motivated to improve upon the archived session. The archived session may also be utilized as part of a video game, simulation, or training session. In another example, the integrated video may provide the option to perform a real-time race with a virtual or remote version of a runner who performs at an elite level for performing virtual races, benchmarking, training, or so forth.
In another embodiment, the user 128 may be riding a bicycle 130 with sensors 134, 136, 138. The wireless device 132 may compile the telemetry data from the sensors 134, 136, 138 with video captured by the wireless device 132. In one embodiment, the wireless device 132 may be mounted to the frame or handlebars of the bicycle 130 to capture forward facing video as the user 128 rides the bicycle. In another embodiment, the user 128 may be wearing a helmet with a camera mounted to the helmet. The sensors 134, 136, 138 may capture data, such as heart rate of the user 128, speed of the bicycle 130 (e.g., speedometer), location (e.g., GPS), cadence, blood pressure, altitude, and other user 128, bicycle 130, or environmental data. In one embodiment, one of the sensors 134, 136, 138 may capture multiple data points or data sources as a single source or multiple sources that may be overlaid onto the video captured by the wireless device 132. The wireless device 132 may also receive digital input from sources, such as external servers, web pages, local memory, web services, and so forth.
In one embodiment, the application 140 may be executed by the wireless device 132 to overlay the video with the telemetry data. The application 140 may be opened or activated to generate an integrated video stream or file. The application 140 may receive the video and telemetry data internally or from one or more sources, time stamp the telemetry data as received, overlay the video with the telemetry data, and communicate the integrated video stream to one or more users/devices. The application 140 may utilize a graphical user interface to display applicable information and receive user preferences and settings for controlling the generation and communication of the integrated video stream. The application may utilize any number of fields, soft buttons, icons, text and graphical display elements, audio and sound alerts, and so forth. For example, the user 128 may utilize the graphical user interface of the application 140 to specify the position, color, size, font, refresh rate, display settings, or other information applicable to how the telemetry data is overlaid on the video to generate the integrated video.
In one embodiment, the application 140 may be utilized by the user 128 or other parties to control how the telemetry data captured by the sensors 134, 136, 138 is formatted and displayed as part of the integrated video stream. For example, the user 128 may specify what telemetry data of the available telemetry data is shown, the position of the telemetry data, the color of the different telemetry data, associated labels displayed, charts, text type, size, language, graphical components, units of measurement (e.g., English, metric, etc.), refresh rate, resolution, image quality and size, and so forth. The user 128 may specify information and specifics for the video or telemetry data. In another embodiment, the viewing user may utilize an application to set the preferences, settings, and format for the integrated video stream that they view, save, or edit. For example, the user preferences may be utilized to repeatedly overlay the telemetry data onto the video content so that
customization or selections are not required each time that the user 128 utilizes the wireless device 132 with one or more of the sensors 134, 136, 138.
The integrated video stream generated by the wireless device 132 may be streamed or communicated to any number of users. The users may also utilize a version of the application 140 to receive and view the integrated video stream. For example, the cloud network 150 may receive the integrated video streams from the laptop 112 and wireless device 132 for storage or further distribution. In one embodiment, users may access the cloud network 132 utilizing a portal and a secure identifier to view the integrated video stream. For example, a password, pin number, session identification, keyword, or so forth may be required to access the integrated video stream. In another example, the mobile application 140 may be utilized to stream the integrated video content from the cloud network 150 or directly from the device streaming or generating the integrated video content. In some embodiments, the user may be required to pay to have access to the integrated video stream.
The cloud network 150 may be configured to receive integrated video streams or generate integrated video streams. For example, the user 128 may carry a camera 160, a smart watch 162, and sensors 164, 166. In this embodiment, the camera 160 may include an application and processor for rendering the telemetry data received from the smart watch 162 and sensors 164, 166. In another embodiment, the camera 160 or other connected device may stream the video as well as the telemetry data from the smart watch 162 and sensors 164, 166 (the smart watch 162 and sensors 164, 166 may also communicate with the cloud network 150). The cloud platform 156 may utilize the logic engine 160 to render the video and telemetry data from the user 128 for storage or communication.
In one embodiment, the servers 152 may include a video server that receives and stores video captured from any number of users. The video stored on the servers 152 may be requested for review, playback, editing, or on-demand review at any time with authorized credentials (e.g., username, password, granted access rights, etc.). In one embodiment, the servers 152 (or the laptop 1 12 or wireless device 132) may generate video with multiple versions or rendered content such that the versions are intended for distinct user groups. For example, some lay-users may not appreciate or want to see information about the gear the vehicle is in, engine RPMs, or so forth, whereas other racers or coaches may be very interested in seeing such telemetry data. As a result, post processing may be performed on the telemetry data or integrated video as it is received.
In other embodiments, the integrated video may be generated by a train and the associated conductors. The train may have a number of different video cameras feeding video to a wireless computing device. The conductors may wear biometric sensors that measure biometrics, such as heart rate, blood pressure, voice characteristics, heart activity, neural activity, or so forth. The wireless computing device collects the telemetry data and video data through physical connections or wireless feeds to compile and generate the integrated video. These applications may be implemented for enhanced safety, security, counterterrorism, training, or entertainment.
In another embodiment, a wireless computing device may be utilized to capture surveillance video for prisoners in a detention facility. The wireless computing device may also receive biometric data from wearable, implantables, or external biometric sensors. For example, the biometric sensors may sense the pulse rate, temperature, blood pressure, and/or voice characteristics of the prisoner. Additional sensors may measure the ambient conditions, such as temperature, humidity, and noise level. The integrated video may be utilized to monitor, predict, de-escalate, and prevent escalation of events within the detention facility in real-time. In one embodiment, the wireless computing device may perform facial, graphic, audio, or pattern recognition to detect people, sounds, actions, activities, or so forth. The integrated video generated by the wireless computing device may be processed to generate any number of alerts and visual indicators which may also be combined as part of the integrated video.
In another embodiment, a wireless computing/communications device may be utilized to capture video for law enforcement, security, emergency services, or military
personnel. For example, cameras may be integrated in the uniform, helmet/hat, body armor, packs, or other items worn or carried by the potential users. Biometric sensors may be similarly integrated or externally monitoring the user, environment, third parties, events, actions, conditions, and so forth.
In another embodiment, the wireless device may represent a drone aircraft equipped to capture video as well as other telemetry data. For example, the drone may be utilized to perform security/monitoring (e.g., animal counting, crop monitoring, measure snow levels, border surveillance, personal protection, etc.), spray pesticides, deliver products, perform military operations, race, and/or perform other tasks, functions, and activities. The telemetry data may include drone data, such as flight path, drone position, orientation, altitude, battery levels, speed and ambient data, such as time of day, temperature, weather conditions, and so forth. The integrated video may provide a single broadcast feed monitorable by a number of users to watch the drone, monitor and maximize the flight path, regulate product usage, or help operators make adjustments to flight path or objectives.
The illustrative embodiments may also be utilized within the medical field to generate real-time or archived integrated video. Video captured about a person as well as telemetry data may be utilized for any number of treatment, surgery, and therapeutic applications. For example, a surgeon performing a remote surgery may be able to see the video with integrated patient and environmental information, such as patient temperature, temperature heart rate, blood pressure, respiration rate, operating room temperature, surgical instrument status, and so forth. The real-time telemetry data may improve surgeries and medical decisions to better care for patients and potentially save lives.
The illustrative embodiments may also be utilized for security and incident review within the transportation industry. A vehicle, such as a semi-truck, taxi, Uber, Lyft, vehicle- for-hire, or other professional driver may connect to a wireless device to receive integrated video. For example, the vehicle 110 may represent a commercial vehicle driven by a commercial driver. The integrated video generated by the laptop 112 may be sent to the cloud network 150 through at least the network 124 for at least temporary storage in the server 152. In the event of an incident or accident, the integrated video may be retrieved from the servers 152 to determine liability, environmental conditions, user
status, performance of duty, conduct, and so forth. In one example, a trucking company may pay a monthly service to 1) archive integrated video associated with their trucks/drivers in the cloud network 150, and 2) monitor their drivers in real-time when driving/ working through the cloud platform 156 utilizing a web browser, application, or other interface. In one embodiment, any incidents may be automatically sent to one or more designated parties or devices as an alert or indicator. The integrated video stream may be automatically edited to show the moments before, during, and after the incident. As noted, the telemetry data may show the biometric data of the user, truck, and environmental information associated with the truck/user. For example, the user's heart rate may be determined through contacts integrated with a steering wheel and one or more internal camera may monitor the user's eyes/pupils to determine the user's status and attention level to the road/surroundings. The video feed may include views of the back, front, and/or interior of the truck to show the environment, driving conditions, nearby drivers, and/or the driver of the truck. The integrated video may show one or more of the video feeds simultaneously.
In one embodiment, the cloud network 150 and cloud platform 156 may be utilized to provide software-as-a-service (SaaS). For example, applications downloaded and executed by one or more of the devices of the communications environment 100 may be utilized to generate and manage integrated video from a number of users, organizations, companies, groups, or so forth. A one-time, annual, monthly, or other service fee may be utilized to authorize the user, the application, and/or associated video services. One or more service providers may operate the cloud network 150. The service providers may service hundreds or thousands of clients with integrated video services.
In one embodiment, the service provider or individual clients may make the integrated video stored in the cloud network 150 available to any number of users as a paid subscription or for training, monitoring, entertainment, education, or any number of other purposes. For example, an archived session of a car race session for Team A may be purchased by Team B (if publicly available) for the purpose of training drivers or engineers while improving benchmarked results. Archived sessions may be leveraged to answer "what if scenarios, hypotheticals, performance tests and other situations in a virtual environment. As a result, time, money, and effort may be saved when evaluating and improving performance metrics. The use of virtual reality systems and technology embodiments may be enhanced utilizing available physical data and resources from the video and telemetry data that may be viewed in real-time and archived for subsequent viewing or more detailed analysis. The integrated video may be utilized to teach and develop military strategies and enhance personal, sports, scientific, medical, and business models, methods, systems, devices, and processes.
The application of the illustrative embodiments to military, police, and emergency response personnel, systems, and processes may be particularly valuable. For example, the integrated video may be displayed to one or more team members within a helmet, safety eyewear, heads-up-display or other visual system. In one embodiment, the telemetry data and video may be utilized to predict significant events, such as weapons discharges, utilization of weapons, injuries, or so forth. The integrated video may be useful for training,
simulations, and real-time tactical situations. For example, team members of a military unit may see the video, biometrics, ambient conditions, and other factors for other team members as automatically selected, displayed simultaneously, or as rotated through. A user may select which team member's integrated video stream to view at any given time. In one
embodiment, real-time scenarios, such as battlefield operations, may be monitored, reviewed or even reenacteuVrecreated utilizing the archived data stream. Additional applications are possible where data from any number of sources, such as the third-party resources 158 (e.g., medical databases, facial recognition databases, tracking systems, artificial intelligence for graphics, etc.), may be combined into the integrated video.
The illustrative embodiments may also be applied to the scientific, medical, and other technical fields for performing arthroscopic, endoscopic, internal, microscopic,
remote, telescopic, robotic, and/or intravenous procedures. The telemetry data may be integrated with the video into a single easily consumable data feed to provide a better understanding of the multiple variables, conditions, and factors that may affect the procedure, observations, processes, or other activities performed by medical professionals, scientists, professionals, or other users. For example, patient pulse rate, blood pressure, respiration rate, blood oxygenation, blood chemical levels, external temperature, bed position, medical professionals on duty, room number, primary doctor, instrument status (e.g., RPMs, battery, position, etc.) or so forth may be displayed with internal or external video captured.
Communications within the communications environment 100 may occur utilizing any number of established or developing network or connection types, standards, and protocols. For example, communications architectures including cloud networks, mesh networks, powerline networks, client-server, network rings, peer-to-peer, n-tier, application server, or other distributed or network system architectures may be utilized.
The illustrative embodiments may be utilized to benchmark and create virtual reality or augmented reality for the playback of exact and precise performance data within real-time environments, such as the communications environment 100. Virtual reality or augmented- reality may be utilized with electronic glass, smart glasses, heads-up-displays, virtual reality headsets (e.g., HoloLens, Google Glass, Oculus Rift, etc.) or other display or
communications systems.
FIG. 2 is a block diagram of a device integrating video and telemetry data in accordance with an illustrative embodiment. FIG. 2 illustrates a capture system 200. In one embodiment, the capture system 200 may include a wireless device 202 that receives a video stream 204 and telemetry data 206. The wireless device 202 may then generate an integrated video stream 208 including both the video stream 204 and the telemetry data 206. The wireless device 202 may communicate directly or indirectly with any number of networks or devices. The wireless device 202 is one embodiment of any number of computing or communications devices. For example, the wireless device 202 may represent any number of cell phones, PDAs, tablets, radios, mp3 players, GPS devices, electronic glass, glasses, accessories, helmets, goggles, heads-up-displays, computers, personal computers, or other electronic devices.
As shown the wireless device 202 may perform any number of processes or steps, including, but not limited to, recording a session, recording an event, starting and stopping recording and telemetry data 206 integration, timestamping the telemetry data 206, synchronizing data, uploading and downloading the integrated stream 208, and recording the integrated stream 208. In one embodiment, the wireless device may timestamp the telemetry data 206 to an associated or correlated frame level of the video stream 204 based on the sampling rate of the video and individual telemetry data streams.
In one embodiment, the wireless device 202 may include at least a processor 210, a communications interface 212, a memory 214, and an integration application
216. The wireless devices 602 may communicate through one or more communications networks. Although not shown, the wireless device 202 may include any number of computing and telecommunications systems, devices, or components that are not specifically described our called out that may include transceivers, transmitters,
receivers, busses, circuits, displays, logic, user interfaces, cards, ports,
adapters, antennas, motherboards, circuits, chipsets, connectors, traces, wires, or
other components.
The processor 210 is circuitry or logic enabled to control execution of a set of instructions. The processor 210 may be one or more microprocessors, digital signal processors, application-specific integrated circuits (ASIC), field programmable gate array (FPGA), central processing units, or other devices suitable for controlling an electronic device including one or more hardware and software elements, executing software, instructions, programs, and applications, converting and processing signals and information, and performing other related tasks. The processor 210 may be a single chip or integrated with other computing or communications elements. The processor 210 may also represent other customized logic for implementing the processes, signal processing, commands, instructions, and functions herein described.
The memory 214 is a hardware component, device, or recording media configured to store data for subsequent retrieval or access at a later time. The memory 214 may be static or dynamic memory. The memory 214 may include a hard disk, random access memory (RAM), magnetic RAM (MRAM), cache, removable media drive, mass storage, or configuration suitable as storage for data, instructions, and information. In one embodiment, the memory 214 and processor 210 may be integrated. The memory 214 may use any type of volatile or non-volatile storage techniques and mediums.
The communications interface 212 is the components for receiving user input and sending and receiving signals. The communications interface 212 may include a one or more transmitter, receiver, or transceiver. The transceiver is the circuitry configured to send and receive signals. For example, the transceiver may communicate with one or more devices simultaneously utilizing any number of standards or protocols, such as Bluetooth, Wi-Fi, cellular signals, or other radio frequency signals. The communications interface 212 may include hybrid, intelligent, or multiple transceivers for simultaneous communications with a number of distinct devices. As described, the signals may be analog or digital. In addition, the signals may be communicated directly or through a communications network.
The communications interface 212 may also include a user interface for receiving user input. The user interface may also include a display for displaying information to a user. For example, the wireless device 202 may include any number of soft keys or hard keys. Hard keys are dedicated buttons or interface elements hard-coded for a single, unique, and consistent purpose. Examples of hard keys include the directional indicators of the described embodiments. Hard keys may also include a dedicated keyboard, track ball, mouse, and other buttons. Soft keys are programmable buttons or interface components. Soft keys may be positioned or located anywhere on a display device, such as a touch screen. Each of the soft keys may perform different functions in response to the default or user-defined
programming. In one embodiment, soft keys may display symbols, icons, text, or other that identify the soft key. For example, the soft keys may represent the integration application 216 utilized by the wireless device 202.
In one embodiment, the integration application 216 is one or more applications, programs, modules, or set of instructions configured to create, compile, overlay, render, compress, and archive the telemetry data 206, the video stream 204, and the integrated stream 208. In another embodiment, the integration application 216 may represent digital logic or hardware that generates the integrated video content. For example, the integration application 216 may be a field programmable gate array (FPGA) or application specific integrated circuit (ASIC). The integration application 216 may also communicate with other applications, devices or components to receive: mapping information, user biometrics (e.g., heart rate, calories, temperature, blood pressure, respiration rate, etc.), fitness tracking information, vehicle information (e.g., throttle position, revolutions per minute, brake pressure, tire pressure, oil pressure, oil temperature, water temperature, voltage, steering angle, gear, fuel pressure, air/fuel ratio, vacuum/boost), global positioning information (e.g., latitude, longitude, altitude, etc.), speed, acceleration, direction, orientation, session start/end times, lap time, lap number, distance travelled, barometric pressure, humidity, flight path, and so forth. The integration application 216 may display the selected telemetry data 206 as well as statistically or user determined information related to the telemetry data, such as minimums, maximums, averages, and so forth.
The integrated stream 208 appears as a single stream, but may represent synchronized layers of telemetry data 206 overlaid on the selected video stream 204. The integrated stream 208 may include one of a number of different video streams. The incoming video 206 may be assigned identifiers for the frames for accurately layering the telemetry data 206 on the video stream 204 to generate the integrated stream 208.
In one example, the wireless device 202 may also utilize a map application. The map application may be configured to format location and map information for display by the communications interface 212. The map application may be a geographic
information system. The map application may also display location and map information associated with the wireless device 202 or associated users/devices. For example, the map application may represent programs, such as Waze, Google Maps, Google Earth, Yahoo maps, MapQuest, Garmin, Microsoft Streets and Trips, and other similar mapping, electronic atlas, and cartography programs. The map application may also represent an application that utilizes public, open-source, or proprietary databases compiled and maintained
by OpenStreetMap, Navteq, Google, ArcGIS, Cadcorp GeognoSIS, ESRI, ArcIMS Server, Google Earth, Google FusionTables, MapServer, Mapnik, and other similar providers, applications, and organizations.
In one embodiment, the integration application 216 may be integrated with one or more of a mapping application, fitness/sport tracking application, personal health
application, vehicle monitoring application, or other applications that may communicate (e.g., real-time, discrete messages, etc.). The integration application 216 and
mapping application may also be integrated in an operating system, kernel, module, or other instructions executed by the processor 210. In one embodiment, the wireless device 202 is a stand-alone device for integrating the video stream 204 and the telemetry data 206 to generate the integrated stream 208. For example, a transceiver may send and receive information to one or more other devices and sensors. The transceiver may be designated to utilize any number of default or user specified frequencies. For example, the
transceiver may communicate with five different sensors communicating the telemetry data 206 through Bluetooth, a physically connected camera may communicate the video stream 204, and the wireless device 202 may communicate the integrated stream 208 to one or more observers utilizing a cellular, Wi-Fi, cellular, or satellite connection. In one embodiment, the wireless device 202 may be integrated in other components, devices, equipment, or systems of vehicles, household devices, computing devices, or other communications devices.
In other embodiments, the video stream 204 may be captured by any number of other devices, such as cell phones, security cameras, body cameras, and so forth. The telemetry data 206 may be overlaid on the video stream 204 from the different devices or third parties. As a result, the integrated stream 208 may represent a crowdsourcing effort. For example, identifying information may be utilized to associate video with a particular event, activity, or location so that an administrator, editor, manager, or other user may select from all available and authentic video stream sources to produce the best results possible.
FIG. 3 is a flowchart of a process for overlaying video data with telemetry data in accordance with an illustrative embodiment. The processes of FIG. 3 and 4 may be implemented by any number of computing or communications devices, such as laptops, smart phones, gaming devices, cameras, vehicle systems, tablets, or other digital capture device referred to as a wireless device for purposes of simplicity. The processes of FIG. 3 and 4 may be implemented by a wireless device communicating with a camera or video capture device and with any number of sensors or data capturing devices that capture telemetry data. In one embodiment, the illustrative embodiments do not require proprietary file formats. The wireless device may accept digital video and telemetry data from any device. As a result, the illustrative embodiments may be implemented utilizing devices users already have access to thereby enhancing utilization and adoption rates of the methods and processes herein described.
The process may begin by initiating a video recording session (step 302). The video recording or streaming may be initiated automatically (e.g., the vehicle being started, systems being activated, available communications connections, etc.) or in response to user input (e.g., selection of an application, powering on a camera, selection of a record/capture button, etc.). In one embodiment, the video recording session is automatically initiated in response to capturing video as well as telemetry data from multiple sources for integration with the video captured.
Next, the wireless device receives video and telemetry data (step 304). The video may correspond to any number of activities or events. For example, the video may correspond to sports (e.g., running, biking, etc.), vehicle events (e.g., car race, travel from place-to-place, etc.), extreme sports (e.g., sky diving, snowmobiling,
etc.), commercial activities (e.g., salesman, pitches, demonstrations, etc.) or any number of other activities. The sample rates for the telemetry data may vary. As a result, the most recent measurement may be utilized until a next measurement is received as part of the telemetry data. The illustrative embodiments allow different sampling rates and display rates to be utilized seamlessly.
Next, the wireless device time stamps the telemetry data as received (step 306) and the software object takes into account the sampling rate of the telemetry data received and the frame rate of the video. The telemetry data is time stamped to ensure that it is properly synchronized with the video being received using an algorithm that neutralizes the differences in sampling and frame rates. Some of the telemetry data may
include a sequencing code that may also be utilized as a timestamp. Showing proper telemetry data is important to the validity and effectiveness of the illustrative
embodiments. Improperly synchronized data may present observers, coaches, family members, friends, or other remote parties false ideas or assumptions about the video and data being viewed.
In one embodiment, for the session recording a universal clock may be utilized to time stamp the beginning and end of the session that is applied to each of the telemetry and video data feeds. If the individual data feeds include a clock, the wireless device (or applicable system) reconciles the differences with the universal clock. Reconciliation may occur once per session, as data is received, or as often as needed to effectively reconcile any determined differences, deviations, variations, errors, or so forth. In many instances, the video data may have a time stamp for the video frames taken by a clock of the recording device, but may not be reliable or accurate. As a result, the universal clock may be utilized to ensure synchronicity and accuracy across the communicating devices. Next, the wireless device overlays the video with the telemetry data to generate the integrated video stream (step 308). During step 308, the telemetry data is combined with the video in a format that is aesthetically pleasing. The telemetry data is rendered as part of the video to generate the integrated video stream. Any number of data graphics applications may be utilized to integrated the video and telemetry data in a desired format. In one embodiment, the user may have previously established how the telemetry data is integrated into the video in real-time or near real-time. During step 308, the video stream be rendered with the telemetry data. The integrated video stream may also be compressed for transmission to any number of users or devices, directly or through one or more networks. For example, the wireless device may capture the telemetry data from any number of data feeds to produce a fully rendered video feed with the telemetry data.
In one embodiment, the wireless device may utilize dedicated hardware or software as needed. For example, an algorithm may be utilized to the individual video
frame receives the right overlay of data. To perform the overlaying the video is re-encoded frame-by-frame. In one embodiment, the telemetry data overlay is drawn on the video as a still image and the video frames and stitched back together in sequence to create the video with integrated telemetry. The formatting of the telemetry data and video data may be set by default or customized as needed as any number of templates or preset options.
Next, the wireless device communicates the integrated video stream (step 310). The integrated video stream may be broadcast to specified users or as a stream receivable by any number of parties. The integrated video stream may be communicated as a feed with the telemetry data displayed within the graphics of the integrated video stream and with the associated metadata for recording or review by any number of end-user devices. In one embodiment, the process of FIG. 3 may be utilized by NASCAR to display integrated video content for multiple users simultaneously. As a result, a NASCAR fan may be able to switch between the numerous drivers to see the video from the applicable car as well as telemetry data associated with the user and car. In one embodiment, hashtags, hyperlinks, tags, or other identifiers may be utilized to switch between streams.
Next, the wireless device terminates the video recording session (step 310). The process may be terminated automatically or manually in response to the video ending or other input. During the process of FIG. 3, the wireless device may also queue, store, or archive the integrated video for live streaming or subsequent viewing. FIG. 4 is a flowchart of a process for processing integrated video in accordance with an illustrative embodiment. The process of FIG. 4 may correspond to one or more steps of FIG. 3, such as step 308 of FIG. 3. In one embodiment, the process may begin by gathering telemetry data (step 402). The telemetry data may be gathered as it is received by the wireless device. For example, the wireless device may receive the telemetry data utilizing wireless communications, such as Bluetooth, Wi-Fi, or other radio frequency signals, or a direct physical connection (e.g., USB, Firewire, etc.).
Next, the wireless device saves the telemetry data into an XML file and data structure (step 404). Any number of data files, markup languages, or structures may be utilized to save the telemetry data. The data may include the associated time/time stamp associated with each piece of telemetry data.
Next, the wireless device drafts artifacts on a layer (step 406). The layer may represent a location for the telemetry data.
Next, the wireless device retrieves data values from the XML file for presentation (step 408). In one embodiment, the data values are the telemetry data stored in the XML file that may be inserted into the layer for presentation.
Next, the wireless device synchronizes the presented telemetry data with the video utilizing a timestamp and sample rates for each data signal (step 410).
FIG. 5 is a flowchart of a process for sharing integrated video in accordance with an illustrative embodiment. In one embodiment, the process may begin by streaming integrated video for immediate playback (step 502). In one embodiment, the integrated video may include video and telemetry data associated with the video.
Next, the wireless device uploads the integrated video (step 504). The integrated video may be automatically uploaded or manually uploaded by a user. In one embodiment, one or more applications may upload the integrated video in response to the telemetry data being fully overlaid, combined, or integrated with the video in a format that may be communicated to one or more users and associated devices. The integrated video may be uploaded to one or more servers, networks, systems, applications, databases, or memories that are part of or in communication with the wireless device.
Next, the wireless device shares the integrated video online (step 506). The integrated video may be shared utilizing any number of website, communities, cloud networks or platforms, or file sharing services. For example, the integrated video may be uploaded to Facebook, Twitter, YouTube, Twitch, Vimeo, LiveLeak, Ustream, or any number of streaming or video sharing sites, devices, or platforms.
FIG. 6-11 are pictorial representations of integrated videos in accordance with an illustrative embodiment. FIG. 6 includes integrated video stream 602, FIG. 7 illustrates integrated video stream 702, FIG. 8 illustrates integrated video stream 802, FIG. 9 illustrates integrated video streams 902, FIG. 10 illustrates integrated video stream
1002, FIG. 11 illustrates integrated video stream 1102, and FIG. 12 illustrates integrated video stream 1202. Altogether the video streams of FIGs. 6-12 may be referred to as the integrated video streams 600. The integrated video streams 600 may illustrate any number of video streams communicating in real-time (or near real-time) or video that may be stored, captured, or queued for subsequent communication. The integrated video streams 600 may be captured and overlaid with telemetry data in real-time or near real-time. All or portions of the available telemetry data may be presented at any given time.
The integrated video stream 602 of FIG. 6 shows an example of a video stream that is ready to be played by a user utilizing any number of media players, software applications, or connections. For example, the integrated video stream 602 may show video from a race car that is captured utilizing a camera. The integrated video stream 602 may include telemetry data 604, such as speed, current time, average lap time, most recent lap
time, minimum lap time, maximum lap time, minimum speed, maximum speed, time elapsed, name of course, lap number, user identifier, and track position/map location. The telemetry data may be varied or alternated to ensure that any users watching are not overloaded with information and that the video image is not overtaken with the telemetry data. In one embodiment, the user may position the telemetry data within the video.
In one embodiment, the integrated video stream 702 of FIG. 7 may illustrate a sky diver jumping from a plane or balloon utilizing a wing suit, helmet, oxygen, and other protective gear. The integrated video stream may display telemetry data and information, such as speed, heart rate (measured in beats per minute), and altitude. The telemetry data may be captured by one or more sensors, biometric sensors, or devices that communicate with a computing or communications device, such as a cell phone, to generate the integrated video stream 702.
In one embodiment, the integrated video stream 802 of FIG. 8 may illustrate the view of a user driving a vehicle for work, pleasure, or daily activities. The integrated video stream 802 may illustrate a speed, date, time, trip time, trip identifier, trip identifier, driver name, map, path or directions, applicable intersection, and other applicable information. For example, the integrated video stream 802 may capture traffic incidents or wrecks to protect the user from unwanted liability or for recording keeping, safety, or insurance purposes.
In one embodiment, the integrated video stream 902 of FIG. 9 may illustrate a user snowboarding. In one embodiment, the user may be wearing snow clothing and a costume. The user may utilize a selfie-stick or other extended camera system with a miniature camera (e.g., GoPro camera system) to record video of snowboarding. The telemetry data integrated with the integrated video stream 902 may include speed, temperature, altitude, path, mapping data, and other applicable information. The integrated video stream 902 may be utilized for commercial or personal pursuits. In one embodiment, the integrated video stream 902 may be captured for a television show, movie, or Internet series about snowboarding.
In one embodiment, the integrated video stream 1002 of FIG. 10 may illustrate a track meet and the 100 meter hurdles race. The integrated video stream 1002 may show pulse rate for one or more of the racers, wind speed and direction, temperature at the event, and speed of one or more of the racers (e.g., numerically and graphically).
In one embodiment, the integrated video stream 1 102 of FIG. 1 1 may illustrate video of a commercial transportation service, such as Lyft, Uber, or other taxi or transportation services. The telemetry data may include speed, name of the company/service, date, pickup time, expected drop-off time, driver name/identifier, passenger, and location/map. The integrated video stream 1102 may also be streamed or recorded for verification, safety, insurance, or other purposes (e.g., company policy, personal verification, etc.). For example, the user may be a professional driver that documents his trips utilizing video and the associated telemetry data that make up the integrated video stream 1102.
In one embodiment, the integrated video stream 1202 of FIG. 12 may illustrate video and telemetry data from a drone aircraft. In addition to the flight video content, the integrated video stream 1202 may show telemetry data including speed, temperature, altitude, height, distance travelled, battery status, range, and other applicable information.
The illustrative embodiments may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, embodiments of the inventive subject matter may take the form of a computer program product embodied in any tangible medium of expression having computer usable program code embodied in the medium. The described embodiments may be provided as a computer program product, or software, that may include a machine-readable medium having stored thereon instructions, which may be used to program a computing system (or other electronic device(s)) to perform a process according to embodiments, whether presently described or not, since every conceivable variation is not enumerated herein. A machine-readable medium includes any mechanism for storing or transmitting information in a form (e.g., software, processing application) readable by a machine (e.g., a computer). The machine-readable medium may include, but is not limited to, magnetic storage medium (e.g., floppy diskette); optical storage medium (e.g., CD-ROM); magneto-optical storage medium; read only memory (ROM); random access memory (RAM); erasable programmable memory (e.g., EPROM and EEPROM); flash memory; or other types of medium suitable for storing electronic instructions. In addition, embodiments may be embodied in an electrical, optical, acoustical or other form of propagated signal (e.g., carrier waves, infrared signals, digital signals, etc.), or wireline, wireless, or other communications medium.
Computer program code for carrying out operations of the embodiments may be written in any combination of one or more programming languages, including an object- oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on a user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN), a personal area network (PAN), or a wide area network (WAN), or the connection may be made to an external computer (e.g., through the Internet using an Internet Service Provider).
FIG. 13 depicts a computing system 1300 in accordance with an illustrative embodiment. For example, the computing system 1300 may represent an electronic computing or communications device, such as the laptop 112 of FIG. 1. The computing device 1300 may be utilized to receive video and telemetry data for creating and managing integrated video. The computing system 1300 includes a processor unit 1301 (possibly including multiple processors, multiple cores, multiple nodes, and/or implementing multithreading, etc.). The computing system includes memory 1307. The memory 1307 may be system memory (e.g., one or more of cache, SRAM, DRAM, zero capacitor RAM, Twin Transistor RAM, eDRAM, EDO RAM, DDR RAM, EEPROM, NRAM, RRAM, SONOS, PRAM, etc.) or any one or more of the above already described possible realizations of machine-readable media. The computing system also includes a bus 1303 (e.g., PCI, ISA, PCI-Express, HyperTransport®, InfiniBand®, NuBus, etc.), a network interface 1305 (e.g., an ATM interface, an Ethernet interface, a Frame Relay interface, SONET interface, wireless interface, etc.), and a storage device(s) 1309 (e.g., optical storage, magnetic storage, etc.).
The system memory 1307 embodies functionality to implement embodiments described above. The system memory 1307 may include one or more functionalities that facilitate retrieval of the audio information associated with an identifier. Code may be implemented in any of the other devices of the computing system 1300. Any one of these functionalities may be partially (or entirely) implemented in hardware and/or on the processing unit 1301. For example, the functionality may be implemented with an application specific integrated circuit, in logic implemented in the processing unit 1301, in a co-processor on a peripheral device or card, etc. Further, realizations may include fewer or additional components not illustrated in FIG. 13 (e.g., video cards, audio cards, additional network interfaces, peripheral devices, etc.). The processor unit 1301, the storage device(s) 1309, and the network interface 1305 are coupled to the bus 1303. Although illustrated as being coupled to the bus 1303, the memory 1307 may be coupled to the processor unit 1301.
The illustrative embodiments are not to be limited to the embodiments described herein. In particular, the illustrative embodiments contemplate numerous variations in the type of ways in which embodiments may be applied. The foregoing description has been presented for purposes of illustration and description. It is not intended to be an exhaustive list or limit any of the disclosure to the precise forms disclosed. It is contemplated that other alternatives or exemplary aspects are considered included in the disclosure. The description is merely examples of embodiments, processes or methods of the invention. It is understood that any other modifications, substitutions, and/or additions may be made, which are within the intended spirit and scope of the disclosure. For the foregoing, it may be seen that the disclosure accomplishes at least all of the intended objectives. The illustrative embodiments are meant to be combined, integrated, separated, or so forth regardless of imposed restrictions, impositions, or so forth. The previous detailed description is of a small number of embodiments for implementing the invention and is not intended to be limiting in scope. The following claims set forth a number of the embodiments of the invention disclosed with greater particularity.

Claims

What is claimed is:
1. A method for combining video with telemetry data, comprising:
receiving the video from a camera associated with a user at a wireless device;
receiving telemetry data associated with the video at the wireless device;
time stamping the telemetry data as received;
overlaying the video with the telemetry data to generate integrated video utilizing the wireless device;
communicating the integrated video from the wireless device to one or more users.
2. The method of claim 1 , further comprising:
receiving a user selection of the telemetry data to be overlaid on the video.
3. The method of claim 1 , further comprising:
storing the video and the telemetry data separately to overlay the video with the telemetry data as needed for communicating the integrated video.
4. The method of claim 1 , further comprising:
uploading the integrated video to share with one or more users.
5. The method of claim 4, wherein the integrated video is uploaded to a cloud platform accessible by a plurality of users.
6. The method of claim 1 , further comprising:
saving the telemetry data into an XML file;
drafting artifacts on a layer;
retrieving data values from the XML file for presentation; and
synchronizing the telemetry data with the video utilizing the timestamps and sample rates for the telemetry data.
7. The method of claim 1 , wherein the wireless device is a smart phone, tablet, or laptop.
8. The method of claim 1 , wherein the telemetry data is captured by one or more sensors.
9. The method of claim 8, wherein the one or more sensors include biometric sensors associated with the user.
10. The method of claim 8, wherein the one or more sensors include sensors associated with a vehicle or an environment of the user.
11. The method of claim 1 , wherein the telemetry data is rendered as part of the integrated video in response to user preferences.
12. A wireless device for integrating video and telemetry data, comprising:
a processor executing a set of instructions;
a transceiver in communication with a video source and one or more data sources; a memory storing the set of instructions, wherein the instructions are executed to: receive the video from the video source;
receive telemetry data associated with the video from the one or more data sources; time stamp the telemetry data as received;
overlay the video with the telemetry data to generate integrated video, wherein the integrated video is communicated to one or more devices; and
communicate the integrated video to one or more users.
13. The wireless device of claim 12, wherein the one or more data sources are sensors for a user, vehicle, or environment, and wherein the video source is a camera or a plurality of cameras.
14. The wireless device of claim 12, wherein the one or more users are designated by a user controlling the wireless device.
15. The wireless device of claim 12, wherein the formatting of the telemetry data within the integrated video is specified by an authorized user, and wherein the integrated video is communicated through one or more social networks.
16. The wireless device of claim 12, wherein the video and the telemetry data are saved separately in the memory to overlay the video with the telemetry data as needed for communicating the integrated video.
17. The wireless device of claim 12, wherein the set of instructions are further executed to:
save the telemetry data into an XML file;
draft artifacts on a layer;
retrieve data values from the XML file for presentation;
synchronize the telemetry data with the video utilizing the timestamps and sample rates for the telemetry data.
18. The wireless device of claim 12, wherein the set of instructions are further executed to:
save the video and the telemetry data separately to overlay the video with the telemetry data as requested for communicating the integrated video.
19. The wireless device of claim 12, wherein the video source is mounted to a vehicle.
20. A wired device for integrating video and telemetry data, comprising:
a processor executing a set of instructions;
a transceiver in communication with a video source and one or more data sources; a memory storing the set of instructions, wherein the instructions are executed to: receive the video from the video source;
receive telemetry data associated with the video from the one or more data sources; time stamp the telemetry data as received;
overlay the video with the telemetry data to generate integrated video, wherein the integrated video is communicated to one or more devices; and
communicate the integrated video to one or more users.
PCT/US2017/036558 2016-06-08 2017-06-08 Networked apparatus for real-time visual integration of digital video with telemetry data feeds WO2017214400A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201662347271P 2016-06-08 2016-06-08
US62/347,271 2016-06-08

Publications (1)

Publication Number Publication Date
WO2017214400A1 true WO2017214400A1 (en) 2017-12-14

Family

ID=60578115

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/036558 WO2017214400A1 (en) 2016-06-08 2017-06-08 Networked apparatus for real-time visual integration of digital video with telemetry data feeds

Country Status (1)

Country Link
WO (1) WO2017214400A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108287923A (en) * 2018-02-28 2018-07-17 山东汇贸电子口岸有限公司 A kind of visualization interface data intelligence extraction system and its design method
DE102020108805A1 (en) 2020-03-31 2021-09-30 Audi Aktiengesellschaft Method for operating a cloud-based platform for storing and sharing video files and a cloud-based system for storing and sharing video files
US11290708B2 (en) 2019-02-19 2022-03-29 Edgy Bees Ltd. Estimating real-time delay of a video data stream
GB2579018B (en) * 2018-11-12 2022-12-21 Bae Systems Plc Processing simulator data
CN117938927A (en) * 2024-03-25 2024-04-26 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method
US12002193B2 (en) * 2020-07-10 2024-06-04 Scoutdi As Inspection device for inspecting a building or structure

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030130822A1 (en) * 2001-11-28 2003-07-10 Steele Robert C. Multimedia racing experience system
US20100283630A1 (en) * 2009-05-05 2010-11-11 Advanced Technologies Group, LLC Sports telemetry system for collecting performance metrics and data
US20110090399A1 (en) * 2009-10-19 2011-04-21 Intergraph Technologies Company Data Search, Parser, and Synchronization of Video and Telemetry Data
WO2014134148A2 (en) * 2013-02-26 2014-09-04 Polaris Industries Inc. Recreational vehicle interactive telemetry, mapping, and trip planning system
WO2014137241A1 (en) * 2013-03-07 2014-09-12 Obshestvo S Ogranichennoy Otvetstvennostyu "Sinezis" Method and system for prompt video-data message transfer to personal devices
US20150163379A1 (en) * 2013-12-11 2015-06-11 Cellco Partnership D/B/A Verizon Wireless Time synchronization of video and data inside a mobile device
US9098611B2 (en) * 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030130822A1 (en) * 2001-11-28 2003-07-10 Steele Robert C. Multimedia racing experience system
US20100283630A1 (en) * 2009-05-05 2010-11-11 Advanced Technologies Group, LLC Sports telemetry system for collecting performance metrics and data
US20110090399A1 (en) * 2009-10-19 2011-04-21 Intergraph Technologies Company Data Search, Parser, and Synchronization of Video and Telemetry Data
US9098611B2 (en) * 2012-11-26 2015-08-04 Intouch Technologies, Inc. Enhanced video interaction for a user interface of a telepresence network
WO2014134148A2 (en) * 2013-02-26 2014-09-04 Polaris Industries Inc. Recreational vehicle interactive telemetry, mapping, and trip planning system
WO2014137241A1 (en) * 2013-03-07 2014-09-12 Obshestvo S Ogranichennoy Otvetstvennostyu "Sinezis" Method and system for prompt video-data message transfer to personal devices
US20150163379A1 (en) * 2013-12-11 2015-06-11 Cellco Partnership D/B/A Verizon Wireless Time synchronization of video and data inside a mobile device

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108287923A (en) * 2018-02-28 2018-07-17 山东汇贸电子口岸有限公司 A kind of visualization interface data intelligence extraction system and its design method
CN108287923B (en) * 2018-02-28 2020-12-29 浪潮云信息技术股份公司 Intelligent visual interface data extraction system and design method thereof
GB2579018B (en) * 2018-11-12 2022-12-21 Bae Systems Plc Processing simulator data
US11290708B2 (en) 2019-02-19 2022-03-29 Edgy Bees Ltd. Estimating real-time delay of a video data stream
US11563932B2 (en) 2019-02-19 2023-01-24 Edgy Bees Ltd. Estimating real-time delay of a video data stream
DE102020108805A1 (en) 2020-03-31 2021-09-30 Audi Aktiengesellschaft Method for operating a cloud-based platform for storing and sharing video files and a cloud-based system for storing and sharing video files
WO2021197776A1 (en) 2020-03-31 2021-10-07 Audi Ag Method for operating a cloud-based platform for storing and sharing video files, and cloud-based system for storing and sharing video files
US12002193B2 (en) * 2020-07-10 2024-06-04 Scoutdi As Inspection device for inspecting a building or structure
CN117938927A (en) * 2024-03-25 2024-04-26 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method
CN117938927B (en) * 2024-03-25 2024-06-04 厦门瑞为信息技术有限公司 AR-assisted self-boarding system and self-boarding method

Similar Documents

Publication Publication Date Title
WO2017214400A1 (en) Networked apparatus for real-time visual integration of digital video with telemetry data feeds
US11375161B2 (en) Wearable camera, wearable camera system, and information processing apparatus for detecting an action in captured video
US10045077B2 (en) Consumption of content with reactions of an individual
US20230239567A1 (en) Wearable Multimedia Device and Cloud Computing Platform with Application Ecosystem
US20200381021A1 (en) System and Method for Improving User Performance in a Sporting Activity
US11198036B2 (en) Information processing system
US20210005224A1 (en) System and Method for Determining a State of a User
US11305195B2 (en) Extended environmental using real-world environment data
US20200135238A1 (en) Automatic generation of video and directional audio from spherical content
US10391361B2 (en) Simulating real-world terrain on an exercise device
JP6317765B2 (en) Mixed reality display adjustment
US9235933B2 (en) Wearable display system that displays previous runners as virtual objects on a current runner's path
US20080259199A1 (en) Image display system, display apparatus, and display method
CN116389554A (en) System for improving user's performance in athletic activities and method thereof
US20140067801A1 (en) Geotagging based on specified criteria
US9794475B1 (en) Augmented video capture
WO2009138587A2 (en) Device for acquiring and processing physiological data of an animal or of a human in the course of a physical or mental activity
US20170264822A1 (en) Mounting Device for Portable Multi-Stream Video Recording Device
JP2017208802A (en) Method for encoding and decoding video of drone and related device
US20170256283A1 (en) Information processing device and information processing method
WO2023244513A1 (en) Data privacy in driver monitoring system
US20170289089A1 (en) Helmet with information-sharing function
CN113873100A (en) Video recording method, video recording device, electronic equipment and storage medium
US20180014158A1 (en) Mobile Device Recommendation System and Method
KR20210110821A (en) Imaging methods and systems

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17811017

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17811017

Country of ref document: EP

Kind code of ref document: A1