EP3175372A1 - System and method for providing and interacting with coordinated presentations - Google Patents
System and method for providing and interacting with coordinated presentationsInfo
- Publication number
- EP3175372A1 EP3175372A1 EP15828271.5A EP15828271A EP3175372A1 EP 3175372 A1 EP3175372 A1 EP 3175372A1 EP 15828271 A EP15828271 A EP 15828271A EP 3175372 A1 EP3175372 A1 EP 3175372A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- computing device
- content
- data stream
- data
- coordinated presentation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F17/00—Digital computing or data processing equipment or methods, specially adapted for specific functions
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/02—Editing, e.g. varying the order of information signals recorded on, or reproduced from, record carriers
- G11B27/031—Electronic editing of digitised analogue information signals, e.g. audio or video signals
-
- G—PHYSICS
- G11—INFORMATION STORAGE
- G11B—INFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
- G11B27/00—Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
- G11B27/10—Indexing; Addressing; Timing or synchronising; Measuring tape travel
- G11B27/11—Indexing; Addressing; Timing or synchronising; Measuring tape travel by using information not detectable on the record carrier
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/2224—Studio circuitry; Studio devices; Studio equipment related to virtual studio applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/265—Mixing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/222—Studio circuitry; Studio devices; Studio equipment
- H04N5/262—Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
- H04N5/272—Means for inserting a foreground image in a background image, i.e. inlay, outlay
- H04N5/275—Generation of keying signals
Definitions
- Patent Application Serial Number 62/031,114 filed on July 30, 2014, and is a continuation- in-part of U.S. Patent Application Serial Number 14/316,536, filed June 26, 2014, the entire contents of each of which are incorporated by reference herein as if expressly set forth in their respective entireties.
- the present application relates, generally, to content presentation and, more particularly, to a system and method for providing and interacting with coordinated presentations.
- the present application addresses this in a system and method in which the broadcaster, who may be an individual using a portable computer device, provides viewers with the ability to launch supplemental content that has been curated by the broadcaster to the topics and information sources chosen by the broadcaster. As such, a more personal and deeper experience can be had by utilizing the present invention.
- a computing device includes a processor and a memory, the computing device being configured by code stored in the memory and executed by the processor.
- Curated content is selected by a user for inclusion in the coordinated presentation, and a plurality of images are captured for inclusion in the coordinated presentation.
- a first data stream associated with the curated content and a second data stream associated with the captured images are received, and a first arrangement of at least some content associated with the first and second data stream is output.
- telemetry information associated with at least one of a gyroscope and accelerometer is processed, and at least some of the curated content is modified in accordance with at least some of the telemetry information and/or at least one of the plurality of images comprised in the second data stream.
- respective content associated with the two data streams is integrated to generate the coordinated presentation.
- the coordinated presentation is capable of transmission to and receipt by one or more remote devices, and wherein the coordinated presentation is configured to enable interaction with at least a portion of the curated content at each of the remote devices.
- FIG. 1 is a diagram illustrating an example hardware arrangement that operates for providing the systems and methods disclosed herein;
- FIG. 2A is a block diagram that illustrates functional elements of a computing device in accordance with an embodiment
- Fig. 2B is a block diagram representing a plurality of modules that provide functionality shown and described herein;
- Fig. 3 illustrates an example coordinated presentation that illustrates the effect of green screen functionality provided in accordance with one or more implementations of the present application
- Figs. 4A and 4B illustrate two video frames of an example coordinated presentation
- Fig. 5 is a simple block diagram that represents a rig that is configured with a camera and computing device
- FIG. 6 is a block diagram illustrating features of the present application and configured to process two data streams into one single stream;
- Fig. 7 illustrates an example representation, which includes a viewing area that includes a foreground element and two background elements, and a virtual background that extends past the viewing area;
- Fig. 8 illustrates an example representation of authoring a coordinated presentation in accordance with the present application
- Fig. 9 illustrates another example representation of authoring a coordinated presentation in accordance with the present application
- Fig. 10 is a flow diagram showing a routine that illustrates a broad aspect of a method for authoring a coordinated presentation.
- the present application provides systems and methods for authoring and playing video that can be layered with interactive content, including content that is available over one or more data communication networks, such as the Internet.
- Devices operating, for example, iOS, ANDROID, WINDOWS MOBILE, BLACKBERRY, MAC OS, WINDOWS or other operating systems are configured to provide functionality, such as an authoring tool and interface for developing distributable coordinated presentations including videos that include customizable and interactive features for use by one or more end-users that receive the presentations.
- the software applications provide a viewing/interactive tool, referred to herein, generally, as a "consuming" interface, for end-users who receive videos that are authored in accordance with the present application.
- a viewing/interactive tool referred to herein, generally, as a "consuming” interface
- users may interact with videos as a function of touch and gestures, as well as other suitable interfaces, such as a mouse, trackball, keyboard or other input.
- Some functionality available for end- users is defined by an author.
- a video mixer module can be provided that comprises instructions executing so as to configure a processor to integrate a plurality of images captured by a camera together with a portion of the curated content via a user selection from a touch-screen interface, and thereby to generate a coordinated presentation that is capable of transmission to and receipt by one or more remote devices; and wherein the coordinated presentation is configured to enable interaction with the portion of the curated content at each of the remote devices such that results of a respective interaction at a particular remote device are viewable at the particular remote device but are not viewable at (a) other of the remote devices and (b) the display.
- coordinated presentations may be configured with interactive options, which may include images, video content, website content, or computer programs (collectively referred to herein, generally, as "vApps").
- An authoring tool can be integrated with a player tool, and the tools enable one or more vApps to be embedded in video or, in one or more implementations, a code is embedded in the video for enhanced functionality.
- a play/pause button can be provided that enables a user to play or pause playback of a coordinated presentation.
- a timeline function can be provided that identifies a current time location within a coordinated presentation as well as to navigate therein.
- VApp icons can be provided that represent vApps that are included with the current coordinated presentation at respective time locations therein.
- the coordinated presentation jumps to the corresponding time location, and the user can interact with the respective vApp.
- Information can be time coded in video, and selectable user interactive elements for navigation/time can be provided.
- the present application includes and improves functionality for chroma key composting, often referred to as use of a "green screen” and/or "blue screen.”
- a computing device configured with an authoring tool and interface for developing distributable coordinated presentations manipulates background content provided in a coordinated presentation as a function of the movement and angle the camera(s) used during recording of the coordinated presentation.
- one or more foreground elements can be the basis of such background manipulation.
- a module executing on a device configured with an authoring tool detects an angle of view, such as a function of camera position and/or angle of view of one or more foreground elements, and manipulates the appearance of the composited background content to eliminate an otherwise static appearance of the background content.
- the background content can be, for example, a composited in place of background provided in a respective color range (e.g., green).
- FIG. 1 a diagram is provided of an example hardware arrangement that operates for providing the systems and methods disclosed herein, and designated generally as system 100.
- System 100 can include one or more data processing apparatuses 102 that are at least communicatively coupled to one or more user computing devices 104 across communication network 106.
- Data processing apparatuses 102 and user computing devices 104 can include, for example, mobile computing devices such as tablet computing devices, smartphones, personal digital assistants or the like, as well as laptop computers and/or desktop computers. Further, one computing device may be configured as a data processing apparatus 102 and a user computing device 104, depending upon operations be executed at a particular time.
- an audio/visual capture device 105 is depicted in Fig.
- data processing apparatus 102 can be configured with one or more cameras (e.g., front-facing and rear-facing cameras), a microphone, a microprocessor, and a communications module(s) and that is coupled to data processing apparatus 102.
- the audio/visual capture device 105 can be configured to interface with one or more data processing apparatuses 102 for producing high- quality audio/video content.
- data processing apparatus 102 can be configured to access one or more databases for the present application, including image files, video content, documents, audio/video recordings, metadata and other information.
- data processing apparatus 102 can be configured to access Internet websites and other online content.
- data processing apparatus 102 can access any required databases via communication network 106 or any other communication network to which data processing apparatus 102 has access.
- Data processing apparatus 102 can communicate with devices including those that comprise databases, using any known communication method, including Ethernet, direct serial, parallel, universal serial bus (“USB”) interface, and/or via a local or wide area network.
- User computing devices 104 communicate with data processing apparatuses 102 using data connections 108, which are respectively coupled to communication network 106.
- Communication network 106 can be any communication network, but is typically the Internet or some other global computer network.
- Data connections 108 can be any known arrangement for accessing communication network 106, such as the public internet, private Internet (e.g. VPN), dedicated Internet connection, or dial-up serial line interface
- SLIPP/PPP protocol/point-to-point protocol
- ISDN integrated services digital network
- DSL digital subscriber line
- ATM asynchronous transfer mode
- User computing devices 104 preferably have the ability to send and receive data across communication network 106, and are equipped with web browsers, software applications, or other software and/or hardware tools, to provide received data on audio/visual devices incorporated therewith.
- user computing device 104 may be personal computers such as Intel Pentium-class and Intel Core-class computers or Apple Macintosh computers, tablets, smartphones, but are not limited to such computers.
- Other computing devices which can communicate over a global computer network such as palmtop computers, personal digital assistants (PDAs) and mass-marketed Internet access devices such as WebTV can be used.
- the hardware arrangement of the present invention is not limited to devices that are physically wired to communication network 106, and that wireless communication can be provided between wireless devices and data processing apparatuses 102.
- the present application provides improved processing techniques to prevent packet loss, to improve handling interruptions in communications, and other issues associated with wireless technology.
- System 100 preferably includes software that provides functionality described in greater detail herein, and preferably resides on one or more data processing apparatuses 102 and/or user computing devices 104.
- One of the functions performed by data processing apparatus 102 is that of operating as a web server and/or a web site host.
- Data processing apparatuses 102 typically communicate with communication network 106 across a permanent i.e., un-switched data connection 108. Permanent connectivity ensures that access to data processing apparatuses 102 is always available.
- FIG. 2A illustrates, in block diagram form, an exemplary data processing apparatus 102 and/or user computing device 104 that can provide functionality in accordance with interactive conferencing, as described herein.
- Data processing apparatus 102 and/or user computing device 104 may include one or more microprocessors 205 and connected system components (e.g., multiple connected chips) or the data processing apparatus 102 and/or user computing device 104 may be a system on a chip.
- the data processing apparatus 102 and/or user computing device 104 includes memory 210 which is coupled to the microprocessor(s) 205.
- the memory 210 may be used for storing data, metadata, and programs for execution by the microprocessor(s) 205.
- the memory 210 may include one or more of volatile and non-volatile memories, such as Random Access Memory (“RAM”), Read Only Memory (“ROM”), Flash, Phase Change Memory (“PCM”), or other type of memory.
- RAM Random Access Memory
- ROM Read Only Memory
- PCM Phase Change Memory
- the data processing apparatus 102 and/or user computing device 104 also includes an audio input/output subsystem 215 which may include a microphone and/or a speaker for, for example, playing back music, providing telephone or voice/video chat functionality through the speaker and microphone, etc.
- a display controller and display device 220 provides a visual user interface for the user; this user interface may include a graphical user interface which, for example, is similar to that shown on a Macintosh computer when running Mac OS operating system software or an iPad, iPhone, or similar device when running mobile computing device operating system software.
- the data processing apparatus 102 and/or user computing device 104 also includes one or more wireless transceivers 230, such as an IEEE 802.11 transceiver, an infrared transceiver, a Bluetooth transceiver, a wireless cellular telephony transceiver (e.g., 1G, 2G, 3G, 4G), or another wireless protocol to connect the data processing system 100 with another device, external component, or a network.
- Gyroscope/Accelerometer 235 can be provided.
- the data processing apparatus 102 and/or user computing device 104 may be a personal computer, tablet-style device, such as an iPad, a personal digital assistant (PDA), a cellular telephone with PDA-like functionality, such as an iPhone, a Wi-Fi based telephone, a handheld computer which includes a cellular telephone, a media player, such as an iPod, an entertainment system, such as a iPod touch, or devices which combine aspects or functions of these devices, such as a media player combined with a PDA and a cellular telephone in one device.
- PDA personal digital assistant
- the data processing apparatus 102 and/or user computing device 104 may be a network computer or an embedded processing apparatus within another device or consumer electronic product.
- the data processing apparatus 102 and/or user computing device 104 also includes one or more input or output (“I/O") devices and interfaces 225 which are provided to allow a user to provide input to, receive output from, and otherwise transfer data to and from the system.
- I/O devices may include a mouse, keypad or a keyboard, a touch panel or a multi-touch input panel, camera, network interface, modem, other known I/O devices or a combination of such I/O devices.
- the touch input panel may be a single touch input panel which is activated with a stylus or a finger or a multi-touch input panel which is activated by one finger or a stylus or multiple fingers, and the panel is capable of distinguishing between one or two or three or more touches and is capable of providing inputs derived from those touches to the data processing apparatus 102 and/or user computing device 104.
- the I/O devices and interfaces 225 may include a connector for a dock or a connector for a USB interface, Fire Wire, etc. to connect the system 100 with another device, external component, or a network.
- the I/O devices and interfaces can include gyroscope and/or accelerometer 227, which can be configured to detect 3-axis angular acceleration around the X, Y and Z axes, enabling precise calculation, for example, of yaw, pitch, and roll.
- the gyroscope and/or accelerometer 227 can be configured as a sensor that detects acceleration, shake, vibration shock, or fall of a device 102/104, for example, by detecting linear acceleration along one of three axes (X, Y and Z).
- the gyroscope can work in conjunction with the accelerometer, to provide detailed and precise information about the device's axial movement in space.
- the 3 axes of the gyroscope combined with the 3 axes of the accelerometer enable the device to recognize approximately how far, fast, and in which direction it has moved to generate telemetry information associated therewith, and that is processed to generate coordinated presentations, such as shown and described herein.
- hardwired circuitry may be used in combination with the software instructions to implement the present embodiments.
- the techniques are not limited to any specific combination of hardware circuitry and software, or to any particular source for the instructions executed by the data processing apparatus 102 and/or user computing device 104.
- Fig. 2B is a block diagram representing a plurality of modules 250 that provide functionality shown and described herein. It is to be appreciated that several of the logical operations described herein can be implemented (1) as a sequence of computer implemented acts or program modules running on the various devices of the system 100 and/or (2) as interconnected machine logic circuits or circuit modules within the system (100). One or more particular implementations can be configured as a function of specifications of a particular device (e.g., size, energy, consumption, performance, etc.). Accordingly, the logical operations described herein are referred to variously as operations, steps, structural devices, acts, or modules.
- the various operations, steps, structural devices, acts and modules can be implemented in software, in firmware, in special purpose digital logic, and any combination thereof. It should also be appreciated that more or fewer operations can be performed than shown in the figures and described herein. These operations can also be performed in a different order than those described herein.
- the modules 250 can be provided via a single computing device, such as data processing apparatus 102, or can be distributed via a plurality of computing devices, such as computing devices 102 and/or 104. Moreover, one module can be provided via a single computing device (102 or 104), or can be distributed across a plurality of computing devices.
- tracking module 252 that includes functionality for detecting acceleration and movement in space, and/or for providing and/or processing in association with such movement.
- the tracking module 152 can transform data collected as a resulting movement of the computing device 104, as well as other "passive" information, such as timestamp information, into new information.
- the time stamp can function as a key for coordinating telemetry information and image/sound content.
- Information, such as the timestamp or information transformed by tracking module 152 can be embedded into the stream for use in generating and/or viewing coordinated presentations.
- Chroma key module 254 includes functionality for users to select one or more pixels and define one or more colors, degrees or a range of color or other input that is used as a basis for defining portions of a frame to be removed or replaced by background content (e.g., green screen
- background adjustment module 256 that is configured to modify the appearance of background content, such as to move the background content or change the angle of view in accordance with telemetry information received and processed as a function of tracking module 252.
- These modules can include tools (e.g., class files, packages or other program resources) that enable software developers to interface with one or more hardware and software systems and to develop and/or access custom video software applications, such as shown and described herein.
- 360 degree module 258 is configured to display up to a 360° view of content displayed on computing device 102/104, including in response to information received and processed as a function of tracking module 252.
- the content includes a "stitched" image file having a 360 degree field of view horizontally, and further includes a 180 degree field of view degree.
- the present application can include interactive capability for providing the content in a 360-degree view.
- Audio/Visual input module 260 includes functionality to interface with audio/visual devices, such as configured to interface with data processing apparatus 102 and/or user computing device 104.
- Audio/Visual input module 260 interfaces with a camera and microphone communicatively coupled to data processing apparatus 102 for use in authoring videos that can be layered with interactive content, such as shown and described herein.
- Such hardware can be used to capture images and sound for use in generating content that is composited with other graphic and/or sonic data.
- the present application can include an effects module 262 that enables one or more processing effects on audio and/or video.
- effects including those shown and described herein, green screen and white screen functionality can be provided to enable virtual placement of a presenter of video content in respective environments.
- Other example effects processing associated with effects module 262 is shown and described below.
- Recommender Module 264 can be included that is configured to provide tools that interface with one or more hardware and software systems and to support content sharing and a new form of social networking as a function of accessible video files that can be layered with various interactive content, and that can be authored, distributed and played in accordance with the teachings herein. Further,
- Advertisement Module 266 can be included that is usable to interface with one or more hardware and software systems to provide advertisements in particular context and times.
- Fig. 3 illustrates an example interface for authoring a coordinated presentation and that shows the effect of green screen functionality provided in accordance with one or more implementations of the present application.
- green screen portion 302 is shown in connection with an image file of a chain linked fence behind a foreground element, e.g., the people who are being photographed, and thereafter combined in the coordinated presentation.
- a "presenter” refers, generally, to a person, organization and/or computing device associated therewith that makes and/or distributes coordinated
- the present application can operate to provide functionality for substantially real-time compositing, which can include capturing of whatever is being displayed in the video frame, including from the camera or other content source, as well as capturing meta-data that is around or otherwise associated with one or more elements of the content, which is usable to report to a player device at a later point in time.
- Meta-data can include, for example, XML data.
- HTML hypertext markup language
- the present application provides a plurality of templates (also referred to generally as "Themes") for quickly authoring a coordinated presentation (e.g., a "TouchCast") that is configured with one or more features associated with a particular style or subject matter.
- a coordinated presentation e.g., a "TouchCast”
- options are available for authoring a new coordinated presentation, which may be in a default configuration and absent of any particular theme, as well as for authoring a new coordinated presentation using or based on a particular theme (e.g., a newscast).
- data processing apparatus 102 and/or user computing device 104 can provide one or more options for "Green Screen" functionality in the authoring tool for authoring a coordinated presentation.
- a graphical user interface associated with the authoring tool is configured with a color dropper control that, when selected by the user, enables the user to define a chroma key value, such as from one or more pixels, which represents one or more suitable portions of the background of the coordinated presentation to be replaced by other content, such as an image, video, as well as HTML-based or other content.
- a chroma key value such as from one or more pixels, which represents one or more suitable portions of the background of the coordinated presentation to be replaced by other content, such as an image, video, as well as HTML-based or other content.
- sensitivity and smoothing slider controls which operate to impact the relative smoothness and impacts of green-screen content.
- Such controls can reduce the impact of light reflecting from a chroma (e.g., green) screen onto a subject, which can impact realism and better integrate the subject into the background content.
- a sound effects control can be included that enables soundboard functionality which can be selected during recording of a coordinated presentation. For example, a car horn, cymbal crash or virtually any other sound effect can be added to a coordinated presentation.
- Figs. 4A and 4B illustrate two video frames 400 and 401 of an example coordinated presentation and that represent changes in the foreground (photographed) element 402 which, in Figs. 4A and 4B, is represented as a human stick figure and in solid lines.
- the example shown in Figs. 4A and 4B the foreground element 402 appears to have moved from one portion of the frame to another portion. The change could have occurred for various reasons, such as a result of camera and/or computing device movement, which can be tracked as a function of tracking module 252, and which generates new information as a result of such tracking.
- the virtual background elements 404 are represented as trees and in broken lines. In the example shown in Fig.
- the background areas 404 are static (e.g., not moving) and, accordingly, the frame 401 does not appear realistic because the movement of the foreground area 402 (or the movement of the angle of view of foreground area 402), such as due to camera panning, is not represented in the background areas 404.
- the frame 401 shown in Fig. 4B appears more realistic due to appropriate repositioning of the background elements 404 in view of the movement of the foreground area 402 through use of the new information from the tracking module.
- information is processed by tracking module 252, including telemetry information associated with an accelerometer and/or gyroscope and associated with image frames in video. This can be processed to affect a selection of a respective chroma key, to select content associated with a chroma key, and/or as well as for the relative placement of a subject in the frame, such as a person being photographed during production of a coordinated presentation.
- the placement of the subject in the frame, as well as the appearance of background content that is superimposed can be modified and adjusted in response to output from the tracking module 252, as well as in response to internal and/or external parameters, such as GPS-based information, gesture-based user input, timestamp information, and image visual effects.
- Information associated with the telemetry of the camera and/or computing device 104 is processed to determine the position and/or location of a subject in the foreground, and used to control the orientation and angle of view of the background content substantially in real-time as the coordinated presentation is being viewed. Accordingly, the positioning of the elements being displayed in the coordinated composite image is optimized.
- the processed information is used as input to one or more algorithms that configure a processor to alter the appearance of the background content, such as by providing movement of the background content relative to movement of one or more subjects in the foreground and/or movement of an angle of view of one or more subjects in the foreground.
- information associated with one or more active user events and/or inactive effects can be processed to provide input, as well, and that contributes to a determination of suitable movement and/or adjustment of the background content.
- gyroscope and/or accelerometer information captured in accordance with tracking module 252 is stored in one or more databases, such as in a data log file.
- the gyroscope and/or accelerometer information can be captured frame by frame, and can represent the position, orientation and/or heading associated with content in each frame.
- the background adjustment module processes information representing, for example, that pivoting occurs (e.g., by the camera 105) and results in corresponding pivot and/or other adjustment of the appearance of the virtual background.
- a single computing device 102/104 such as a mobile computing device configured as a tablet computer or smart phone, can be configured to provide the features shown and described herein.
- a rig 500 is provided that is configured with a camera 105 and computing device 102/104.
- Camera 105 may be a digital single lens reflex camera or other camera that is configured with an image sensor of higher quality or capacity than included with computing device 102/104, and usable to provide high-quality image content.
- the standalone camera 105 and camera 105 and the camera that is configured and/or integrated with computing device 102/104 can both be used to photograph
- Telemetry information from computing device 102/104 is processed to control the appearance of background content that has been added to the coordinated presentation, such as to affect the appearance of the angle, distance, viewpoint, or the like of the background content.
- the computing device 102/104 can begin recording at a time earlier than the camera 105 which can result in more telemetry information than photographic content being captured from the stand-alone camera 105.
- postproduction methods such as provided in video editing software, the photographic content captured by camera 105 can be synchronized with the telemetry information captured and processed in connection with computing device 102/104, to adjust virtual background content in the coordinated presentation, accordingly.
- a computing device configured in accordance with the present application processes two data streams into one single stream, substantially as illustrated in the block diagrams shown in Fig. 6.
- One of the two streams can be captured by a camera, such as configured or integrated with a computing device (e.g., a smart phone camera), and the other of the two streams can be generated by a computing device or originate from a data file, such as an image file (e.g., TIFF, JPG, PNG, or BMP format).
- a data file e.g., TIFF, JPG, PNG, or BMP format
- a corresponding and opposite adjustment is made to one or more background elements 404 represented in the second stream, such as to move a background element 404 five pixels down and three pixels to the left.
- This can generate a smooth and realistic appearance of the coordinated presentation, as a function of the background adjustment module 256.
- a mapping module 262 is provided that identifies a set of objects (e.g., foreground elements 402 and background elements 404), and manipulates the appearance of objects and/or set of objects.
- the present application supports complex manipulations, such as via background adjustment module 256, including to account for pitch, yaw, and roll, and to modify the appearance of one or more background elements 404, such as to change the perspective, the skew, lighting and shadows or the like, such as in the context of one or more respective foreground elements. For example, adjustments can be made in connection with brightness, contrast and/or saturation.
- an alternative output stream 608 can be generated, which corresponds to the alternative implementation described below with reference to Fig. 7 and in connection with a 360° file, such as an image file, a 3-D file or some other content capable of supporting a 360° display and that is configurable as a virtual background.
- a 360° file such as an image file, a 3-D file or some other content capable of supporting a 360° display and that is configurable as a virtual background.
- video can be provided that supports an effect of a 360° environment, including a 3-D environment, that the author of a coordinated presentation can explore ( e.g., walk through, display to a viewer, or the like).
- video and 3-D files as possible background environments.
- the second alternative data stream 608 can be manipulable by the viewer during playback of the coordinated presentation. For example, as the viewer of the coordinated presentation moves his or her computing device 104, telemetry information is accessed by computing device 104 and the 360° virtual background image can pan accordingly.
- the present application supports a virtual background provided in a coordinated presentation that can include content that extends beyond the viewing area of computing device 102/104.
- An example representation is shown in Fig. 7, which includes viewing area 702 that includes a foreground element 402 and two background elements 404.
- the background comprises a 360° panoramic image and the presenter may wish to enable the viewer to see all 360° of the image within the context of one or more foreground elements 402.
- a presenter may provide a coordinated presentation associated with an outdoor landscape and holds the computing device 104 that is integrated with camera 105 in front of her face and turns 360°.
- the virtual background elements 404 in the coordinated presentation appear to display the 360° view of the outdoor landscape behind the presenter, as telemetry information from the computing device 104 and camera 105 is processed by computing device 102 and/or 104.
- such virtual 360° panning can be provided in the coordinated presentation and can respond to movement associated with the presenter.
- such virtual 360° panning can be provided in the coordinated presentation in response to movement by the viewer of the coordinated presentation, for example on a tablet computing device 104.
- the 360° virtual background may comprise a series of individual image frames that are "stitched" in context as information associated with the presenter's device telemetry is referenced, such as by adjustment module 256 and during generation of the coordinated presentation.
- the information associated with the presenter's computing device telemetry is usable to adjust virtual background elements 404, substantially as described herein, and that maintains a high degree of realism due to the relative adjustments of the background elements 404 to the foreground elements 402.
- virtual background elements 404 substantially as described herein, and that maintains a high degree of realism due to the relative adjustments of the background elements 404 to the foreground elements 402.
- background elements 404 adjustment relative to the foreground element(s) 402 may not occur due to a lack of telemetry information being available to mapping module 262 and/or background adjustment module 256.
- the viewer's computing device 104 can be configured to adjust the virtual background 360° image, such as shown and described herein, as a function of substantially real-time compiling of a coordinated presentation and elements 402, 404 thereof.
- a viewer of a coordinated presentation can manipulate his or her computing device 104, telemetry information associated with a gyroscope and/or accelerometer configured with the device 104, for example, can be processed to modify the appearance of one or more virtual background elements 404, and/or the entire virtual background itself.
- the present application supports integrating content comprising HTML, such as a body of an Internet web page, as background in connection with green screen functionality.
- content comprising HTML, such as a body of an Internet web page
- an author can select a respective Internet web page to replace content.
- Fig. 8 illustrates an example implementation in which an interactive web page 802 is provided as a background content during authoring of a coordinated presentation.
- a user overlay option referred generally herein as "silhouette”
- background author 804 can be virtually placed within the background content (e.g., web page 802), itself.
- Positioning of the author 804 within the background content can be affected using standard selection and movement gestures, such as swiping, pinching or the like.
- the author 804 can be switched as the background element. Accordingly, a vApp (e.g., web page 802) effectively becomes the background content, while maintain its interactivity, for example, for author and viewer.
- FIG. 9 illustrates another example implementation of an authoring process in accordance with the present application, in which an interactive webpage 802 is provided as background content with the author 804 integrated therewith.
- a social network content feed from a website 802 and is displayed as background content and integrated with author 804.
- a flow diagram is described showing a routine S100 that illustrates a broad aspect of a method for authoring a coordinated presentation 304 in accordance with at least one implementation disclosed herein.
- a routine S100 that illustrates a broad aspect of a method for authoring a coordinated presentation 304 in accordance with at least one implementation disclosed herein.
- several of the logical operations described herein are implemented (1) as a sequence of computer implemented acts or program modules running on computing device 102/104 and/or (2) as interconnected machine logic circuits or circuit modules within computing device 102/104.
- the implementation is a matter of choice, dependent for example on the requirements of the device (e.g., size, mobility, energy, consumption, performance, etc.). Accordingly, the logical operations described herein are referred to variously as operations, steps, structural devices, acts, or modules.
- step S102 the process starts, for the example, as an app launching on a tablet computing device.
- options in the form of graphical screen controls are provided to a user for authoring a coordinated presentation 304, for example, in the form of camera controls, which are selectable and that cause one or more modules to execute instructions associated with the respective control to capture a first data stream that includes content from a camera configured or integrated with the user's tablet computing device.
- options are provided for the user to select background image content, such as content that is stored locally on the user's computing device 102 and/or 104, or content that is available via a data communication network.
- One or more modules execute instructions associated with a respective selected control and to capture a second data stream, which includes the background image content.
- telemetry information associated with the user's tablet computing device e.g., from a gyroscope and/or accelerometer
- the information is stored in a data log file, such as in XML or other suitable format.
- the stored telemetry information is processed and at least a portion of the virtual background image is modified to correspond with the processed telemetry information and in association with a respective image frame in accordance with the content from the camera.
- a composited video frame is generated that includes at least some of the content from the camera and at least some of the modified virtual background.
- the video frame is integrated with a plurality of other video frames and output in a select format as a coordinated presentation, which is distributed in step SI 16, thereafter the process ends.
- the present application provides significant flexibility and creativity in connection with creating and viewing coordinated presentations. Although many of the examples shown and described herein regard distribution of coordinated presentations to a plurality of users, the invention is not so limited. Although illustrated embodiments of the present invention have been shown and described, it should be understood that various changes, substitutions, and alterations can be made by one of ordinary skill in the art without departing from the scope of the present invention.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201462031114P | 2014-07-30 | 2014-07-30 | |
PCT/US2015/042916 WO2016019146A1 (en) | 2014-07-30 | 2015-07-30 | System and method for providing and interacting with coordinated presentations |
Publications (2)
Publication Number | Publication Date |
---|---|
EP3175372A1 true EP3175372A1 (en) | 2017-06-07 |
EP3175372A4 EP3175372A4 (en) | 2018-03-14 |
Family
ID=55218317
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15828271.5A Withdrawn EP3175372A4 (en) | 2014-07-30 | 2015-07-30 | System and method for providing and interacting with coordinated presentations |
Country Status (2)
Country | Link |
---|---|
EP (1) | EP3175372A4 (en) |
WO (1) | WO2016019146A1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6020931A (en) * | 1996-04-25 | 2000-02-01 | George S. Sheng | Video composition and position system and media signal communication system |
GB201208088D0 (en) * | 2012-05-09 | 2012-06-20 | Ncam Sollutions Ltd | Ncam |
GB2502986B (en) * | 2012-06-12 | 2014-05-14 | Supponor Oy | Apparatus and method for image content replacement |
US20140002580A1 (en) * | 2012-06-29 | 2014-01-02 | Monkeymedia, Inc. | Portable proprioceptive peripatetic polylinear video player |
US20140136999A1 (en) * | 2012-11-14 | 2014-05-15 | Rounds Entertainment Ltd. | Multi-User Interactive Virtual Environment System and Method |
-
2015
- 2015-07-30 WO PCT/US2015/042916 patent/WO2016019146A1/en active Application Filing
- 2015-07-30 EP EP15828271.5A patent/EP3175372A4/en not_active Withdrawn
Also Published As
Publication number | Publication date |
---|---|
WO2016019146A1 (en) | 2016-02-04 |
EP3175372A4 (en) | 2018-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9852764B2 (en) | System and method for providing and interacting with coordinated presentations | |
US20210306595A1 (en) | System and Method for Interactive Video Conferencing | |
US10033967B2 (en) | System and method for interactive video conferencing | |
US10380800B2 (en) | System and method for linking and interacting between augmented reality and virtual reality environments | |
US9363448B2 (en) | System and method for providing and interacting with coordinated presentations | |
US11310463B2 (en) | System and method for providing and interacting with coordinated presentations | |
CN109407821B (en) | Collaborative interaction with virtual reality video | |
US11457176B2 (en) | System and method for providing and interacting with coordinated presentations | |
US20230120437A1 (en) | Systems for generating dynamic panoramic video content | |
WO2017035368A1 (en) | System and method for interactive video conferencing | |
US9666231B2 (en) | System and method for providing and interacting with coordinated presentations | |
WO2019056001A1 (en) | System and method for interactive video conferencing | |
JP2019512177A (en) | Device and related method | |
US11405587B1 (en) | System and method for interactive video conferencing | |
JP6224465B2 (en) | Video distribution system, video distribution method, and video distribution program | |
US10084849B1 (en) | System and method for providing and interacting with coordinated presentations | |
US12028643B2 (en) | Dynamic virtual background for video conference | |
CN116149469A (en) | AR (augmented reality) -based user behavior recording method and AR | |
WO2016019146A1 (en) | System and method for providing and interacting with coordinated presentations | |
US11659138B1 (en) | System and method for interactive video conferencing | |
WO2020033896A1 (en) | System and method for providing and interacting with coordinated presentations | |
EP3669539A1 (en) | System and method for providing and interacting with coordinated presentations | |
EP3014467A2 (en) | System and method for providing and interacting with coordinated presentations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
17P | Request for examination filed |
Effective date: 20170223 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
A4 | Supplementary search report drawn up and despatched |
Effective date: 20180213 |
|
RIC1 | Information provided on ipc code assigned before grant |
Ipc: H04N 5/222 20060101ALI20180207BHEP Ipc: G06F 17/00 20060101AFI20180207BHEP Ipc: H04N 5/272 20060101ALI20180207BHEP Ipc: H04N 5/275 20060101ALI20180207BHEP |
|
17Q | First examination report despatched |
Effective date: 20200316 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200929 |