[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2017143289A1 - Système et procédé pour présenter et visualiser un segment vidéo sphérique - Google Patents

Système et procédé pour présenter et visualiser un segment vidéo sphérique Download PDF

Info

Publication number
WO2017143289A1
WO2017143289A1 PCT/US2017/018508 US2017018508W WO2017143289A1 WO 2017143289 A1 WO2017143289 A1 WO 2017143289A1 US 2017018508 W US2017018508 W US 2017018508W WO 2017143289 A1 WO2017143289 A1 WO 2017143289A1
Authority
WO
WIPO (PCT)
Prior art keywords
video segment
spherical video
display
view
interest
Prior art date
Application number
PCT/US2017/018508
Other languages
English (en)
Inventor
Joven Matias
David Newman
Ha Phan
Timothy Bucklin
Original Assignee
Gopro, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US15/046,344 external-priority patent/US9973746B2/en
Priority claimed from US15/050,275 external-priority patent/US9743060B1/en
Priority claimed from US15/050,297 external-priority patent/US9602795B1/en
Application filed by Gopro, Inc. filed Critical Gopro, Inc.
Priority to EP17753994.7A priority Critical patent/EP3417609A4/fr
Priority to CN201780012107.XA priority patent/CN108702451A/zh
Publication of WO2017143289A1 publication Critical patent/WO2017143289A1/fr

Links

Classifications

    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11BINFORMATION STORAGE BASED ON RELATIVE MOVEMENT BETWEEN RECORD CARRIER AND TRANSDUCER
    • G11B27/00Editing; Indexing; Addressing; Timing or synchronising; Monitoring; Measuring tape travel
    • G11B27/10Indexing; Addressing; Timing or synchronising; Measuring tape travel
    • G11B27/34Indicating arrangements 
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/76Television signal recording
    • H04N5/765Interface circuits between an apparatus for recording and another apparatus
    • H04N5/77Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera
    • H04N5/772Interface circuits between an apparatus for recording and another apparatus between a recording apparatus and a television camera the recording apparatus and the television camera being placed in the same enclosure
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture

Definitions

  • the systems and methods described herein generally relate to presenting and viewing a spherical video segment.
  • Captured video segments may include content captured in multiple views with multiple perspectives. If users are able to choose their view point within the captured video segments during playback, they may miss content available in other views and/or areas of the captured video segments. Missing content may cause the user to have a less than satisfactory experience.
  • a spherical video segment may include digital content presented in a three dimensional view including a 360-degree horizontal field of view and/or a 180-degree vertical field of view.
  • the system and/or method described herein may be configured to obtain the spherical video segment from a repository of video segments.
  • the system and/or method may determine an orientation of a two dimensional display based upon output signals generated via a sensor. The output signals may convey information related to the orientation of the display.
  • a display field of view within the spherical video segment may be determined. The display field of view may be presented on the display based on the orientation of the display.
  • the display field of view may follow a portion of the 360-degree horizontal field of view (e.g., 90-degrees of the 360-degrees, such that the other 270-degrees of the horizontal field of view may not be within the display field of view) and a portion of the 180-degree vertical field of view (e.g., 45-degrees of the 180- degrees, such that the other 135-degrees of the vertical field of view may not be within the display field of view) based upon how the display is moved and/or oriented (e.g., if the display is angled up by a certain number of degrees, the display field of view may be determined to be angled up by the same number of degrees).
  • the 360-degree horizontal field of view e.g., 90-degrees of the 360-degrees, such that the other 270-degrees of the horizontal field of view may not be within the display field of view
  • 180-degree vertical field of view e.g., 45-degrees of the 180- degrees, such that the other 135-degree
  • the display field of view of the spherical video segment may be presented on the display for a user to view different angles and/or perspectives at any given point in time of the spherical video segment by moving the display in different directions and/or angles.
  • the three dimensional view may include depth information associated with the spherical video segment.
  • Zoom controls may be provided which may allow for the increase and/or decrease of visible degrees during playback within the display field of view of the spherical video segment.
  • the display field of view may be captured and/or recorded as a two dimensional video segment.
  • User controls may be provided via the display and/or an application associated with presentation of the spherical video segment on the display that may allow the user viewing and/or consuming the spherical video segment to record the presented display field of view.
  • the user may choose to make a recording of the content of the two dimensional display of the display field of view as the user moves the display in different directions and/or orients the display at different angles. This may be referred to as a virtual camera, as the user may make as many recordings of the two dimensional display as the user wishes.
  • the user controls may allow the user to start, pause, stop, and/or allow for other controls to control capture of the display field of view as the two dimensional video segment (e.g., record in slow motion, etc.). In this manner, the user may capture highlights and/or events of interest within the spherical video segment in a single two dimensional video segment.
  • the two dimensional video segment may be stored as a separate video segment than the spherical video segment.
  • the two dimensional video segment may be shared with other users and/or consumers to view, such that the user and/or other consumers of the spherical video segment and/or the two dimensional video segment may not have to search for the highlights and/or events of interest on their own within the spherical video segment.
  • one or more events of interest may be tagged and/or presented within the spherical video segment.
  • Events of interest may include one or more of a highlight or a climax of the spherical video segment, something occurring within the spherical video segment apart from a focal (e.g., action) point of the spherical video segment, and/or other event of interest within the spherical video segment.
  • the system and/or method may obtain tag information associated with the event of interest within the spherical video segment. The information may identify a point in time in the spherical video segment and a viewing angle within the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • the system and/or method may determine, proximate to the point in time, whether the viewing angle of the event of interest is located within the display field of view. If the event of interest is outside the display field of view (e.g., with reference to the example above, if the event of interest is located within the 270-degrees of the horizontal field of view not within the display field of view and/or the 135- degrees of the vertical field of view not within the display field of view), alert information may be generated indicating that the event of interest for the spherical video segment may be located outside the field of view.
  • a notification may be generated and presented within the display field of view to alert the user which direction to move and/or orient the display at which time during the spherical video segment in order to view the event of interest.
  • the notification may be graphical (e.g., arrows, etc.), audible (e.g., spoken directions), and/or in any other form.
  • Users may tag and/or mark the spherical video segment with their own events of interest to be stored and shared for future reference of the spherical video segment, either by the users themselves or for other consumers to view.
  • a system configured to present and view a spherical video segment may include one or more client computing platform(s).
  • the server(s) and the client computing platform(s) may communicate in a client/server configuration, and/or via another configuration.
  • the client computing platform(s) may include one or more processors configured by machine-readable instructions to execute computer program components.
  • the computer program components may include a spherical segment component, an authentication component, an orientation component, a field of view component, a presentation component, a capture component, a notification component, and/or other components.
  • the spherical segment component may be configured to obtain the spherical video segment.
  • the spherical video segment may be included within a repository of video segments.
  • a repository of images and/or video segments may be available via the system.
  • the repository of images and/or video segments may be stored within an electronic storage, one or more server(s), external resources, a cloud, and/or any other storage location. Individual images and/or video segments of the repository of images and/or video segments may be stored in different locations.
  • the repository of images and/or video segments may be associated with different users.
  • the video segments may include a compilation of videos, video segments, video clips, and/or still images.
  • the spherical video segment may include tag information associated with an event of interest within the spherical video segment.
  • the tag information may identify a point in time within the spherical video segment and a viewing angle within the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • the event of interest may include any occurrence within the spherical video segment such as a notable moment and/or interesting point in time within the spherical video segment duration.
  • the event of interest may be a moment that a user wishes to share with other viewers and/or consumers of the spherical video segment (e.g., the user and/or other users may be interested in the moment and/or event of interest).
  • the event of interest may include one or more of a highlight or a climax of the spherical video segment, something occurring within the spherical video segment apart from a focal (e.g., action) point of the spherical video segment, and/or other event of interest within the spherical video segment.
  • the authentication component may be configured to authenticate a user associated with client computing platform(s) accessing the repository of images and/or video segments via the system.
  • the authentication component may manage accounts associated with users and/or consumers of the system.
  • the user accounts may include user information associated with users and/or consumers of the user accounts.
  • User information may include information stored by client computing platform(s), server(s), and/or other storage locations.
  • the orientation component may be configured to determine the orientation of a display based on output signals of a sensor of client computing platform(s). For example, if one of client computing platform(s) is a mobile device (e.g., a smartphone), the orientation component may be configured to determine the orientation of the display based on the output signals of the client computing device when the user moves the smartphone in one or more directions (e.g., up, down, left, right, etc.).
  • the orientation of the display may be indicated by a number of degrees in one or more directions.
  • the orientation may be defined in one or more of a Cartesian coordinate system, a cylindrical and/or polar coordinate system, a spherical and/or polar coordinate system.
  • the field of view component may be configured to determine a display field of view within the spherical video segment to be presented on the display based on the orientation of the display.
  • the field of view component may be configured to determine the field of view within the spherical video segment to be presented on the display based on the orientation of the display including determining a viewing angle in the spherical video segment that corresponds to the orientation of the display.
  • the field of view component may be configured to determine the field of view within the spherical video segment to be presented on the display based on the orientation of the display including identifying the display field of view within the spherical video segment that is present at the viewing angle.
  • the display field of view may include a horizontal field of view (e.g., left and/or right), a vertical field of view (e.g., up and/or down), and/or other field of views.
  • the field of view component may be configured to, proximate to the point in time, determine whether the viewing angle of the event of interest is located within the display field of view. Proximate to the point in time at which the event of interest is viewable may include one or more of a predefined time period prior to the point in time at which the event of interest is viewable, a predefined time period after the point in time at which the event if viewable, and/or the point in time at which the event of interest is viewable. As discussed above, the display field of view may be determined on a recurring or ongoing basis. The display field of view may indicate a current field of view determined based upon the orientation of the display associated with client computing platform(s) during presentation of the spherical video segment.
  • the display field of view may change one or more times over the course of presentation of the spherical video segment via the display.
  • the display field of view may include the viewing angle visible to the user via the display at a current point in time, as numerous fields of view that are not visible to the user may be available at the current point in time (e.g., 180-degrees to the left or right of the display field of view may not be visible to the user via the display field of view unless the display is moved to the left or right).
  • the field of view component may automatically alter the display field of view for the user automatically to display the event of interest and/or the user may alter the display field of view based upon the movements of the client computing platform(s).
  • the capture component may be configured to capture the display field of view as a two dimensional video segment. Capturing the display field of view as the two dimensional video segment may include recording the display field of view as the two dimensional video segment such that as the user is using the display associated with client computing platform(s) as a viewfinder for visual content included within the spherical video segment, the capture component may record and/or capture the user's actions and/or movements (e.g., the display field of view) throughout the spherical video segment via client computing platform(s).
  • the capture component may record the display field of view (e.g., the visual content being displayed on the display associated with client computing platform(s)), including any movements of the display field of view including when the user may move the display associated with client computing platform(s) to the left, right, up, and/or down, resulting in movement of the display field of view within the spherical video segment.
  • the display field of view e.g., the visual content being displayed on the display associated with client computing platform(s)
  • any movements of the display field of view including when the user may move the display associated with client computing platform(s) to the left, right, up, and/or down, resulting in movement of the display field of view within the spherical video segment.
  • the notification component may be configured to, responsive to a determination proximate to the point in time that the viewing angle is outside the display field of view, generate alert information indicating the event of interest for the spherical video segment is located outside the display field of view.
  • the alert information may indicate the location of the event of interest within the spherical video segment indicated by the tag information.
  • the alert information may include the viewing angle associated with the tag information, the point in time associated with the tag information, and/or other information related to the event of interest indicated by and/or associated with the tag information.
  • the notification component may be configured to effectuate presentation of a notification based upon the alert information.
  • the notification may include one or more of a graphical notification, an audible notification, a sensory notification, and/or other types of notifications.
  • the notification may include an alert message presented within the display field of view of the spherical video segment.
  • the notification for example, may include an alert sound audible to the user.
  • An example sensory notification may include a vibration and/or light notification.
  • the notification may indicate to the user that the user may be missing, has missed, and/or may be about to miss the event of interest of the spherical video segment that may be occurring outside the display field of view.
  • FIG. 1 illustrates a system configured for presenting and viewing a spherical video segment, in accordance with one or more implementations.
  • FIG. 2 illustrates an example spherical video segment being presented via a display, in accordance with one or more implementations.
  • FIG. 3 illustrates an example spherical video segment being presented via a display with user controls, in accordance with one or more implementations.
  • FIG. 4 illustrates an example display field of view of a spherical video segment including a notification, in accordance with one or more implementations.
  • FIG. 5. illustrates a method configured for viewing a spherical video segment, in accordance with one or more implementations.
  • FIG. 6. illustrates a method configured for presenting an event of interest within a spherical video segment, in accordance with one or more implementations
  • FIG. 1 illustrates an example system 100 that is configured for presenting and viewing a spherical video segment.
  • a spherical video segment may include digital content presented in a three dimensional view including a 360-degree horizontal field of view and/or a 180-degree vertical field of view.
  • the system and/or method described herein may be configured to obtain the spherical video segment from a repository of video segments.
  • the system and/or method may determine an orientation of a two dimensional display based upon output signals generated via a sensor.
  • the output signals may convey information related to the orientation of the display.
  • a display field of view within the spherical video segment may be determined.
  • the display field of view may be presented on the display based on the orientation of the display.
  • the display field of view may follow a portion of the 360-degree horizontal field of view (e.g., 90- degrees of the 360-degrees, such that the other 270-degrees of the horizontal field of view may not be within the display field of view) and a portion of the 180-degree vertical field of view (e.g., 45-degrees of the 180-degrees, such that the other 135-degrees of the vertical field of view may not be within the display field of view) based upon how the display is moved and/or oriented (e.g., if the display is angled up by a certain number of degrees, the display field of view may be determined to be angled up by the same number of degrees).
  • the 360-degree horizontal field of view e.g., 90- degrees of the 360-degrees, such that the other 270-degrees of the horizontal field of view may not be within the display field of view
  • 180-degree vertical field of view e.g. 45-degrees of the 180-degrees, such that the other 135-degrees
  • the display field of view of the spherical video segment may be presented on the display for a user to view different angles and/or perspectives at any given point in time of the spherical video segment by moving the display in different directions and/or angles.
  • the three dimensional view may include depth information associated with the spherical video segment.
  • Zoom controls may be provided which may allow for the increase and/or decrease of visible degrees during playback within the display field of view of the spherical video segment
  • the display field of view may be captured and/or recorded as a two dimensional video segment.
  • User controls may be provided via the display and/or an application associated with presentation of the spherical video segment on the display that may allow the user viewing and/or consuming the spherical video segment to record the presented display field of view.
  • the user may choose to make a recording of the content of the two dimensional display of the display field of view as the user moves the display in different directions and/or orients the display at different angles. This may be referred to as a virtual camera, as the user may make as many recordings of the two dimensional display as the user wishes.
  • the user controls may allow the user to start, pause, stop, and/or allow for other controls to control capture of the display field of view as the two dimensional video segment (e.g., record in slow motion, etc.). In this manner, the user may capture highlights and/or events of interest within the spherical video segment in a single two dimensional video segment.
  • the two dimensional video segment may be stored as a separate video segment than the spherical video segment.
  • the two dimensional video segment may be shared with other users and/or consumers to view, such that the user and/or other consumers of the spherical video segment and/or the two dimensional video segment may not have to search for the highlights and/or events of interest on their own within the spherical video segment.
  • one or more events of interest may be tagged and/or presented within the spherical video segment.
  • Events of interest may include one or more of a highlight or a climax of the spherical video segment, something occurring within the spherical video segment apart from a focal (e.g., action) point of the spherical video segment, and/or other event of interest within the spherical video segment.
  • the system and/or method may obtain tag information associated with the event of interest within the spherical video segment. The information may identify a point in time in the spherical video segment and a viewing angle within the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • the system and/or method may determine, proximate to the point in time, whether the viewing angle of the event of interest is located within the display field of view. If the event of interest is outside the display field of view (e.g., with reference to the example above, if the event of interest is located within the 270-degrees of the horizontal field of view not within the display field of view and/or the 135- degrees of the vertical field of view not within the display field of view), alert information may be generated indicating that the event of interest for the spherical video segment may be located outside the field of view.
  • a notification may be generated and presented within the display field of view to alert the user which direction to move and/or orient the display at which time during the spherical video segment in order to view the event of interest.
  • the notification may be graphical (e.g., arrows, etc.), audible (e.g., spoken directions), and/or in any other form.
  • Users may tag and/or mark the spherical video segment with their own events of interest to be stored and shared for future reference of the spherical video segment, either by the users themselves or for other consumers to view.
  • system 100 may include one or more client computing platform(s) 102, one or more server(s) 104, electronic storage 122, one or more physical processor(s) 124 configured to execute machine-readable instructions 105, one or more computer program components, and/or other components.
  • One or more physical processor(s) 124 may be configured to execute machine-readable instructions. Executing machine-readable instructions 105 may cause the one or more physical processor(s) 124 to effectuate presentation of the spherical video segment.
  • Machine-readable instructions 105 may include one or more computer program components such as spherical segment component 106, authentication component 108, orientation component 110, field of view component 112, presentation component 114, capture component 116, notification component 118, and/or other components.
  • client computing platform(s) 102 may be configured to provide remote hosting of the features and/or function of machine-readable instructions 105 to one or more server(s) 104 that may be remotely located from client computing platform(s) 102.
  • one or more features and/or functions of client computing platform(s) 102 may be attributed as local features and/or functions of one or more server(s) 104 .
  • individual ones of server(s) 104 may include machine-readable instructions (not shown in FIG. 1) comprising the same or similar components as machine-readable instructions 105 of client computing platform(s) 102.
  • Server(s) 104 may be configured to locally execute the one or more components that may be the same or similar to the machine-readable instructions 105.
  • One or more features and/or functions of machine-readable instructions 105 of client computing platform(s) 102 may be provided, at least in part, as an application program that may be executed at a given server 104.
  • Client computing platform(s) 102 may include one or more of a cellular telephone, a smartphone, a digital camera, a laptop, a tablet computer, a desktop computer, a television set-top box, smart TV, a gaming console, and/or other computing platforms.
  • Client computing platform(s) 102, server(s) 104, and/or external resources 120 may be operatively linked via one or more electronic communication links.
  • electronic communication links may be established, at least in part, via a network such as the Internet and/or other networks. It will be appreciated that this is not intended to be limiting and that the scope of this disclosure includes implementations in which client computing platform(s) 102, server(s) 104, and/or external resources 120 may be operatively linked via some other communication media.
  • Spherical segment component 106 may be configured to obtain the spherical video segment.
  • the spherical video segment may be included within a repository of video segments.
  • a repository of images and/or video segments may be available via system 100.
  • the repository of images and/or video segments may be stored within electronic storage 122, one or more server(s) 104, external resources 120, a cloud, and/or any other storage location. Individual images and/or video segments of the repository of images and/or video segments may be stored in different locations.
  • the repository of images and/or video segments may be associated with different users.
  • the video segments may include a compilation of videos, video segments, video clips, and/or still images.
  • While the present disclosure may be directed to previously captured spherical video and/or spherical video segments captured by one or more image capturing devices, one or more other implementations of system 100, client computing platform(s) 102, and/or server(s) 104 may be configured for other types of media items.
  • Other types of media items may include one or more of audio files (e.g., music, podcasts, audio books, and/or other audio files), multimedia presentations, photos, slideshows, and/or other media files.
  • the spherical video segment may be received from one or more storage locations associated with client computing platform(s) 102, server(s) 104, and/or other storage locations where spherical video segments may be stored.
  • Authentication component 108 may be configured to authenticate a user associated with client computing platform(s) 102 accessing the repository of images and/or video segments via system 100. Authentication component 108 may manage accounts associated with users and/or consumers of system 100. The user accounts may include user information associated with users and/or consumers of the user accounts. User information may include information stored by client computing platform(s) 102, server(s) 104, and/or other storage locations.
  • User information may include one or more of information identifying users and/or consumers (e.g., a username or handle, a number, an identifier, and/or other identifying information), security login information (e.g., a login code or password, a user ID, and/or other information necessary for the user to access server(s) 104), system usage information, external usage information (e.g., usage of one or more applications external to system 100 including one or more of online activities such as in social networks and/or other external applications), subscription information, a computing platform identification associated with the user and/or consumer, a phone number associated with the user and/or consumer, privacy settings information, and/or other information related to users and/or consumers.
  • security login information e.g., a login code or password, a user ID, and/or other information necessary for the user to access server(s) 104
  • system usage information e.g., usage of one or more applications external to system 100 including one or more of online activities such as in social networks and/or other external applications
  • Authentication component 108 may be configured to obtain user information via one or more client computing platform(s) 102 (e.g., user input via a user interface, etc.). If a user and/or consumer does not have a preexisting user account associated with system 100, a user and/or consumer may register to receive services provided by system 100 via a website, web-based application, mobile application, and/or user application. Authentication component 108 may be configured to create a user ID and/or other identifying information for a user and/or consumer when the user and/or consumer registers. The user ID and/or other identifying information may be associated with one or more client computing platform(s) 102 used by the user and/or consumer. Authentication component 108 may be configured to store such association with the user account of the user and/or consumer. A user and/or consumer may associate one or more accounts associated with social network services, messaging services, and the like with an account provided by system 100.
  • client computing platform(s) 102 e.g., user input via a user interface
  • spherical segment component 106 may effectuate transmission of the spherical video segment to client computing platform(s) 102 associated with the user and/or consumer.
  • the spherical video segment may be transmitted to client computing platform(s) 102 on which users may consume the spherical video segment.
  • Spherical segment component 106 may be configured to host the spherical video segment over a network. For example, the spherical video segment may be hosted over the internet such that users may access the spherical video segment via the internet.
  • Hosting the spherical video segment over the internet may include uploading and/or storing the spherical video segment on one or more server(s) 104 wherein the servers process requests and/or deliver the spherical video segment to client computing platform(s) 102. This may include serving separate digital files for the spherical video segment, streaming the spherical video segment, and/or other delivery mechanisms.
  • Spherical segment component 106 may be configured to host the spherical video segment over a network by communicating information (e.g., via streaming digital content data, and/or other visual information) from server(s) 104 to client computing platform(s) 102 for presentation on one or more displays associated with client computing platform(s) 102.
  • the digital content transmitted to a given client computing platform 102 may correspond to the spherical video segment being presented for consumption by user at the given client computing platform 102.
  • Client computing platform(s) 102 may include one or more display devices configured to display the spherical video segment.
  • Client computing platform(s) 102 may include a two dimensional display configured to present two dimensional images. For example, the spherical video segment may be presented via the two dimensional display in a two dimensional format, rather than a three dimensional format such that the user associated with client computing platform(s) 102 may view the spherical video segment in the two dimensional format.
  • Client computing platform(s) 102 may include a sensor configured to generate output signals conveying information related to an orientation of the display.
  • the orientation of the display may refer to a relative position of the display.
  • the orientation of the display may be indicated by a number of degrees in one or more directions.
  • the orientation may be defined in one or more of a Cartesian coordinate system, a cylindrical and/or polar coordinate system, a spherical and/or polar coordinate system.
  • a spherical video segment may include a captured three dimensional video segment that may be viewed, tagged, edited, and/or distributed.
  • the spherical video segment may include a playback of a live captured video (e.g., captured via one or more cameras).
  • the spherical video segment may include multiple views such that at least a portion of the spherical video segment is outside a display field of view of a user viewing the spherical video segment at a given point in time via the display associated with client computing platform(s) 102.
  • the spherical video segment may include one or more of a 360-degree horizontal field of view, a 180-degree vertical field of view, and/or other views.
  • the display associated with client computing platform(s) 102 may display only a portion of the spherical video segment (e.g., the display may only display a portion of the 360-degree horizontal field of view and/or a portion of the 180-degree vertical field of view). As such, other portions of the spherical video segment that are not displayed via the display associated with client computing platform(s) 102 at a given point in time may still be available to view.
  • Orientation component 108 may be configured to determine the orientation of the display based on the output signals of the sensor of client computing platform(s) 102. For example, if one of client computing platform(s) 102 is a mobile device (e.g., a smartphone), orientation component 108 may be configured to determine the orientation of the display based on the output signals of client computing device 102 when the user moves the smartphone in one or more directions (e.g., up, down, left, right, etc.). As discussed above, the orientation of the display may be indicated by a number of degrees in one or more directions. For example, the orientation may be defined in one or more of a Cartesian coordinate system, a cylindrical and/or polar coordinate system, a spherical and/or polar coordinate system.
  • Field of view component 112 may be configured to determine a display field of view within the spherical video segment to be presented on the display based on the orientation of the display.
  • Field of view component 112 may be configured to determine the field of view within the spherical video segment to be presented on the display based on the orientation of the display including determining a viewing angle in the spherical video segment that corresponds to the orientation of the display.
  • Field of view component 112 may be configured to determine the field of view within the spherical video segment to be presented on the display based on the orientation of the display including identifying the display field of view within the spherical video segment that is present at the viewing angle.
  • the display field of view may include a horizontal field of view (e.g., left and/or right), a vertical field of view (e.g., up and/or down), and/or other field of views.
  • Field of view component 112 may be configured to determine the display field of view within the spherical video segment based on how the user is holding and/or moving the display associated with client computing platform(s) 102. For example, the user may move, turn, and/or orient the display of client computing platform(s) 102 to change his or her field of view based on an angle and/or direction of the orientation of the display associated with client computing platform(s) 102. The orientation of the display may reflect the viewing angle in the spherical video segment.
  • the display field of view within the spherical video segment to be presented on the display may be a 90-degree vertical display field of view.
  • the viewing angle in the spherical video segment that corresponds to the orientation of the display may then be the 90- degree vertical viewing angle.
  • the display field of view within the spherical video segment to be presented on the display may be a 75-degree vertical angle display field of view (e.g., the display field of view may be tilted down towards the surface level) and 100-degrees to the right from the initial point.
  • the viewing angle in the spherical video segment that corresponds to the orientation of the display may then be the 75-degree vertical viewing angle, turned 100-degrees to the right.
  • Presentation component 114 may be configured to effectuate presentation of the display field of view of the spherical video segment on the display.
  • the user associated with client computing platform(s) 102 may view the spherical video segment on the display of client computing platform(s) 102.
  • the determination of the orientation of the display and the determination of the display field of view within the spherical video segment for presentation on the display may occur recursively during playback of the spherical video segment. In this manner, whichever direction client computing platform(s) 102 moves and/or is oriented, presentation component 114 may be configured to effectuate presentation of the display field of view of the spherical video segment on the display in real-time.
  • the user associated with client computing platform(s) 102 moves the display associated with client computing platform(s) 102 to the right by 30-degrees (e.g., 30-degrees horizontally) relative to a starting position and/or starting orientation of the display associated with client computing platform(s) 102
  • the spherical video segment may be displayed on the display associated with client computing platform(s) 102 to present the spherical video segment at 30-degrees to the right (e.g., horizontally) relative to an initial viewing angle in the spherical video segment. This may occur recursively such that the user may move the display associated with client computing platform(s) 102 in any direction to view different angles and/or perspectives of the spherical video segment at any given point in time.
  • Recursive determinations of the orientation of the display and the display field of view within the spherical video segment for presentation on the display may facilitate use of the display by the user as a viewfinder for visual content included within the spherical video segment.
  • the viewfinder for the spherical video segment may represent the ability for the user to move the display associated with client computing platform(s) 102 on which the spherical video segment is displayed in order to view different angles and/or perspectives at any given point in time during the spherical video segment.
  • the 360-degree horizontal field of view and the 180-degree vertical field of view may be available for the user to view via the display associated with client computing platform(s) 102 by moving, rotating, and/or otherwise orienting the display associated with client computing platform(s) 102 in various directions and/or angles to view different moments and/or events within the spherical video segment.
  • a current position e.g., an initial position
  • an event may be taking place 180-degrees to the right or left (e.g., horizontally) of the current position at that moment in time (e.g., 25 seconds into the spherical video segment).
  • the user may move the display associated with client computing platform(s) 102 to explore other areas of the spherical video segment at the 25-second mark of the spherical video segment.
  • the user may pause the spherical video segment at a particular point in time (e.g., 2 seconds into the spherical video segment) via one or more user controls, as will be discussed in further detail below, in order to move the display associated with client computing platform(s) 102 in various directions to explore the visual content captured within the spherical video segment in part and/or its entirety at the particular point in time (e.g., 2 seconds into the spherical video segment).
  • Visual content and/or events included within the spherical video segment may include capture of particular objects (e.g., people, animals, landmarks, etc.), actions (e.g., a sporting event, riding a bicycle, etc.), landscapes (e.g., scenery, etc.), and/or other visual content and/or events that may be captured.
  • objects e.g., people, animals, landmarks, etc.
  • actions e.g., a sporting event, riding a bicycle, etc.
  • landscapes e.g., scenery, etc.
  • other visual content and/or events included within the spherical video segment may include capture of particular objects (e.g., people, animals, landmarks, etc.), actions (e.g., a sporting event, riding a bicycle, etc.), landscapes (e.g., scenery, etc.), and/or other visual content and/or events that may be captured.
  • the user may facilitate use of the display as the viewfinder for visual content and/or events included within the s
  • FIG. 2 depicts an example spherical video segment 200 being presented on display 202 of client computing platform 102, in accordance with one or more implementations. While client computing platform 102 is shown as a smartphone in FIG. 2, this is not meant to be a limitation of this disclosure, as client computing platform 102 may include any client computing platform 102 discussed above.
  • FIG. 2 may represent an illustration of spherical video segment 200 being presented via display 202 associated with client computing platform 102 at an individual point in time 204 within spherical video segment 200 (e.g., 30 seconds into spherical video segment 200).
  • Spherical video segment 200 may include a three dimensional spherical video segment presented via two dimensional display 202 (e.g., the user may be able to view and/or consume a portion of the three dimensional spherical video segment). While a portion of spherical video segment 200 is displayed via display 202 as the display field of view, other portions of the 360-degree horizontal field of view and/or 180-degree vertical field of view of spherical video segment 200 are not displayed via display 202. In order to view the other portions of spherical video segment 200, display 202 may be oriented in various angles and/or moved in various directions, as depicted by arrows 206, in order to change the display field of view presented via display 202.
  • a skier may be depicted within spherical video segment 200 at the individual point in time 204, if display 202 is moved to the right or left by 180-degrees, the user may be able to view what is occurring 180-degrees from the skier within spherical video segment 200. While arrows 206 are depicted in four directions, this is for exemplary purposes only and is not meant to be limitation of this disclosure, as display 202 may be oriented in any angle and/or moved in any direction.
  • capture component 116 may be configured to capture the display field of view as a two dimensional video segment. Capturing the display field of view as the two dimensional video segment may include recording the display field of view as the two dimensional video segment such that as the user is using the display associated with client computing platform(s) 102 as the viewfinder for visual content included within the spherical video segment, capture component 116 may record and/or capture the user's actions and/or movements (e.g., the display field of view, as discussed above) throughout the spherical video segment via client computing platform(s) 102.
  • capture component 116 may record the display field of view (e.g., the visual content being displayed on the display associated with client computing platform(s) 102), including any movements of the display field of view including when the user may move the display associated with client computing platform(s) 102 to the left, right, up, and/or down, resulting in movement of the display field of view within the spherical video segment.
  • the display field of view e.g., the visual content being displayed on the display associated with client computing platform(s) 102
  • any movements of the display field of view including when the user may move the display associated with client computing platform(s) 102 to the left, right, up, and/or down, resulting in movement of the display field of view within the spherical video segment.
  • Capture component 116 may be configured to capture a single field of view over a span of time. For example, even if the display associated with client computing platform(s) 102 is moved and/or oriented in different directions, capture component 116 may record and/or capture only a single field of view. If the initial vertical field of view is a 90-degree vertical field of view and the initial horizontal field of view is 0-degrees, then capture component 116 may be configured to capture that field of view for the entire length of the spherical video segment and/or a particular length of time. User controls, as will be discussed in further detail below, may be used to configure recording and/or editing of the two dimensional video segment and/or post- capture of the two dimensional video segment.
  • the spherical video segment may be a three dimensional video segment, while the display associated with client computing platform(s) 102 may be a two dimensional display.
  • capture component 116 may be configured to capture and/or record the display field of view and create a separate video segment as the two dimensional video segment other than the spherical video segment.
  • Capture component 116 may be configured to store the two dimensional video segment.
  • the two dimensional video segment may be stored via electronic storage 122, a cloud, and/or other storage device associated with system 100 and/or server(s) 104.
  • the two dimensional video segment may be included within the repository of video segments discussed above. In this manner, the two dimensional video segment may be available for playback to view by other users and/or consumers.
  • System 100 may effectuate presentation of the two dimensional video segment on the display associated with client computing platform(s) 102 for playback.
  • System 100 may effectuate transmission of the two dimensional video segment to one or more client computing platform(s) 102 associated with the user and/or other viewers and/or consumers.
  • a user may capture moments and/or events of interest within the spherical video segment as the two dimensional video segment and share the two dimensional video segment with other viewers and/or consumers.
  • the two dimensional video segment may include moments and/or events of interest which the user preferred in order to view the spherical video segment in the future in an easier and more efficient manner, such that the user and/or other viewers may not be required to search through the entirety of the spherical video segment (e.g., the three dimensional video segment) to view such moments and/or events of interest at a particular point in time.
  • the display may include use by the user as the viewfinder for visual content included within the spherical video segment.
  • the display and/or the viewfinder may include one or more user controls configured to be actuated by the user to control capture of the display field of view as the two dimensional video segment.
  • the one or more user controls may include one or more of a record control, a stop control, a play control, a pause control, controls to fast forward and/or rewind through the spherical video segment and/or the two dimensional video segment, a save control, and/or other user controls that may be used to control capture of the two dimensional video segment (e.g., a control to record the two dimensional video segment in slow motion, etc.).
  • the one or more user controls may be included within the display, within the viewfinder (e.g., associated with display of the spherical video segment), on client computing platform(s) 102, and/or may be included within and/or on other devices and/or applications capable of displaying the spherical video segment and/or recording the two dimensional video segment.
  • the one or more user controls may include one or more buttons associated with client computing platform(s) 102 and/or an application capable of displaying the spherical video segment, levers, switches, and/or any other actuator that may be actuated by the user to control capture of the two dimensional video segment.
  • FIG. 3 depicts an example spherical video segment 200 being presented on display 202 of client computing platform 102 with user controls 302, 304, 306, in accordance with one or more implementations. While user control 302 depicts a record button, user control 304 depicts a stop button, and user control 306 depicts a pause button, these are for exemplary purposes only and are not meant to be a limitation of this disclosure, as other user controls may be provided.
  • FIG. 3 illustrates an exemplary interface that may allow a user to capture and/or record the display field of view presented via display 202 as a two dimensional video segment. As shown from FIG.
  • arrows 206 may depict directions and/or angles in which display 202 may be oriented such that the user may view other portions of the display field of view presented via display 202.
  • the user may actuate button 302 to begin capturing and/or recording the display field of view presented via display 202 as the two dimensional video segment while the user moves and/or orients client computing platform 102 in various directions and/or angles (e.g., causing the display field of view to follow similar directions as the orientation of display 202).
  • the user may follow the skier depicted in spherical video segment 200 as the skier continues skiing down a slope.
  • the user may move display 202 in the direction the skier is skiing in order to keep the skier in the middle of the display field of view presented via display 202.
  • the user may actuate button 306 to pause the recording and/or actuate button 304 to stop the recording.
  • the spherical video segment may include tag information associated with an event of interest within the spherical video segment.
  • the tag information may identify a point in time within the spherical video segment and a viewing angle within the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • the event of interest may include any occurrence within the spherical video segment such as a notable moment and/or interesting point in time within the spherical video segment duration.
  • the event of interest may be a moment that a user wishes to share with other viewers and/or consumers of the spherical video segment (e.g., the user and/or other users may be interested in the moment and/or event of interest).
  • the event of interest may include one or more of a highlight or a climax of the spherical video segment, something occurring within the spherical video segment apart from a focal (e.g., action) point of the spherical video segment, and/or other event of interest within the spherical video segment.
  • an event of interest may include presentation of: a snowboarder landing an impressive jump, a noteworthy rally in a beach volleyball game, a memorable moment within a wildlife video capture, a fans' sign at a sporting event, a crowd reaction within a surfing video, a beautiful nature scene away from the subject wildlife, and/or any other events of interest captured within the spherical video segment.
  • the tag information may include one or more of a tag, a comment including text, an emoji, and/or an image, and/or other information associated with the tag information and/or the event of interest.
  • users may submit tag information associated with the event of interest via a force touch and/or other inputs.
  • the tag information may be inputted on and/or via one or more user interfaces via the display associated with client computing platform(s) 102. For example, a user may tap and/or touch the display associated with client computing platform(s) 102 while viewing the spherical video segment to mark and/or tag the point in time and the viewing angle (e.g., location) of the event of interest within the spherical video segment.
  • System 100 may receive the tag information from client computing platform(s) 102.
  • the tag information may identify the point in time in the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • System 100 may receive the point in time from the user such that the user may select and/or choose where in the spherical video segment the event of interest occurs, begins, and/or is about to begin. For example, the user may select the point in time within the spherical video segment such that a timestamp may be associated with the event of interest.
  • the point in time may include a single point in time, a range of time, multiple points in time, and/or other point in time indications.
  • the tag information may identify a viewing angle within the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • the viewing angle may reference a location within the spherical video segment in which the event of interest is viewable.
  • System 100 may receive the viewing angle from the user such that the user may select and/or choose where in the spherical video segment the event of interest occurs, begins, and/or is about to begin.
  • the user may select the viewing angle within the spherical video segment such that a location within the spherical video segment may be associated with the event of interest.
  • the viewing angle may include a number of degrees in a horizontal field of view and/or a number of degrees in a vertical field of view.
  • the viewing angle may include one or more of a range of viewing angles, an individual view point based upon the viewing angle, a view point and its vicinity, and/or other viewing angles associated with the event of interest.
  • the viewing angle may be relative to the display field of view
  • the viewing angle may include one or more dimensions identifying where within the spherical video segment the event of interest is located, and/or what field(s) of view will be able to observe the portion of the spherical video segment at which the event of interest is viewable.
  • the viewing angle may include one or more of a horizontal field of view (e.g., left to right), a vertical field of view (e.g., height, up, and/or down), and/or other location information.
  • the viewing angle and/or information associated with the viewing angle may be defined by one or more of a Cartesian coordinate system, a cylindrical and/or polar coordinate system, a spherical and/or polar coordinate system, other coordinate system, and/or other data.
  • the tag information including the point in time and the viewing angle at which the event of interest is viewable may be stored within electronic storage 122 and/or may be stored within an electronic storage associated with server(s) 104.
  • Spherical segment component 106 may be configured to obtain the tag information associated with the event of interest. If stored on one or more server(s) 104, the tag information may be transmitted over the network to one or more client computing platform(s) 102 such that it may be received by spherical segment component 106.
  • field of view component 112 may be configured to, proximate to the point in time, determine whether the viewing angle of the event of interest is located within the display field of view. Proximate to the point in time at which the event of interest is viewable may include one or more of a predefined time period prior to the point in time at which the event of interest is viewable, a predefined time period after the point in time at which the event if viewable, and/or the point in time at which the event of interest is viewable. As discussed above, the display field of view may be determined on a recurring or ongoing basis.
  • the display field of view may indicate a current field of view determined based upon the orientation of the display associated with client computing platform(s) 102 during presentation of the spherical video segment.
  • the display field of view may change one or more times over the course of presentation of the spherical video segment via the display.
  • the display field of view may include the viewing angle visible to the user via the display at a current point in time, as numerous fields of view that are not visible to the user may be available at the current point in time (e.g., 180-degrees to the left or right of the display field of view may not be visible to the user via the display field of view unless the display is moved to the left or right).
  • Field of view component 114 may automatically alter the display field of view for the user automatically to display the event of interest and/or the user may alter the display field of view based upon the movements of the client computing platform(s).
  • the display field of view may include one or more visible ranges of viewing angles within the spherical video segment for a window in time within the spherical video segment.
  • the window of time within the spherical video segment may include and/or correspond to the proximate point in time associated with the point in time.
  • the window of time may correspond to the point in time and/or range of time relevant to the event of interest.
  • the window of time within the spherical video segment may include a separate or partially separate window of time, a period of time associated with the tag information, the segment duration, and/or another point in time, portion of time, and/or range of time within the spherical video segment.
  • the window of time may include a portion of the duration of the spherical video segment that begins prior to the period of time begins such that the viewing angle may indicate the display field of view for the user prior to the beginning of the period of time for the event of interest indicated by the tag information.
  • the window of time may be determined by field of view component 112, and/or selected and/or otherwise indicated by a user.
  • Field of view component 112 may be configured to determine whether the field of view associated with the tag information is located within and/or outside the display field of view proximate to the point in time.
  • within the display field of view may include completely and/or partially within the display field of view.
  • Outside the display field of view may include completely and/or partially outside, and/or off-centered within the display field of view.
  • the determination of whether the viewing angle associated with the tag information is located within and/or outside the display field of view may be determined in an ongoing and/or reoccurring manner for the duration of the spherical video segment.
  • Notification component 118 may be configured to, responsive to a determination proximate to the point in time that the viewing angle is outside the display field of view, generate alert information indicating the event of interest for the spherical video segment is located outside the display field of view.
  • the alert information may indicate the location of the event of interest within the spherical video segment indicated by the tag information.
  • the alert information may include the viewing angle associated with the tag information, the point in time associated with the tag information, and/or other information related to the event of interest indicated by and/or associated with the tag information.
  • notification component 118 may be configured to effectuate transmission of the alert information over the network to one or more client computing platform(s) 102 associated with the display on which the spherical video segment may be presented.
  • Notification component 118 may be configured to effectuate presentation of a notification based upon the alert information.
  • the notification may include one or more of a graphical notification, an audible notification, a sensory notification, and/or other types of notifications.
  • the notification may include an alert message presented within the display field of view of the spherical video segment.
  • the notification for example, may include an alert sound audible to the user.
  • An example sensory notification may include a vibration and/or light notification.
  • the notification may indicate to the user that the user may be missing, has missed, and/or may be about to miss the event of interest of the spherical video segment that may be occurring outside the display field of view.
  • a visible notification may be presented at and/or near the center of the display field of view, in the periphery of the display field of view, and/or at other locations within the display field of view of spherical video segment.
  • An audible notification may include a notification sound played by a speaker, within one or more earphones, within one or both sides of
  • a sensory notification may be delivered via client computing platform(s) 102, one or more display devices associated with client computing platform(s) 102, one or more control (e.g., user interfacing) devices associated with client computing platform(s) 102, and/or other devices.
  • the notification may include a direction to orient the display to include the viewing angle within the display field of view.
  • the notification may indicate where within the spherical video segment the event of interest is taking and/or is about to take place.
  • the notification may include a direction to orient the display to include the viewing angle associated with the tag information within the display field of view.
  • the notification may include a number of degrees to angle and/or move the display such that the event of interest may be displayed within the display field of view. For example, proximate to the point in time associated with the tag information, a graphical notification may include "Look to your left to see something cool!" The graphical notification may include an arrow pointing in the direction to move the display such that the event of interest may be displayed within the display field of view.
  • An audible notification may include a spoken message of "Don't miss what is happening to your right” and/or a sound may be played in the right ear of the user to indicate to the user that the user should look to the right.
  • the sound may be played in the left ear of the user to indicate to the user that the user should look to the left.
  • the notification may include the point in time at which the event of interest is viewable.
  • the notification may indicate the point in time within the spherical video segment the event of interest is taking place, is about to take place, and/or did take place.
  • the graphical notification may include "Look to your left to see something cool at 2 minutes and 8 seconds!
  • the graphical notification may include a timer that counts down until the point in time.
  • the audible notification for example, may include a spoken message of "Don't miss what is happening to your right in 10 seconds.”
  • Notification component 118 may be configured to effectuate presentation of the notification after the point in time has passed.
  • System 100 may provide user controls, as discussed above, to allow the user associated with client computing platform(s) 102 to fast forward, rewind, view in slow motion, and/or other user controls to control display and/or manipulation of the spherical video segment.
  • FIG. 4 depicts an example display field of view 400 of spherical video segment 200 including notification 402 at point in time 404, in accordance with one or more implementations.
  • notification 402 Responsive to a determination proximate point in time 404 (e.g., point in time 404 being 60 seconds into spherical video segment 200) that a viewing angle associated with tag information associated with an event of interest is located outside display field of view 400, notification 402 may be generated and presented within field of view 400.
  • Notification 402 may indicate that an event of interest within spherical video segment 200 is about to take place and/or is taking place outside field of view 400.
  • Notification 402 may indicate a direction toward which the user should orient display 202 to observe the portion of spherical video segment 400 that may include the event of interest. For example, perhaps the skier from FIGS. 2 and 3 performs a jump at 67 seconds into spherical video segment 200. As the skier is no longer depicted within field of view 400, notification 402 may indicate which direction to orient display 202 to view the skier and/or the event of interest (e.g., the skip jump) within display field of view 400 again. While notification 402 is depicted in FIG. 4 as an arrow, this is for exemplary purposes only and is not meant to be a limitation of this disclosure, as other notifications may be provided.
  • system 100 may receive, via client computing platform(s) 102 associated with the display, the tag information associated with the event of interest such that the user may be reminded of the event of interest next time the user views the spherical video segment and/or so other viewers and/or consumers may not miss the event of interest while viewing the spherical video segment. While the consumer other than the user views the spherical video segment, the consumer may wish to mark and/or tag, in a similar manner as discussed above, a second event of interest within the spherical video segment for his or her reference and/or for other viewers and/or consumers.
  • the tag information associated with the second event of interest may identify a second point in time in the spherical video segment and a second viewing angle within the spherical video segment at which the second event of interest may be viewable within the spherical video segment.
  • the tag information associated with the second event of interest may be stored within electronic storage 122 and/or other storage device.
  • system 100 may effectuate transmission of the tag information associated with the second event of interest to a second client computing platform associated with a second consumer, such that the second consumer may view the spherical video segment via a display associated with the second client computing platform.
  • Notifications may be presented for the event of interest and/or the second event of interest as the second consumer views the spherical video segment.
  • the spherical video segment may include any number of events of interest via the tag information, which may be received by the same user and/or other users. Any number of notifications for individual events of interest may be presented for future viewers of the spherical video segment.
  • the tag information may be used for editing the spherical video segment.
  • editing the spherical video segment may include one or more of: selecting and/or compiling a two dimensional video segment of the spherical video segment; augmenting (e.g., removing content, etc.), enhancing (e.g., color correcting, etc.), and/or otherwise refining one or more portions and/or sections (e.g., frames, and/or other portions and/or sections) of the spherical video segment; and/or otherwise editing one or more portions and/or sections (e.g., frames, and/or other portions and/or sections) of the spherical video segment.
  • the tag information for the spherical video segment may indicate one or more portions and/or sections (e.g., frames, and/or other portions and/or sections) of the spherical video segment that should be edited.
  • the tag information may be used to select portions and/or sections from the spherical video segment to edit.
  • client computing platforms 102, server(s) 104, and/or external resources 120 may be operatively linked via one or more electronic communication links.
  • electronic communication links may be established, at least in part, via a network 130 such as the Internet and/or other networks. It will be appreciated that this is not intended to be limiting and that the scope of this disclosure includes implementations in which client computing platform(s) 102, server(s) 104, and/or external resources 120 may be operatively linked via some other communication media.
  • the external resources 120 may include sources of information, hosts and/or providers of virtual spaces outside of system 100, external entities participating with system 100, external entities for digital content and/or digital content platforms, and/or other resources. In some implementations, some or all of the functionality attributed herein to external resources 120 may be provided by resources included in system 100.
  • the client computing platform(s) 102 may include electronic storage 122, one or more processor(s) 124, and/or other components.
  • the client computing platform(s) 102 may include communication lines or ports to enable the exchange of information with a network and/or other servers and/or computing platforms. Illustration of client computing platform(s) 102 in FIG. 1 is not intended to be limiting.
  • the client computing platform(s) 102 may include a plurality of hardware, software, and/or firmware components operating together to provide the functionality attributed herein to client computing platform(s) 102.
  • client computing platform(s) 102 may be implemented by a cloud of computing platforms operating together as client computing platform(s) 102.
  • One or more server(s) 104 may include an electronic storage, one or more processors, computer readable instructions configured to execute components 106-118.
  • Electronic storage 122 may comprise electronic storage media that electronically stores information.
  • the electronic storage media of electronic storage 122 may include one or both of system storage that is provided integrally (i.e., substantially non-removable) with client computing platform(s) 102 and/or removable storage that is removably connectable to client computing platform(s) 102 via, for example, a port or a drive.
  • a port may include a USB port, a firewire port, and/or other port.
  • a drive may include a disk drive and/or other drive.
  • Electronic storage 122 may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
  • the electronic storage 122 may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources).
  • Electronic storage 122 may store software algorithms, information determined by processor(s) 124, information received from client computing platform(s) 102, information received from server(s) 104, and/or other information that enables client computing platform(s) 102 to function as described herein.
  • Processor(s) 124 are configured to provide information processing capabilities in client computing platform(s) 102.
  • processor(s) 124 may include one or more of a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, a state machine, and/or other mechanisms for electronically processing information.
  • processor(s) 124 are shown in FIG. 1 as a single entity, this is for illustrative purposes only.
  • processor(s) 124 may include one or more processing units. These processing units may be physically located within the same device, or processor(s) 124 may represent processing functionality of a plurality of devices operating in coordination.
  • the processor 124 may be configured to execute components 106-118.
  • Processor 124 may be configured to execute components 106, 108, 110, 111, 112, 114, 116, and/or 118 by software; hardware; firmware; some combination of software, hardware, and/or firmware; and/or other mechanisms for configuring processing capabilities on processor(s) 124.
  • components 106-118 are illustrated in FIG. 1 as being located and/or co-located within a particular component of system 100, in implementations in which physical processor(s) 124 include multiple processing units, one or more of components 106-118 may be located remotely from the other components.
  • components 106-118 may provide more or less functionality than is described.
  • one or more of components 106-118 may be eliminated, and some or all of its functionality may be incorporated, shared, integrated into, and/or otherwise provided by other ones of components 106-118.
  • physical processor(s) 124 may be configured to execute one or more additional components that may perform some or all of the functionality attributed herein to one of components 106-118.
  • One or more of the components of system 100 may be configured to present and/or provide a user interface to provide an interface between system 100 and a user (e.g. a controlling entity, and/or other users using a graphical user interface) through which the user can provide information to and receive information from system 100.
  • a user e.g. a controlling entity, and/or other users using a graphical user interface
  • This enables data, results, and/or instructions (e.g., determinations, selections, and/or other indications) and any other
  • communicable items collectively referred to as "information," to be communicated between the user and system 100.
  • An example of information that may be conveyed by a user and/or controlling entity is a selected time indication, a selected location indication, a user comment and/or comment information, and/or other information.
  • Examples of interface devices suitable for inclusion in a user interface include one or more of those associated with a computing platform, a keypad, buttons, switches, a keyboard, knobs, levers, a display screen, a touch screen, a mouse, speakers, a microphone, an indicator light, an audible alarm, and/or a printer.
  • Information may be provided to a user by the user interface in the form of a graphical user interface.
  • the user interface may be integrated with a removable storage interface provided by electronic storage 122.
  • information is loaded into system 100 from removable storage (e.g., a smart card, a flash drive, a removable disk, etc.) that enables the user(s) to customize system 100.
  • removable storage e.g., a smart card, a flash drive, a removable disk, etc.
  • Other exemplary input devices and techniques adapted for use with system 100 as the user interface include, but are not limited to, an RS-232 port, RF link, an IR link, modem (telephone, cable, Ethernet, internet or other). In short, any technique for communicating information with system 100 is contemplated as the user interface.
  • FIGS. 5 and 6 illustrate exemplary methods 500 and 600 for presenting and viewing a spherical video segment, in accordance with one or more implementations.
  • the operations of methods 500 and/or 600 presented below are intended to be illustrative and non-limiting examples. In certain implementations, methods 500 and/or 600 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of methods 500 and/or 600 are illustrated in FIGS. 5 and/or 6 and described below is not intended to be limiting.
  • methods 500 and/or 600 may be implemented in one or more processing devices (e.g., a digital processor, an analog processor, a digital circuit designed to process information, an analog circuit designed to process information, and/or other mechanisms for electronically processing information).
  • the one or more processing devices may include one or more devices executing some or all of the operations of methods 500 and/or 600 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 500.
  • a spherical video segment may be obtained.
  • operation 502 may be performed by a spherical segment component that is the same as or similar to spherical segment component 106 (shown in FIG. 1 and described herein).
  • an orientation of a two dimensional display may be determined based upon output signals of a sensor.
  • the two dimensional display may be configured to present two dimensional images.
  • the sensor may be configured to generate output signals conveying information related to the orientation of the display.
  • operation 504 may be performed by an orientation component that is the same as or similar to orientation component 108 (shown in FIG. 1 and described herein).
  • a display field of view within the spherical video segment to be presented on the display may be determined based upon the orientation of the display.
  • operation 506 may be performed by a field of view component that is the same as or similar to field of view component 112 (shown in FIG. 1 and described herein).
  • operation 508 the display field of view of the spherical video segment may be presented on the display.
  • operation 508 may be performed by a presentation component that is the same as or similar to presentation component 114 (shown in FIG. 1 and described herein).
  • the display field of view may be captured as a two dimensional video segment.
  • operation 510 may be performed by a capture component that is the same as or similar to capture component 116 (shown in FIG. 1 and described herein).
  • a spherical video segment may be obtained.
  • the spherical video segment may include tag information associated with an event of interest in the spherical video segment.
  • the tag information may identify a point in time in the spherical video segment and a viewing angle in the spherical video segment at which the event of interest is viewable in the spherical video segment.
  • operation 502 may be performed by a spherical segment component that is the same as or similar to spherical segment component 106 (shown in FIG. 1 and described herein).
  • an orientation of a two dimensional display may be determined based upon output signals of a sensor.
  • the two dimensional display may be configured to present two dimensional images.
  • the sensor may be configured to generate output signals conveying information related to the orientation of the display.
  • operation 604 may be performed by an orientation component that is the same as or similar to orientation component 108 (shown in FIG. 1 and described herein).
  • a display field of view within the spherical video segment to be presented on the display may be determined based upon the orientation of the display.
  • operation 606 may be performed by a field of view component that is the same as or similar to field of view component 112 (shown in FIG. 1 and described herein).
  • operation 608 the display field of view of the spherical video segment may be presented on the display.
  • operation 608 may be performed by a presentation component that is the same as or similar to presentation component 114 (shown in FIG. 1 and described herein).
  • operation 610 whether the viewing angle is located within the display field of view proximate to the point in time may be determined.
  • operation 610 may be performed by a field of view component that is the same as or similar to field of view component 112 (shown in FIG. 1 and described herein).
  • alert information may be generated.
  • the alert information may indicate the event of interest for the spherical video segment is located outside the display field of view.
  • operation 612 may be performed by a notification component that is the same as or similar to notification component 118 (shown in FIG. 1 and described herein).
  • a notification may be presented based upon the alert information.
  • the notification may include the alert information and may be presented within the display field of view.
  • operation 614 may be performed by a notification component that is the same as or similar to notification component 118 (shown in FIG. 1 and described herein).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne des systèmes et des procédés pour présenter et visualiser un segment vidéo sphérique. Le segment vidéo sphérique comprenant des informations d'étiquette associées à un événement d'intérêt peut être obtenu. Les informations d'étiquette peuvent identifier un instant et un angle de visualisation auxquels l'événement d'intérêt peut être visualisé dans le segment vidéo sphérique. Une orientation d'un dispositif d'affichage bidimensionnel peut être déterminée sur la base de signaux de sortie d'un capteur. Un champ de vision d'affichage dans le segment vidéo sphérique peut être déterminé et présenté sur le dispositif d'affichage sur la base de l'orientation du dispositif d'affichage. Le champ de vision d'affichage peut être capturé sous la forme d'un segment vidéo bidimensionnel. Si l'angle de visualisation de l'événement d'intérêt est en dehors du champ de vision d'affichage à proximité de l'instant, une notification peut être présentée dans le champ de vision d'affichage.
PCT/US2017/018508 2016-02-17 2017-02-17 Système et procédé pour présenter et visualiser un segment vidéo sphérique WO2017143289A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP17753994.7A EP3417609A4 (fr) 2016-02-17 2017-02-17 Système et procédé pour présenter et visualiser un segment vidéo sphérique
CN201780012107.XA CN108702451A (zh) 2016-02-17 2017-02-17 用于呈现和观看球形视频片段的系统和方法

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US15/046,344 2016-02-17
US15/046,344 US9973746B2 (en) 2016-02-17 2016-02-17 System and method for presenting and viewing a spherical video segment
US15/050,275 2016-02-22
US15/050,297 2016-02-22
US15/050,275 US9743060B1 (en) 2016-02-22 2016-02-22 System and method for presenting and viewing a spherical video segment
US15/050,297 US9602795B1 (en) 2016-02-22 2016-02-22 System and method for presenting and viewing a spherical video segment

Publications (1)

Publication Number Publication Date
WO2017143289A1 true WO2017143289A1 (fr) 2017-08-24

Family

ID=59625526

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2017/018508 WO2017143289A1 (fr) 2016-02-17 2017-02-17 Système et procédé pour présenter et visualiser un segment vidéo sphérique

Country Status (3)

Country Link
EP (1) EP3417609A4 (fr)
CN (1) CN108702451A (fr)
WO (1) WO2017143289A1 (fr)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10482653B1 (en) 2018-05-22 2019-11-19 At&T Intellectual Property I, L.P. System for active-focus prediction in 360 video
US10721510B2 (en) 2018-05-17 2020-07-21 At&T Intellectual Property I, L.P. Directing user focus in 360 video consumption
US10827225B2 (en) 2018-06-01 2020-11-03 AT&T Intellectual Propety I, L.P. Navigation for 360-degree video streaming

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040075738A1 (en) * 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
US20050062869A1 (en) * 1999-04-08 2005-03-24 Zimmermann Steven Dwain Immersive video presentations
US20100299630A1 (en) * 2009-05-22 2010-11-25 Immersive Media Company Hybrid media viewing application including a region of interest within a wide field of view
US20120206565A1 (en) * 2011-02-10 2012-08-16 Jason Villmer Omni-directional camera and related viewing software
US20120242798A1 (en) * 2011-01-10 2012-09-27 Terrence Edward Mcardle System and method for sharing virtual and augmented reality scenes between users and viewers
US8890954B2 (en) * 2010-09-13 2014-11-18 Contour, Llc Portable digital video camera configured for remote image acquisition control and viewing
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8964008B2 (en) * 2011-06-17 2015-02-24 Microsoft Technology Licensing, Llc Volumetric video presentation
JP6075066B2 (ja) * 2012-12-28 2017-02-08 株式会社リコー 画像管理システム、画像管理方法、及びプログラム
US10977864B2 (en) * 2014-02-21 2021-04-13 Dropbox, Inc. Techniques for capturing and displaying partial motion in virtual or augmented reality scenes

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050062869A1 (en) * 1999-04-08 2005-03-24 Zimmermann Steven Dwain Immersive video presentations
US20040075738A1 (en) * 1999-05-12 2004-04-22 Sean Burke Spherical surveillance system architecture
US20100299630A1 (en) * 2009-05-22 2010-11-25 Immersive Media Company Hybrid media viewing application including a region of interest within a wide field of view
US8890954B2 (en) * 2010-09-13 2014-11-18 Contour, Llc Portable digital video camera configured for remote image acquisition control and viewing
US8896694B2 (en) * 2010-09-13 2014-11-25 Contour, Llc Portable digital video camera configured for remote image acquisition control and viewing
US20120242798A1 (en) * 2011-01-10 2012-09-27 Terrence Edward Mcardle System and method for sharing virtual and augmented reality scenes between users and viewers
US20120206565A1 (en) * 2011-02-10 2012-08-16 Jason Villmer Omni-directional camera and related viewing software
US9473758B1 (en) * 2015-12-06 2016-10-18 Sliver VR Technologies, Inc. Methods and systems for game video recording and virtual reality replay

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3417609A4 *

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10721510B2 (en) 2018-05-17 2020-07-21 At&T Intellectual Property I, L.P. Directing user focus in 360 video consumption
US11218758B2 (en) 2018-05-17 2022-01-04 At&T Intellectual Property I, L.P. Directing user focus in 360 video consumption
US10482653B1 (en) 2018-05-22 2019-11-19 At&T Intellectual Property I, L.P. System for active-focus prediction in 360 video
US10783701B2 (en) 2018-05-22 2020-09-22 At&T Intellectual Property I, L.P. System for active-focus prediction in 360 video
US11100697B2 (en) 2018-05-22 2021-08-24 At&T Intellectual Property I, L.P. System for active-focus prediction in 360 video
US11651546B2 (en) 2018-05-22 2023-05-16 At&T Intellectual Property I, L.P. System for active-focus prediction in 360 video
US10827225B2 (en) 2018-06-01 2020-11-03 AT&T Intellectual Propety I, L.P. Navigation for 360-degree video streaming
US11197066B2 (en) 2018-06-01 2021-12-07 At&T Intellectual Property I, L.P. Navigation for 360-degree video streaming

Also Published As

Publication number Publication date
CN108702451A (zh) 2018-10-23
EP3417609A1 (fr) 2018-12-26
EP3417609A4 (fr) 2019-07-17

Similar Documents

Publication Publication Date Title
US11546566B2 (en) System and method for presenting and viewing a spherical video segment
US9743060B1 (en) System and method for presenting and viewing a spherical video segment
US9973746B2 (en) System and method for presenting and viewing a spherical video segment
US10750088B2 (en) System and method for identifying comment clusters for panoramic content segments
US10843088B2 (en) Sharing recorded gameplay
US11482192B2 (en) Automated object selection and placement for augmented reality
US10430558B2 (en) Methods and systems for controlling access to virtual reality media content
US8867886B2 (en) Surround video playback
US20160227115A1 (en) System for digital media capture
US20150172238A1 (en) Sharing content on devices with reduced user actions
US20130129304A1 (en) Variable 3-d surround video playback with virtual panning and smooth transition
KR20150100795A (ko) 그룹 이벤트에서의 영상 캡처, 처리 및 전달
EP3272127B1 (fr) Système d'interaction sociale à base de vidéo
WO2017143289A1 (fr) Système et procédé pour présenter et visualiser un segment vidéo sphérique
US11057681B2 (en) Systems and methods for providing access to still images derived from a video
JP2024543235A (ja) コンピュータネットワーク上で配信される複数のコンテンツストリームを特徴とするメディア資産の迅速なコンテンツ切り替えを提供するためのシステム及び方法
CA3237494A1 (fr) Systemes et procedes pour fournir une commutation de contenu rapide dans des contenus multimedias comprenant de multiples flux de contenu qui sont delivres sur des reseaux informatiques
US20180302691A1 (en) Providing smart tags

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17753994

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2017753994

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2017753994

Country of ref document: EP

Effective date: 20180917