CN112637612B - Live broadcast platform and interactive video processing method thereof - Google Patents
Live broadcast platform and interactive video processing method thereof Download PDFInfo
- Publication number
- CN112637612B CN112637612B CN201910907638.6A CN201910907638A CN112637612B CN 112637612 B CN112637612 B CN 112637612B CN 201910907638 A CN201910907638 A CN 201910907638A CN 112637612 B CN112637612 B CN 112637612B
- Authority
- CN
- China
- Prior art keywords
- playing
- client
- interactive video
- path
- node
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 192
- 238000003672 processing method Methods 0.000 title claims abstract description 9
- 238000000034 method Methods 0.000 claims description 35
- 230000008569 process Effects 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 26
- 238000003860 storage Methods 0.000 description 9
- 230000002776 aggregation Effects 0.000 description 7
- 238000004220 aggregation Methods 0.000 description 7
- 238000012423 maintenance Methods 0.000 description 7
- 230000006870 function Effects 0.000 description 6
- 230000004048 modification Effects 0.000 description 4
- 238000012986 modification Methods 0.000 description 4
- 238000012217 deletion Methods 0.000 description 3
- 230000037430 deletion Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000002955 isolation Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 2
- 238000003780 insertion Methods 0.000 description 2
- 230000037431 insertion Effects 0.000 description 2
- 230000000670 limiting effect Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000008520 organization Effects 0.000 description 2
- 230000036961 partial effect Effects 0.000 description 2
- 230000002829 reductive effect Effects 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/21—Server components or server architectures
- H04N21/218—Source of audio or video content, e.g. local disk arrays
- H04N21/2187—Live feed
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/25—Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
- H04N21/262—Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
- H04N21/26208—Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
- H04N21/26241—Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints involving the time of distribution, e.g. the best time of the day for inserting an advertisement or airing a children program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/4302—Content synchronisation processes, e.g. decoder synchronisation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/472—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content
- H04N21/47217—End-user interface for requesting content, additional data or services; End-user interface for interacting with content, e.g. for content reservation or setting reminders, for requesting event notification, for manipulating displayed content for controlling playback functions for recorded or on-demand content, e.g. using progress bars, mode or play-point indicators or bookmarks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
- H04N21/478—Supplemental services, e.g. displaying phone caller identification, shopping application
- H04N21/4788—Supplemental services, e.g. displaying phone caller identification, shopping application communicating with other users, e.g. chatting
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Databases & Information Systems (AREA)
- Business, Economics & Management (AREA)
- Marketing (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Television Signal Processing For Recording (AREA)
Abstract
The embodiment of the specification provides a live broadcast platform and an interactive video processing method thereof. In addition, due to the fact that the play time offset of each play node is edited, when the interactive video is played by the live broadcast platform, the play progress of each client can be controlled based on the play time offset, each client can discuss the video content on a relatively uniform time axis, and interactivity of each user in the live broadcast room when the interactive video is watched is improved.
Description
Technical Field
The specification relates to the technical field of computer software, in particular to a live broadcast platform and an interactive video processing method thereof.
Background
The interactive video is a novel video which integrates interactive experience into a linear video through various technical means. Fig. 1 is a schematic diagram of an interactive video in a practical application scenario. When the interactive video is played to a certain progress, a plurality of branch options can be provided on the playing interface for the user to select, and when the user watches the video animation of the interactive video on the live broadcast platform, different branches can be selected independently to watch different plot trends.
Disclosure of Invention
Based on the above, the present specification provides a live broadcast platform and an interactive video processing method thereof.
According to a first aspect of embodiments of the present specification, there is provided a live platform, including:
the node editing module and the video editing module;
the node editing module is used for receiving a node editing instruction, creating playing nodes according to the node editing instruction and setting playing time offsets corresponding to the playing nodes; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
the video editing module is used for receiving video files and respectively associating each video file with a corresponding playing sub-path so as to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in the live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value.
According to a second aspect of embodiments of the present specification, there is provided an interactive video processing method, based on the live broadcast platform of any embodiment, the method including:
receiving a node editing instruction, creating playing nodes according to the node editing instruction and setting playing time offsets corresponding to the playing nodes; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
receiving video files, and respectively associating each video file with a corresponding playing sub-path to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in the live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value.
By applying the scheme of the embodiment of the specification, each playing node is edited on the live broadcast platform, and the paths among the playing nodes are filled through the video files to generate the interactive video.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the specification.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present specification and together with the description, serve to explain the principles of the specification.
Fig. 1 is a schematic diagram of an interactive video in a practical application scenario.
Fig. 2 is a schematic diagram of a live platform of one embodiment of the present description.
Fig. 3 is a schematic diagram of a play node according to an embodiment of the present specification.
Fig. 4(a) is a schematic diagram of an interactive video editing interface according to an embodiment of the present specification.
Fig. 4(b) is a schematic diagram of an interactive video editing interface according to another embodiment of the present specification.
Fig. 5 is a schematic diagram of interactive video playing time and playing field according to an embodiment of the present specification.
Fig. 6 is a schematic diagram of a bullet screen displayed on a client according to an embodiment of the present disclosure.
Fig. 7 is a schematic diagram of a bullet screen displayed on a client according to another embodiment of the present disclosure.
Fig. 8 is a schematic diagram of a live platform of another embodiment of the present description.
Fig. 9 is a partial functional architecture diagram of a live platform of one embodiment of the present description.
FIG. 10 is a schematic diagram of an interactive video creation process, one embodiment of the present description.
Fig. 11 is a schematic diagram of an interactive video playback process according to an embodiment of the present specification.
Fig. 12 is an overall functional architecture diagram of a live platform of an embodiment of the present specification.
Fig. 13 is a flowchart of an interactive video processing method according to an embodiment of the present specification.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present specification. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the specification, as detailed in the appended claims.
The terminology used in the description herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the description. As used in this specification and the appended claims, the singular forms "a", "an", and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any and all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used herein to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of the present specification. The word "if" as used herein may be interpreted as "at … …" or "when … …" or "in response to a determination", depending on the context.
Fig. 2 is a schematic diagram of a live broadcast platform according to an embodiment of the present specification. The live platform may include:
a node editing module 201 and a video editing module 202;
the node editing module 201 is configured to receive a node editing instruction, create a playing node according to the node editing instruction, and set a playing time offset corresponding to each playing node; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
the video editing module 202 is configured to receive video files, and associate each video file with a corresponding playing sub-path to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in the live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value.
The terms used in this example are defined as follows:
playback node (node for short): nodes arranged and associated according to a certain organization structure can be predefined during interactive video production, and an interactive video at least needs 3 nodes, namely a bifurcation node and at least two child nodes corresponding to the bifurcation node (namely, the bifurcation node is a father node of the two child nodes), and can also comprise other nodes, such as aggregation nodes and common nodes. The node including a plurality of child nodes is a bifurcation node, the node including a plurality of father nodes is an aggregation node, and the node including at most only one father node and at most only one child node is a non-bifurcation node. According to the positions of the nodes, the nodes can be further divided into a starting node, a middle node and an ending node, wherein the starting node is a root node, and the starting node has no father node; the end node is a leaf node, and the end node has no child node; nodes having both child nodes and parent nodes are intermediate nodes.
A schematic diagram of a playout node of one embodiment is shown in fig. 3. The graph comprises 8 playing nodes comprising S, R, A, B, E1, E2, E3 and E4, wherein the node S is a starting node, the nodes A and B are child nodes of the node S, the nodes E1 and E2 are child nodes of the node A, the nodes E3 and E4 are child nodes of the node B, the node S is a non-branching node, the nodes R, A and B are branching nodes, and the nodes E1, E2, E3 and E4 are all ending nodes.
And (3) adjacent playing nodes: one broadcast node and its parent node are adjacent broadcast nodes, and similarly, one broadcast node and its child node are adjacent broadcast nodes.
And (3) playing path: a path connecting the two playback nodes. For example, the path connecting the node S and the node R is the playback path SR from the node S to the node R, and for example, the path connecting the node S and the node a is the playback path SR → RA from the node S to the node a. The playing path between two adjacent playing nodes is also referred to as a playing sub-path, for example, SR in fig. 3 is the playing sub-path between the playing node S and the playing node R, and RA is the playing sub-path between the playing node R and the playing node a. The bifurcation node comprises a plurality of playing paths, and the common node and the aggregation node both comprise one playing path. The playback path between a node and its child nodes is referred to as the playback path pertaining to the node. The playback path subordinate to the child node of the node is also a playback path subordinate to the node.
The interactive video comprises a plurality of video files, each playing sub-path corresponds to one video file of the interactive video, the video files corresponding to all the playing sub-paths jointly form the video content of the interactive video, and each playing node and the playing paths among the playing nodes form the playing logic of the interactive video. The playing sub-paths belonging to the same bifurcation node are parallel playing sub-paths corresponding to a plurality of parallel bifurcation scenarios in the interactive video, and a user can select one of the playing sub-paths in an alternative mode to independently select the scenario trend of the interactive video. For example, in the playing node shown in fig. 3, at the node R, the user may select an option corresponding to RA to enable the interactive video to play the branching scenario corresponding to RA; at node a, the user may select an option corresponding to AE1 to cause the interactive video to play a branching scenario corresponding to AE 1.
History selection path: the user selects the playing path in the process of watching the interactive video. For example, at the node R, if the user selects the option corresponding to RA, the historical selection path of the user is SR → RA. And if the user does not select the playing path, the history selection path is empty.
The current playing path is as follows: the sub-path currently being played at the current time. For example, when the playing path of the interactive video is SR → RA and the video file corresponding to RA is currently being played, RA is the current playing path.
Default play path: and each bifurcation node corresponds to a default playing path, and if the user does not select the playing path within a preset time period, the system automatically selects a preset playing path, namely the default playing path, for the user until the user reselects the playing path. For example, in fig. 3, the default playback path for node R may BE set to RA, and the default playback path for node B may BE set to BE 3. And a path between two adjacent playing nodes on the default playing path is a default playing sub-path.
Target play path: the next playing sub-path to be played can be selected by the user, and if the user does not select the next playing sub-path within the preset time, the default playing path is used as the target playing path. For example, in node R, if the user selects RB, RB is the target playback path; and if the user does not select the target playing path, taking the default playing sub-path RA of the node R as the target playing path.
Play time offset: and the time offset of the current moment relative to the initial playing moment of the interactive video. For example, the interactive video starts playing at 20:00:00, and the current time is 20:10:00, the playing time offset is 10 minutes.
The live broadcast platform of the embodiment can directly or indirectly receive instructions sent by operation and maintenance personnel or by director so as to realize the processes of creating, editing, releasing and playing interactive videos.
The node editing module 201 may receive a node editing instruction sent by an operation and maintenance worker or a director, where the node editing instruction may carry a play time offset of a play node, that is, a time difference between a time when the interactive video progresses to the play node and an initial play time of the interactive video. For example, the initial playing time of the interactive video is 20:00:00, the time of progressing to the first playing node is 20:10:00, and the playing time offset of the first playing node is 10 minutes. And when the interactive video is played, the playing node and the child nodes of the playing node are connected to the interactive video.
Further, the node editing instruction may also carry node types of the playing node, including a branching type, an aggregation type (also called a convergence type), and a normal type. The playback node includes a branching node, i.e., a playback node including a plurality of child nodes, and the aggregation node, i.e., a playback node including a plurality of father nodes, where the number of the common nodes, i.e., the father nodes and the child nodes, is 1, or only the father nodes have no child nodes (i.e., end nodes), or only the child nodes have no playback nodes of the father nodes (i.e., start nodes). For the branching type nodes, the node editing instruction can also carry the branching number of the playing node, each branch corresponds to one sub-node, and each branch of the same branching node corresponds to each scenario branch in the interactive video in parallel.
The playing node corresponding to an interactive video at least comprises a bifurcation node and two bifurcations corresponding to the bifurcation node, namely, the interactive video at least comprises 3 nodes (a bifurcation node and at least two sub-nodes). In practical applications, the interactive video may further include other types of nodes, for example, an aggregation node and/or a normal node, and the playback nodes shown in fig. 3 include normal nodes S, E1, E2, E3 and E4, and further include branch nodes R, a and B.
After the playing nodes are defined, the video editing module 202 may receive operation and maintenance personnel or compile uploaded video files, and associate each video file with a corresponding playing sub-path respectively. The video editing module 202 may obtain a URL (Uniform Resource Locator) of the video file input by the operation and maintenance personnel or director, and obtain a corresponding video file by accessing the URL.
Fig. 4(a) and 4(b) are schematic diagrams of an interactive video editing interface including only two play paths of RS → SA → AM → MT and RS → SB → BM → MT, and fig. 4(a) is a schematic diagram of a node editing interface including a node creation component in which one play node can be newly added by sending an instruction to the node creation component (for example, by mouse clicking the component). The created playback nodes may be displayed in the playback node list in order. The play node list includes the serial number, type, node code and play time offset of each play node. Each playing node can correspond to a node modification component and a node deletion component, and the type, the node code and the playing time offset of the corresponding playing node can be modified by sending instructions to the node modification component. The corresponding playback node can be deleted by sending an instruction to the node deletion component.
Fig. 4(b) shows a schematic diagram of a video upload interface of an embodiment. For nodes created in the node editing process, a path between adjacent nodes is a playing path, and a video file can be uploaded for each playing path. For example, a path between node S and node a in fig. 4(a) may upload a video file, and the code of the video file may be denoted as 2_3, where 2 is the node sequence number of the head node (i.e., node S) on the path, and 3 is the node sequence number of the tail node (i.e., node a) on the path. The head node is a start node on a path, and the tail node is an end node on a path. For the fork node, the code of the video file corresponding to each fork path of the fork node may further include the number of the fork path corresponding to the video file, for example, 3_5_1 represents the video file corresponding to the 1 st fork path between the play node with sequence number 3 and the play node with sequence number 5. Each bifurcation node can also correspond to a bifurcation adding component for adding the bifurcation path of the bifurcation node.
In the embodiment, the organization structure and the playing path of each video file in the interactive video are determined by defining the playing node. In the playing node shown in fig. 3, there are 4 playing paths of the interactive video, which are respectively: SR → RA → AE1, SR → RA → AE2, SR → RB → BE3 and SR → RB → BE 4. In addition, a virtual time axis (i.e., a script time axis) is formed by setting the play time offset corresponding to each play node, and the meaning of the script time axis is that the time offset between the play time of each frame of video frame in the interactive video and the initial play time of the interactive video is determined, and the time offset is only related to each video file in the interactive video and the play path of the interactive video and is not related to whether the network condition of the client is good or not. That is, after the interactive video has been edited, what scenario is played in the fraction of seconds of the interactive video is fixed.
If the difference between the actual playing time of the current interactive video played by the client and the playing time offset of the interactive video on the script time axis is larger due to the reasons of network blockage and the like, the playing progress of the interactive video played by the client can be controlled by means of frame tracking or adjusting the reserved time of the playing path selected by the client and the like, so that the playing delay of the interactive video played by the client is reduced. Therefore, the playing progress of the interactive video played by each client is relatively uniform, the clients can conveniently discuss the scenarios on the relatively uniform time axis, and the live broadcast interactivity is improved.
In one embodiment, the live platform further comprises a video management module 203; the video management module 203 is configured to receive play time information and play field information of each interactive video, and associate the play time information and the play field information with corresponding interactive videos respectively. An interactive video playing time and playing field schematic diagram of one embodiment of the present description is shown in fig. 5. In the figure, interactive video 1 is played at 13:00 and 17:00, respectively, for a period of 90 minutes, and the 13:00 played field is the first show (i.e., the first play). The interactive video 2 is played at 15:00, and the playing time duration is 100 minutes. By playing different scenes of the same interactive video at different times, the content of the plot which is not watched before the user watches from the beginning in other scenes by the late time can be ensured, and the user experience is improved. For example, assuming that the user enters the live broadcast room at 13:10, the user misses the scenario of the first 10 minutes when the interactive video 1 is first shown, and thus, the user can enter the live broadcast room before 17:00 to re-watch the scenario of the first 10 minutes of the interactive video 1.
In one embodiment, the live platform further includes a console 204; the console 204 is configured to respond to the received interactive video editing instruction in the playing process of the interactive video, so as to insert a playing node in the interactive video, delete the playing node in the interactive video, and insert an advertisement in the interactive video.
In this embodiment, part of the control right to the interactive video may be opened to the operation and maintenance personnel or director, and the operation and maintenance personnel or director manually sends an instruction to edit the interactive video. The operation and maintenance personnel or the director can send an interactive video editing instruction to the live broadcast platform, wherein the interactive video editing instruction can be a play node insertion instruction, a play node deletion instruction or an advertisement insertion instruction, and can also be other types of instructions. After receiving the interactive video editing instruction, the live broadcast platform can temporarily insert the playing node, delete the playing node, or insert the advertisement in the interactive video accordingly.
In one embodiment, the console 204 is further configured to count the number of instruction operations of the client viewing the interactive video; and/or counting the number of clients on each play sub-path. In this embodiment, the console 204 may count the number of instruction operations executed by each client, where the instruction operations include an instruction operation for selecting a playback path, an instruction operation for enabling or disabling the barrage isolation function, an instruction operation for enabling or disabling the barrage anti-penetration function, and the like. The console 204 may also count the number of clients on each playing sub-path, that is, count how many clients are playing the video files corresponding to the path SR, how many clients are playing the video files corresponding to the path RA, and so on.
In one embodiment, the live platform further comprises an instruction controller 205; the instruction controller 205 is configured to obtain server time, and send an interactive video playing start instruction to the client through a long connection pre-established with the client if the server time reaches the playing time, where the interactive video playing start instruction is used for the client to access a video address of the interactive video, so as to start playing the video file.
The live platform may maintain a video playlist (i.e., a program guide) for recording the playing time information and the playing session information of each interactive video, and load the command controller 205. For the client already in the live broadcast room before the interactive video is played, the client may establish a long connection with the live broadcast platform when entering the live broadcast room, monitor a message of the long connection, wait for a uniform playing command, and the command controller 205 performs a self-loop execution, and when it is determined that the current time of the server reaches the playing time, continuously send a playing command to each client on time, where the playing command includes an address (e.g., m3u8) of a currently played video file, and the client starts playing the corresponding video file after accessing the address.
In one embodiment, the instruction controller is further configured to respond to a short connection establishment request sent by the client to establish a short connection with the client, and receive a play request instruction sent by the client through the short connection, where the play request instruction is used for the client to request to start playing the interactive video.
For a client (a late user) which enters a live broadcast room after the live broadcast room starts to play interactive video, after the client enters the live broadcast room, a short connection establishment request can be sent to a live broadcast platform, and the instruction controller 205 can establish a short connection with the client and receive a play request instruction sent by the client through the short connection, so as to start to play the interactive video. In addition, if the client side has network disconnection and other situations in the interactive video playing process, a playing request instruction can be actively sent to the live broadcast platform through short connection, so that the interactive video is continuously played.
In one embodiment, the live platform further comprises: a bullet screen processing module 206; the bullet screen processing module 206 is configured to obtain a bullet screen sent by the client, obtain a current playing path for playing the interactive video when the client sends the bullet screen, and forward the bullet screen to other clients on the current playing path.
In this embodiment, the bullet screen processing module 206 may forward the bullet screen sent by one client to each client with the same current playing path. For example, the current playing paths of the client 1 and the client 2 are both RA, and the client 1 sends the bullet screen "today is good weather", so that the bullet screen processing module 206 may forward the bullet screen "today is good weather" to the client 2 for display.
Further, if the current playing paths of the first client and the second client are different, the bullet screen processing module 206 may isolate the bullet screen from the second client when receiving the bullet screen sent by the first client, that is, the bullet screen processing module 206 performs filtering and shielding on the bullet screen sent by the first client, so as to prevent the bullet screen sent by the first client from being forwarded to the second client.
Fig. 6 and 7 are schematic diagrams of barrage displayed on the client according to an embodiment. Suppose that user 1 on the play path RA sends a barrage of "true weather today" and user 2 on the play path RB sends a barrage of "great family! "and" fast rendezvous ", user 3 on the play path RB sent the barrage" who is still looking? ". The bullet screen displayed on the play interface of the user 1 shields the bullet screens sent by the users 2 and 3, as shown in fig. 6; the bullet screens displayed on the playing interfaces of the users 2 and 3 shield the bullet screen sent by the user 1, as shown in fig. 7.
According to the bullet screen isolation method, the bullet screens sent by the clients under different playing paths are shielded, so that a user can only see the bullet screens sent by the clients under the same playing path, the number of the bullet screens is reduced, and the video playing is prevented from being interfered due to too many bullet screens; on the other hand, as the clients watching different plot branches of the interactive video in different playing paths are isolated by the barrage, the clients watching the same plot branches in the same playing path can conveniently communicate through the barrage, the interactivity in the interactive video playing process is improved, and the user experience is improved.
In one embodiment, the bullet screen processing module 206 is further configured to: and acquiring a second play time offset of the moment of sending the barrage by the second client relative to the initial play moment of the interactive video, and forwarding the barrage to the first client which plays the interactive video at present and has the first time offset smaller than the second time offset.
When the second client side sends the bullet screen, the time point of sending the bullet screen by the second client side is associated with the bullet screen, and then the bullet screen is sent to the server. After the server receives the bullet screen, the time associated with the bullet screen can be analyzed to obtain a second time offset. The second time offset represents that the moment of sending the bullet screen by the second client is played to the second fraction of the interactive video.
In addition, the server may further obtain a first time offset of the interactive video currently played by the first client in the same live broadcast room, where the first time offset represents a fraction of a second from the current playing of the interactive video by the first client.
If the second time offset is larger than the first time offset, the playing progress of the interactive video played on the second client is faster than the playing progress of the interactive video played on the first client. For example, the first time offset is 00: 20:00, second time offset 00: 30:00, i.e., the first client plays only to the 00 th: 20:00, and the second client has played to the 00 th: 30:00. That is, the second client is 10 minutes faster than the play progress of the first client.
In the above case, since the second client has already played the subsequent scenario of the video content currently played by the first client, the barrage sent by the second client is likely to cause a breakthrough to the user of the first client. In order to prevent the above situation, the barrage sent by the second client may be isolated at the first client, that is, the server intercepts the barrage sent by the second client, and the barrage sent by the second client is prevented from being displayed on the first client.
Suppose that user 1 sends a barrage "today is really good weather" through client 1 and user 2 sends a barrage "family good!through client 2! "and" speed rendezvous ", user 3 sent the barrage" who is still looking? ", and user 1 is currently playing to the 00 th: 10:00, and user 2 is currently playing to the 00 th: 20:00, and user 3 is currently playing to the 00 th: 30:00. The bullet screen displayed on the play interface of the user 1 separates the bullet screens sent by the users 2 and 3, as shown in fig. 6; the bullet screen displayed on the play interface of the user 2 only isolates the bullet screen sent by the user 3, as shown in fig. 7.
According to the scheme of the embodiment, the second time offset of the time when the second client sends the barrage relative to the initial playing time of the interactive video is obtained, and the first time offset of the interactive video currently played by the first client is obtained, so that the barrage sent by the second client is isolated by the first client with the first time offset smaller than the second time offset, the barrage sent by the second client with the faster playing progress is prevented from causing the user of the first client with the slower playing progress to be thoroughly played, and the experience of the user in watching the interactive video is improved.
In one embodiment, the live platform further comprises a play module 207; the playing module 207 is configured to receive a playing path selection instruction of the client, determine a target playing sub-path of the interactive video according to the playing path selection instruction, and send a URL of a video file associated with the target playing sub-path to the client.
And assuming that the playing node R comprises a playing sub-path RA and a playing sub-path RB, and the target playing sub-path selected by the playing path selection instruction is RB, sending the URL of the video file associated with the RB to the client.
Further, if the playing path selection instruction of the client is not received within a preset time period, the URL of the video file associated with the target default playing sub-path is sent to the client. And if the playing node R comprises a playing sub-path RA and a playing sub-path RB, wherein RA is a target default playing sub-path of the node R on the default playing path, and a playing path selection instruction sent by the client is not received in a preset time period, sending the URL of the video file associated with RA to the client.
In one embodiment, the playing module 207 is further configured to: sending the current script time of the interactive video to the client, wherein the current script time is used for comparing the client with the actual playing time of the interactive video which is played currently, and tracking frames to a server when the difference value between the current script time and the actual playing time is greater than a preset time threshold value; wherein the current script time is a time offset between a playing time of a current video frame in the interactive video relative to a starting playing time of the interactive video.
The script time is the time on the script time axis, and as mentioned above, the script time axis determines the time offset between the playing time of each frame of video frame in the interactive video and the initial playing time of the interactive video, and the script time is equivalent to the playing time of the client in an ideal state, that is, the client has no network delay and no stuck playing state. However, in practical situations, due to network latency and the presence of katton, the actual playing time is generally later than the script time, and the actual playing times of different clients are generally different.
For example, according to the set script time, the interactive video should start playing at 18:00:00, progress to the bifurcation node S at 18:01:00, progress to the bifurcation node B at 18:04:00 via the playing path SB, and progress to the bifurcation node A at 18:06:00 via the playing path SA.
The network condition of the client 1 is generally, and occasionally is stuck, the client continues playing after 8 seconds of being stuck at 18:00:08, receives a bifurcation message of a playing node S sent by a live platform at 18:01:00, displays a selection component for selecting a subsequent playing path of the node S on a playing interface of the client at 18:01:08, receives a bifurcation message of a playing node B sent by the live platform at 18:04:00, displays a selection component for selecting a subsequent playing path of the node B on the playing interface of the client at 18:04:09, receives a bifurcation message of a playing node A sent by the live platform at 18:06:00, and displays a selection component for selecting a playing path of the node A on the playing interface of the client at 18:06: 10.
The network condition of the client 2 is very poor and needs buffering frequently, the client continues playing after 20 seconds of hiton at 18:00:20 (assuming that the preset time threshold is 10 seconds), receives the branching message of the playing node S sent by the live platform at 18:01:00, can perform frame chase after receiving the branching message of the playing node S in order to reduce the playing delay, immediately displays a selection component for selecting the subsequent playing path of the node S on the playing interface of the client at 18:01:10 (at this time, only 10 seconds of playing delay remain through frame chase), receives the branching message of the playing node B sent by the live platform at 18:04:00, displays the selection component for selecting the subsequent playing path of the node B on the playing interface of the client at 18:04:05, receives the branching message of the playing node a sent by the live platform at 18:06:00, the selection component for selecting the play path of node a is only displayed on the play interface of the client at 18:06: 07.
The frame pursuit can be that the client terminal carries out active frame pursuit after receiving the node message. Specifically, the client may obtain the current script time t1 and obtain the current playing time t2 of the client, and if the difference between the two is greater than a certain threshold (i.e., the tolerance duration), the client player immediately continues playing from the time point t 1. And if the pause is less than the tolerance duration, no frame chase is performed.
The frame tracing may also be a passive frame tracing, for example, the live broadcast platform sends a heartbeat packet to the client at regular time (for example, sends the heartbeat packet every 3 seconds), the client detects whether the tolerance duration is exceeded or not when receiving the heartbeat packet, if so, the frame tracing is performed, otherwise, the frame tracing is not performed.
In one embodiment, the playing module 207 is further configured to: acquiring a bullet screen white list of a first client, wherein the bullet screen white list is used for storing a list of preset playing paths; wherein the preset playing path satisfies: the barrage sent by the second client on the preset playing path is allowed to be displayed on the first client; when a bullet screen sent by a second client is received, judging whether the second current playing path is in the bullet screen white list or not; and if not, isolating the bullet screen sent by the second client side at the first client side.
Further, if the second current playing path is in the bullet screen white list, the bullet screen sent by the second client is displayed at the first client.
In this embodiment, the first client may further display a bullet screen sent by a second client having a part different from the current playing path of the client, and the current playing path (i.e., the preset playing path) where the second client performing bullet screen display on the client is located is allowed to be stored in the bullet screen white list in advance. The preset playing path may be a playing sub-path belonging to the same bifurcation node as the current playing path of the first client, or may be a playing sub-path under other bifurcation nodes. The preset playing path may be customized by a user, or a default preset playing path may be adopted. The default preset playing path may be a playing sub-path subordinate to the same forking node as the current playing path of the first client.
The first client can start the bullet screen white list function, and after the bullet screen white list function is started, if the first client does not edit the preset playing path in the bullet screen white list, the default preset playing path is adopted; and if the first client edits the preset playing path in the bullet screen white list, adopting the edited preset playing path.
The scheme of this embodiment enables the user to autonomously select the bullet screen source for viewing, for example, when the user wants to view comments of other users on different playing sub-paths corresponding to the same cross node, each playing sub-path corresponding to the same cross node may be added to the bullet screen white list. The bullet screen isolation mode is more flexible, and the user experience is further improved.
In one embodiment, the playing module 207 is further configured to: and storing the bullet screen scope of each playing node, and shielding the bullet screen sent by a second client side of which the current playing path is not in the bullet screen scope on the first client side when the current playing path of the first client side is in the bullet screen scope.
In this embodiment, each node corresponds to a bullet screen scope, and the bullet screen scope is used to store a list of a group of play paths, where the play paths in the list satisfy: and the bullet screens sent by the clients on the playing path can be displayed on other clients on the playing path.
Taking the playing node shown in fig. 3 as an example, assuming that the bullet screen scope of the playing node R is RA, the current playing paths of the client 1 and the client 2 are both RA, and the current playing path of the client 3 is RB, then, because the current playing paths of the client 1 and the client 2 are both in the bullet screen scope, the bullet screens sent by the client 1 and the client 2 can be displayed on the client 1, and similarly, the bullet screens sent by the client 1 and the client 2 can also be displayed on the client 2. Since the client 3 is outside the bullet screen scope, only the bullet screen transmitted by the client itself is displayed on the client 3, and the bullet screens transmitted by the clients 1 and 2 are not displayed.
In one embodiment, the playing module 207 is further configured to: and updating the playing path of the client. Specifically, for the default playing path, the default playing path may be updated according to the current default playing path. Assuming that the original default playing path is SR → RA, when the scenario advances to the default playing sub-path AE1 corresponding to RA, that is, the video file corresponding to RA is played completely, and the video file corresponding to AE1 is currently played, the default playing path may be updated to SR → RA → AE 1. In the embodiment, the default playing path is updated, so that the currently played path is continuously recorded in the process that the user does not select the playing path, so that corresponding content of the interactive video can be played directly from the back of the currently played path when the user enters the live broadcast room next time.
In one embodiment, a target playing path selected by the client may also be obtained; and after the video file corresponding to the target playing path is played, updating the playing path of the client according to the target playing path. Assuming that the original history selection path is SR → RB, when the scenario progresses to the node B and the user selects the play sub-path BE4, after the video file corresponding to BE4 starts to BE played, the play path is updated to SR → RB → BE 4. In the embodiment, the playing path is updated, so that the currently played path is continuously recorded in the process of watching the interactive video by the user, and the subsequent content of the interactive video can be played on the basis of the currently played path when the user enters the live broadcast room next time.
In one embodiment, the playing module 207 is further configured to: and after the playing path is updated, updating the bullet screen displayed on the client. Since the bullet screen scope of each play node may be different, after the play path is updated, the bullet screen displayed on the client needs to be updated at the same time. Specifically, a target barrage scope of a latest playing node on an updated playing path may be obtained, and when a current playing path of a first client is within the target barrage scope, a barrage sent by a second client whose current playing path is within the target barrage scope is displayed on the first client, and a barrage sent by a second client whose current playing path is not within the target barrage scope is shielded.
Fig. 8 is an overall architecture diagram of a live platform according to another embodiment of the present specification. Fig. 9 is a partial functional architecture diagram of the live platform shown in fig. 8. As shown in fig. 9, an operation specialist or director can perform video editing and node editing on the live platform, and then can upload the interactive video after editing is completed to the live platform. If the number of the interactive videos is multiple, the interactive videos can be managed, and the playing time and the playing field of each video are managed through program list management. In addition, the instruction operation times of the client for watching the interactive video can be counted; and/or counting the number of clients on each play sub-path.
Fig. 10 is a schematic diagram of an interactive video creation process of the live platform shown in fig. 8. As shown in the figure, in the process of creating an interactive video, a node creation play node is created according to a received node creation instruction, and a time offset of each play node is determined, where the time offset of each play node constitutes a script axis. Then, uploading the video files for each playing path respectively to fill the plot time between the playing nodes.
Fig. 11 is a schematic diagram of an interactive video playback process according to an embodiment of the present specification. After entering the live broadcasting room, the client can establish long connection with the live broadcasting platform, monitor a long connection broadcasting command and wait for interactive video broadcasting. And if a playing command is received, playing the interactive video of the current field. After receiving a new instruction (e.g., a fork instruction), a current play time (video _ time) at which the client plays the interactive video is acquired, and a current script time (script _ time) is acquired. And if the current playing time is greater than or equal to the current script time, judging the type of the instruction. For the bifurcation instruction, popping up a selection interactive UI on the client so that a user can select a playing path, or automatically selecting a default option when waiting for selection overtime; for the merge instruction, the next video is directly played; for the end instruction, the playback is ended.
Fig. 12 is an overall functional architecture diagram of a live platform of an embodiment of the present specification. In addition to the functions in the foregoing embodiments, the live broadcast platform of this embodiment may also be compatible with the functions of an existing live broadcast platform, for example, gifting a gift in a live broadcast room, sending a host barrage, playing an advertisement in the live broadcast room, and paying virtual money. In addition, some animation effects can be shown when the user progresses to the branch node or the aggregation node, for example, a thinking countdown effect when the user selects a playing path, a montage transition effect when the user switches to play video files on different playing sub paths, and the like.
Fig. 13 is a flowchart of an interactive video processing method according to an embodiment of the present specification, where the method is based on the live broadcast platform according to any embodiment, and the method includes:
step S1301: receiving a node editing instruction, creating playing nodes according to the node editing instruction and setting playing time offsets corresponding to the playing nodes; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
step S1302: receiving video files, and respectively associating each video file with a corresponding playing sub-path to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in the live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value.
Other embodiments of the above method are detailed in other embodiments of the live broadcast platform, and are not described herein again.
The various technical features in the above embodiments can be arbitrarily combined, so long as there is no conflict or contradiction between the combinations of the features, but the combination is limited by the space and is not described one by one, and therefore, any combination of the various technical features in the above embodiments also falls within the scope disclosed in the present specification.
Embodiments of the present description may take the form of a computer program product embodied on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having program code embodied therein. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
The above description is only exemplary of the present disclosure and should not be taken as limiting the disclosure, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present disclosure should be included in the scope of the present disclosure.
Claims (26)
1. A live platform, comprising:
the node editing module and the video editing module;
the node editing module is used for receiving a node editing instruction, creating playing nodes according to the node editing instruction and setting playing time offsets corresponding to the playing nodes; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
the video editing module is used for receiving video files and respectively associating each video file with a corresponding playing sub-path so as to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in a live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value;
the live broadcast platform further comprises: a video management module; the video management module is used for receiving the playing time information and the playing field information of each interactive video and respectively associating the playing time information and the playing field information with the corresponding interactive video; the play time information includes information indicating a start play time of the interactive video.
2. The live platform of claim 1, further comprising:
a control console;
and the control console is used for responding to the received interactive video editing instruction in the playing process of the interactive video so as to insert playing nodes in the interactive video, delete the playing nodes in the interactive video and insert advertisements in the interactive video.
3. The live platform of claim 2, wherein the console is further configured to:
counting the instruction operation times of the client watching the interactive video; and/or
And counting the number of the clients on each playing sub-path.
4. The live platform of claim 1, further comprising:
an instruction controller;
the instruction controller is used for acquiring server time, and if the server time reaches the playing time, sending an interactive video playing starting instruction to the client through a long connection pre-established with the client, wherein the interactive video playing starting instruction is used for the client to access a video address of the interactive video so as to start playing the video file.
5. The live platform of claim 1, further comprising: an instruction controller; the instruction controller is configured to:
responding to a short connection establishment request sent by the client to establish short connection with the client, and receiving a playing request instruction sent by the client through the short connection, wherein the playing request instruction is used for the client to request to start playing the interactive video.
6. The live platform of claim 1, further comprising:
a bullet screen processing module;
the bullet screen processing module is used for acquiring the bullet screen sent by the client, acquiring the current playing path of the interactive video when the client sends the bullet screen, and forwarding the bullet screen to other clients on the current playing path.
7. The live platform of claim 6, wherein the barrage processing module is further configured to:
and acquiring a second time offset of the moment when the second client sends the barrage relative to the initial playing moment of the interactive video, and forwarding the barrage to the first client which plays the interactive video at present and has the first time offset smaller than the second time offset.
8. The live platform of claim 1, further comprising:
a playing module;
the playing module is used for receiving a playing path selection instruction of the client, determining a target playing sub-path of the interactive video according to the playing path selection instruction, and sending the URL of the video file associated with the target playing sub-path to the client.
9. The live platform of claim 1, further comprising: a playing module; the playing module is used for:
sending the current script time of the interactive video to the client, wherein the current script time is used for comparing the client with the actual playing time of the interactive video which is played currently, and tracking frames to a server when the difference value between the current script time and the actual playing time is greater than a preset time threshold value; wherein the current script time is a time offset between a playing time of a current video frame in the interactive video relative to a starting playing time of the interactive video.
10. The live platform of claim 1, further comprising: a playing module; the playing module is used for:
acquiring a bullet screen white list of a first client, wherein the bullet screen white list is used for storing a list of preset playing paths; wherein the preset playing path satisfies: the barrage sent by the second client on the preset playing path is allowed to be displayed on the first client;
when a bullet screen sent by a second client is received, judging whether a current playing path of the second client is in the bullet screen white list or not;
and if not, isolating the bullet screen sent by the second client side at the first client side.
11. The live platform of claim 1, further comprising: a playing module; the playing module is used for:
and storing the bullet screen scope of each playing node, and shielding the bullet screen sent by a second client side of which the current playing path is not in the bullet screen scope on the first client side when the current playing path of the first client side is in the bullet screen scope.
12. The live platform of claim 1, further comprising: a playing module; the playing module is used for:
and updating the playing path of the client.
13. The live platform of claim 12, wherein the playback module is further configured to:
and after the playing path is updated, updating the bullet screen displayed on the client.
14. An interactive video processing method based on the live platform of any one of claims 1 to 13, the method comprising:
receiving a node editing instruction, creating playing nodes according to the node editing instruction and setting playing time offsets corresponding to the playing nodes; the playing node comprises at least one bifurcation node and at least two sub-nodes of the bifurcation node;
receiving video files, and respectively associating each video file with a corresponding playing sub-path to generate an interactive video; the path between adjacent playing nodes is a playing sub-path, each playing sub-path of the same bifurcation node corresponds to a scenario branch in the interactive video, and the playing time offset is used for controlling the playing progress of each client playing the interactive video in a live broadcast room, so that the playing time difference of each client playing the interactive video is smaller than a preset value;
the method further comprises the following steps:
receiving the playing time information and the playing field information of each interactive video, and respectively associating the playing time information and the playing field information with the corresponding interactive video; the play time information includes information indicating a start play time of the interactive video.
15. The method of claim 14, further comprising:
and in the playing process of the interactive video, responding to a received interactive video editing instruction so as to insert playing nodes into the interactive video, delete the playing nodes from the interactive video and insert advertisements into the interactive video.
16. The method of claim 15, further comprising:
counting the instruction operation times of the client watching the interactive video; and/or
And counting the number of the clients on each playing sub-path.
17. The method of claim 14, further comprising:
and obtaining the server time, and if the server time reaches the playing time, sending an interactive video playing starting instruction to the client through a long connection pre-established with the client, wherein the interactive video playing starting instruction is used for the client to access a video address of the interactive video so as to start playing the video file.
18. The method of claim 14, further comprising:
responding to a short connection establishment request sent by the client to establish short connection with the client, and receiving a playing request instruction sent by the client through the short connection, wherein the playing request instruction is used for the client to request to start playing the interactive video.
19. The method of claim 14, further comprising:
and acquiring the barrage sent by the client, acquiring a current playing path for playing the interactive video when the client sends the barrage, and forwarding the barrage to other clients on the current playing path.
20. The method of claim 19, further comprising:
and acquiring a second time offset of the moment when the second client sends the barrage relative to the initial playing moment of the interactive video, and forwarding the barrage to the first client which plays the interactive video at present and has the first time offset smaller than the second time offset.
21. The method of claim 14, further comprising:
receiving a playing path selection instruction of the client, determining a target playing sub-path of the interactive video according to the playing path selection instruction, and sending the URL of the video file associated with the target playing sub-path to the client.
22. The method of claim 14, further comprising:
sending the current script time of the interactive video to the client, wherein the current script time is used for comparing the client with the actual playing time of the interactive video which is played currently, and tracking frames to a server when the difference value between the current script time and the actual playing time is greater than a preset time threshold value; wherein the current script time is a time offset between a playing time of a current video frame in the interactive video relative to a starting playing time of the interactive video.
23. The method of claim 14, further comprising:
acquiring a bullet screen white list of a first client, wherein the bullet screen white list is used for storing a list of preset playing paths; wherein the preset playing path satisfies: the barrage sent by the second client on the preset playing path is allowed to be displayed on the first client;
when a bullet screen sent by a second client is received, judging whether a current playing path of the second client is in the bullet screen white list or not;
and if not, isolating the bullet screen sent by the second client side at the first client side.
24. The method of claim 14, further comprising:
and storing the bullet screen scope of each playing node, and shielding the bullet screen sent by a second client side of which the current playing path is not in the bullet screen scope on the first client side when the current playing path of the first client side is in the bullet screen scope.
25. The method of claim 14, further comprising:
and updating the playing path of the client.
26. The method of claim 25, further comprising:
and after the playing path is updated, updating the bullet screen displayed on the client.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910907638.6A CN112637612B (en) | 2019-09-24 | 2019-09-24 | Live broadcast platform and interactive video processing method thereof |
US17/762,282 US20220417619A1 (en) | 2019-09-24 | 2020-09-22 | Processing and playing control over interactive video |
PCT/CN2020/116677 WO2021057693A1 (en) | 2019-09-24 | 2020-09-22 | Processing of and playing control over interactive video |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910907638.6A CN112637612B (en) | 2019-09-24 | 2019-09-24 | Live broadcast platform and interactive video processing method thereof |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112637612A CN112637612A (en) | 2021-04-09 |
CN112637612B true CN112637612B (en) | 2021-11-23 |
Family
ID=75282865
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201910907638.6A Active CN112637612B (en) | 2019-09-24 | 2019-09-24 | Live broadcast platform and interactive video processing method thereof |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112637612B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115209175B (en) * | 2022-07-18 | 2023-10-24 | 深圳蓝色鲨鱼科技有限公司 | Voice transmission method and system |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105898394A (en) * | 2016-05-25 | 2016-08-24 | 腾讯科技(深圳)有限公司 | Multimedia playing method and related device |
CN106028065A (en) * | 2016-06-22 | 2016-10-12 | 东方有线网络有限公司 | Method for realizing interactive television watching effect enhancing system based on metadata control |
CN106385594A (en) * | 2016-09-18 | 2017-02-08 | 深圳市青柠互动科技开发有限公司 | Method for optimizing video live broadcast services |
CN108401179A (en) * | 2018-04-02 | 2018-08-14 | 广州荔支网络技术有限公司 | A kind of animation playing method, device and mobile terminal based on virtual objects |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090138906A1 (en) * | 2007-08-24 | 2009-05-28 | Eide Kurt S | Enhanced interactive video system and method |
AU2015330646A1 (en) * | 2014-10-10 | 2017-06-01 | Livebarn Inc. | System and method for optical player tracking in sports venues |
-
2019
- 2019-09-24 CN CN201910907638.6A patent/CN112637612B/en active Active
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105898394A (en) * | 2016-05-25 | 2016-08-24 | 腾讯科技(深圳)有限公司 | Multimedia playing method and related device |
CN106028065A (en) * | 2016-06-22 | 2016-10-12 | 东方有线网络有限公司 | Method for realizing interactive television watching effect enhancing system based on metadata control |
CN106385594A (en) * | 2016-09-18 | 2017-02-08 | 深圳市青柠互动科技开发有限公司 | Method for optimizing video live broadcast services |
CN108401179A (en) * | 2018-04-02 | 2018-08-14 | 广州荔支网络技术有限公司 | A kind of animation playing method, device and mobile terminal based on virtual objects |
Non-Patent Citations (3)
Title |
---|
【互动视频】功能上线!手把手教你投稿~;哔哩哔哩创作中心;《https://m.bilibili.com/video/BV1n4411F7tm》;20190708;第1-5页 * |
互动视频的初步尝试及未来前景探究;葛皓珺;《新媒体研究》;20190903(第15期);第59-60页 * |
哔哩哔哩创作中心.【互动视频】功能上线!手把手教你投稿~.《https://m.bilibili.com/video/BV1n4411F7tm》.2019,第1-5页. * |
Also Published As
Publication number | Publication date |
---|---|
CN112637612A (en) | 2021-04-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8532464B2 (en) | Methods and systems for use in controlling playback of content in relation to recorded content | |
US8839290B2 (en) | Methods and systems for generating a personalized version of a media content program for a user | |
CN107105351B (en) | Regenerating unit | |
CN107920258B (en) | Data processing method and device | |
KR102355752B1 (en) | Device and method for playing an interactive audiovisual movie | |
CN112637612B (en) | Live broadcast platform and interactive video processing method thereof | |
WO2021057693A1 (en) | Processing of and playing control over interactive video | |
CN112637690B (en) | Interactive video production method and device, and server | |
US20230291943A1 (en) | Systems and methods for providing media content for continous watching | |
EP2989633B1 (en) | Method for the reproduction of a film | |
WO2020185616A1 (en) | Systems and methods for providing media content for continuous watching | |
US20120170907A1 (en) | System and method for streaming content to blu-ray devices | |
US20230336809A1 (en) | Audio transitions when streaming audiovisual media titles | |
CN112637691B (en) | Bullet screen isolation method, device and system | |
CN112637611B (en) | Interactive video playing method, device and system | |
US20140037267A1 (en) | Methods and apparatuses for reproducing and recording discless application and information storage medium for recording the discless application | |
CN112637689B (en) | Bullet screen processing method, device and system | |
CN112637657A (en) | Interactive video playing control method, device and system | |
US20200288196A1 (en) | Systems and methods for providing media content for continuous watching | |
CN104967883A (en) | Playlist editing method and device | |
EA042304B1 (en) | DEVICE AND METHOD FOR REPLAYING INTERACTIVE AUDIOVISUAL FILM |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |