[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111669605A - Method and device for synchronizing multimedia data and associated interactive data thereof - Google Patents

Method and device for synchronizing multimedia data and associated interactive data thereof Download PDF

Info

Publication number
CN111669605A
CN111669605A CN201910168672.6A CN201910168672A CN111669605A CN 111669605 A CN111669605 A CN 111669605A CN 201910168672 A CN201910168672 A CN 201910168672A CN 111669605 A CN111669605 A CN 111669605A
Authority
CN
China
Prior art keywords
multimedia data
data
time
encoding
interactive
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201910168672.6A
Other languages
Chinese (zh)
Other versions
CN111669605B (en
Inventor
刘亚运
余学亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201910168672.6A priority Critical patent/CN111669605B/en
Publication of CN111669605A publication Critical patent/CN111669605A/en
Application granted granted Critical
Publication of CN111669605B publication Critical patent/CN111669605B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/80Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
    • H04N21/85Assembly of content; Generation of multimedia applications
    • H04N21/854Content authoring
    • H04N21/8547Content authoring involving timestamps for synchronizing content
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/21Server components or server architectures
    • H04N21/218Source of audio or video content, e.g. local disk arrays
    • H04N21/2187Live feed
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/25Management operations performed by the server for facilitating the content distribution or administrating data related to end-users or client devices, e.g. end-user or client device authentication, learning user preferences for recommending movies
    • H04N21/262Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists
    • H04N21/26208Content or additional data distribution scheduling, e.g. sending additional data at off-peak times, updating software modules, calculating the carousel transmission frequency, delaying a video stream transmission, generating play-lists the scheduling operation being performed under constraints
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4305Synchronising client clock from received content stream, e.g. locking decoder clock with encoder clock, extraction of the PCR packets
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/4302Content synchronisation processes, e.g. decoder synchronisation
    • H04N21/4307Synchronising the rendering of multiple content streams or additional data on devices, e.g. synchronisation of audio on a mobile phone with the video output on the TV screen

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Databases & Information Systems (AREA)
  • Computer Security & Cryptography (AREA)
  • Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)

Abstract

The invention discloses a method for synchronizing multimedia data and associated interactive data thereof, which comprises the following steps: receiving an issuing request aiming at interactive data from an operation management platform; receiving interactive data and encoding configuration for multimedia data from an operations management station; calculating the time offset between the time stamp of the encoded multimedia data and the insertion time point of the interactive data based on the encoding configuration; and inserting the interactive data into the multimedia data in an access layer of a multimedia data processing background based on the time offset. A synchronization apparatus and a computer-readable storage medium are also disclosed.

Description

Method and device for synchronizing multimedia data and associated interactive data thereof
Technical Field
The present invention relates to a method for synchronizing multimedia data, and more particularly, to a method, apparatus and computer-readable storage medium for synchronizing multimedia data with interactive data associated therewith.
Background
In a scene where real-time interaction with a user at an audience side is required in a large live broadcast activity, the multimedia data is required to have higher stability, and the interactive data and the live broadcast multimedia data keep strong consistency in absolute time. Different live broadcast devices (such as a collector and a mobile phone of a studio) have different coding capabilities, and interactive data initiated from a main broadcast end needs to pass through collection and coding, a live broadcast stream background and a content Delivery network (cdn) (content Delivery network) and finally reach a viewer end, so that synchronous display of the interactive data (such as information display, advertisement display, instruction issue and the like) and a live broadcast stream is difficult, the interactive data is often displayed to a user end earlier than a corresponding live broadcast time point, and the interaction time point is not matched with a live broadcast picture, which affects the overall interactive experience of the user.
Disclosure of Invention
It would be advantageous to provide a solution that can alleviate, reduce or eliminate the above-mentioned problems.
According to an aspect of the present invention, there is provided a method for synchronizing multimedia data and associated interactive data thereof, the method comprising: receiving an issuing request aiming at interactive data from an operation management platform; receiving interactive data and encoding configuration for multimedia data from an operations management station; calculating the time offset between the time stamp of the encoded multimedia data and the insertion time point of the interactive data based on the encoding configuration; and inserting the interactive data into the multimedia data in an access layer of a multimedia data processing background based on the time offset.
According to one embodiment, calculating a time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration comprises: the time offset is calculated by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
According to one embodiment, calculating a time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further comprises: the coding configuration is synchronized and stored in the operations management station in advance.
According to one embodiment, the method further comprises setting an encoding time of the multimedia data based on the encoding configuration parameter.
According to one embodiment, setting the encoding time of the multimedia data based on the encoding configuration parameter further comprises: an encoding time of the multimedia data is set based on a weighted sum of at least two encoding configuration parameters.
According to one embodiment, the encoding configuration parameters in the encoding configuration include a code rate, a frame rate, and a group of pictures.
According to one embodiment, inserting the interactive data into the multimedia data in the access stratum of the multimedia data processing background based on the time offset further comprises: and when the timestamp of the latest received frame is greater than or equal to the time offset, inserting the interactive data into the multimedia data in an access layer of a live streaming background at the beginning of the latest received timestamp.
According to another aspect of the present invention, there is provided an apparatus for synchronizing multimedia data and interactive data associated therewith, the apparatus comprising: the receiving module receives an issuing request aiming at the interactive data from the operation management platform, and receives the interactive data from the operation management platform and the coding configuration for the multimedia data; the computing module is used for computing the time offset between the time stamp of the encoded multimedia data and the insertion time point of the interactive data based on the encoding configuration; and the inserting module inserts the interactive data into the multimedia data in an access layer of the multimedia data processing background based on the time offset.
According to one embodiment, the calculating module calculates a time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration includes: the calculation module calculates a time offset by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
According to an embodiment, the calculating module calculates the time offset between the encoded timestamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further comprises: the computing module synchronizes and stores the coding configuration in the operation management table in advance.
According to an embodiment, the calculating module calculates the time offset by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data further includes: the calculation module sets the encoding time of the multimedia data based on the encoding configuration parameters.
According to one embodiment, the calculating module sets the encoding time of the multimedia data based on the encoding configuration parameter further comprises: the calculation module sets an encoding time of the multimedia data based on a weighted sum of the at least two encoding configuration parameters.
According to one embodiment, the encoding configuration parameters in the encoding configuration include a code rate, a frame rate, and a group of pictures.
According to one embodiment, the inserting module inserts the interactive data into the multimedia data in an access stratum of a multimedia data processing background based on the time offset further comprises: the insertion module judges whether the frame of the newly received multimedia data is a frame which is intended to be synchronously transmitted with the interactive data according to whether the timestamp of the newly received frame is greater than or equal to the time offset, and inserts the interactive data into the multimedia data in an access layer of a live streaming background at the time when the timestamp of the newly received frame starts when the timestamp of the newly received frame is greater than or equal to the time offset.
According to yet another aspect of the present invention, a computer readable storage medium is provided, comprising computer program instructions for synchronization of multimedia data with interactive data in a multimedia data push, which when executed by a processor, causes the processor to perform the method as described above.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiment(s) described hereinafter.
Drawings
Further details, features and advantages of the invention are disclosed in the following description of exemplary embodiments with reference to the accompanying drawings, in which:
FIG. 1 shows a schematic diagram illustration of an application scenario according to an embodiment of the present invention;
fig. 2 shows a schematic illustration of a live real-time interaction architecture according to the prior art;
fig. 3 shows a synchronization timing diagram of live streaming and interactive signaling based on the live real-time interactive architecture of fig. 2;
fig. 4 shows a schematic illustration of a live real-time interaction architecture according to an embodiment of the invention;
FIG. 5 shows in more detail a synchronization timing diagram of live streams and interactive data based on the live real-time interactive architecture of FIG. 4;
FIG. 6 shows a schematic flow chart of a method of synchronizing interactive data associated with multimedia data according to an embodiment of the invention;
FIG. 7 shows a schematic block diagram of a synchronization control apparatus according to an embodiment of the present invention; and
fig. 8 generally illustrates an example system that includes an example computing device that represents one or more systems and/or devices that may implement the various techniques described herein.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
Before explaining the embodiments of the present invention in detail, some terms related to the embodiments of the present invention will be explained first.
Live streaming: the method refers to the multimedia data which is collected and coded by the terminal of the anchor user in real time when the anchor user initiates the live broadcasting. The terminal of the anchor user is also referred to as the live end in the embodiments of the present invention. Live streaming background: mainly comprises an access layer and a trans-encapsulation layer. Live broadcasting and stream pushing: the live broadcast push terminal adopts a mobile network or a WIFI network to push the live broadcast stream to a streaming connection node of a live broadcast stream background. Mobile networks include, but are not limited to, 4G, 3G, and the like. May be referred to herein as multimedia streams.
An access layer: the method is used for receiving the live stream and can complete the function of receiving and forwarding the live stream.
And (3) live broadcast interactive signaling: in the live broadcast process, a summary of service-side interactive data needs to be synthesized into live broadcast stream data.
And (3) transferring the packaging layer: the device is used for performing trans-encapsulation processing on the live stream so as to convert the live stream from an original format to a specified new format.
CDN: the method can redirect the request initiated by the user to the service node nearest to the user according to the network flow and the comprehensive information such as the connection, the load condition, the distance to the user, the response time and the like of each service node. The CND is used to enable a user to obtain desired content nearby, thereby solving the problem of network congestion and improving the response speed of the user request. FLV (FLASH VIDEO, streaming media format) data block: refers to a data stream in a generic multimedia data encapsulation format. Each FLV data block includes at least two parts, which are divided into a header (head) and a body (body), the header includes related index information of the body part, such as a presentation Time stamp (pts) including the data block, and the body includes multimedia data, such as multimedia data.
Chunk data: the general names of the live stream data blocks are flv chunk and rtmp chunk, for example.
M3U 8: refers to an M3U file in utf (unicode Transformation format) -8 encoding format. The M3U file records an index plain text file, and when the plain text file is opened, the network address of the corresponding multimedia file is found according to the index of the plain text file for online playing.
The invention provides a method for synchronizing interactive data and live multimedia stream, which enables the interactive data and the live stream to be displayed synchronously. The plug flow can be performed by using mobile phone end plug flow, obs (openbroadcast software) plug flow or other self-developed software without using customized plug flow software, so that the generalization is realized. The method can be flexibly expanded, and the interaction requirement of a user and a live broadcast end is met.
Fig. 1 shows a schematic illustration of an application scenario 100 according to an embodiment of the present invention. The method is applied to live broadcast activities, and has greater requirements on the interaction of the audience side at the live broadcast side in large-scale live broadcast activities. In such a scenario, it is desirable to keep the interactive data aligned with the live multimedia data in absolute time. Such as the live answer scenario 100 shown in fig. 1, it is desirable to synchronously display interactive data (e.g., the title interface in the figure) while the host is announcing the title in the live multimedia stream. The interactive data can also be displayed topic information, displayed advertisements, issued instructions and the like.
Fig. 2 shows a schematic illustration of a prior art live real-time interaction architecture 200. Live-end push streaming generally requires installation of application software with live function in order to capture live multimedia streams. In the prior art, the software is code capture software 201 customized based on the live environment (e.g., indoor, outdoor, etc.) and the live device (e.g., collector, cell phone, etc.). In the process of acquiring codes at the stream pushing end, the pre-written interactive configuration 202 is read as required, and the interactive configuration 202 is added into the real-time coded real-time data stream and pushed to the live stream background 203. The access layer 204 in the live streaming background 203 is configured to receive a live multimedia stream from the live streaming end, and then transmit the live multimedia stream to the trans-encapsulation layer 205 in the live streaming background to perform trans-encapsulation service. The transcoding layer 205 can convert the original live multimedia stream into live multimedia streams of various resolutions and formats. The CDN 206 pulls the stored live stream in the corresponding format and corresponding resolution from the live stream background, and pushes the live stream to the player 207. In the processes of pushing stream at the pushing stream end, background transferring of live stream and back sourcing of the CDN, all the processes need to be in accordance with the private protocol customized by the live environment and the live equipment, and the coupling is high. Major modifications are required for most live protocols. Furthermore, if the configuration information is not successfully read, the stability of the entire live link may be affected.
Fig. 3 illustrates a synchronization timing diagram of live streams and interactive data based on the live real-time interaction architecture 200 of fig. 2. For different encoding devices (e.g. different hardware encoders, large live devices, cell phones)Etc.), the frame rate and the group of pictures gop (group of pictures) will affect the output time of the encoded stream data, and the interactive data is loaded during the encoding at the push end. Because of the need of multimedia stream output, there is a certain coding time delay T through coding acquisition(src_encode)Therefore, the interactive data is usually streaming data before the interactive point, and the interactive data is bound with the encoded data at the previous time and pushed out, so that the interactive data at the playing end is not matched with the live multimedia data. As shown in fig. 3, each data block includes at least two parts, which are divided into a head (head) and a body (body). The header contains relevant index information of the body part, such as a presentation Time stamp (pts) containing the data block. pts refers to the time point of each block of encoded data, and when a live broadcast starts to acquire the encoded data, pts is counted from 0 and strictly incremented. At the stream pushing end shown in fig. 3, at the encoding acquisition time t1, a piece of interaction data is read, which corresponds to the (n + 1) th FRAME (FRAME n + 1) of the encoding acquisition end. The actual encoded output of FRAME n +1 corresponds to chunk data pts in FIG. 3 N+1 . If the interactive data is added at the push end like the prior art, the interactive data will be delayed from the chunk data pts due to the delay of the live multimedia stream M Binding causes the interactive data to appear earlier than the live interactive picture.
Fig. 4 shows a schematic illustration of a live real-time interaction architecture 400 according to an embodiment of the invention. In the live streaming background, the push streaming in FLV format is taken as an example for illustration (it should be understood by those skilled in the art that other formats such as RTMP may also be used). At the push streaming initiation end, the coding configuration collected by the code collector 401 is synchronized to the operation management platform 402 in advance. After the live broadcast starts, live broadcast source data is collected in real time, and the live broadcast stream is pushed to an access layer server 403 in the background of the live broadcast stream through an FLV protocol. The live stream access layer 403 caches the live stream for downstream services to pull back to the source. The decapsulation layer 404 converts data in the FLV format into data in the ts (transport stream) format in real time by pulling the FLV live stream of the access layer. The CDN 405 returns the source TS data in real time by pulling the M3U8 file converted to the encapsulation layer, and provides the player 406 on the user side for playing.
When the interactive data needs to be distributed in the live broadcast, the interactive data to be distributed and the configuration parameter information of the encoding end of the live broadcast are selected through the operation management platform 402 and are sent to the live broadcast interactive platform 407 in real time. By introducing the service of the operation management platform 402 and the live broadcast interactive platform 407, the issuing of the interactive data is extracted from the encoding end, and the coupling degree of the service is reduced. When live real-time interactive data needs to be added, the interactive data is sent to the live interactive platform service in real time by the operation management platform 402 in cooperation with the interaction of the anchor terminal. The live broadcast platform service calculates the delay time from the acquisition to the output of the live broadcast stream by the live broadcast encoding end in real time according to the information of the encoding capability, the code rate, the frame rate and the like of the live broadcast, and pushes the interactive data and the delay time (for example, mainly the encoding time) to the live broadcast stream access layer 403 at the same time. When receiving live interactive data, the forward encapsulation layer 404 writes the interactive data into an M3U8 file in real time, and when returning to the source, sends the interactive data to the CDN 405 for caching. At the audience user side, the player requests the CDN 405 node for the M3U8 file in real time, pulls the current live stream TS stream and parses the private interactive tag data in the M3U8 file, so as to display the live stream and the interactive data in real time.
The operations management station 402 provides mainly three functions: (1) synchronizing configuration information such as frame rate, code rate and GOP of different encoding collectors 401 in real time to a live broadcast interactive platform 407; (2) inputting static interactive data and an interactive data interface which needs to be acquired in real time by coding configuration; (3) and the operation side issues interactive data in real time at a live broadcast interactive point through an operation management platform. As understood by those skilled in the art, the operations manager may also be other control management devices, such as a manager, administrator, etc.
The live broadcast interactive platform 407 mainly has the functions of receiving an issuing request of an operation management platform in real time and calculating delay time from coding to output in real time according to parameter information of a pre-configured coding acquisition end. And encapsulating different interactive data into a unified data stream, carrying a delay time parameter, and sending the data stream to an access layer of a live streaming background in real time. Judging the alignment of signaling and flow: when the trans-encapsulation layer returns the source live stream to the access layer, the access layer judges whether the current interactive data needs to be inserted into the source return data of the live stream or not by calculating the time point of receiving the interactive data and the time interval of delay. By controlling the binding time of the live streaming data and the signaling, the strong consistency of the interactive data and the live streaming in the whole live broadcast can be ensured.
Fig. 5 shows in more detail a synchronization timing diagram of live streams and interactive data based on the live real-time interactive architecture of fig. 4. Because the stream pushing end can set the coding parameter information of the live broadcast before the live broadcast is initiated, the information can not be changed in the normal live broadcast after the setting is completed. The relevant encoding parameters are synchronized to the operation management station 402 side. In the live broadcast, when the live broadcast interactive data needs to be issued, the delay time d1 from the acquisition to the output can be calculated in real time according to the encoding parameters. At the operation side, as shown in fig. 5, at time t1, the operation end issues a piece of live broadcast interactive data, and the corresponding acquisition end corresponds to FRAME (n + 1) block data. In the coding output layer and the access layer, the data block corresponding to the time point t1 is the mth block, m<n + 1. If the interactive data is bound into the live streaming data at this time, the interactive data will not be aligned with the display of the streaming picture (as shown by the interactive data in the dotted line in fig. 5). The difference in time from the acquisition of the code to the output of the live stream, i.e. the delay time T(src_encode). When the background of the live streaming receives the interactive data, the interactive data is calculated according to the T(src_encode)And time, caching the interactive data, and recording the received time point. Judging the alignment of signaling and flow: judging whether the pts time in the latest received data block meets the pts time difference greater than or equal to T at the moment of receiving signaling in real time(src_encode). It is assumed that the time meeting the condition is t2, that is, after t2, the live broadcast access layer will bind the interactive data to the corresponding live broadcast stream chunk when the encapsulation service is switched to the source. In the conversion packaging layer, in the received chunk data, the interactive data and the live streaming data are corrected, and finallyAnd the consistency of interactive display at the playing end is ensured.
In order to solve the problem of strong consistency of time points of a stream pushing end and a signaling end, the invention provides a delay calculation method for calculating the time from coding to output. In the overall delay calculation, the total delay time from the source side to the streaming server is assumed to be T(src_total)The delay time of the source end collecting the scene picture and outputting uncompressed data is T(src_capture)The delay time of coding, compressing and outputting FLV live broadcast data after the source end collects the data is T(src_encode)The source end sends FLV live stream data to the stream receiving server through the network for a delay time T(src_send)Then, there are: t is(src_total)=T(src_capture)+T(src_encode)+ T(src_send)
Generally, the hardware device collects the field picture data in real time basically, the delay is very low and can be ignored, so that T(src_capture)=0, simplified to: t is(src_total)=T(src_encode)+ T(src_send)
Transmission delay T(send)Generally, in a fixed network environment, the constant value is objectively stable in a fixed numerical range and does not fluctuate too much, and the constant value is a constant value and accounts for the whole T(src_total)A very low fraction, T(send)10 to 100 milliseconds, T(src_total)1-5 seconds, such that T(src_total)Is dependent on T(src_encode)。T(src_encode)Coding complexity and parameters are different according to coding modes, mainly video coding (H264 or HEVC), delay is greatly changed, and generally T is(src_encode)The parameters such as coding frame rate, coding GOP length, coding IPB frame number, coding prediction mode, search mode, and loodhead number in scene switching mode, which influence coding performance and coding delay, are strongly correlated, so that T(src_encode)It can be written as a mathematical function of the video coding parameters: t is(src_encode)=F(encode_param)。
The encode _ param includes various indexes such as the encoding frame rate, the GOP length and the like described above, and the function returns the givenThe resulting delay value of the parameter. The F (encode _ param) function can take different function models according to the precision requirement, and generally, the linear weighting coefficient fitting can greatly meet the requirement, namely: f (encode _ param) = k1 × X1+ k2 × X2+. the. + Kn × Xn (Kn is a weighting coefficient, and Xn is each encoding delay influence parameter). The delay time from the sending of the field interaction data to the receiving of the signaling by the server is T(signal)In the real state T(src_total)>T(signal),T(src_total)Generally 1 to 3 seconds, T(signal)Generally about 10 to 100 milliseconds. The data corresponding to the voice of the interactive interface broadcast picture in the stream picture to be realized is completely synchronous with the interactive data, namely the following requirements are met: t is(src_total)-T(signal)=0, so we need to adjust the time of the faster party, here the interactive data party, so that the arrival signaling delay is exactly the same as the stream delay. Assume that the new signal delay time is T'(signal)And T'(signal)=T(src_total)=T(signal)+T(offset)So that the superimposed signalling delay offset T can be calculated(offset)=T(src_total)–T(signal)I.e., the difference between times t1 and t2 in fig. 5.
Fig. 6 shows a schematic flow chart of a method 600 of synchronizing interactive data associated with multimedia data according to an embodiment of the invention. In step 602, the live interactive platform receives an issuing request for interactive data from an operation management console. The interactive data may be, for example, titles, options, etc. in a live answer. In step 604, the live interactive platform receives interactive data and encoding configuration for multimedia data from the operations management station. The encoding configuration of the multimedia data may be parameters that affect encoding performance and encoding delay, such as an encoding frame rate, an encoding group of pictures GOP length, the number of encoding IPBs, an encoding prediction mode, a search mode, the number of loadheads in a scene switching mode, and the like. In step 606, the live interactive platform calculates a time offset between the encoded timestamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration. That is, a time offset between a time stamp of live multimedia data delayed by encoding and an insertion time point of interactive data is calculated. In step 608, the live interactive platform further inserts the interactive data into the multimedia data in the access layer of the multimedia data processing background based on the time offset.
In one embodiment, the step 606 of calculating the time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration comprises: the time offset is calculated by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
In one embodiment, the step 606 of calculating the time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further comprises: the coding configuration is synchronized and stored in the operations management station in advance.
In one embodiment, the encoding time of the multimedia data is set based on the encoding configuration parameter.
In one embodiment, setting the encoding time of the multimedia data based on the encoding configuration parameter further comprises: an encoding time of the multimedia data is set based on a weighted sum of at least two encoding configuration parameters.
In one embodiment, the encoding configuration parameters in the encoding configuration include a code rate, a frame rate, a group of pictures, and the like.
In one embodiment, the step 608 of inserting the interactive data into the multimedia data in the access stratum of the multimedia data processing background based on the time offset further comprises: and when the timestamp of the latest received frame is greater than or equal to the time offset, inserting the interactive data into the multimedia data in an access layer of a live streaming background at the beginning of the latest received timestamp.
Fig. 7 shows a schematic block diagram of a synchronization control apparatus 700 according to an embodiment of the present invention. The synchronization control apparatus 700 includes a receiving module 701, a calculating module 702, and an inserting module 703. Wherein the receiving module 701 is configured to receive an issuing request for interactive data from an operation management station, and receive the interactive data and an encoding configuration for multimedia data from the operation management station. The calculation module 702 is configured to calculate a time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration. The inserting module 703 is configured to insert the interactive data into the multimedia data in an access stratum of a multimedia data processing background based on the time offset.
In one embodiment, the calculating module 702 calculating the time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration comprises: the calculation module calculates a time offset by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
In one embodiment, the calculating module 702 calculates the time offset between the encoded time stamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further includes: the calculation module 702 synchronizes and stores the coding configuration in the operations management station in advance.
In one embodiment, the calculating module 702 calculating the time offset by the difference between the encoding time of the multimedia data and the delay time of receiving the interactive data further comprises: the calculation module 702 sets the encoding time of the multimedia data based on the encoding configuration parameters.
In one embodiment, the calculating module 702 setting the encoding time of the multimedia data based on the encoding configuration parameter further comprises: the calculation module 702 sets the encoding time of the multimedia data based on a weighted sum of at least two encoding configuration parameters.
In one embodiment, the encoding configuration parameters in the encoding configuration include a code rate, a frame rate, and a group of pictures.
In one embodiment, the inserting module 703 inserting the interactive data into the multimedia data in the access stratum of the multimedia data processing background based on the time offset further comprises: the inserting module 703 determines whether the frame of the multimedia data received latest is a frame with which the interactive data is intended to be transmitted synchronously by determining whether the timestamp of the frame received latest is greater than or equal to the time offset, and inserts the interactive data into the multimedia data in the access layer of the live streaming background at the time when the timestamp of the frame received latest is greater than or equal to the time offset.
Fig. 8 generally illustrates an example system 800 that includes an example computing device 810 that represents one or more systems and/or devices that may implement the various techniques described herein. Computing device 810 may be, for example, a device associated with a client (e.g., a client device), a system-on-chip, a server of a service provider, and/or any other suitable computing device or computing system. The synchronization control device 700 described above with respect to fig. 7 may take the form of a computing device 810. Alternatively, the synchronization control apparatus 700 may be implemented as a computer program in the form of a synchronization control application 816. More specifically, the synchronization control apparatus 700 may be implemented as an integral part of the video player or as a plug-in that can be downloaded and installed separately from the video player.
The example computing device 810 as illustrated includes a processing system 811, one or more computer-readable storage media 812, and one or more I/O interfaces 813 communicatively coupled to each other. Although not shown, computing device 810 may also include a system bus or other data and command transfer system that couples the various components to one another. A system bus can include any one or combination of different bus structures, such as a memory bus or memory controller, a peripheral bus, a universal serial bus, and/or a processor or local bus that utilizes any of a variety of bus architectures. Various other examples are also contemplated, such as control and data lines.
The processing system 811 represents functionality to perform one or more operations using hardware. Thus, the processing system 811 is illustrated as including hardware elements 814 that may be configured as processors, functional blocks, and so forth. This may include implementation in hardware as an application specific integrated circuit or other logic device formed using one or more semiconductors. The hardware elements 814 are not limited by the materials from which they are formed or the processing mechanisms employed therein. For example, a processor may be comprised of semiconductor(s) and/or transistors (e.g., electronic Integrated Circuits (ICs)). In such a context, processor-executable instructions may be electronically-executable instructions.
The computer-readable storage medium 812 is illustrated as including memory/storage 815. Memory/storage 815 represents memory/storage capacity associated with one or more computer-readable storage media. The memory/storage 815 may include volatile media (such as Random Access Memory (RAM)) and/or nonvolatile media (such as Read Only Memory (ROM), flash memory, optical disks, magnetic disks, and so forth). The memory/storage 815 may include fixed media (e.g., RAM, ROM, a fixed hard drive, etc.) as well as removable media (e.g., flash memory, a removable hard drive, an optical disk, and so forth). Computer-readable storage media 812 may be configured in various other ways as further described below.
One or more I/O interfaces 813 represent functionality that allows a user to enter commands and information to computing device 810, and optionally also allows information to be presented to the user and/or other components or devices using various input/output devices. Examples of input devices include a keyboard, a cursor control device (e.g., a mouse), a microphone (e.g., for voice input), a scanner, touch functionality (e.g., capacitive or other sensors configured to detect physical touch), a camera (e.g., motion that may not involve touch may be detected as gestures using visible or invisible wavelengths such as infrared frequencies), and so forth. Examples of output devices include a display device (e.g., a monitor or projector), speakers, a printer, a network card, a haptic response device, and so forth. Accordingly, the computing device 810 may be configured in various ways to support user interaction, as described further below.
Computing device 810 may also include a synchronization control application 816. The synchronization control application 816 may be, for example, a software instance of the synchronization control device 700 of fig. 7, and in combination with other elements in the computing device 810 implement the techniques described herein.
Various techniques may be described herein in the general context of software hardware elements or program modules. Generally, these modules include routines, programs, objects, elements, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The terms "module," "functionality," and "component" as used herein generally represent software, firmware, hardware, or a combination thereof. The features of the techniques described herein are platform-independent, meaning that the techniques may be implemented on a variety of computing platforms having a variety of processors.
An implementation of the described modules and techniques may be stored on or transmitted across some form of computer readable storage media. Computer readable storage media may include a variety of media that can be accessed by computing device 810. By way of example, and not limitation, computer-readable storage media may include "computer-readable storage media" and "computer-readable signal media".
"computer-readable storage medium" refers to a medium and/or device, and/or a tangible storage apparatus, capable of persistently storing information, as opposed to mere signal transmission, carrier wave, or signal per se. Accordingly, computer-readable storage media refers to non-signal bearing media. Computer-readable storage media include hardware such as volatile and nonvolatile, removable and non-removable media and/or storage devices implemented in a method or technology suitable for storage of information such as computer-readable instructions, data structures, program modules, logic elements/circuits or other data. Examples of computer readable storage media may include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, Digital Versatile Disks (DVD) or other optical storage, hard disks, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or other storage devices, tangible media, or an article of manufacture suitable for storing the desired information and accessible by a computer.
"computer-readable signal medium" refers to a signal-bearing medium configured to transmit instructions to hardware of computing device 810, such as via a network. Signal media typically embodies computer readable instructions, data structures, program modules or other data in a modulated data signal such as a carrier wave, data signal or other transport mechanism. Signal media also includes any information delivery media. The term "modulated data signal" means a signal that has one or more of its characteristics set or changed in such a manner as to encode information in the signal. By way of example, and not limitation, communication media includes wired media such as a wired network or direct-wired connection, and wireless media such as acoustic, RF, infrared and other wireless media.
As previously described, the hardware element 814 and the computer-readable storage medium 812 represent instructions, modules, programmable device logic, and/or fixed device logic implemented in hardware form that may be used in some embodiments to implement at least some aspects of the techniques described herein. The hardware elements may include integrated circuits or systems-on-chips, Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), Complex Programmable Logic Devices (CPLDs), and other implementations in silicon or components of other hardware devices. In this context, a hardware element may serve as a processing device that performs program tasks defined by instructions, modules, and/or logic embodied by the hardware element, as well as a hardware device for storing instructions for execution, such as the computer-readable storage medium described previously.
Combinations of the foregoing may also be used to implement the various techniques and modules described herein. Thus, software, hardware, or program modules and other program modules may be implemented as one or more instructions and/or logic embodied on some form of computer-readable storage medium and/or by one or more hardware elements 814. Computing device 810 may be configured to implement particular instructions and/or functions corresponding to software and/or hardware modules. Thus, implementing modules as modules executable by computing device 810 as software may be implemented at least partially in hardware, for example, using computer-readable storage media of a processing system and/or hardware elements 814. The instructions and/or functions may be executable/operable by one or more articles of manufacture (e.g., one or more computing devices 810 and/or processing systems 811) to implement the techniques, modules, and examples described herein.
In various implementations, computing device 810 may assume a variety of different configurations. For example, computing device 810 may be implemented as a computer-like device including a personal computer, desktop computer, multi-screen computer, laptop computer, netbook, and so forth. The computing device 810 may also be implemented as a mobile device-like device including mobile devices such as mobile telephones, portable music players, portable gaming devices, tablet computers, multi-screen computers, and the like. Computing device 810 may also be implemented as a television-like device that includes devices with or connected to a generally larger screen in a casual viewing environment. These devices include televisions, set-top boxes, game consoles, and the like.
The techniques described herein may be supported by these various configurations of computing device 810 and are not limited to specific examples of the techniques described herein. Computing device 810 may also interact through a wide variety of communication technologies "clouds" 820.
Cloud 820 includes and/or is representative of a platform 822 for resources 824. The platform 822 abstracts underlying functionality of hardware (e.g., servers) and software resources of the cloud 820. Resources 824 may include applications and/or data that may be used when computer processing is performed on a server remote from computing device 810. Resources 824 may also include services provided over the internet and/or over a subscriber network such as a cellular or Wi-Fi network. The platform 822 may abstract resources and functions to connect the computing device 810 with other computing devices. The platform 822 may also serve to abstract the hierarchy of resources to provide a corresponding level of hierarchy encountered for the requirements of the resources 824 implemented via the platform 822.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are disclosed as example forms of implementing the claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
Variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed subject matter, from a study of the drawings, the disclosure, and the appended claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.

Claims (15)

1. A method of synchronizing multimedia data with associated interactive data thereof, the method comprising:
receiving an issuing request aiming at the interactive data from an operation management platform;
receiving the interactive data and an encoding configuration for the multimedia data from an operations management station;
calculating a time offset between the time stamp of the encoded multimedia data and an insertion time point of the interactive data based on the encoding configuration; and
inserting the interactive data into the multimedia data in an access stratum of a multimedia data processing background based on the time offset.
2. The method of claim 1, wherein calculating a time offset between the encoded timestamp of the multimedia data and an insertion time point of the interactive data based on the encoding configuration comprises:
the time offset is calculated by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
3. The method of claim 1, wherein calculating a time offset between the encoded timestamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further comprises:
the coding configuration is synchronized in advance and stored in the operations management station.
4. The method of claim 2, further comprising:
and setting the encoding time of the multimedia data based on the encoding configuration parameters.
5. The method of claim 4, setting an encoding time for the multimedia data based on an encoding configuration parameter further comprising:
setting an encoding time of the multimedia data based on a weighted sum of at least two encoding configuration parameters.
6. The method of any preceding claim, wherein the encoding configuration parameters in the encoding configuration include code rate, frame rate and group of pictures.
7. The method of claim 1, wherein inserting the interactive data into multimedia data in an access stratum of a multimedia data processing background based on the time offset further comprises:
and judging whether the frame of the newly received multimedia data is a frame which the interactive data intends to be synchronously transmitted with the newly received multimedia data or not by judging whether the timestamp of the newly received frame is greater than or equal to the time offset, and inserting the interactive data into the multimedia data in an access layer of a live streaming background at the time when the timestamp of the newly received frame is greater than or equal to the time offset.
8. A device for synchronizing multimedia data with interactive data associated therewith, the device comprising:
the receiving module is used for receiving an issuing request aiming at the interactive data from an operation management platform, and receiving the interactive data and the coding configuration for the multimedia data from the operation management platform;
the computing module is used for computing the time offset between the time stamp of the encoded multimedia data and the insertion time point of the interactive data based on the encoding configuration; and
and the inserting module is used for inserting the interactive data into the multimedia data in an access layer of a multimedia data processing background based on the time offset.
9. The device of claim 8, wherein the calculation module calculates a time offset between the encoded timestamp of the multimedia data and an insertion time point of the interaction data based on the encoding configuration comprises:
the calculation module calculates the time offset by a difference between an encoding time of the multimedia data and a delay time of receiving the interactive data.
10. The device of claim 8, wherein the calculation module calculates a time offset between the encoded timestamp of the multimedia data and the insertion time point of the interactive data based on the encoding configuration further comprises:
the computing module synchronizes and stores the coding configuration in the operation management platform in advance.
11. The apparatus of claim 9, wherein the calculating module calculates the time offset by a difference between an encoding time of the multimedia data and a delay time of receiving interactive data further comprises:
the calculation module sets the encoding time of the multimedia data based on the encoding configuration parameters.
12. The device of claim 11, wherein the computing module sets an encoding time for the multimedia data based on an encoding configuration parameter further comprises:
the calculation module sets an encoding time of the multimedia data based on a weighted sum of at least two encoding configuration parameters.
13. The apparatus according to any of claims 8-11, wherein the encoding configuration parameters in the encoding configuration include a code rate, a frame rate, and a group of pictures.
14. The device of claim 8, wherein the insertion module inserts the interactive data into the multimedia data in an access stratum of a multimedia data processing background based on the time offset further comprises:
the insertion module judges whether the latest received frame of the multimedia data is a frame which the interactive data intends to be synchronously transmitted with through whether the timestamp of the latest received frame is larger than or equal to the time offset, and inserts the interactive data into the multimedia data in an access layer of a live streaming background at the time when the latest received timestamp starts when the timestamp of the latest received frame is larger than or equal to the time offset.
15. A computer readable storage medium comprising computer program instructions for synchronization of multimedia data with interactive data in a multimedia data push, which when executed by a processor causes the processor to perform the method according to any of claims 1-7.
CN201910168672.6A 2019-03-06 2019-03-06 Method and device for synchronizing multimedia data and associated interactive data thereof Active CN111669605B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910168672.6A CN111669605B (en) 2019-03-06 2019-03-06 Method and device for synchronizing multimedia data and associated interactive data thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910168672.6A CN111669605B (en) 2019-03-06 2019-03-06 Method and device for synchronizing multimedia data and associated interactive data thereof

Publications (2)

Publication Number Publication Date
CN111669605A true CN111669605A (en) 2020-09-15
CN111669605B CN111669605B (en) 2021-10-26

Family

ID=72381374

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910168672.6A Active CN111669605B (en) 2019-03-06 2019-03-06 Method and device for synchronizing multimedia data and associated interactive data thereof

Country Status (1)

Country Link
CN (1) CN111669605B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473163A (en) * 2021-05-24 2021-10-01 康键信息技术(深圳)有限公司 Data transmission method, device, equipment and storage medium in network live broadcast process
CN114461423A (en) * 2022-02-08 2022-05-10 腾讯科技(深圳)有限公司 Multimedia stream processing method, device, storage medium and program product
CN115243096A (en) * 2022-07-27 2022-10-25 北京字跳网络技术有限公司 Live broadcast room display method and device, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179554A1 (en) * 2003-03-12 2004-09-16 Hsi-Kang Tsao Method and system of implementing real-time video-audio interaction by data synchronization
WO2015040239A1 (en) * 2013-09-23 2015-03-26 Pajouh Darius Vahdat Synchronization of events and audio or video content during recording and playback of multimedia content items
CN105100954A (en) * 2014-05-07 2015-11-25 朱达欣 Interactive response system and method based on Internet communication and streaming media live broadcast
CN107743252A (en) * 2017-11-01 2018-02-27 创盛视联数码科技(北京)有限公司 A kind of method for reducing live delay
CN108600785A (en) * 2018-05-10 2018-09-28 闪玩有限公司 The synchronous method and computer readable storage medium of video streaming sub-routine
CN108900875A (en) * 2018-07-27 2018-11-27 北京感动无限科技有限公司 Dispositions method and device for broadcasting content application
CN109118854A (en) * 2017-06-22 2019-01-01 格局商学教育科技(深圳)有限公司 A kind of panorama immersion living broadcast interactive teaching system

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040179554A1 (en) * 2003-03-12 2004-09-16 Hsi-Kang Tsao Method and system of implementing real-time video-audio interaction by data synchronization
WO2015040239A1 (en) * 2013-09-23 2015-03-26 Pajouh Darius Vahdat Synchronization of events and audio or video content during recording and playback of multimedia content items
CN105100954A (en) * 2014-05-07 2015-11-25 朱达欣 Interactive response system and method based on Internet communication and streaming media live broadcast
CN109118854A (en) * 2017-06-22 2019-01-01 格局商学教育科技(深圳)有限公司 A kind of panorama immersion living broadcast interactive teaching system
CN107743252A (en) * 2017-11-01 2018-02-27 创盛视联数码科技(北京)有限公司 A kind of method for reducing live delay
CN108600785A (en) * 2018-05-10 2018-09-28 闪玩有限公司 The synchronous method and computer readable storage medium of video streaming sub-routine
CN108900875A (en) * 2018-07-27 2018-11-27 北京感动无限科技有限公司 Dispositions method and device for broadcasting content application

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
汪学均: "视频互动同步课堂教学模式研究", 《中国电化教育》 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113473163A (en) * 2021-05-24 2021-10-01 康键信息技术(深圳)有限公司 Data transmission method, device, equipment and storage medium in network live broadcast process
CN114461423A (en) * 2022-02-08 2022-05-10 腾讯科技(深圳)有限公司 Multimedia stream processing method, device, storage medium and program product
CN115243096A (en) * 2022-07-27 2022-10-25 北京字跳网络技术有限公司 Live broadcast room display method and device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN111669605B (en) 2021-10-26

Similar Documents

Publication Publication Date Title
US11627351B2 (en) Synchronizing playback of segmented video content across multiple video playback devices
US10798440B2 (en) Methods and systems for synchronizing data streams across multiple client devices
US10911512B2 (en) Personalized content streams using aligned encoded content segments
US10187668B2 (en) Method, system and server for live streaming audio-video file
WO2019205886A1 (en) Method and apparatus for pushing subtitle data, subtitle display method and apparatus, device and medium
US11201903B1 (en) Time synchronization between live video streaming and live metadata
US9560421B2 (en) Broadcast and broadband hybrid service with MMT and DASH
CN100515079C (en) An implementation method for picture-in-picture in IPTV
US10887646B2 (en) Live streaming with multiple remote commentators
KR20170074866A (en) Receiving device, transmitting device, and data processing method
CN111669605B (en) Method and device for synchronizing multimedia data and associated interactive data thereof
CN113661692B (en) Method, apparatus and non-volatile computer-readable storage medium for receiving media data
CN113141522B (en) Resource transmission method, device, computer equipment and storage medium
US8935432B1 (en) Clock locking for live media streaming
CN101909046A (en) Multimedia transcoding server and multimedia transcoding system
CN107690093B (en) Video playing method and device
US10652625B1 (en) Synchronization of multiple encoders for streaming content
CN108696762A (en) A kind of synchronous broadcast method, device and system
CN111416994B (en) Method and device for synchronously presenting video stream and tracking information and electronic equipment
van Deventer et al. Media synchronisation for television services through HbbTV
CN111800649A (en) Method and device for storing video and method and device for generating video
US10812558B1 (en) Controller to synchronize encoding of streaming content
US10652292B1 (en) Synchronization of multiple encoders for streaming content
US11856242B1 (en) Synchronization of content during live video stream
CN117319692A (en) Time shift playing method and device, computing equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40028939

Country of ref document: HK

GR01 Patent grant
GR01 Patent grant