WO2013048484A1 - Quality of experience enhancements over wireless networks - Google Patents
Quality of experience enhancements over wireless networks Download PDFInfo
- Publication number
- WO2013048484A1 WO2013048484A1 PCT/US2011/054406 US2011054406W WO2013048484A1 WO 2013048484 A1 WO2013048484 A1 WO 2013048484A1 US 2011054406 W US2011054406 W US 2011054406W WO 2013048484 A1 WO2013048484 A1 WO 2013048484A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- parameters
- network
- multimedia
- adaptive streaming
- wireless
- Prior art date
Links
- 230000003044 adaptive effect Effects 0.000 claims abstract description 89
- 238000004891 communication Methods 0.000 claims abstract description 72
- 230000006978 adaptation Effects 0.000 claims abstract description 59
- 238000000034 method Methods 0.000 claims abstract description 53
- 230000005540 biological transmission Effects 0.000 claims abstract description 22
- 238000005457 optimization Methods 0.000 claims abstract description 18
- 238000012913 prioritisation Methods 0.000 claims description 16
- 230000011664 signaling Effects 0.000 claims description 10
- 101100172132 Mus musculus Eif3a gene Proteins 0.000 claims description 7
- 230000006870 function Effects 0.000 description 12
- 238000013442 quality metrics Methods 0.000 description 8
- 238000012384 transportation and delivery Methods 0.000 description 8
- 238000012546 transfer Methods 0.000 description 7
- 238000013507 mapping Methods 0.000 description 6
- 230000006855 networking Effects 0.000 description 6
- 238000007726 management method Methods 0.000 description 5
- 238000012545 processing Methods 0.000 description 5
- 230000000750 progressive effect Effects 0.000 description 5
- 230000007246 mechanism Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000007727 signaling mechanism Effects 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000004069 differentiation Effects 0.000 description 2
- 238000011156 evaluation Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000007774 longterm Effects 0.000 description 2
- 238000013139 quantization Methods 0.000 description 2
- 238000013468 resource allocation Methods 0.000 description 2
- 241001025261 Neoraja caerulea Species 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000003139 buffering effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000001934 delay Effects 0.000 description 1
- 238000002716 delivery method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- GVVPGTZRZFNKDS-JXMROGBWSA-N geranyl diphosphate Chemical compound CC(C)=CCC\C(C)=C\CO[P@](O)(=O)OP(O)(O)=O GVVPGTZRZFNKDS-JXMROGBWSA-N 0.000 description 1
- 230000000977 initiatory effect Effects 0.000 description 1
- 239000003607 modifier Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W28/00—Network traffic management; Network resource management
- H04W28/16—Central resource management; Negotiation of resources or communication parameters, e.g. negotiating bandwidth or QoS [Quality of Service]
- H04W28/24—Negotiating SLA [Service Level Agreement]; Negotiating QoS [Quality of Service]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/20—Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
- H04N21/23—Processing of content or additional data; Elementary server operations; Server middleware
- H04N21/234—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
- H04N21/2343—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
- H04N21/23439—Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements for generating different versions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/1066—Session management
- H04L65/1083—In-session procedures
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/61—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio
- H04L65/612—Network streaming of media packets for supporting one-way streaming services, e.g. Internet radio for unicast
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/60—Network streaming of media packets
- H04L65/75—Media network packet handling
- H04L65/762—Media network packet handling at the source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L65/00—Network arrangements, protocols or services for supporting real-time applications in data packet communication
- H04L65/80—Responding to QoS
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/02—Protocols based on web technology, e.g. hypertext transfer protocol [HTTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/4104—Peripherals receiving signals from specially adapted client devices
- H04N21/4122—Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/414—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance
- H04N21/4143—Specialised client platforms, e.g. receiver in car or embedded in a mobile appliance embedded in a Personal Computer [PC]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/436—Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
- H04N21/4363—Adapting the video stream to a specific local network, e.g. a Bluetooth® network
- H04N21/43637—Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/60—Network structure or processes for video distribution between server and client or between remote clients; Control signalling between clients, server and network components; Transmission of management data between server and client, e.g. sending from server to client commands for recording incoming content stream; Communication details between server and client
- H04N21/63—Control signaling related to video distribution between client, server and network components; Network processes for video distribution between server and clients or between remote clients, e.g. transmitting basic layer and enhancement layers over different transmission paths, setting up a peer-to-peer communication via Internet between remote STB's; Communication protocols; Addressing
- H04N21/643—Communication protocols
- H04N21/6437—Real-time Transport Protocol [RTP]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/83—Generation or processing of protective or descriptive data associated with content; Content structuring
- H04N21/845—Structuring of content, e.g. decomposing content into time segments
- H04N21/8456—Structuring of content, e.g. decomposing content into time segments by decomposing the content in the time domain, e.g. in time segments
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/18—Information format or content conversion, e.g. adaptation by the network of the transmitted or received information for the purpose of wireless delivery to users or terminals
Definitions
- Embodiments pertain to wireless communications. Some embodiments relate to the use of wireless multimedia communications, and Quality of Experience (QoE) techniques implemented within wireless networks and services.
- QoE Quality of Experience
- Some user QoE optimization for multimedia services implement resource management strategies at the lower layers of network communications (e.g., the PHY, MAC, network, and transport layers) by considering the specific characteristics of the applications.
- the PHY/MAC/NET layers in existing networks remain agnostic of dynamically varying application- layer requirements and characteristics, and only aim to optimize link quality subject to certain target Quality of Service (QoS) requirements.
- QoS Quality of Service
- Implemented QoS classes and associated service attributes generally do not accommodate QoE-related metrics for application-level multimedia processing, nor are multimedia streams generally prioritized or adapted in a content-aware fashion to optimize QoE. Further, networks typically do not pass any content-specific information regarding the multimedia processing at the codec to the wireless network, or otherwise enable cross-layer coordination capabilities.
- FIG. 1 illustrates an adaptive streaming client architecture used in accordance with example embodiments
- FIG. 2 illustrates a QoS-aware network architecture for implementing adaptive streaming according to an example embodiment
- FIG. 3 illustrates a cross-layer adaptive streaming client adaptation configuration used in accordance with example embodiments
- FIG. 4 illustrates a video streaming optimization configuration provided to multiple video receivers according to one example embodiment
- FIG. 5 illustrates a method for performing a QoE optimization procedure with video streaming according to one example embodiment
- FIG. 6 illustrates a messaging and channel access configuration for streaming video from multiple sources according to one example embodiment
- FIG. 7 illustrates a method for optimizing adaptive streaming network communications at a plurality of network levels according to one example embodiment.
- Several of the embodiments described herein provide techniques for QoE-driven cross-layer optimization of network communications, such as in wireless networks enabling the distribution of multimedia content.
- some example embodiments include the configuration and use of a cross-layer optimized (and QoE-driven) client adaptation architecture to configure network communication parameters.
- These communication parameters may include various data, video, radio, network, and transport level parameters for implementing QoE with multimedia streaming services, such as Real Time Streaming Protocol (RTSP)-based or Dynamic Adaptive Streaming over HTTP (DASH)/HTTP-based adaptive streaming services.
- RTSP Real Time Streaming Protocol
- DASH Dynamic Adaptive Streaming over HTTP
- QoE differs in various respects from QoS, and therefore is not fully addressed by existing QoS techniques implemented within network communication architectures.
- QoS generally provides mechanisms to ensure that data is communicated between two points (and prioritized, as appropriate) to provide network performance with minimal packet loss, bit rate, jitter, and latency.
- QoE in contrast, generally implements mechanisms relating to the quality of the data itself being transferred.
- QoE may relate to quality of audio or video being played back to a user, which may be unsatisfactory even if the QoS for delivery of the corresponding audio or video data is satisfactory.
- the target QoS parameters for the core network and radio access network may be derived independently of multimedia- specific application layer parameters for multimedia streaming services, receiver device/display capabilities, or physical link conditions.
- QoE- driven cross-layer optimization for multimedia communications may be provided through various resource management strategies at lower networking model layers (e.g., the PHY, MAC, network, and transport layers) by considering the specific characteristics of video and multimedia applications.
- QoE optimization may also be implemented by adapting video compression and streaming algorithms after taking into account the mechanisms provided by the lower layers for error control and resource allocation.
- two of the capabilities enabled by cross-layer optimizations include:
- PHY/MAC/NET layer-aware content adaptation at the codec level using adaptation parameters such as bit rate, resolution, frame rate, and the like, to enable a streaming service to adapt its content characteristics to varying network conditions (e.g., changing resource availability, or the time-varying nature of the wireless channel).
- adaptation parameters such as bit rate, resolution, frame rate, and the like
- Various content adaption strategies are performed to ensure the highest possible QoE while maintaining interruption- free playback of the multimedia. This capability is known as "adaptive streaming”.
- characteristics of the video stream can allow for performing distortion- aware channel access prioritization at the PHY/MAC/NET layer to enhance video quality.
- PHY/MAC/NET layers in most networks only attempt to optimize link quality to QoS requirements, with use of parameters such as throughput, latency/jitter, packet error/loss rate, and so forth. Also, due to layer independence and separation, and the limitations of QoS as previously described, existing QoS classes and associated service attributes do not accommodate QoE-related metrics for application-level multimedia processing and prioritize the multimedia streams in a content-aware fashion.
- existing network configurations generally do not pass content-specific information (e.g., rate-distortion characteristics of the video stream, associated video quality metrics, and the like) regarding the multimedia processing at the codec (application) level to wireless networks.
- content-specific information e.g., rate-distortion characteristics of the video stream, associated video quality metrics, and the like
- new cross-layer coordination capabilities and signaling mechanisms may be used to enable exchanging application-level information for QoE-aware radio and wireless multimedia network adaptation, and for resource management for one or more service flows.
- the various embodiments described herein disclose techniques and configurations that provide adaptive services for wireless networks to enable such content-awareness and enhanced QoE. Both conversational and streaming services may be enhanced using the techniques described herein.
- the techniques described herein are applicable for unicast, multicast and broadcast multimedia delivery methods.
- the proposed techniques are also applicable in heterogeneous environments that require delivery of multimedia content such as video over multiple air interfaces.
- WiDi relies on local peer-to-peer (P2P) wireless connectivity over a Wireless Local Area Network (WLAN) or Wireless Personal Area Network (WPAN)-based air interface (e.g. , Wi-Fi P2P, Wi-Fi Alliance Wi-Fi Display, WiDi Direct, myWi-Fi, 60 GHz technology, and the like) to transfer data between multimedia devices, such as a computer and a television.
- P2P peer-to-peer
- WLAN Wireless Local Area Network
- WPAN Wireless Personal Area Network
- FIG. 1 depicts an example configuration of an adaptive streaming network architecture configured to deliver multimedia content via a packet- switched streaming service (PSS) from a PSS server 102 to a PSS client 112.
- PSS packet- switched streaming service
- the following illustrates multimedia content transmission via a 3 GPP Long Term Evolution (LTE) or Long Term Evolution Advanced (LTE- A) network configuration, although any of a number of wireless network standards and protocols may be similarly configured for use in streaming network architecture.
- LTE Long Term Evolution
- LTE- A Long Term Evolution Advanced
- the multimedia content data is delivered from the PSS server 102 and the public network 104 (e.g., the Internet) to a core network 106, and transmitted from the core network 106 through an access network 108.
- the core network 106 and access network 108 exist within the LTE IP network 120, e.g., an internal IP network maintained by a telecommunications provider.
- the access network 108 provides network connectivity between the core network 106 and the wirelessly transmitting access point/base station/eNodeB 110 within a LTE wireless network 122, e.g., a wireless network provided by a
- the multimedia content data is transmitted from the access network 108 to the base station/eNodeB 110, broadcasted via a wireless communication (e.g., a cellular data transmission) from the base station/eNodeB 110 via the LTE wireless network 122, and received at a mobile station
- a wireless communication e.g., a cellular data transmission
- MS mobile phone
- UE user equipment
- the PSS client 112 may use a WiFi P2P network 124 to further transmit the streaming multimedia content onto another device, such as the user's television 116.
- This final transmission to the television 116 via the WiFi P2P network 124 may involve the use of a wireless multimedia connection standard such as WiDi, and a WiDi application 118 operating on the receiving computing device 114 and the television 116.
- the transmitted multimedia content e.g., streaming video
- WiDi may not only be used to provide communicate multimedia content with output devices such as televisions, but may also be used to communicate multimedia content with input devices such as video cameras.
- Another example use case of WiDi includes a video conferencing application over cellular-enabled client devices (e.g., user equipment (UE)) corresponding to conversational and streaming video services.
- a video conferencing application e.g., Skype
- IMS IP multimedia subsystem
- video may also be signaled over a WiFi P2P connection from the UE to the WiDi adapter (in addition to the cellular network).
- a wireless multimedia delivery network is not limited to use of a cellular network, but may involve a variety of other wireless standards and configurations, including but not limited to a Wireless Wide Area Network (WW AN), a WLAN, or a WPAN network, an unmanaged WiFi network, or a TV broadcast network (e.g., DVB).
- WW AN Wireless Wide Area Network
- WLAN Wireless Local Area Network
- WPAN Wireless Personal Area Network
- unmanaged WiFi network e.g., DVB
- a series of adaptive streaming services are provided to enable QoE via wireless multimedia networking configurations.
- the adaptive streaming services enhancing QoE via cross-layer optimization may include one or more of:
- -An end-to-end QoS architecture for adaptive multimedia streaming in which the target QoS parameters for the core network and/or radio access network may be derived from multimedia-specific and application-layer parameters, determined from values such as provided from the session description protocol (SDP) for RTSP-based adaptive streaming or media presentation description (MPD) metadata for HTTP-based adaptive streaming, as well as from receiver device/display capabilities and physical link conditions;
- SDP session description protocol
- MPD media presentation description
- CSMA/CA multimedia QoS and traffic prioritization framework in which access categories and associated system parameters (e.g., Arbitration
- Inter- Frame Space Number (AIFSN), Connection Window (CW) and Transmit Opportunity (TXOP) parameters) for HCF Controlled Channel Access (HCCA) or Enhanced Distributed Channel Access (EDCA) may be determined based on QoE-optimized mapping functions derived from the multimedia-specific application-layer parameters in the SDP or MPD for RTSP/HTTP-based adaptive streaming services, receiver device/display capabilities, and physical link conditions;
- -Client device configurations to manage the streaming session, modify session parameters (e.g., derive new RTSP/SDP session parameters), adapt video parameters (e.g., bitrate, resolution, frame rate, etc.), prioritize traffic, allocate resources and optimize bandwidth/QoS for its local connections (e.g., WiDi links) based on multimedia information gathered from session- level signaling (e.g., SDP or MPD signaling) over the other video delivery networks (e.g., 3GPP, Wi-Fi, or digital video broadcasting networks) using Session Initiation Protocol (SIP), RTSP, or HTTP protocols, including codec information, quality requirements, and rate-distortion characteristics;
- session parameters e.g., derive new RTSP/SDP session parameters
- adapt video parameters e.g., bitrate, resolution, frame rate, etc.
- prioritize traffic allocate resources and optimize bandwidth/QoS for its local connections (e.g., WiDi links) based on multimedia information gathered from session-
- the present disclosure provides techniques to optimize channel access among concurrent wireless multimedia network applications for delivering the best possible QoE of multimedia content.
- This may provide enhanced operations applicable to: 1) Multiple concurrent multimedia network adaptive streaming applications (e.g., via a WiDi connection) carrying different multimedia content or displayed on different screens; 2) Wireless webcam and video conferencing over a multimedia wireless network; and 3) Internet video streaming over a multimedia wireless network, such as with use of adaptive HTTP streaming services, to one or more displays.
- streaming protocols may be used in conjunction with the presently disclosed cross-layer optimization techniques. These streaming protocols include:
- Example technologies using RTSP-based streaming include Microsoft Windows Media, Apple QuickTime, Adobe Flash, and Real Networks Helix. Some implementations of WiDi also use RTSP-based streaming.
- HTTP protocol a stateless protocol
- the server responds by sending the data and then the transaction is terminated.
- Each HTTP request is handled as a completely standalone one-time transaction.
- HTTP-based progressive download methods may also be used for media delivery from standard Web servers. In HTTP-based progressive download, supported clients can seek positions in a media file by performing byte range requests to the Web server.
- Some of the disadvantages of HTTP-based progressive download include that (i) bandwidth may be wasted if the user decides to stop watching the content after progressive download has started (e.g., switching to another content), (ii) the download is not bitrate adaptive, and (iii) the download does not support live media services.
- DASH addresses some of the weaknesses of Real-time Transport Protocol (RTP)/RTSP-based streaming and HTTP-based progressive downloads.
- RTP Real-time Transport Protocol
- DASH provides the ability to move control of a "streaming session" entirely to the client and therefore moves the adaptive streaming intelligence from the server to the client.
- the client may open one or several or many TCP connections to one or several standard HTTP servers or caches, retrieve the MPD metadata file providing information on the structure and different versions of the media content stored in the server (including different bitrates, frame rates, resolutions, codec types, etc.) and request smaller segments of the selected version of the media file with individual HTTP messages (to imitate streaming via short downloads).
- DASH provides the ability to the client to automatically choose an initial content rate to match initial available bandwidth without requiring a negotiation with the streaming server.
- DASH further provides the ability to dynamically switch between different bitrate representations of the media content as the available bandwidth changes.
- DASH allows faster adaptation to changing network and wireless link conditions, user preferences, and device capabilities (e.g., display resolution, CPU, memory resources, etc.).
- Such dynamic adaptation may enable an improved user quality of experience (QoE), with shorter startup delays, fewer re -buffering events, and the like.
- Example DASH technologies include Microsoft Internet Information Services (IIS) Smooth Streaming, Apple HTTP Live Streaming, and Adobe HTTP Dynamic Streaming.
- IIS Internet Information Services
- FIG. 2 provides an illustration of a QoS-aware network architecture, configured to access multimedia content provided by a multimedia server 102 from a public network 104 (e.g., the internet) via each of these network interfaces.
- a public network 104 e.g., the internet
- Communications provided for the multimedia content within the non- wireless IP network 204 include transfer of data from the public network 104 via the core network 106, and the transfer of data from the core network 106 via the access network 108.
- Communications provided for the multimedia content within the wireless network 206 include the transfer of data from the access network 108 to the access point/base station/eNodeB 110, and the transfer of data from the access point/base station/eNodeB 110 wirelessly to the receiving computing device 114 (a mobile station/user equipment).
- QoS parameters 202 for the non-wireless IP network 204 and the wireless network 206 may be derived based on multimedia- specific application-layer parameters. These derived QoS parameters 202 may then be provided to the various components and interfaces within the non- wireless IP network 204 and the wireless network 206, including the core network 106, the access network 108, and the wireless network interface operated by the access point/base station/eNodeB 110.
- each interface defines a set of QoS classes or access categories (ACs) (e.g., best effort (AC_BE), background (AC_BK), voice (AC_VO), and video (AC_VI) access categories for the WiFi Multimedia (WMM) standard as part of enhanced distributed coordination function (DCF) channel access (EDCA)) and specifies associated service attributes in terms of various performance requirements such as throughput, latency/jitter, packet error-loss rate, and the like (e.g., via TSPECs, etc.).
- ACs QoS classes or access categories
- AC_BE best effort
- AC_BK background
- AC_VO voice
- AC_VI video
- WMM WiFi Multimedia
- DCF enhanced distributed coordination function
- EDCA enhanced distributed coordination function
- the QoS classes / ACs enable the differentiation of the service flows between client applications and various services.
- each service flow is mapped to a specific QoS class and receives a common QoS treatment. This allows service flows to be prioritized accordingly, when resources are distributed between different service flows through scheduling functions.
- QoS definitions that may be used in IP network 204, specifically in core network 106 and access network 108, are
- IntServ follows the flow-based and signaled QoS model, where the end-hosts signal QoS needs to the network, while DiffServ works on the provisioned- QoS model, where network elements are set up to service multiple classes of traffic with varying QoS requirements.
- DiffServ uses the 6-bit Differentiated Services Code Point (DSCP) field in the header of IP packets for packet classification purposes.
- DSCP Differentiated Services Code Point
- the IntServ model relies on the Resource Reservation Protocol (RSVP) to explicitly signal and reserve the desired QoS for each flow in the network, described by the FlowSpecs.
- RSVP Resource Reservation Protocol
- a convergence sub-layer is defined to interface higher-layer protocol data units and perform classification and mapping functions. For example, in the case of DiffServ, each end-to-end IP packet entering the system is identified with a dedicated air interface AC for the radio access network, by mapping its DSCP over the core network from DiffServ to a particular QoS class for the radio access network.
- the configuration illustrated in FIG. 2 therefore enables an end- to-end QoS architecture for adaptive streaming in which target QoS parameters for the core network and/or radio access network (including local P2P connections) are derived. These values may be derived based on multimedia- specific application-layer parameters, such as from SDP values for RTSP-based adaptive streaming or MPD values for HTTP-based adaptive streaming, as well as based on the receiver device/display capabilities and physical link conditions.
- the IP and wireless network devices may be configured to (i) have the ability to parse the SDP or MPD values in order to extract multimedia-specific application layer information for a given streaming session, (ii) exchange information on the receiver device/display capabilities and/or physical link conditions, and (iii) derive target video adaptation parameters and QoS parameters for the core network and radio access network.
- IP and wireless network devices may be configured to perform mapping from the multimedia-specific application-layer information contained in the SDP or MPD values (or from any similar metadata format carrying multimedia information), or receiver device/display capabilities, or physical link conditions to:
- the set of video adaptation parameters e.g., bitrate, resolution, frame rate, etc.
- QoS parameters for the core network e.g.,
- DiffServ/DSCP parameters IntServ/FlowSpecs parameters, etc. 2
- the set of video adaptation parameters e.g., bitrate, resolution, frame rate, etc.
- QoS parameters for the radio access network e.g., QoS class or access category (AC) parameters, TSPECs, etc.
- the set of video adaptation parameters e.g., bitrate, resolution, frame rate, etc.
- QoS parameters for the local P2P network among client devices e.g., a WiFi P2P network as in the WiDi use case
- client devices e.g., a WiFi P2P network as in the WiDi use case
- QoS class or access category (AC) parameters e.g., TSPECs, and the like.
- the network devices may signal the SDP or MPD values (or any such metadata carrying multimedia information) as well as the receiver device/display capabilities to other network devices in order to share adaptive streaming related session information with the appropriate entities in the network.
- the decisions on the video adaptation parameters and QoS parameters (QoS class or access category (AC) parameters, etc.) for all clients sharing the resources/spectrum can be made jointly in order to find the QoE-optimizing traffic prioritization among the clients in a coordinated fashion.
- the set of multimedia-specific application-layer parameters provided by the SDP, MPD, or any other similar metadata format can include one or more of the following multimedia parameters:
- - Type of multimedia application e.g., video conferencing, realtime video streaming, video downloading/uploading, stored or internet- streamed video, DVD or Blue-Ray video playback, etc.
- Multimedia bitrate, resolution, and frame rate information including a maximum bitrate above which the perceived quality improvement is negligible, and a minimum bitrate to achieve the lowest acceptable quality.
- Multimedia codec information e.g., codec type such as AMR
- MPEG4, H.264 AVC/SVC etc. possibly also describing profiles and levels.
- Multimedia quality metrics specified at different bitrates, frame rates and resolutions such as reference, reduced-reference or non-reference metrics, e.g., video quality metrics (VQM), structural similarity metrics (SSIM), perceptual evaluation of video quality metrics (PEVQ), video mean opinion scores (MOS), and other subjective quality metrics.
- VQM video quality metrics
- SSIM structural similarity metrics
- PEVQ perceptual evaluation of video quality metrics
- MOS video mean opinion scores
- GOP frames
- frame type e.g., I-frame, P-frame, B-frame, etc.
- - Layer type for scalable video coding e.g., base layer, enhancement layer, etc.
- an enhancement to support QoS enables
- EDCA to differentiate packets using different priorities and maps them to specific ACs that are buffered in separate queues at a station.
- Each AC i within a station having its own EDCA parameters contends for the channel access independently of the others.
- AIFS arbitration inter-frame space
- CW contention window
- TXOP transmit opportunity
- the channel access probability differentiation is provided by using: a) different AIFSs instead of the constant distributed IFS (DIFS) used in DCF, and, b) different values for the minimum/maximum CWs to be used for the backoff time extraction.
- DIFS constant distributed IFS
- AIFSN Prioritization If there is a packet ready for transmission in the MAC queue of an AC, the EDCA function will sense the channel to be idle for a complete AIFS before it can start the transmission or backoff countdown.
- the AIFS of AC i may be determined as follows:
- AIFSJ SIFS + AIFSNJ * T_slot
- AIFSNJ is the AC-specific AIFS number corresponding to
- SIFS is the length of the short inter-frame space and T_slot is the duration of a time slot.
- CW Prioritization If the channel is idle when the first packet arrives at the AC i queue, the packet can be directly transmitted as soon as the channel is sensed to be idle for AIFSJ. Otherwise, a backoff procedure is completed following the completion of AIFS before the transmission of this packet.
- a uniformly distributed random integer namely a backoff value, is selected from the range [0, W_i].
- the backoff counter is decremented at the slot boundary if the previous time slot is idle. Should the channel be sensed busy at any time slot during AIFS or backoff, the backoff procedure is suspended at the current backoff value. The backoff resumes as soon as the channel is sensed to be idle for AIFS again. When the backoff counter reaches zero, the packet is transmitted in the following slot.
- the value of W_i depends on the number of retransmissions the current packet experienced.
- the initial value of W_i is set to CWmin_i. If the transmitter cannot receive an Acknowledgment (ACK) packet from the receiver in a timeout interval, the transmission is labeled as unsuccessful and the packet is scheduled for retransmission. At each unsuccessful transmission, the value of W_i is doubled until CWmax_i is reached. The value of W_i is reset to CWmin_i if the transmission is successful; or the packet retransmission limit is reached the packet is dropped.
- ACK Acknowledgment
- the ACs with higher priority are assigned a smaller AIFSN value. Therefore, the ACs with higher priority can either transmit or decrement their backoff counters while ACs with lower priority are still waiting in AIFS. This results in ACs with higher priority enjoying a relatively faster progress through backoff slots. Moreover, the ACs with higher priority may select backoff values from a comparably smaller CW range. This approach prioritizes the access because a smaller CW value means a smaller backoff delay before the transmission.
- TXOP is a bounded time interval during which a station can send as many frames as possible as long as the duration of the transmissions does not extend beyond the maximum duration of the TXOP.
- each AC i may carry out multiple frame exchange sequences as long as the total access duration does not go over MaxTXOP_i.
- the transmissions are separated by SIFS. Multiple frame transmissions in a TXOP can reduce the overhead due to contention.
- a TXOP limit of zero corresponds to only one frame exchange per access.
- the ACs with higher priority may use a nonzero TXOP to increase their channel access time, with TXOP durations ranked according to the AC priority (i.e., the highest priority AC may have the largest TXOP).
- the previously described CSMA/CA-based multimedia QoS and traffic prioritization framework determines the access categories and associated system parameters (e.g., AIFSN, CW, and TXOP parameters) for EDCA or HCCA values. These values may be implemented based on QoE-optimized mapping functions derived from the multimedia- specific application-layer parameters in the SDP or MPD values (or any other similar metadata format), receiver device/display capabilities, or physical link conditions for RTSP/HTTP-based adaptive streaming services.
- system parameters e.g., AIFSN, CW, and TXOP parameters
- the network devices may signal the SDP or MPD information as well as the receiver device/display capabilities to other network devices in order to share adaptive streaming related session information with the appropriate entities in the network.
- the decisions on the QoS parameters QoS class or access category (AC) parameters, etc.
- the QoS parameters QoS class or access category (AC) parameters, etc.
- a cross-layer optimized platform adaptation architecture is defined for adaptive streaming, in which video, transport and radio components in the platform cooperate and exchange information towards identifying platform configurations needed to optimize user QoE.
- FIG. 3 An example client adaptation architecture 302 illustrated against a series of associated Open Systems Interconnection (OSI) communication layers and protocols 300 is depicted in FIG. 3. As illustrated, a cross-layer adaptation manager 304 extending across each of the OSI communication layers is operable with each of the following system components:
- Radio Adaptation and QoS engine 320 Determines radio-level adaptation and QoS parameters
- Network Adaptation and QoS engine 318 Determines network- level adaptation and QoS parameters
- RTSP/HTTP Access Client 316 Handles transport-level
- Adaptive Streaming Control Engine 312 Parses the SDP or MPD parameters and determines streaming parameters for adaptive streaming (e.g., DASH segment duration, sequence and timing of HTTP requests, etc.);
- Media Adaptation Engine 314 Determines codec-level adaptation parameters
- QoE monitor 310 Dynamically measures QoE.
- DASH client platform configurations may be jointly optimized at the video, transport and radio levels via cross-layer cooperation of the cross-layer adaptation manager 304, and associated system components, in connection with the following parameters at each appropriate layer:
- -Application (Video) layer Bitrate, frame rate, resolution, the decisions of the client to drive the requested content representations from the DASH server;
- -Transport layer QoE feedback based on the real-time transport control protocol (RTCP), sequence and timing of HTTP requests, number of parallel TCP connections, DASH segment durations, and so forth;
- RTCP real-time transport control protocol
- Modulation and coding scheme MCS
- target QoS parameters for the core network and radio access network.
- the adaptive streaming client platform can dynamically track the following parameters, and use parameter values as inputs for decisions towards jointly adapting the streaming client configurations via cross-layer cooperation:
- VQM video quality metrics
- SSIM structural similarity metrics
- PEVQ perceptual evaluation of video quality metrics
- MOS video mean opinion scores
- WiDi Wireless Multimedia Network protocols
- WiDi Wireless Multimedia Network protocols
- Adaptive streaming over WiDi may be performed using the RTSP protocol.
- a cross-layer coordinated QoS framework may be adapted to optimize channel access among concurrent WiDi applications for delivering the best possible multimedia QoE, allowing effective performance of adaptive streaming with QoE in a multi-access environment. This may help ensure that the WiDi links share the medium in a "content-aware" and “display aware” fashion with the appropriate prioritizations among the streams during channel access.
- this embodiment may enable content-aware and display-aware selection video adaptation (bitrate, resolution, frame rate, content characteristics, etc.), DCA, and target QoS parameters for different WiDi connections in order to share resources efficiently and realize the best possible video quality levels over all WiDi applications.
- this embodiment may also factor the type of content being broadcasted. For example, a low- definition action movie or sports presentation may require more data
- the presented CSMA/CA-based multimedia QoS and traffic prioritization framework is applicable such that access categories and associated system parameters (e.g., AIFSN, CW and TXOP parameters) for EDCA or HCCA are determined based on QoE-optimized mapping functions derived from the multimedia-specific application-layer parameters in the SDP, receiver device/display capabilities and physical link conditions, and the like.
- access categories and associated system parameters e.g., AIFSN, CW and TXOP parameters
- WiDi devices may utilize RTSP/SDP-based signaling mechanisms to exchange multimedia- specific application-layer parameters and receiver device/display capability information over the radio links. These parameters and capability information may be applied to enable coordinated QoE optimization, application-aware network adaptation, QoS support, and resource management for adaptive streaming services transmitting over a WiDi connection. Therefore, a client device running a WiDi application may manage the streaming session, modify session parameters (e.g., derive new RTSP/SDP session parameters), adapt video parameters (e.g., bitrate, resolution, frame rate, etc.), prioritize traffic, allocate resources and optimize
- session parameters e.g., derive new RTSP/SDP session parameters
- adapt video parameters e.g., bitrate, resolution, frame rate, etc.
- bandwidth/QoS for its local connections e.g., WiDi links
- multimedia information gathered from session- level signaling e.g., SDP or MPD signaling
- video delivery networks e.g., 3GPP, WiFi or DVB networks
- SIP Session Initiation Protocol
- RTSP Real-Time Transport Protocol
- HTTP HyperText Transfer Protocol
- a WiDi client platform architecture may also perform RTSP- based adaptive streaming based on the proposed QoE-aware cross-layer cooperation framework in order to jointly optimize platform parameters for video/network/radio adaptation and QoS support.
- WiDi devices may signal the RTSP/SDP or MPD information as well as the receiver device/display capabilities to other WiDi devices in order to share adaptive streaming related session information with the appropriate entities in the network.
- the decisions on the QoS parameters QoS class or access category (AC) parameters, etc.
- resources/spectrum can be made jointly in order to find the QoE-optimizing traffic prioritization among the clients in a coordinated fashion.
- FIG. 4 provides an illustration of an example network configuration 400 for transmitting streaming multimedia content with a DASH adaptive streaming protocol in accordance with an example embodiment.
- FIG. 4 specifically depicts an example use case of over-the-top Internet video streaming using DASH and adaptive HTTP techniques, e.g., Apple HTTP Live Streaming.
- multimedia content is communicated from a DASH server 402 to a WiFi AP 404, and then via a WiFi network to a computer 406 running a DASH Client and a WiDi communications application.
- the computer 406 utilizes the WiDi communications application to transmit multimedia content via a WiDi WiFi P2P network to two devices, a large-sized screen television 410 and a medium-sized receiver screen 414 (e.g., a computing device or another display device with a smaller screen than television 410).
- a WiDi WiFi P2P network two devices, a large-sized screen television 410 and a medium-sized receiver screen 414 (e.g., a computing device or another display device with a smaller screen than television 410).
- the DASH client first fetches the MPD from the DASH server(s) and learns about multimedia characteristics for content to be streamed over the WiDi links (e.g., these parameters may include minimum bitrate for acceptable video quality and maximum bitrate above which perceived video quality improvement is negligible). This is followed by the DASH client using RTSP/SDP signaling to gather capability information from each of the displays. The DASH client then estimates link qualities to each of the displays based on the physical channel conditions (e.g. by tracking packet error/loss statistics).
- the network configuration 400 is specifically configured to provide the multimedia content to the large screen television 410 at a high priority 408 (e.g., AC Priority Level 1).
- the network configuration 400 is further configured to provide the multimedia content to the medium- sized receiver screen 414 at a low priority 412 (e.g., AC Priority Level 2).
- the DASH client may next determine the QoE-optimizing adaptive streaming configuration including video adaptation parameters and QoS parameters. Finally, the DASH client streams content to the displays based on the QoE-optimizing content-aware and display- aware adaptive streaming configuration based on the selected video adaptation parameters and QoS parameters.
- FIG. 5 provides an illustration of a method for QoE-optimization procedure for DASH-based video streaming according to one embodiment.
- multimedia characteristics for the streaming content may be determined (operation 510), for example, by the DASH client obtaining the MPD from the DASH server(s) and analyzing multimedia characteristics for content to be streamed over the WiDi links.
- Capability information for the various displays may be determined (operation 520), for example, by the DASH client using RTSP/SDP signaling to gather capability information from each of the displays.
- Network link conditions to the displays or display clients may be determined (operation 530), for example, by the DASH client estimating link qualities to each of the displays based on the physical channel conditions (such as by tracking packet error/loss statistics).
- an adaptive streaming configuration may be determined (operation 540).
- the DASH client may calculate relevant QoE-optimizing adaptive streaming configuration parameters including video adaptation parameters and QoS parameters using the previously described MPD information, display capabilities, and physical link conditions.
- content may be streamed based on the determined adaptive streaming configuration (operation 550).
- the DASH client may stream content to the displays based on the QoE-optimizing adaptive streaming configuration parameters including the previously described video adaptation parameters and QoS parameters.
- FIG. 6 illustrates another example embodiment of cross-layer optimized adaptive streaming techniques, with a network architecture supporting a two-user WiFi network where each user is concurrently running WiDi applications to stream multimedia content.
- WiDi Link 1 is configured to transmit real-time video from computing system 602 to large screen television 608 with a high priority level 604.
- RTSP/SDP messaging 606 is configured to exchange application-layer parameters between the source (computing system 602) and sink (television
- WiDi Link 2 is also configured to transmit real-time video from computing system 612 to medium-sized screen receiver device 618 with a low priority level 614; and likewise, WiDi Link 2 exchanges similar RTSP / SDP messaging 616 between the computing system 612 and the receiver device 618.
- each WiDi configuration determines the application layer parameters associated with display capabilities, parses the locally stored SDP for each video content to gather multimedia-specific information, estimates the physical link conditions, and uses these criteria to determine the QoE optimizing video adaptation parameters/QoS access category/and associated EDCA / HCCA parameters corresponding to their multimedia streams.
- WiDi Link 1 For example, suppose user of WiDi Link 1 would like to stream a fast-moving high quality video stream to a large screen TV 608 with a minimum bitrate equivalent to 4 Mbps of channel capacity in order to meet the target QoE for the video stream, while the user of WiDi Link 2 would like to stream a slow moving lower quality video stream to a medium-sized screen receiver device 618 with a minimum bitrate equivalent to 2 Mbps of channel capacity in order to meet the target QoE for the video stream.
- WiDi Link 1 has a more stringent QoE and bitrate requirement compared to WiDi Link 2.
- WiDi Link 1 is assigned to a higher priority level 604 while WiDi Link 2 is assigned to a lower priority level 614, allowing WiDi Link 1 to utilize more of the channel capacity resources and thereby meet its higher bitrate requirement.
- WiDi Link 1 Through such content-aware channel access, both users are able to meet their QoE requirements and enjoy a satisfying video streaming experience.
- each WiDi link could only realize an average throughput of 3 Mbps. While this throughput would allow WiDi Link 2 to meet its target QoE requirement, WiDi Link 1 would not be able to meet its target QoE requirement.
- a content-aware channel access solution in accordance with one embodiment considers provides a CWmin ratio of 2: 1 in the QoS prioritized CSMA/CA-based WiFi access. This is illustrated in the channel access 610 for the WiFi network used to transmit the data via the WiDi links, configured to allow up to two times higher throughput for WiDi Link 1 in comparison with WiDi Link 2 (i.e., WiDi Link 1 gains access to 2/3 of the channel bandwidth, while WiDi Link 2 gains access to 1/3 of the channel bandwidth).
- WiDi Link 1 realizing 4 Mbps of WiDi throughput
- WiDi Link 2 realizing 2 Mbps of WiDi throughput.
- both users are able to meet their target QoE requirements.
- FIG. 7 provides a summarized illustration of a method used to implement adaptive streaming optimization, with use of various of the previously described techniques according to one example embodiment.
- the adaptive streaming optimization occurs in connection with streaming multimedia content to one or more displays with one or more adaptive streaming connections (operation 710).
- the calculations performed with the adaptive streaming optimization may include: receiving display parameters via the connection(s) (operation 720); determining the stream requirements for the connection(s) (operation 730); determining the network link condition(s) for the connection(s) (operation 740); and determining the target QoS parameters for the connection(s) (operation 750).
- the target QoS parameters are then implemented for the connection(s) (operation 760).
- the result of the implemented QoS parameters for the connection(s) may then be verified (operation 770), with further adjustments and implementations to the QoS parameters provided in subsequent activities.
- Embodiments may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a computer-readable storage device, which may be read and executed by at least one processor to perform the operations described herein.
- a computer-readable storage device may include any non-transitory mechanism for storing information in a form readable by a machine (e.g., a computer).
- a computer-readable storage device may include read- only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media.
- communication devices such as a base station or UE may include one or more processors and may be configured with instructions stored on a computer-readable storage device.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Quality & Reliability (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Systems and methods for providing content-aware adaptation of multimedia communications in wireless networks to ensure Quality of Experience (QoE) of the content transmitted by the multimedia communications are generally disclosed herein. One example embodiment includes adaptive streaming optimization techniques, such as the exchanging of application-layer parameters used to establish network connectivity settings and implement an appropriate QoE for applications communicating within the wireless network. Example embodiments may also determine and implement Quality of Service (QoS) parameters for the wireless network and other connected networks based on the application-layer parameters. Such application-layer parameters may include receiver display capabilities and multimedia-specific parameters. These techniques may be used in connection with, for example, the transmission of real-time multimedia content, such as multimedia content communicated wirelessly from a computing device using a Wireless Display (WiDi) transmission standard.
Description
QUALITY OF EXPERIENCE ENHANCEMENTS
OVER WIRELESS NETWORKS
TECHNICAL FIELD
[0001] Embodiments pertain to wireless communications. Some embodiments relate to the use of wireless multimedia communications, and Quality of Experience (QoE) techniques implemented within wireless networks and services.
BACKGROUND
[0002] In wireless multimedia communications, various generic cross- layer design methodologies are used to optimize user QoE and increase the service capacity for the network communications. Quality degradation is generally caused by factors such as high distortion levels, limited bandwidth, excessive delay, power constraints, and computational complexity limitation.
[0003] Some user QoE optimization for multimedia services implement resource management strategies at the lower layers of network communications (e.g., the PHY, MAC, network, and transport layers) by considering the specific characteristics of the applications. In many cases, however, the PHY/MAC/NET layers in existing networks remain agnostic of dynamically varying application- layer requirements and characteristics, and only aim to optimize link quality subject to certain target Quality of Service (QoS) requirements.
[0004] Implemented QoS classes and associated service attributes generally do not accommodate QoE-related metrics for application-level multimedia processing, nor are multimedia streams generally prioritized or adapted in a content-aware fashion to optimize QoE. Further, networks typically do not pass any content-specific information regarding the multimedia processing at the codec to the wireless network, or otherwise enable cross-layer coordination capabilities.
[0005] There are general needs for improved methods of optimizing QoE for multimedia content service flows. There are also general needs for improved
methods of establishing and operating adaptive streaming services over wireless networks.
BRIEF DESCRIPTION OF THE DRAWINGS
[0006] FIG. 1 illustrates an adaptive streaming client architecture used in accordance with example embodiments;
[0007] FIG. 2 illustrates a QoS-aware network architecture for implementing adaptive streaming according to an example embodiment;
[0008] FIG. 3 illustrates a cross-layer adaptive streaming client adaptation configuration used in accordance with example embodiments;
[0009] FIG. 4 illustrates a video streaming optimization configuration provided to multiple video receivers according to one example embodiment;
[0010] FIG. 5 illustrates a method for performing a QoE optimization procedure with video streaming according to one example embodiment;
[0011] FIG. 6 illustrates a messaging and channel access configuration for streaming video from multiple sources according to one example embodiment; and
[0012] FIG. 7 illustrates a method for optimizing adaptive streaming network communications at a plurality of network levels according to one example embodiment.
DETAILED DESCRIPTION
[0013] The following description and the drawings sufficiently illustrate specific embodiments to enable those skilled in the art to practice them. Other embodiments may incorporate structural, logical, electrical, process, and other changes. Portions and features of some embodiments may be included in, or substituted for, those of other embodiments. Embodiments set forth in the claims encompass all available equivalents of those claims.
[0014] Several of the embodiments described herein provide techniques for QoE-driven cross-layer optimization of network communications, such as in wireless networks enabling the distribution of multimedia content. In particular, some example embodiments include the configuration and use of a cross-layer optimized (and QoE-driven) client adaptation architecture to configure network
communication parameters. These communication parameters may include various data, video, radio, network, and transport level parameters for implementing QoE with multimedia streaming services, such as Real Time Streaming Protocol (RTSP)-based or Dynamic Adaptive Streaming over HTTP (DASH)/HTTP-based adaptive streaming services.
[0015] QoE differs in various respects from QoS, and therefore is not fully addressed by existing QoS techniques implemented within network communication architectures. QoS generally provides mechanisms to ensure that data is communicated between two points (and prioritized, as appropriate) to provide network performance with minimal packet loss, bit rate, jitter, and latency. QoE, in contrast, generally implements mechanisms relating to the quality of the data itself being transferred. Thus, in multimedia settings, QoE may relate to quality of audio or video being played back to a user, which may be unsatisfactory even if the QoS for delivery of the corresponding audio or video data is satisfactory.
[0016] Implementing QoS standards within network configurations cannot ensure a satisfactory QoE, because QoS does not factor application requirements or otherwise operate in an application-aware manner (especially for delivery of multimedia content in wireless network settings). For example, in some QoS architectures supporting adaptive streaming services, the target QoS parameters for the core network and radio access network may be derived independently of multimedia- specific application layer parameters for multimedia streaming services, receiver device/display capabilities, or physical link conditions.
[0017] In accordance with the embodiments described herein, QoE- driven cross-layer optimization for multimedia communications may be provided through various resource management strategies at lower networking model layers (e.g., the PHY, MAC, network, and transport layers) by considering the specific characteristics of video and multimedia applications. QoE optimization may also be implemented by adapting video compression and streaming algorithms after taking into account the mechanisms provided by the lower layers for error control and resource allocation. For example, in
connection with streaming multimedia communications, two of the capabilities enabled by cross-layer optimizations include:
[0018] (1) PHY/MAC/NET layer-aware content adaptation at the codec level, using adaptation parameters such as bit rate, resolution, frame rate, and the like, to enable a streaming service to adapt its content characteristics to varying network conditions (e.g., changing resource availability, or the time-varying nature of the wireless channel). Various content adaption strategies are performed to ensure the highest possible QoE while maintaining interruption- free playback of the multimedia. This capability is known as "adaptive streaming".
[0019] (2) Application-aware PHY/MAC/NET layer adaptation at the radio and network levels in order to perform PHY/MAC/NET layer functions such as link adaptation and resource allocation. This may be used to exploit knowledge of various application-layer attributes associated with the video content and service. For example, knowledge of the rate-distortion
characteristics of the video stream can allow for performing distortion- aware channel access prioritization at the PHY/MAC/NET layer to enhance video quality.
[0020] Both of these optimizations in combination are not fully realized with existing network techniques. This occurs in part because the
PHY/MAC/NET layers in most networks only attempt to optimize link quality to QoS requirements, with use of parameters such as throughput, latency/jitter, packet error/loss rate, and so forth. Also, due to layer independence and separation, and the limitations of QoS as previously described, existing QoS classes and associated service attributes do not accommodate QoE-related metrics for application-level multimedia processing and prioritize the multimedia streams in a content-aware fashion.
[0021] Further, existing network configurations generally do not pass content-specific information (e.g., rate-distortion characteristics of the video stream, associated video quality metrics, and the like) regarding the multimedia processing at the codec (application) level to wireless networks. In this context, new cross-layer coordination capabilities and signaling mechanisms may be used to enable exchanging application-level information for QoE-aware radio and
wireless multimedia network adaptation, and for resource management for one or more service flows.
[0022] The various embodiments described herein disclose techniques and configurations that provide adaptive services for wireless networks to enable such content-awareness and enhanced QoE. Both conversational and streaming services may be enhanced using the techniques described herein. In addition, the techniques described herein are applicable for unicast, multicast and broadcast multimedia delivery methods. Moreover, the proposed techniques are also applicable in heterogeneous environments that require delivery of multimedia content such as video over multiple air interfaces.
[0023] Aspects of the present disclosure provide techniques relevant to the transmission and receipt of wireless networking communications, and specifically wireless communication systems and protocols adapted for multimedia content communications. An example multimedia networking configuration that may be enhanced in connection with the presently described techniques is known under the names of "WirelessHD", "wireless display", or simply "WiDi", with one example WiDi implementation marketed as "INTEL® Wireless Display". WiDi relies on local peer-to-peer (P2P) wireless connectivity over a Wireless Local Area Network (WLAN) or Wireless Personal Area Network (WPAN)-based air interface (e.g. , Wi-Fi P2P, Wi-Fi Alliance Wi-Fi Display, WiDi Direct, myWi-Fi, 60 GHz technology, and the like) to transfer data between multimedia devices, such as a computer and a television.
[0024] For example, FIG. 1 depicts an example configuration of an adaptive streaming network architecture configured to deliver multimedia content via a packet- switched streaming service (PSS) from a PSS server 102 to a PSS client 112. The following illustrates multimedia content transmission via a 3 GPP Long Term Evolution (LTE) or Long Term Evolution Advanced (LTE- A) network configuration, although any of a number of wireless network standards and protocols may be similarly configured for use in streaming network architecture.
[0025] The multimedia content data is delivered from the PSS server 102 and the public network 104 (e.g., the Internet) to a core network 106, and transmitted from the core network 106 through an access network 108. The core
network 106 and access network 108 exist within the LTE IP network 120, e.g., an internal IP network maintained by a telecommunications provider. The access network 108 provides network connectivity between the core network 106 and the wirelessly transmitting access point/base station/eNodeB 110 within a LTE wireless network 122, e.g., a wireless network provided by a
telecommunications provider.
[0026] Thus, the multimedia content data is transmitted from the access network 108 to the base station/eNodeB 110, broadcasted via a wireless communication (e.g., a cellular data transmission) from the base station/eNodeB 110 via the LTE wireless network 122, and received at a mobile station
(MS)/user equipment (UE) at the receiving computing device 114 for processing by the PSS client 112.
[0027] The PSS client 112 in turn may use a WiFi P2P network 124 to further transmit the streaming multimedia content onto another device, such as the user's television 116. This final transmission to the television 116 via the WiFi P2P network 124 may involve the use of a wireless multimedia connection standard such as WiDi, and a WiDi application 118 operating on the receiving computing device 114 and the television 116. Ultimately, the transmitted multimedia content (e.g., streaming video) may be displayed at the final receiving device, e.g., television 116.
[0028] WiDi may not only be used to provide communicate multimedia content with output devices such as televisions, but may also be used to communicate multimedia content with input devices such as video cameras. Another example use case of WiDi includes a video conferencing application over cellular-enabled client devices (e.g., user equipment (UE)) corresponding to conversational and streaming video services. A video conferencing application (e.g., Skype) over an IP multimedia subsystem (IMS)-may be integrated to have the UE use the wireless webcam feature of WiDi. Consequently, video may also be signaled over a WiFi P2P connection from the UE to the WiDi adapter (in addition to the cellular network).
[0029] Although the presently disclosed techniques and configurations provide a number of examples related to WiDi, WirelessHD, and similar wireless multimedia networking configurations, this disclosure is applicable to a
larger number of heterogeneous UE connectivity scenarios. Moreover, a wireless multimedia delivery network is not limited to use of a cellular network, but may involve a variety of other wireless standards and configurations, including but not limited to a Wireless Wide Area Network (WW AN), a WLAN, or a WPAN network, an unmanaged WiFi network, or a TV broadcast network (e.g., DVB).
[0030] In example embodiments, a series of adaptive streaming services are provided to enable QoE via wireless multimedia networking configurations. The adaptive streaming services enhancing QoE via cross-layer optimization may include one or more of:
[0031] -An end-to-end QoS architecture for adaptive multimedia streaming in which the target QoS parameters for the core network and/or radio access network may be derived from multimedia-specific and application-layer parameters, determined from values such as provided from the session description protocol (SDP) for RTSP-based adaptive streaming or media presentation description (MPD) metadata for HTTP-based adaptive streaming, as well as from receiver device/display capabilities and physical link conditions;
[0032] -A Carrier sense multiple access with collision avoidance
(CSMA/CA)-based multimedia QoS and traffic prioritization framework in which access categories and associated system parameters (e.g., Arbitration
Inter- Frame Space Number (AIFSN), Connection Window (CW) and Transmit Opportunity (TXOP) parameters) for HCF Controlled Channel Access (HCCA) or Enhanced Distributed Channel Access (EDCA) may be determined based on QoE-optimized mapping functions derived from the multimedia-specific application-layer parameters in the SDP or MPD for RTSP/HTTP-based adaptive streaming services, receiver device/display capabilities, and physical link conditions;
[0033] -Client device configurations to manage the streaming session, modify session parameters (e.g., derive new RTSP/SDP session parameters), adapt video parameters (e.g., bitrate, resolution, frame rate, etc.), prioritize traffic, allocate resources and optimize bandwidth/QoS for its local connections (e.g., WiDi links) based on multimedia information gathered from session- level signaling (e.g., SDP or MPD signaling) over the other video delivery networks
(e.g., 3GPP, Wi-Fi, or digital video broadcasting networks) using Session Initiation Protocol (SIP), RTSP, or HTTP protocols, including codec information, quality requirements, and rate-distortion characteristics;
[0034] -A client signaling mechanism to exchange multimedia-specific application-layer parameters with the SDP or MPD values (or any other metadata carrying multimedia-specific parameters), and/or receiver
device/display capabilities in a radio access network towards enabling coordinated QoE optimization, application-aware network adaptation, and QoS support and resource management for adaptive streaming services; and
[0035] -A QoE-aware cross-layer cooperation framework for the adaptive streaming client platform architecture in order to jointly optimize platform parameters for RTSP or HTTP-based streaming, video/network/radio adaptation, and QoS support.
[0036] In this context, the present disclosure provides techniques to optimize channel access among concurrent wireless multimedia network applications for delivering the best possible QoE of multimedia content. This may provide enhanced operations applicable to: 1) Multiple concurrent multimedia network adaptive streaming applications (e.g., via a WiDi connection) carrying different multimedia content or displayed on different screens; 2) Wireless webcam and video conferencing over a multimedia wireless network; and 3) Internet video streaming over a multimedia wireless network, such as with use of adaptive HTTP streaming services, to one or more displays.
[0037] In order to effectively perform adaptive streaming in a multiaccess environment, it is important to ensure that the wireless multimedia network links share the medium in a "content-aware" fashion with appropriate video adaptation and QoS prioritizations among the streams during channel access. The presently described techniques and system configurations enable content- aware selection video adaptation parameters and WiFi-based Dynamic Channel Allocation (DCA) QoS parameters for different WiDi connections in order to share resources efficiently and realize the best possible video quality levels over all active wireless multimedia network applications.
[0038] Applicability to State-Tracking, Stateless, and Adaptive
Streaming Services
[0039] The following streaming protocols may be used in conjunction with the presently disclosed cross-layer optimization techniques. These streaming protocols include:
[0040] State-Tracking Protocols. Traditional streaming services (e.g., conducted with the RTSP) generally use a state-tracking protocol, where once a client connects to the streaming server, the server keeps track of the client' s state until the client disconnects again. Typically, frequent communication between the client and the server occurs for purposes such as session provisioning and negotiation of media parameters. Once a session between the client and the server has been established, the server sends the media as a continuous stream of packets over either UDP or TCP transport. The application-layer information on the multimedia-specific parameters is typically communicated by SDP.
Example technologies using RTSP-based streaming include Microsoft Windows Media, Apple QuickTime, Adobe Flash, and Real Networks Helix. Some implementations of WiDi also use RTSP-based streaming.
[0041] Stateless Protocols. Another option for adaptive streaming is via a stateless protocol, such as the HTTP protocol. With use of the HTTP protocol, as a client requests data, the server responds by sending the data and then the transaction is terminated. Each HTTP request is handled as a completely standalone one-time transaction. HTTP-based progressive download methods may also be used for media delivery from standard Web servers. In HTTP-based progressive download, supported clients can seek positions in a media file by performing byte range requests to the Web server. Some of the disadvantages of HTTP-based progressive download include that (i) bandwidth may be wasted if the user decides to stop watching the content after progressive download has started (e.g., switching to another content), (ii) the download is not bitrate adaptive, and (iii) the download does not support live media services.
[0042] Adaptive Streaming. Dynamic adaptive streaming over HTTP
(DASH) addresses some of the weaknesses of Real-time Transport Protocol (RTP)/RTSP-based streaming and HTTP-based progressive downloads. DASH provides the ability to move control of a "streaming session" entirely to the client and therefore moves the adaptive streaming intelligence from the server to the client. The client may open one or several or many TCP connections to one
or several standard HTTP servers or caches, retrieve the MPD metadata file providing information on the structure and different versions of the media content stored in the server (including different bitrates, frame rates, resolutions, codec types, etc.) and request smaller segments of the selected version of the media file with individual HTTP messages (to imitate streaming via short downloads).
[0043] DASH provides the ability to the client to automatically choose an initial content rate to match initial available bandwidth without requiring a negotiation with the streaming server. DASH further provides the ability to dynamically switch between different bitrate representations of the media content as the available bandwidth changes. Hence, DASH allows faster adaptation to changing network and wireless link conditions, user preferences, and device capabilities (e.g., display resolution, CPU, memory resources, etc.). Such dynamic adaptation may enable an improved user quality of experience (QoE), with shorter startup delays, fewer re -buffering events, and the like.
Example DASH technologies include Microsoft Internet Information Services (IIS) Smooth Streaming, Apple HTTP Live Streaming, and Adobe HTTP Dynamic Streaming.
[0044] End-to-End QoS Architecture for Adaptive Streaming
[0045] Providing a sufficiently detailed end-to-end QoS implementation that optimizes adaptive streaming requires the consideration of the interfaces provided by the wireless network, access network, and core network used to transfer data. FIG. 2 provides an illustration of a QoS-aware network architecture, configured to access multimedia content provided by a multimedia server 102 from a public network 104 (e.g., the internet) via each of these network interfaces.
[0046] Communications provided for the multimedia content within the non- wireless IP network 204 (e.g., a LTE IP network) include transfer of data from the public network 104 via the core network 106, and the transfer of data from the core network 106 via the access network 108. Communications provided for the multimedia content within the wireless network 206 (e.g., a LTE wireless network) include the transfer of data from the access network 108 to the access point/base station/eNodeB 110, and the transfer of data from the
access point/base station/eNodeB 110 wirelessly to the receiving computing device 114 (a mobile station/user equipment).
[0047] As illustrated, QoS parameters 202 for the non-wireless IP network 204 and the wireless network 206 may be derived based on multimedia- specific application-layer parameters. These derived QoS parameters 202 may then be provided to the various components and interfaces within the non- wireless IP network 204 and the wireless network 206, including the core network 106, the access network 108, and the wireless network interface operated by the access point/base station/eNodeB 110.
[0048] Regarding the specific wireless network interface (e.g., IEEE standard 802. l ie, WiFi Multimedia (WMM), etc.) used in the wireless network 206, each interface defines a set of QoS classes or access categories (ACs) (e.g., best effort (AC_BE), background (AC_BK), voice (AC_VO), and video (AC_VI) access categories for the WiFi Multimedia (WMM) standard as part of enhanced distributed coordination function (DCF) channel access (EDCA)) and specifies associated service attributes in terms of various performance requirements such as throughput, latency/jitter, packet error-loss rate, and the like (e.g., via TSPECs, etc.). The QoS classes / ACs enable the differentiation of the service flows between client applications and various services. In one example embodiment, each service flow is mapped to a specific QoS class and receives a common QoS treatment. This allows service flows to be prioritized accordingly, when resources are distributed between different service flows through scheduling functions.
[0049] Some examples for QoS definitions that may be used in IP network 204, specifically in core network 106 and access network 108, are
Differentiated Services - DiffServ (RFC 2474) and Integrated Services - IntServ (RFC 1633), specified by the Internet Engineering Task Force (IETF). IntServ follows the flow-based and signaled QoS model, where the end-hosts signal QoS needs to the network, while DiffServ works on the provisioned- QoS model, where network elements are set up to service multiple classes of traffic with varying QoS requirements. In particular, DiffServ uses the 6-bit Differentiated Services Code Point (DSCP) field in the header of IP packets for packet classification purposes. The IntServ model relies on the Resource Reservation
Protocol (RSVP) to explicitly signal and reserve the desired QoS for each flow in the network, described by the FlowSpecs. In order to provide multi-layer QoS control and manage end-to-end QoS, a convergence sub-layer is defined to interface higher-layer protocol data units and perform classification and mapping functions. For example, in the case of DiffServ, each end-to-end IP packet entering the system is identified with a dedicated air interface AC for the radio access network, by mapping its DSCP over the core network from DiffServ to a particular QoS class for the radio access network.
[0050] The configuration illustrated in FIG. 2 therefore enables an end- to-end QoS architecture for adaptive streaming in which target QoS parameters for the core network and/or radio access network (including local P2P connections) are derived. These values may be derived based on multimedia- specific application-layer parameters, such as from SDP values for RTSP-based adaptive streaming or MPD values for HTTP-based adaptive streaming, as well as based on the receiver device/display capabilities and physical link conditions.
[0051] In this context, the IP and wireless network devices (STAs, APs, etc., including client devices) may be configured to (i) have the ability to parse the SDP or MPD values in order to extract multimedia-specific application layer information for a given streaming session, (ii) exchange information on the receiver device/display capabilities and/or physical link conditions, and (iii) derive target video adaptation parameters and QoS parameters for the core network and radio access network.
[0052] As a potential implementation of deriving QoS parameters in one example embodiment, IP and wireless network devices may be configured to perform mapping from the multimedia-specific application-layer information contained in the SDP or MPD values (or from any similar metadata format carrying multimedia information), or receiver device/display capabilities, or physical link conditions to:
1) The set of video adaptation parameters (e.g., bitrate, resolution, frame rate, etc.) and QoS parameters for the core network, e.g.,
DiffServ/DSCP parameters, IntServ/FlowSpecs parameters, etc.
2) The set of video adaptation parameters (e.g., bitrate, resolution, frame rate, etc.) and QoS parameters for the radio access network, e.g., QoS class or access category (AC) parameters, TSPECs, etc.
3) The set of video adaptation parameters (e.g., bitrate, resolution, frame rate, etc.) and QoS parameters for the local P2P network among client devices (e.g., a WiFi P2P network as in the WiDi use case), including QoS class or access category (AC) parameters, TSPECs, and the like.
[0053] Furthermore, the network devices (including radio access network devices and client devices, e.g., STAs) may signal the SDP or MPD values (or any such metadata carrying multimedia information) as well as the receiver device/display capabilities to other network devices in order to share adaptive streaming related session information with the appropriate entities in the network. In such cases, the decisions on the video adaptation parameters and QoS parameters (QoS class or access category (AC) parameters, etc.) for all clients sharing the resources/spectrum can be made jointly in order to find the QoE-optimizing traffic prioritization among the clients in a coordinated fashion.
[0054] Apart from QoS enhancements, the exchange of multimedia- specific application layer information contained in the SDP or MPD attributes among network devices would be beneficial for other use cases as well, such as QoE-optimal adaptive streaming during session transfer among client devices.
[0055] The set of multimedia-specific application-layer parameters provided by the SDP, MPD, or any other similar metadata format can include one or more of the following multimedia parameters:
[0056] - Type of multimedia application, e.g., video conferencing, realtime video streaming, video downloading/uploading, stored or internet- streamed video, DVD or Blue-Ray video playback, etc.
[0057] - Type of multimedia, e.g., image, video, audio, voice, etc.
[0058] - Application-level constraints for the multimedia content, e.g., delay, jitter, reliability, quality requirements, etc., and recommended QoS class and parameter information.
[0059] - Multimedia bitrate, resolution, and frame rate information, including a maximum bitrate above which the perceived quality improvement is negligible, and a minimum bitrate to achieve the lowest acceptable quality.
[0060] - Multimedia codec information, e.g., codec type such as AMR,
MPEG4, H.264 AVC/SVC etc., possibly also describing profiles and levels.
[0061] - Multimedia quality metrics specified at different bitrates, frame rates and resolutions, such as reference, reduced-reference or non-reference metrics, e.g., video quality metrics (VQM), structural similarity metrics (SSIM), perceptual evaluation of video quality metrics (PEVQ), video mean opinion scores (MOS), and other subjective quality metrics.
[0062] - Device capability information and display properties, including screen size, resolution, and bit depth.
[0063] - Encoding information, such as the number of group of pictures
(GOP) frames, GOP size, and frame type (e.g., I-frame, P-frame, B-frame, etc.).
[0064] - Quantization parameters for different frames, e.g., varying quantization scales for I-, P-, B-frames, etc.
[0065] - Layer type for scalable video coding (SVC), e.g., base layer, enhancement layer, etc.
[0066] - Application-level Forward Error Correction (FEC), erasure coding, or network coding parameters.
[0067] - Session and RTCP signaling bandwidth information (e.g., bandwidth modifiers used with SDP).
[0068] - Pre-decoder buffer size, initial buffering period, decoder capability information.
[0069] - Streaming method (RTSP, HTTP, etc.).
[0070] - Support for QoE, Adaptation, Extended RTCP reporting, fast content switching, and RTP profiles.
[0071] CSMA/CA-based Multimedia QoS and Traffic Prioritization
Framework
[0072] Standards disclosed in IEEE standard 802.1 le provide a QoS extension of the distributed coordination function (DCF) and the point coordination function (PCF) of 802.11 wireless networking standards, through a new coordination function: the hybrid coordination function (HCF). Within the HCF, there are two methods of channel access, similar to those defined in the legacy 802.11 MAC: HCF Controlled Channel Access (HCCA) and Enhanced Distributed Channel Access (EDCA). Both EDCA and HCCA define Traffic
Categories (TC). While the forthcoming discussion will address EDCA only, it should be understood that the scope of the techniques proposed here are also applicable to HCCA-based QoS delivery as HCCA also relies on the same TCs for traffic prioritization.
[0073] In one embodiment, an enhancement to support QoS enables
EDCA to differentiate packets using different priorities and maps them to specific ACs that are buffered in separate queues at a station. Each AC i within a station having its own EDCA parameters contends for the channel access independently of the others. Levels of services may be provided through different assignments of the AC-specific EDCA parameters: AIFS, CW, and TXOP limits (AIFS = arbitration inter-frame space, CW = contention window, TXOP = transmit opportunity), allowing for prioritization of channel access among different ACs. The channel access probability differentiation is provided by using: a) different AIFSs instead of the constant distributed IFS (DIFS) used in DCF, and, b) different values for the minimum/maximum CWs to be used for the backoff time extraction.
[0074] AIFSN Prioritization: If there is a packet ready for transmission in the MAC queue of an AC, the EDCA function will sense the channel to be idle for a complete AIFS before it can start the transmission or backoff countdown. The AIFS of AC i may be determined as follows:
AIFSJ = SIFS + AIFSNJ * T_slot
[0075] where AIFSNJ is the AC-specific AIFS number corresponding to
AC i, SIFS is the length of the short inter-frame space and T_slot is the duration of a time slot.
[0076] CW Prioritization: If the channel is idle when the first packet arrives at the AC i queue, the packet can be directly transmitted as soon as the channel is sensed to be idle for AIFSJ. Otherwise, a backoff procedure is completed following the completion of AIFS before the transmission of this packet. A uniformly distributed random integer, namely a backoff value, is selected from the range [0, W_i]. The backoff counter is decremented at the slot boundary if the previous time slot is idle. Should the channel be sensed busy at any time slot during AIFS or backoff, the backoff procedure is suspended at the
current backoff value. The backoff resumes as soon as the channel is sensed to be idle for AIFS again. When the backoff counter reaches zero, the packet is transmitted in the following slot. The value of W_i depends on the number of retransmissions the current packet experienced. The initial value of W_i is set to CWmin_i. If the transmitter cannot receive an Acknowledgment (ACK) packet from the receiver in a timeout interval, the transmission is labeled as unsuccessful and the packet is scheduled for retransmission. At each unsuccessful transmission, the value of W_i is doubled until CWmax_i is reached. The value of W_i is reset to CWmin_i if the transmission is successful; or the packet retransmission limit is reached the packet is dropped.
[0077] The ACs with higher priority are assigned a smaller AIFSN value. Therefore, the ACs with higher priority can either transmit or decrement their backoff counters while ACs with lower priority are still waiting in AIFS. This results in ACs with higher priority enjoying a relatively faster progress through backoff slots. Moreover, the ACs with higher priority may select backoff values from a comparably smaller CW range. This approach prioritizes the access because a smaller CW value means a smaller backoff delay before the transmission.
[0078] TXOP Prioritization: TXOP is a bounded time interval during which a station can send as many frames as possible as long as the duration of the transmissions does not extend beyond the maximum duration of the TXOP. Upon gaining access to the medium, each AC i may carry out multiple frame exchange sequences as long as the total access duration does not go over MaxTXOP_i. In a TXOP, the transmissions are separated by SIFS. Multiple frame transmissions in a TXOP can reduce the overhead due to contention. A TXOP limit of zero corresponds to only one frame exchange per access. The ACs with higher priority may use a nonzero TXOP to increase their channel access time, with TXOP durations ranked according to the AC priority (i.e., the highest priority AC may have the largest TXOP).
[0079] In one embodiment, the previously described CSMA/CA-based multimedia QoS and traffic prioritization framework determines the access categories and associated system parameters (e.g., AIFSN, CW, and TXOP parameters) for EDCA or HCCA values. These values may be implemented
based on QoE-optimized mapping functions derived from the multimedia- specific application-layer parameters in the SDP or MPD values (or any other similar metadata format), receiver device/display capabilities, or physical link conditions for RTSP/HTTP-based adaptive streaming services.
[0080] Furthermore, the network devices (including radio access network devices and client devices, e.g., STAs) may signal the SDP or MPD information as well as the receiver device/display capabilities to other network devices in order to share adaptive streaming related session information with the appropriate entities in the network. In such cases, the decisions on the QoS parameters (QoS class or access category (AC) parameters, etc.) for all clients sharing the resources/spectrum can be made jointly in order to find the QoE- optimizing traffic prioritization among the clients in a coordinated fashion.
[0081] QoE-Optimizing Platform Adaptation Architecture for Adaptive Streaming
[0082] In another example embodiment, a cross-layer optimized platform adaptation architecture is defined for adaptive streaming, in which video, transport and radio components in the platform cooperate and exchange information towards identifying platform configurations needed to optimize user QoE.
[0083] An example client adaptation architecture 302 illustrated against a series of associated Open Systems Interconnection (OSI) communication layers and protocols 300 is depicted in FIG. 3. As illustrated, a cross-layer adaptation manager 304 extending across each of the OSI communication layers is operable with each of the following system components:
[0084] Radio Adaptation and QoS engine 320: Determines radio-level adaptation and QoS parameters;
[0085] Network Adaptation and QoS engine 318: Determines network- level adaptation and QoS parameters;
[0086] RTSP/HTTP Access Client 316: Handles transport-level
RTSP/RTP/UDP/IP or HTTP/TCP/IP operations, and establishes and manages the RTSP/HTTP transport connections;
[0087] Adaptive Streaming Control Engine 312: Parses the SDP or MPD parameters and determines streaming parameters for adaptive streaming (e.g., DASH segment duration, sequence and timing of HTTP requests, etc.);
[0088] Media Adaptation Engine 314: Determines codec-level adaptation parameters; and
[0089] QoE monitor 310: Dynamically measures QoE.
[0090] For example, DASH client platform configurations may be jointly optimized at the video, transport and radio levels via cross-layer cooperation of the cross-layer adaptation manager 304, and associated system components, in connection with the following parameters at each appropriate layer:
[0091] -Application (Video) layer: Bitrate, frame rate, resolution, the decisions of the client to drive the requested content representations from the DASH server;
[0092] -Transport layer: QoE feedback based on the real-time transport control protocol (RTCP), sequence and timing of HTTP requests, number of parallel TCP connections, DASH segment durations, and so forth;
[0093] - Network, and Link and Physical (Radio) layers: Modulation and coding scheme (MCS), target QoS parameters for the core network and radio access network.
[0094] Further, in one embodiment, the adaptive streaming client platform can dynamically track the following parameters, and use parameter values as inputs for decisions towards jointly adapting the streaming client configurations via cross-layer cooperation:
[0095] - Measured QoE parameters, e.g., video quality metrics (VQM), structural similarity metrics (SSIM), perceptual evaluation of video quality metrics (PEVQ), video mean opinion scores (MOS), etc and other subjective quality metrics;
[0096] - Measured video rate-distortion characteristics;
[0097] - User preferences at the application layer;
[0098] - Multimedia-related information retrieved from SDP or MPD parameters;
[0099] - Information received from the network on current QoS availability and network congestion states;
[00100] - Measured dynamic QoS parameters (e.g., throughput, latency, reliability, etc.);
[00101] - Measured dynamic channel/network conditions at the radio and transport levels; and
[00102] - Power/latency budgets and CPU/buffer/memory requirements at the platform architecture level.
[00103] Wireless Multimedia Network Applications
[00104] As previously described, the present network enhancement techniques are applicable to Wireless Multimedia Network protocols such as WiDi, which may implement a special case of the proposed end-to-end QoS architecture. Due to its simple point-to-point communication setting, WiDi allows for full control over both transmit and receive ends of the link, allowing for highly-optimized multimedia adaptation.
[00105] Adaptive streaming over WiDi may be performed using the RTSP protocol. In this context, a cross-layer coordinated QoS framework may be adapted to optimize channel access among concurrent WiDi applications for delivering the best possible multimedia QoE, allowing effective performance of adaptive streaming with QoE in a multi-access environment. This may help ensure that the WiDi links share the medium in a "content-aware" and "display aware" fashion with the appropriate prioritizations among the streams during channel access.
[00106] More specifically, this embodiment may enable content-aware and display-aware selection video adaptation (bitrate, resolution, frame rate, content characteristics, etc.), DCA, and target QoS parameters for different WiDi connections in order to share resources efficiently and realize the best possible video quality levels over all WiDi applications. In addition, this embodiment may also factor the type of content being broadcasted. For example, a low- definition action movie or sports presentation may require more data
transmission and throughput than a higher-definition low-motion movie.
[00107] In one embodiment, the presented CSMA/CA-based multimedia QoS and traffic prioritization framework is applicable such that access categories and associated system parameters (e.g., AIFSN, CW and TXOP parameters) for
EDCA or HCCA are determined based on QoE-optimized mapping functions derived from the multimedia-specific application-layer parameters in the SDP, receiver device/display capabilities and physical link conditions, and the like.
[00108] For example, WiDi devices may utilize RTSP/SDP-based signaling mechanisms to exchange multimedia- specific application-layer parameters and receiver device/display capability information over the radio links. These parameters and capability information may be applied to enable coordinated QoE optimization, application-aware network adaptation, QoS support, and resource management for adaptive streaming services transmitting over a WiDi connection. Therefore, a client device running a WiDi application may manage the streaming session, modify session parameters (e.g., derive new RTSP/SDP session parameters), adapt video parameters (e.g., bitrate, resolution, frame rate, etc.), prioritize traffic, allocate resources and optimize
bandwidth/QoS for its local connections (e.g., WiDi links) to the displays based on multimedia information gathered from session- level signaling (e.g., SDP or MPD signaling) over the other video delivery networks (e.g., 3GPP, WiFi or DVB networks) using SIP, RTSP or HTTP protocols, including codec information, quality requirements, and rate-distortion characteristics.
[00109] A WiDi client platform architecture may also perform RTSP- based adaptive streaming based on the proposed QoE-aware cross-layer cooperation framework in order to jointly optimize platform parameters for video/network/radio adaptation and QoS support. For example, WiDi devices may signal the RTSP/SDP or MPD information as well as the receiver device/display capabilities to other WiDi devices in order to share adaptive streaming related session information with the appropriate entities in the network. In such cases, the decisions on the QoS parameters (QoS class or access category (AC) parameters, etc.) for all clients sharing the
resources/spectrum can be made jointly in order to find the QoE-optimizing traffic prioritization among the clients in a coordinated fashion.
[00110] FIG. 4 provides an illustration of an example network configuration 400 for transmitting streaming multimedia content with a DASH adaptive streaming protocol in accordance with an example embodiment. FIG. 4 specifically depicts an example use case of over-the-top Internet video streaming
using DASH and adaptive HTTP techniques, e.g., Apple HTTP Live Streaming. As illustrated, multimedia content is communicated from a DASH server 402 to a WiFi AP 404, and then via a WiFi network to a computer 406 running a DASH Client and a WiDi communications application. The computer 406 utilizes the WiDi communications application to transmit multimedia content via a WiDi WiFi P2P network to two devices, a large-sized screen television 410 and a medium-sized receiver screen 414 (e.g., a computing device or another display device with a smaller screen than television 410).
[00111] In this particular scenario, two different video streams are to be received by the DASH client from one or more DASH servers to be projected onto two different displays with varying characteristics. Accordingly, the DASH Client first fetches the MPD from the DASH server(s) and learns about multimedia characteristics for content to be streamed over the WiDi links (e.g., these parameters may include minimum bitrate for acceptable video quality and maximum bitrate above which perceived video quality improvement is negligible). This is followed by the DASH client using RTSP/SDP signaling to gather capability information from each of the displays. The DASH client then estimates link qualities to each of the displays based on the physical channel conditions (e.g. by tracking packet error/loss statistics).
[00112] The network configuration 400 is specifically configured to provide the multimedia content to the large screen television 410 at a high priority 408 (e.g., AC Priority Level 1). The network configuration 400 is further configured to provide the multimedia content to the medium- sized receiver screen 414 at a low priority 412 (e.g., AC Priority Level 2).
[00113] Based on the MPD information, display capabilities, and physical link conditions gathered from the first three steps, the DASH client may next determine the QoE-optimizing adaptive streaming configuration including video adaptation parameters and QoS parameters. Finally, the DASH client streams content to the displays based on the QoE-optimizing content-aware and display- aware adaptive streaming configuration based on the selected video adaptation parameters and QoS parameters.
[00114] FIG. 5 provides an illustration of a method for QoE-optimization procedure for DASH-based video streaming according to one embodiment.
First, multimedia characteristics for the streaming content may be determined (operation 510), for example, by the DASH client obtaining the MPD from the DASH server(s) and analyzing multimedia characteristics for content to be streamed over the WiDi links. Capability information for the various displays may be determined (operation 520), for example, by the DASH client using RTSP/SDP signaling to gather capability information from each of the displays. Network link conditions to the displays or display clients may be determined (operation 530), for example, by the DASH client estimating link qualities to each of the displays based on the physical channel conditions (such as by tracking packet error/loss statistics).
[00115] Based on the previously determined multimedia characteristics, display capability information, and network link conditions, an adaptive streaming configuration may be determined (operation 540). For example, the DASH client may calculate relevant QoE-optimizing adaptive streaming configuration parameters including video adaptation parameters and QoS parameters using the previously described MPD information, display capabilities, and physical link conditions.
[00116] Finally, content may be streamed based on the determined adaptive streaming configuration (operation 550). For example, the DASH client may stream content to the displays based on the QoE-optimizing adaptive streaming configuration parameters including the previously described video adaptation parameters and QoS parameters.
[00117] FIG. 6 illustrates another example embodiment of cross-layer optimized adaptive streaming techniques, with a network architecture supporting a two-user WiFi network where each user is concurrently running WiDi applications to stream multimedia content. As depicted, WiDi Link 1 is configured to transmit real-time video from computing system 602 to large screen television 608 with a high priority level 604. Within WiDi Link 1, RTSP/SDP messaging 606 is configured to exchange application-layer parameters between the source (computing system 602) and sink (television
608). As further depicted, WiDi Link 2 is also configured to transmit real-time video from computing system 612 to medium-sized screen receiver device 618 with a low priority level 614; and likewise, WiDi Link 2 exchanges similar
RTSP / SDP messaging 616 between the computing system 612 and the receiver device 618.
[00118] Using the RTSP / SDP-based signaling 606, 614 between each of the source-sink device pairs, each WiDi configuration determines the application layer parameters associated with display capabilities, parses the locally stored SDP for each video content to gather multimedia-specific information, estimates the physical link conditions, and uses these criteria to determine the QoE optimizing video adaptation parameters/QoS access category/and associated EDCA / HCCA parameters corresponding to their multimedia streams.
[00119] For example, suppose user of WiDi Link 1 would like to stream a fast-moving high quality video stream to a large screen TV 608 with a minimum bitrate equivalent to 4 Mbps of channel capacity in order to meet the target QoE for the video stream, while the user of WiDi Link 2 would like to stream a slow moving lower quality video stream to a medium-sized screen receiver device 618 with a minimum bitrate equivalent to 2 Mbps of channel capacity in order to meet the target QoE for the video stream. Hence, WiDi Link 1 has a more stringent QoE and bitrate requirement compared to WiDi Link 2. Consequently, WiDi Link 1 is assigned to a higher priority level 604 while WiDi Link 2 is assigned to a lower priority level 614, allowing WiDi Link 1 to utilize more of the channel capacity resources and thereby meet its higher bitrate requirement. Through such content-aware channel access, both users are able to meet their QoE requirements and enjoy a satisfying video streaming experience.
[00120] Suppose in the illustrated example of FIG. 6 that the channel capacity provided to both users is approximately 6 Mbps. Therefore, in a legacy DCF framework in which both users would have equal priority for channel access, each WiDi link could only realize an average throughput of 3 Mbps. While this throughput would allow WiDi Link 2 to meet its target QoE requirement, WiDi Link 1 would not be able to meet its target QoE requirement.
[00121] As a remedy, a content-aware channel access solution in accordance with one embodiment considers provides a CWmin ratio of 2: 1 in the QoS prioritized CSMA/CA-based WiFi access. This is illustrated in the channel access 610 for the WiFi network used to transmit the data via the WiDi links, configured to allow up to two times higher throughput for WiDi Link 1 in
comparison with WiDi Link 2 (i.e., WiDi Link 1 gains access to 2/3 of the channel bandwidth, while WiDi Link 2 gains access to 1/3 of the channel bandwidth). This results in WiDi Link 1 realizing 4 Mbps of WiDi throughput and WiDi Link 2 realizing 2 Mbps of WiDi throughput. Hence, both users are able to meet their target QoE requirements.
[00122] FIG. 7 provides a summarized illustration of a method used to implement adaptive streaming optimization, with use of various of the previously described techniques according to one example embodiment. As illustrated, the adaptive streaming optimization occurs in connection with streaming multimedia content to one or more displays with one or more adaptive streaming connections (operation 710).
[00123] The calculations performed with the adaptive streaming optimization may include: receiving display parameters via the connection(s) (operation 720); determining the stream requirements for the connection(s) (operation 730); determining the network link condition(s) for the connection(s) (operation 740); and determining the target QoS parameters for the connection(s) (operation 750). The target QoS parameters are then implemented for the connection(s) (operation 760). The result of the implemented QoS parameters for the connection(s) may then be verified (operation 770), with further adjustments and implementations to the QoS parameters provided in subsequent activities.
[00124] Although the previously described techniques and configurations were provided with reference to specific implementations of wireless multimedia networks such as WiDi, these techniques and configurations may also be applicable to a variety of WLANs, WWANs, and wireless communication standards implementing communication quality-enhancing techniques. Further, the previously described techniques and configurations may be applied within any number of multimedia streaming applications and protocols over wireless networks to enhance user quality of experience through the proposed cross-layer optimized and QoS-enabled adaptive streaming techniques.
[00125] Embodiments may be implemented in one or a combination of hardware, firmware, and software. Embodiments may also be implemented as instructions stored on a computer-readable storage device, which may be read
and executed by at least one processor to perform the operations described herein. A computer-readable storage device may include any non-transitory mechanism for storing information in a form readable by a machine (e.g., a computer). For example, a computer-readable storage device may include read- only memory (ROM), random-access memory (RAM), magnetic disk storage media, optical storage media, flash-memory devices, and other storage devices and media. In some embodiments, communication devices such as a base station or UE may include one or more processors and may be configured with instructions stored on a computer-readable storage device.
[00126] The Abstract is provided to comply with 37 C.F.R. Section
1.72(b) requiring an abstract that will allow the reader to ascertain the nature and gist of the technical disclosure. It is submitted with the understanding that it will not be used to limit or interpret the scope or meaning of the claims. The following claims are hereby incorporated into the detailed description, with each claim standing on its own as a separate embodiment.
Claims
1. A method performed by a transmitting device for adaptive streaming optimization within a wireless local area network, comprising:
determining target quality of service (QoS) parameters for an adaptive streaming communication within the wireless local area network from multimedia- specific parameters, receiver display capabilities of one or more receiving devices, and link conditions; and
transmitting the adaptive streaming communication in the wireless local area network in accordance with the target QoS parameters;
wherein the target QoS parameters are implemented using cross-layer adaptation among a plurality of network levels in the wireless local area network to provide quality of experience (QoE) for multimedia content to be delivered by the adaptive streaming communication.
2. The method of claim 1, wherein the wireless local area network operates using a carrier sense multiple access/collision avoidance (CSMA/CA) protocol, and wherein implementing the target QoS parameters within the network includes providing QoS and traffic prioritization in connection with CSMA/CA-based operations of the wireless local area network.
3. The method of claim 2, wherein access categories and associated system parameters for Enhanced Distributed Channel Access (EDCA) or Hybrid Coordination Function Controlled Channel Access (HCCA) used in the wireless local area network are determined based on the multimedia-specific parameters, the receiver display capabilities, and the link conditions.
4. The method of claim 1, wherein the wireless local area network is operably coupled to a core network providing the multimedia content, the method further comprising implementing the target QoS parameters within the core network.
5. The method of claim 1, wherein the adaptive streaming
communication over the wireless local area network is conducted using Real Time Streaming Protocol (RTSP) streaming, and wherein the multimedia- specific parameters are communicated to the transmitting device within session description protocol (SDP) parameters.
6. The method of claim 1, wherein the adaptive streaming
communication over the wireless local area network is conducted using Real Time Streaming Protocol (RTSP) streaming, and wherein the multimedia- specific parameters are communicated to the transmitting device in media presentation description (MPD) metadata used for Hypertext Transport Protocol (HTTP)-based adaptive streaming.
7. The method of claim 1, further comprising:
receiving client signaling in connection with the adaptive streaming communication, the client signaling indicating at least one of the multimedia- specific parameters, the receiver display capabilities, and the link conditions.
8. The method of claim 1, further comprising:
adapting the target QoS parameters, bandwidth allocations, transport parameters, and multimedia streaming parameters at application, session, and transport layers of the wireless local area network based on the multimedia- specific parameters, the receiver display capabilities, and the link conditions, to further provide the QoE for the multimedia content delivered by the adaptive streaming communication.
9. The method of claim 1, further comprising:
adapting parameters of the multimedia content to provide the QoE for the multimedia content delivered by the adaptive streaming communication, including modifying at least one of bitrate, resolution, or frame rate of the multimedia content.
10. The method of claim 1, wherein determining the target QoS parameters includes factoring multimedia- specific parameters, receiver display capabilities, and link conditions from at least one additional adaptive streaming communication occurring over the network.
11. The method of claim 10, wherein the adaptive streaming communication is transmitted to a first receiving device, and wherein the at least one additional adaptive streaming communication is transmitted to at least one additional receiving device.
12. The method of claim 1, wherein the adaptive streaming
communication is provided as a digital transmission of video and audio data signals to a receiving device using a peer-to-peer (P2P) connection over the wireless local access network.
13. A network communications device, comprising:
processing-layer circuitry configured to provide a cross-layer adaptation manager for optimizing adaptive streaming at multiple network layers in connection with wireless network communications performed by the network communications device, the cross-layer adaptation manager configured to: calculate target quality of service (QoS) parameters for the wireless network, from multimedia- specific parameters, receiver display capabilities of one or more receiving devices, and link conditions, wherein the target QoS parameters are configured for implementation in the wireless network to provide quality of experience (QoE) for the multimedia content delivered by the adaptive streaming communications; and
physical-layer circuitry to transmit adaptive streaming communications within the wireless network in accordance with the target QoS parameters.
14. The network communications device of claim 13, the cross-layer adaptation manager providing a Radio Adaptation and QoS engine to determine wireless radio-level adaptation and QoS parameters for the wireless network.
15. The network communications device of claim 13, the cross-layer adaptation manager providing a Network Adaptation and QoS engine to determine network-level adaptation and QoS parameters for the wireless network or a core network operably coupled to the wireless network.
16. The network communications device of claim 13, the cross-layer adaptation manager providing a Real Time Streaming Protocol
(RTSP)/Hypertext Transport Protocol (HTTP) Access Client to handle transport- level operations, and establish and manage the RTSP/HTTP transport connections for the wireless network.
17. The network communications device of claim 13, wherein obtaining multimedia-specific parameters includes parsing multimedia content information provided in metadata; and
wherein calculating the target QoS parameters includes determining the target QoS parameters, streaming parameters, transport parameters, network parameters, and radio parameters for the adaptive streaming communication based on the parsed multimedia content information.
18. The network communications device of claim 17, wherein the multimedia content information provided in metadata includes one or both of session description protocol (SDP) or media presentation description (MPD) metadata.
19. The network communications device of claim 13, the cross-layer adaptation manager providing a Media Adaptation Engine to determine codec- level adaptation parameters for the adaptive streaming communication.
20. The network communications device of claim 13, the cross-layer adaptation manager providing a QoE monitor to dynamically measure QoE of the adaptive streaming communication at a receiver display, and determine the target QoS parameters, streaming parameters, transport parameters, network parameters, and radio transmission parameters based on the measured QoE of the adaptive streaming communication at the receiver display.
21. The network communications device of claim 13, wherein one or more peer-to-peer (P2P) connections are established between at least one receiving device and at least one transmitting device via the wireless network in connection with at least one digital transmission of video and audio data signals.
22. A wireless communication device, comprising:
processing-layer circuitry configured to provide a cross-layer adaptation manager for optimizing adaptive streaming across multiple network layers in connection with an adaptive streaming communication transmitted from the wireless communication device, wherein the cross-layer adaptation manager is configured to:
determine target quality of service (QoS) parameters for wireless network implementation based on multimedia- specific parameters, receiver display capabilities of a receiving device, and link conditions, wherein the target QoS parameters provide quality of experience (QoE) for multimedia content delivered by the adaptive streaming communication; and
physical-layer circuitry to transmit the adaptive streaming
communication from the wireless communication device to the receiving device via a wireless local area network, in accordance with the target QoS parameters.
23. The wireless communication device of claim 22, the cross-layer adaptation manager further configured to:
adapt the target QoS parameters, bandwidth allocations, transport parameters, and multimedia streaming parameters at application, session, and transport layers of the wireless local area network based on the multimedia- specific parameters, the receiver display capabilities, and the link conditions.
24. The wireless communication device of claim 22, wherein the adaptive streaming communication occurs via a peer-to-peer (P2P) connection between the receiving device and the wireless communication device,
wherein the wireless communication device receives the multimedia content and content-specific application layer parameters of the multimedia content,
wherein the cross-layer adaptation manager adapts the target QoS parameters, transport parameters, and multimedia streaming parameters of the multimedia content for transmission of the multimedia content via the P2P connection, and
wherein the cross-layer adaptation manager manages session parameters for the adaptive streaming communication over the P2P connection.
25. The wireless communication device of claim 24, wherein the P2P connection facilitates a digital transmission of video and audio data signals from the wireless communication device to the receiving device via the wireless local area network.
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201180075247.4A CN103959798B (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancing on wireless network |
EP11873166.0A EP2761881A4 (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancements over wireless networks |
US13/993,417 US20140219088A1 (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancements over wireless networks |
PCT/US2011/054406 WO2013048484A1 (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancements over wireless networks |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/US2011/054406 WO2013048484A1 (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancements over wireless networks |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2013048484A1 true WO2013048484A1 (en) | 2013-04-04 |
Family
ID=47996215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/054406 WO2013048484A1 (en) | 2011-09-30 | 2011-09-30 | Quality of experience enhancements over wireless networks |
Country Status (4)
Country | Link |
---|---|
US (1) | US20140219088A1 (en) |
EP (1) | EP2761881A4 (en) |
CN (1) | CN103959798B (en) |
WO (1) | WO2013048484A1 (en) |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2806633A1 (en) * | 2013-05-23 | 2014-11-26 | Alcatel Lucent | Method and apparatus for improved network optimization for providing video from a plurality of sources to a plurality of clients |
EP2806632A1 (en) * | 2013-05-23 | 2014-11-26 | Alcatel Lucent | Method and apparatus for optimizing video quality of experience in end-to-end video applications |
WO2014187491A1 (en) * | 2013-05-23 | 2014-11-27 | Nokia Solutions And Networks Oy | Methods and apparatus for adapting a data rate |
CN104185285A (en) * | 2013-05-28 | 2014-12-03 | 华为技术有限公司 | Media data transmission method, device and system |
WO2015013811A1 (en) * | 2013-08-02 | 2015-02-05 | Blackberry Limited | Wireless transmission of real-time media |
WO2015103627A1 (en) * | 2014-01-06 | 2015-07-09 | Intel Corporation | Client/server signaling commands for dash |
CN105247882A (en) * | 2013-06-06 | 2016-01-13 | 索尼公司 | Content supply device, content supply method, program, and content supply system |
KR20160030517A (en) * | 2013-06-17 | 2016-03-18 | 구글 인코포레이티드 | Method, apparatus and computer-readable medium for media content streaming device setup |
KR20160060056A (en) * | 2013-09-27 | 2016-05-27 | 소니 주식회사 | Content supply device, content supply method, program, terminal device, and content supply system |
US9363333B2 (en) | 2013-11-27 | 2016-06-07 | At&T Intellectual Property I, Lp | Server-side scheduling for media transmissions |
CN105791408A (en) * | 2016-03-29 | 2016-07-20 | 中国科学院信息工程研究所 | P2P network construction method and system |
US9532043B2 (en) | 2013-08-02 | 2016-12-27 | Blackberry Limited | Wireless transmission of real-time media |
EP3113441A1 (en) * | 2015-06-29 | 2017-01-04 | Alcatel-Lucent España | Dynamic quality of experience improvement in media content streaming |
EP3025504A4 (en) * | 2013-07-22 | 2017-05-17 | Intel Corporation | Coordinated content distribution to multiple display receivers |
EP3065411A4 (en) * | 2013-10-28 | 2017-07-19 | Sony Corporation | Content supplying device, content supplying method, program, terminal device, and content supplying program |
CN107535002A (en) * | 2015-01-07 | 2018-01-02 | 奥兰治 | System for transmitting packet according to multiple access protocol |
EP3497935A1 (en) * | 2016-08-09 | 2019-06-19 | V-Nova International Limited | Adaptive video consumption |
EP3641223A4 (en) * | 2017-07-25 | 2020-04-29 | Huawei Technologies Co., Ltd. | Method for acquiring qoe information, and terminal and network device |
EP3672154A1 (en) * | 2018-12-20 | 2020-06-24 | Fundacion Centro De Tecnologias De Interaccion Visual Y Comunicaciones Vicomtech | Optimising transmission of streaming video contents based on qoe metrics |
CN112823527A (en) * | 2018-10-05 | 2021-05-18 | 交互数字Ce专利控股公司 | Method implemented at a device capable of running an adaptive streaming session and corresponding device |
US11070445B2 (en) | 2019-01-25 | 2021-07-20 | Tambora Systems Singapore Pte. Ltd. | System and method for optimization of an over-the-top (OTT) platform |
CN114143300A (en) * | 2021-11-25 | 2022-03-04 | 中国银行股份有限公司 | Transaction request sending method and device |
EP3850860A4 (en) * | 2018-09-12 | 2022-05-04 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
EP3850859A4 (en) * | 2018-09-12 | 2022-05-11 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
Families Citing this family (84)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140229570A1 (en) * | 2011-03-30 | 2014-08-14 | Telefonaktiebolaget L M Ericsson (Publ) | Adaptive rate transmission over a radio interface |
US9407466B2 (en) * | 2011-09-19 | 2016-08-02 | Arris Enterprises, Inc. | Adaptively delivering services to client devices over a plurality of networking technologies in a home network |
WO2013057315A2 (en) | 2011-10-21 | 2013-04-25 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Resource management concept |
US8977704B2 (en) | 2011-12-29 | 2015-03-10 | Nokia Corporation | Method and apparatus for flexible caching of delivered media |
CN103188725B (en) * | 2011-12-29 | 2018-01-30 | 中兴通讯股份有限公司 | A kind of adaptation of cooperation service, shunting transmission and stream switching method and system |
US9401968B2 (en) * | 2012-01-20 | 2016-07-26 | Nokia Techologies Oy | Method and apparatus for enabling pre-fetching of media |
US9203888B2 (en) * | 2012-05-01 | 2015-12-01 | Ericsson Ab | Server-side class-of-service-based bandwidth management in over-the-top video delivery |
WO2014011720A1 (en) * | 2012-07-10 | 2014-01-16 | Vid Scale, Inc. | Quality-driven streaming |
US20140033242A1 (en) * | 2012-07-24 | 2014-01-30 | Srinivasa Rao | Video service assurance systems and methods in wireless networks |
US9591513B2 (en) * | 2012-08-06 | 2017-03-07 | Vid Scale, Inc. | Rate adaptation using network signaling |
US11159804B1 (en) * | 2012-09-13 | 2021-10-26 | Arris Enterprises Llc | QoE feedback based intelligent video transport stream tuning |
CN103795747A (en) * | 2012-10-30 | 2014-05-14 | 中兴通讯股份有限公司 | File transfer method and device through Wi-Fi Direct |
US9967300B2 (en) * | 2012-12-10 | 2018-05-08 | Alcatel Lucent | Method and apparatus for scheduling adaptive bit rate streams |
US9369671B2 (en) * | 2013-02-11 | 2016-06-14 | Polycom, Inc. | Method and system for handling content in videoconferencing |
US9137091B2 (en) * | 2013-02-20 | 2015-09-15 | Novatel Wireless, Inc. | Dynamic quality of service for control of media streams using feedback from the local environment |
US20140281002A1 (en) * | 2013-03-14 | 2014-09-18 | General Instrument Corporation | Devices, systems, and methods for managing and adjusting adaptive streaming traffic |
US10284612B2 (en) * | 2013-04-19 | 2019-05-07 | Futurewei Technologies, Inc. | Media quality information signaling in dynamic adaptive video streaming over hypertext transfer protocol |
CN105144768B (en) | 2013-04-26 | 2019-05-21 | 英特尔Ip公司 | Shared frequency spectrum in frequency spectrum share situation is redistributed |
US9973559B2 (en) * | 2013-05-29 | 2018-05-15 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Systems and methods for presenting content streams to a client device |
US20150046568A1 (en) * | 2013-08-11 | 2015-02-12 | Imvision Software Technologies Ltd. | Method and system for playing multicast over-the-top (ott) content streams |
US9986044B2 (en) * | 2013-10-21 | 2018-05-29 | Huawei Technologies Co., Ltd. | Multi-screen interaction method, devices, and system |
US9455932B2 (en) * | 2014-03-03 | 2016-09-27 | Ericsson Ab | Conflict detection and resolution in an ABR network using client interactivity |
US10142259B2 (en) | 2014-03-03 | 2018-11-27 | Ericsson Ab | Conflict detection and resolution in an ABR network |
US9397947B2 (en) * | 2014-03-11 | 2016-07-19 | International Business Machines Corporation | Quality of experience for communication sessions |
US10264211B2 (en) * | 2014-03-14 | 2019-04-16 | Comcast Cable Communications, Llc | Adaptive resolution in software applications based on dynamic eye tracking |
US9992500B2 (en) * | 2014-03-18 | 2018-06-05 | Intel Corporation | Techniques for evaluating compressed motion video quality |
FR3021489A1 (en) * | 2014-05-22 | 2015-11-27 | Orange | METHOD FOR ADAPTIVE DOWNLOAD OF DIGITAL CONTENT FOR MULTIPLE SCREENS |
US9779307B2 (en) | 2014-07-07 | 2017-10-03 | Google Inc. | Method and system for non-causal zone search in video monitoring |
US10140827B2 (en) | 2014-07-07 | 2018-11-27 | Google Llc | Method and system for processing motion event notifications |
KR102260177B1 (en) | 2014-07-16 | 2021-06-04 | 텐세라 네트워크스 리미티드 | Efficient content delivery over wireless networks using guaranteed prefetching at selected times-of-day |
US11095743B2 (en) | 2014-07-16 | 2021-08-17 | Tensera Networks Ltd. | Optimized content-delivery network (CDN) for the wireless last mile |
US9979796B1 (en) | 2014-07-16 | 2018-05-22 | Tensera Networks Ltd. | Efficient pre-fetching notifications |
US10506027B2 (en) | 2014-08-27 | 2019-12-10 | Tensera Networks Ltd. | Selecting a content delivery network |
US10602388B1 (en) * | 2014-09-03 | 2020-03-24 | Plume Design, Inc. | Application quality of experience metric |
KR102656605B1 (en) * | 2014-11-05 | 2024-04-12 | 삼성전자주식회사 | Method and apparatus to control sharing screen between plural devices and recording medium thereof |
US10079861B1 (en) | 2014-12-08 | 2018-09-18 | Conviva Inc. | Custom traffic tagging on the control plane backend |
US9819972B1 (en) * | 2014-12-10 | 2017-11-14 | Digital Keystone, Inc. | Methods and apparatuses for a distributed live-on-demand (LOD) origin |
US20160182603A1 (en) * | 2014-12-19 | 2016-06-23 | Microsoft Technology Licensing, Llc | Browser Display Casting Techniques |
CN104602113B (en) * | 2014-12-26 | 2018-03-13 | 广东欧珀移动通信有限公司 | A kind of method, apparatus and system realized long distance wireless fidelity and shown |
JP6845808B2 (en) | 2015-02-07 | 2021-03-24 | ジョウ ワン, | Methods and systems for smart adaptive video streaming driven by perceptual quality estimation |
US11622137B2 (en) * | 2015-02-11 | 2023-04-04 | Vid Scale, Inc. | Systems and methods for generalized HTTP headers in dynamic adaptive streaming over HTTP (DASH) |
KR101897959B1 (en) | 2015-02-27 | 2018-09-12 | 쏘닉 아이피, 아이엔씨. | System and method for frame replication and frame extension in live video encoding and streaming |
FR3034608A1 (en) * | 2015-03-31 | 2016-10-07 | Orange | METHOD FOR PRIORIZING MEDIA FLOW IN A COMMUNICATIONS NETWORK |
US20160316392A1 (en) * | 2015-04-27 | 2016-10-27 | Spreadtrum Hong Kong Limited | LTE-WLAN Traffic Offloading Enhancement Using Extended BSS Load Element |
US9723470B1 (en) | 2015-04-30 | 2017-08-01 | Tensera Networks Ltd. | Selective enabling of data services to roaming wireless terminals |
US10149343B2 (en) * | 2015-05-11 | 2018-12-04 | Apple Inc. | Use of baseband triggers to coalesce application data activity |
US10374965B2 (en) | 2015-06-01 | 2019-08-06 | Huawei Technologies Co., Ltd. | Systems and methods for managing network traffic with a network operator |
US10349240B2 (en) | 2015-06-01 | 2019-07-09 | Huawei Technologies Co., Ltd. | Method and apparatus for dynamically controlling customer traffic in a network under demand-based charging |
US10200543B2 (en) | 2015-06-01 | 2019-02-05 | Huawei Technologies Co., Ltd. | Method and apparatus for customer service management for a wireless communication network |
US9361011B1 (en) | 2015-06-14 | 2016-06-07 | Google Inc. | Methods and systems for presenting multiple live video feeds in a user interface |
GB2540450B (en) | 2015-07-08 | 2018-03-28 | Canon Kk | Improved contention mechanism for access to random resource units in an 802.11 channel |
US10154520B1 (en) * | 2015-09-14 | 2018-12-11 | Newracom, Inc. | Methods for random access in a wireless network |
US10334316B2 (en) | 2015-09-18 | 2019-06-25 | At&T Intellectual Property I, L.P. | Determining a quality of experience metric based on uniform resource locator data |
US9942578B1 (en) * | 2015-12-07 | 2018-04-10 | Digital Keystone, Inc. | Methods and apparatuses for a distributed live-on-demand (LOD) origin |
US10314076B2 (en) * | 2016-01-13 | 2019-06-04 | Qualcomm Incorporated | Methods and apparatus for selecting enhanced distributed channel access parameters for different stations |
KR20170087350A (en) * | 2016-01-20 | 2017-07-28 | 삼성전자주식회사 | Electronic device and operating method thereof |
US11240157B1 (en) * | 2016-03-02 | 2022-02-01 | Amazon Technologies, Inc. | Adaptive quality of service marking |
JP7072754B2 (en) | 2016-03-06 | 2022-05-23 | エスシムウェーブ インク. | Methods and systems for automatic quality measurement of streaming video |
KR102598035B1 (en) * | 2016-04-12 | 2023-11-02 | 광동 오포 모바일 텔레커뮤니케이션즈 코포레이션 리미티드 | Method and apparatus for determining codec mode set of service communication |
US10506237B1 (en) * | 2016-05-27 | 2019-12-10 | Google Llc | Methods and devices for dynamic adaptation of encoding bitrate for video streaming |
US20180007428A1 (en) * | 2016-06-29 | 2018-01-04 | Intel Corporation | Wireless display implementation of applications |
US10957171B2 (en) | 2016-07-11 | 2021-03-23 | Google Llc | Methods and systems for providing event alerts |
US10554711B2 (en) * | 2016-09-29 | 2020-02-04 | Cisco Technology, Inc. | Packet placement for scalable video coding schemes |
WO2018084646A1 (en) * | 2016-11-03 | 2018-05-11 | Lg Electronics Inc. | Method and apparatus for transmitting and receiving data in a wireless communication system |
US10834406B2 (en) * | 2016-12-12 | 2020-11-10 | Netflix, Inc. | Device-consistent techniques for predicting absolute perceptual video quality |
EP3379782B1 (en) * | 2017-03-24 | 2019-05-08 | Deutsche Telekom AG | Network entity with network application protocol interface (napi) |
US10390070B2 (en) | 2017-03-31 | 2019-08-20 | Intel Corporation | Methods and apparatus for adaptive video transmission based on channel capacity |
US10484308B2 (en) | 2017-03-31 | 2019-11-19 | At&T Intellectual Property I, L.P. | Apparatus and method of managing resources for video services |
US10819763B2 (en) | 2017-03-31 | 2020-10-27 | At&T Intellectual Property I, L.P. | Apparatus and method of video streaming |
CN107979496B (en) * | 2017-12-07 | 2021-01-05 | 锐捷网络股份有限公司 | Method and server for acquiring user experience quality |
EP3834585A1 (en) * | 2018-08-10 | 2021-06-16 | Telefonaktiebolaget Lm Ericsson (Publ) | User equipment discovery |
US11044185B2 (en) | 2018-12-14 | 2021-06-22 | At&T Intellectual Property I, L.P. | Latency prediction and guidance in wireless communication systems |
US10912105B2 (en) * | 2019-03-28 | 2021-02-02 | Intel Corporation | Apparatus, system and method of wireless video streaming |
US11109394B2 (en) * | 2019-07-30 | 2021-08-31 | Cypress Semiconductor Corporation | Methods, systems and devices for providing differentiated quality of service for wireless communication devices |
CN116261182A (en) * | 2019-09-30 | 2023-06-13 | 华为技术有限公司 | Method and device for negotiating video media |
JP7392374B2 (en) * | 2019-10-08 | 2023-12-06 | ヤマハ株式会社 | Wireless transmitting device, wireless receiving device, wireless system, and wireless transmitting method |
US11159965B2 (en) | 2019-11-08 | 2021-10-26 | Plume Design, Inc. | Quality of experience measurements for control of Wi-Fi networks |
US11233669B2 (en) | 2019-12-17 | 2022-01-25 | Google Llc | Context-dependent in-call video codec switching |
ES2972184T3 (en) * | 2020-06-08 | 2024-06-11 | Fund Privada I2Cat Internet I Innovacio Digital A Catalunya | Computer-implemented procedure for allocating wireless network resources and adaptive video transmission |
JP7552499B2 (en) | 2020-07-14 | 2024-09-18 | 日本電信電話株式会社 | Video quality selection device, video quality selection method and program |
CN113473502B (en) * | 2020-08-27 | 2024-02-02 | 几维通信技术(深圳)有限公司 | Terminal equipment, processing system and optimization method for automatic network optimization |
WO2023004580A1 (en) * | 2021-07-27 | 2023-02-02 | Zte Corporation | A method for quality of experience awareness transmission |
US20230053216A1 (en) * | 2021-08-13 | 2023-02-16 | Qualcomm Incorporated | Techniques for dynamic resolutions |
US11540092B1 (en) * | 2021-10-12 | 2022-12-27 | Verizon Patent And Licensing Inc. | Systems and methods for analyzing and optimizing conference experiences |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070195789A1 (en) * | 2006-02-23 | 2007-08-23 | Freescale Semiconductor Inc | Managing packets for transmission in a communication system |
US20090300687A1 (en) * | 2008-05-28 | 2009-12-03 | Broadcom Corporation | Edge device establishing and adjusting wireless link parameters in accordance with qos-desired video data rate |
US20100082834A1 (en) * | 2008-10-01 | 2010-04-01 | Samsung Electronics Co., Ltd. | Apparatus and method for transmitting and receiving data in a wireless communication network |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7599307B2 (en) * | 2003-08-21 | 2009-10-06 | Vidiator Enterprises Inc. | Quality of experience (QoE) method and apparatus for wireless communication networks |
US8238287B1 (en) * | 2004-10-06 | 2012-08-07 | Marvell International Ltd. | Method and apparatus for providing quality of service (QoS) in a wireless local area network |
CN102075769B (en) * | 2011-01-10 | 2012-11-07 | 苏州博联科技有限公司 | Method for optimizing video QoS of video wireless transmission monitoring system |
-
2011
- 2011-09-30 EP EP11873166.0A patent/EP2761881A4/en not_active Withdrawn
- 2011-09-30 WO PCT/US2011/054406 patent/WO2013048484A1/en active Application Filing
- 2011-09-30 US US13/993,417 patent/US20140219088A1/en not_active Abandoned
- 2011-09-30 CN CN201180075247.4A patent/CN103959798B/en not_active Expired - Fee Related
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20070195789A1 (en) * | 2006-02-23 | 2007-08-23 | Freescale Semiconductor Inc | Managing packets for transmission in a communication system |
US20090300687A1 (en) * | 2008-05-28 | 2009-12-03 | Broadcom Corporation | Edge device establishing and adjusting wireless link parameters in accordance with qos-desired video data rate |
US20100082834A1 (en) * | 2008-10-01 | 2010-04-01 | Samsung Electronics Co., Ltd. | Apparatus and method for transmitting and receiving data in a wireless communication network |
Non-Patent Citations (2)
Title |
---|
HUUSKO ET AL.: "Cross-layer architecture for scalable video transmission in wireless network", SIGNAL PROCESSING IMAGE COMMUNICATION, vol. 22, no. 3, 1 March 2007 (2007-03-01), pages 317 - 330, XP022015416, DOI: doi:10.1016/j.image.2006.12.011 |
See also references of EP2761881A4 |
Cited By (54)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2806633A1 (en) * | 2013-05-23 | 2014-11-26 | Alcatel Lucent | Method and apparatus for improved network optimization for providing video from a plurality of sources to a plurality of clients |
EP2806632A1 (en) * | 2013-05-23 | 2014-11-26 | Alcatel Lucent | Method and apparatus for optimizing video quality of experience in end-to-end video applications |
WO2014187491A1 (en) * | 2013-05-23 | 2014-11-27 | Nokia Solutions And Networks Oy | Methods and apparatus for adapting a data rate |
WO2014187789A1 (en) * | 2013-05-23 | 2014-11-27 | Alcatel Lucent | Method and apparatus for improved network optimization for providing video from a plurality of sources to a plurality of clients |
US10070197B2 (en) | 2013-05-23 | 2018-09-04 | Alcatel Lucent | Method and apparatus for improved network optimization for providing video from a plurality of sources to a plurality of clients |
CN104185285A (en) * | 2013-05-28 | 2014-12-03 | 华为技术有限公司 | Media data transmission method, device and system |
EP2945327A4 (en) * | 2013-05-28 | 2016-04-06 | Huawei Tech Co Ltd | Media data transmission method, apparatus and system |
EP3007458A4 (en) * | 2013-06-06 | 2016-12-07 | Sony Corp | Content supply device, content supply method, program, and content supply system |
CN105247882A (en) * | 2013-06-06 | 2016-01-13 | 索尼公司 | Content supply device, content supply method, program, and content supply system |
US10212208B2 (en) | 2013-06-06 | 2019-02-19 | Saturn Licensing Llc | Content supply device, content supply method, program, and content supply system |
KR20160030517A (en) * | 2013-06-17 | 2016-03-18 | 구글 인코포레이티드 | Method, apparatus and computer-readable medium for media content streaming device setup |
US10965483B2 (en) | 2013-06-17 | 2021-03-30 | Google Llc | Methods, systems, and media for media content streaming device setup |
KR102182041B1 (en) * | 2013-06-17 | 2020-11-23 | 구글 엘엘씨 | Method, apparatus and computer-readable medium for media content streaming device setup |
US11750413B2 (en) | 2013-06-17 | 2023-09-05 | Google Llc | Methods, systems, and media for media content streaming device setup |
US12119956B2 (en) | 2013-06-17 | 2024-10-15 | Google Llc | Methods, systems, and media for media content streaming device setup |
US10051027B2 (en) | 2013-07-22 | 2018-08-14 | Intel Corporation | Coordinated content distribution to multiple display receivers |
EP3025504A4 (en) * | 2013-07-22 | 2017-05-17 | Intel Corporation | Coordinated content distribution to multiple display receivers |
WO2015013811A1 (en) * | 2013-08-02 | 2015-02-05 | Blackberry Limited | Wireless transmission of real-time media |
US10368064B2 (en) | 2013-08-02 | 2019-07-30 | Blackberry Limited | Wireless transmission of real-time media |
US20170104988A1 (en) | 2013-08-02 | 2017-04-13 | Blackberry Limited | Wireless transmission of real-time media |
US9532043B2 (en) | 2013-08-02 | 2016-12-27 | Blackberry Limited | Wireless transmission of real-time media |
US10091267B2 (en) | 2013-08-02 | 2018-10-02 | Blackberry Limited | Wireless transmission of real-time media |
US9525714B2 (en) | 2013-08-02 | 2016-12-20 | Blackberry Limited | Wireless transmission of real-time media |
EP3051830A1 (en) * | 2013-09-27 | 2016-08-03 | Sony Corporation | Content supply device, content supply method, program, terminal device, and content supply system |
US10623463B2 (en) | 2013-09-27 | 2020-04-14 | Saturn Licensing Llc | Content supplying apparatus, content supplying method, program, terminal device, and content supplying system |
EP3051830A4 (en) * | 2013-09-27 | 2017-04-05 | Sony Corporation | Content supply device, content supply method, program, terminal device, and content supply system |
KR102123208B1 (en) | 2013-09-27 | 2020-06-15 | 소니 주식회사 | Content supply device, content supply method, program, terminal device, and content supply system |
US10305953B2 (en) | 2013-09-27 | 2019-05-28 | Saturn Licensing Llc | Content supplying apparatus, content supplying method, program, terminal device, and content supplying system |
KR20160060056A (en) * | 2013-09-27 | 2016-05-27 | 소니 주식회사 | Content supply device, content supply method, program, terminal device, and content supply system |
KR20200013112A (en) * | 2013-09-27 | 2020-02-05 | 소니 주식회사 | Content supply device, content supply method, program, terminal device, and content supply system |
KR102073871B1 (en) | 2013-09-27 | 2020-02-05 | 소니 주식회사 | Content supply device, content supply method, program, terminal device, and content supply system |
EP3065411A4 (en) * | 2013-10-28 | 2017-07-19 | Sony Corporation | Content supplying device, content supplying method, program, terminal device, and content supplying program |
US10637948B2 (en) | 2013-10-28 | 2020-04-28 | Saturn Licensing Llc | Content supply apparatus, content supply method, program, terminal apparatus, and content supply system |
US9363333B2 (en) | 2013-11-27 | 2016-06-07 | At&T Intellectual Property I, Lp | Server-side scheduling for media transmissions |
US10063656B2 (en) | 2013-11-27 | 2018-08-28 | At&T Intellectual Property I, L.P. | Server-side scheduling for media transmissions |
US10516757B2 (en) | 2013-11-27 | 2019-12-24 | At&T Intellectual Property I, L.P. | Server-side scheduling for media transmissions |
US10476930B2 (en) | 2014-01-06 | 2019-11-12 | Intel IP Corporation | Client/server signaling commands for dash |
WO2015103627A1 (en) * | 2014-01-06 | 2015-07-09 | Intel Corporation | Client/server signaling commands for dash |
CN107535002A (en) * | 2015-01-07 | 2018-01-02 | 奥兰治 | System for transmitting packet according to multiple access protocol |
EP3113441A1 (en) * | 2015-06-29 | 2017-01-04 | Alcatel-Lucent España | Dynamic quality of experience improvement in media content streaming |
CN105791408B (en) * | 2016-03-29 | 2019-04-02 | 中国科学院信息工程研究所 | A kind of construction method and system of P2P network |
CN105791408A (en) * | 2016-03-29 | 2016-07-20 | 中国科学院信息工程研究所 | P2P network construction method and system |
EP3497935A1 (en) * | 2016-08-09 | 2019-06-19 | V-Nova International Limited | Adaptive video consumption |
EP3641223A4 (en) * | 2017-07-25 | 2020-04-29 | Huawei Technologies Co., Ltd. | Method for acquiring qoe information, and terminal and network device |
EP3850860A4 (en) * | 2018-09-12 | 2022-05-04 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
EP3850859A4 (en) * | 2018-09-12 | 2022-05-11 | Roku, Inc. | Dynamically adjusting video to improve synchronization with audio |
US11611788B2 (en) | 2018-09-12 | 2023-03-21 | Roku, Inc. | Adaptive switching in a whole home entertainment system |
CN112823527B (en) * | 2018-10-05 | 2023-06-16 | 交互数字Ce专利控股公司 | Method implemented at a device capable of running an adaptive streaming session and corresponding device |
CN112823527A (en) * | 2018-10-05 | 2021-05-18 | 交互数字Ce专利控股公司 | Method implemented at a device capable of running an adaptive streaming session and corresponding device |
US11936704B2 (en) | 2018-10-05 | 2024-03-19 | Interdigital Madison Patent Holdings, Sas | Method to be implemented at a device able to run one adaptive streaming session, and corresponding device |
EP3672154A1 (en) * | 2018-12-20 | 2020-06-24 | Fundacion Centro De Tecnologias De Interaccion Visual Y Comunicaciones Vicomtech | Optimising transmission of streaming video contents based on qoe metrics |
US11070445B2 (en) | 2019-01-25 | 2021-07-20 | Tambora Systems Singapore Pte. Ltd. | System and method for optimization of an over-the-top (OTT) platform |
CN114143300A (en) * | 2021-11-25 | 2022-03-04 | 中国银行股份有限公司 | Transaction request sending method and device |
CN114143300B (en) * | 2021-11-25 | 2024-04-19 | 中国银行股份有限公司 | Transaction request sending method and device |
Also Published As
Publication number | Publication date |
---|---|
US20140219088A1 (en) | 2014-08-07 |
CN103959798B (en) | 2018-06-08 |
CN103959798A (en) | 2014-07-30 |
EP2761881A1 (en) | 2014-08-06 |
EP2761881A4 (en) | 2015-06-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN103959798B (en) | Quality of experience enhancing on wireless network | |
US10511991B2 (en) | Adapting communication parameters to link conditions, traffic types, and/or priorities | |
US10623928B2 (en) | Terminal node, method, storage medium for video data transmission | |
US20210410168A1 (en) | Service data transmission method, network device, and terminal device | |
US10721715B2 (en) | Link-aware streaming adaptation | |
KR101489414B1 (en) | Systems and methods for detection for prioritizing and scheduling packets in a communication network | |
KR102097538B1 (en) | Congestion induced video scaling | |
CN109039495B (en) | QoE-aware radio access network architecture for HTTP-based video streaming | |
US10097946B2 (en) | Systems and methods for cooperative applications in communication systems | |
US20140155043A1 (en) | Application quality management in a communication system | |
US20140153392A1 (en) | Application quality management in a cooperative communication system | |
KR20140147871A (en) | Systems and methods for application-aware admission control in a communication network | |
EP3280208B1 (en) | Cooperative applications in communication systems | |
EP3179812B1 (en) | Cooperative applications in communication systems | |
Ma et al. | Access point centric scheduling for dash streaming in multirate 802.11 wireless network | |
Debnath | A Novel QoS-aware MPEG-4 video delivery algorithm over the lossy IEEE 802.11 WLANs to improve the video quality | |
WO2015085525A1 (en) | Method and device for realizing quality of experience (qoe) | |
Peart et al. | Improved quality of service utilising high priority traffic in HCF in a dynamically changing wireless networks | |
Longhao | Innovative content delivery solutions in the future network heterogeneous environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 11873166 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011873166 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13993417 Country of ref document: US |