WO2012142042A1 - Satellite telecommunications using a computer system adapted for harsh environments - Google Patents
Satellite telecommunications using a computer system adapted for harsh environments Download PDFInfo
- Publication number
- WO2012142042A1 WO2012142042A1 PCT/US2012/032928 US2012032928W WO2012142042A1 WO 2012142042 A1 WO2012142042 A1 WO 2012142042A1 US 2012032928 W US2012032928 W US 2012032928W WO 2012142042 A1 WO2012142042 A1 WO 2012142042A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- audio
- portable
- streams
- video
- data streams
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04H—BROADCAST COMMUNICATION
- H04H60/00—Arrangements for broadcast applications with a direct linking to broadcast information or broadcast space-time; Broadcast-related systems
- H04H60/02—Arrangements for generating broadcast information; Arrangements for generating broadcast-related information with a direct linking to broadcast information or to broadcast space-time; Arrangements for simultaneous generation of broadcast information and broadcast-related information
- H04H60/04—Studio equipment; Interconnection of studios
- H04H60/05—Mobile studios
Definitions
- the present disclosure generally relates to telecommunications, and more particularly to a computer system adapted for satellite telecommunications.
- the distribution of content related to news and current events is a multi-billion dollar industry.
- the news media is becoming considerably more robust as consumers are growing accustomed to, and expect more from, broadband connections, mobile device access, high resolution images, faster connects, richer data, greater search-ability, and other factors.
- the news media is one of the most rapidly evolving e-commerce industries as a result of advancing mobile technologies— supporting broadcast quality image capture over high speed wireless.
- Today, media organizations are required to dispatch expensive electronic news gathering crews to breaking news events using expensive camera decks, satellite trucks and costly labor in order to capture time-critical content.
- the present application describes a satellite communication system that allows for increased bandwidth.
- At least two portable satellite antennas are used.
- the satellite antennas are typically portable and can be carried by a user to remote areas.
- a portable computer system can be used and coupled to the satellite antennas via wired or wireless connection.
- the portable computer system can be adapted for news coverage and can, therefore, receive live audio and video feeds.
- a multiplexer can be used to divide the audio and video data streams into multiple data streams for transmission over the satellite antennas.
- the portable computer system can include a demultiplexer for receiving an audio signal with or without an associated video signal from the two or more satellite antennas.
- two-way communication can be implemented to allow news reporters to broadcast and receive questions from remote areas.
- the portable computer system can be designed for rugged conditions.
- the portable computer system can be housed in a waterproof case.
- a flash drive can be used instead of a hard drive to reduce or eliminate mechanical motion in the computer system.
- duplicate audio packets can be transmitted if it is determined that there is adequate bandwidth.
- the audio feed is more important than the video feed.
- redundant packets sent if one of the packets is corrupted, the other packet can be used.
- FIG. 1 is system diagram showing multiple satellite antennas used to transmit an audio/video signal from a client to a server computer.
- FIG. 2 is an example client computer of FIG. 1.
- FIG. 3 is an example server computer of FIG. 1.
- FIG. 4 is a flowchart of an embodiment for transmitting data streams over multiple satellite antennas in parallel.
- FIG. 5 is a flowchart of a method for configuring antennas and the server computer from the client computer.
- FIG. 6 is a flowchart of a method for transmitting and receiving dual data streams over multiple satellite antennas.
- FIG. 7 is a flowchart of a method for reconstructing the audio and video streams.
- FIG. 8 is a flowchart of a method for transmitting redundant audio streams.
- FIG. 9 shows an exemplary IP header.
- FIG. 10 shows an exemplary RTP header.
- FIG. 11 shows an exemplary client-side software model.
- FIG. 12 shows an exemplary server-side software model.
- FIG. 1 is a system diagram of a client computer 110 communicating with a server computer 112 via a satellite 114.
- Two satellite antennas 116, 118 can be used for parallel communication to the satellite 114. Although two satellite antennas are shown, additional antennas can be used. Parallel communication from the client computer over the antennas 116, 118 increases bandwidth and provides a more stable and reliable communication path, as further described below.
- the client computer can be housed in a water-proof case to allow transportation to remote locations where harsh environmental conditions can exist.
- the client computer 110 can be adapted for receiving audio/video signals from a portable news camera 120 that can capture a live news broadcast from a news person 122.
- the news person 122 can also receive audio 126 in return so as to establish two-way communicate via satellite with another news person 130 in a newsroom.
- the audio/video signals can be split into two data streams for parallel
- the antennas 116, 118 can be any desired antennas for communicating with a satellite.
- Example antennas are broadband global area network (BGAN) antennas.
- BGAN broadband global area network
- Such antennas are normally used to connect a portable computer (e.g., laptop) to broadband Internet in remote locations, although as long as line of sight to the satellite exists, the terminal can be used anywhere.
- the BGAN terminal is about the size of a laptop (i.e., it is sized to be handheld) and can be easily carried into remote areas, unlike other satellite Internet services, which require bulky and heavy satellite dishes.
- the satellite 114 can receive the parallel communications from the antennas 116, 118 and transmit the same to an Internet server 140, which then transmits the parallel communications over the Internet 142 to the destination server 112.
- the Internet server 140 is typically a server controlled by a company that owns the satellite 114. It is understood that the Internet server 140 can establish two-way communication with satellite 140 via a fixed antenna system (not shown). As further described below, the server 112 receives the parallel signals and reconstructs the transmitted audio and video streams and synchronizes them for display on television monitor 150. A microphone 152 can receive audio signals from the news person 130 and transmit the audio data stream to the server 112. The server can split the data stream into parallel streams for transmission over the Internet 142, through the satellite, 114, for receipt by the portable antennas 116, 118. The parallel audio signals can then be reconstructed in the client computer 110, and transmitted 126 to the news person 122.
- the system provides independent multi-path, multi-medium data flows by multiplexing and de-multiplexing streams implementing a virtual bonded data path.
- Back-channel support is provided to allow for feedback loops such as those utilized by live news feeds.
- the field apparatus is packaged in a small (e.g., 12.5" x 10.1" x 6") and lightweight (e.g., ⁇ 10 lbs.) form factor for ease of portability and remote use.
- the field apparatus is ruggedized to provide IP67 ingress protection, as well as high shock and vibration protection, to allow operation in the most extreme weather and environmental conditions.
- the field apparatus is designed to operate at high temperatures (e.g. ,158 degrees Fahrenheit), and runs for longer than 6.5 hours on a single battery charge.
- the field apparatus uses global satellite communications as the primary transmission medium, thereby, generally providing broader coverage than cellular, Wi-Fi, and other local area or land-based technologies.
- the field apparatus may employ other land-based or wireless communication known in the art, including, cellular and Wi-Fi.
- FIG. 2 is an exemplary hardware diagram of the client computer 110.
- the water- tight case 202 is shown generically and is generally constructed to withstand an impact without damaging.
- the client computer 110 can include a battery 206, a processor 208, a flash memory 210, a touch screen 212, a touch screen controller 214, multiplexer 216, demultiplexer 218, and I/O ports 220, 222. Other components can be used. The components are generally coupled together, although not all connections are shown for purposes of clarity.
- the battery 206 allows the computer 110 to operate autonomously in remote locations and is used to power the other components.
- An example battery 206 can include of two Energizer Energi To Go XP 18000 (18000 mAh @ 5 V power capacity) batteries connected in series.
- the processor 208 can be any type of desired controller, as is well understood in the art. Generally, the processor 208 receives instructions from flash memory 210 and executes the instructions to perform the transmission and reception of data streams.
- the flash memory 210 can be used instead of a hard drive to increase durability. Processing of data streams can be performed by the processor to decide how to split the data streams for transmission over the antennas 116, 118 through control of the multiplexer 216. Furthermore, processing of parallel received data streams can be used to reconstruct the data streams for transmission 126 through control of demultiplexer 218.
- the touch screen 212 allows the user to input commands to configure the antennas 116, 118 and configure the server 112, remotely.
- configuration data can be sent as a parallel transmission over the antennas 116, 118 and can be used to configure one or more of the following transmission parameters: bandwidth, frame size, and/or frame rate.
- the touch screen controller 214 interprets user input to the touch screen 212 and sends the input signals to the processor 208 for further action or to the USB interface.
- the multiplexer 216 is used to combine the audio/video data streams and create two data streams A/Vl and A/V2 for transmission over I/O port 222.
- the client computer 110 can be directly coupled to the antennas 116, 118 or wirelessly connected, as well understood in the art.
- the demultiplexer 218 can be used to reconstruct the audio and/or video signals received from the antennas 116, 118.
- Transmission of the reconstructed data streams can be transmitted through I/O port 220 for receipt by the user 122.
- FIG. 3 shows further details of an exemplary server 112.
- the server 112 can include an I/O port 302, a processor 304, a multiplexer 306, a demultiplexer 308, an I/O port 310, and memory 312 for storing configuration data.
- the server 112 can include other components, as is well understood in the art.
- the processor 304 can control the demultiplexer 308 and receive the parallel data streams from the I/O 302 for further processing.
- the server-side processing can use configuration data previously received from the client computer 110 and stored in memory 312.
- Configuration data can include various transmission parameters, such as bandwidth, frame size, frame rate, etc.
- Audio and/or video signals can be received through I/O port 310, processed and transmitted via multiplexer 306 as parallel data streams to the Internet for subsequent satellite transmission to the client computer.
- FIG. 4 is a flowchart of a method for communicating via satellite.
- audio and video data streams are received.
- data streams are received from a live user transmission, such as a news broadcaster performing a live story.
- the data streams are split into multiplexed data streams. Splitting the data streams can take a variety of forms and can change based on the desired implementation.
- the multiplexer 216 accepts the audio and video data streams and splits the streams into independent streams to be sent over the multiple antennas.
- the multiplexer can accept configuration requests from the user interface, such as setting the burst rate parameters for transmitting stream parts.
- the streams can split to be an audio only and video only streams. Alternatively, the audio and video can be combined into each stream to better even out the packet size transmitted over each antenna in parallel.
- the multiplexed data streams are transmitted to the portable satellite antennas for parallel transmission. In the event that there are more than two antennas, the multiplexer can be designed to generate additional data streams to match the number of antennas.
- FIG. 5 is a flowchart of a method for configuring the portable antennas and the server computer.
- process block 510 user input data is received from the user interface.
- the user interface has touch screen commands that guide a user through commands for configuration.
- the portable satellite antennas are configured based on the received input. The configuration depends on the antennas used.
- the server is configured remotely by sending the user interface commands over the antennas 116, 118, in parallel and to the server via satellite. The server can reconstruct the commands by receiving the parallel data streams, demultiplexing the streams and placing the packets in proper order.
- FIG. 6 is a flowchart of a method for transmitting and receiving data streams via satellite over the multiple antennas.
- an audio/video data stream is received from a camera and microphone.
- the data streams are split and transmitted in parallel. Each data stream can include packets of audio and video data.
- process block 630 while transmitting the audio/video data streams, multiple data streams are received in parallel.
- the multiple streams can include just audio or audio and video.
- process block 640 the audio/video streams are re-joined to construct individual audio and/or video streams so that the user 122 can listen and/or watch the news person 130 positioned at the server location.
- the system allows for two-way communication over the portable satellite antennas.
- FIG. 7 is a flowchart of a method for reordering and synchronizing packet data.
- the multiplexed signals are received from the Internet.
- the signals are demultiplexed into the audio and video streams.
- sequence numbers can be obtained from the packet headers. The sequence numbers can be used to order the packets of data in the correct order, which corresponds to the originating
- FIG. 8 is a flowchart of a method for generating duplicate audio streams.
- the bandwidth is monitored. Typically, such monitoring is done on the server side based on the packet receipt rate. Alternatively, monitoring can be done on the client side based on received packets. The bandwidth information can then be passed back to the client 110.
- the client can submit duplicate audio streams to the multiplexer 216. Redundant audio streams provides a higher probability that one of the audio data streams is received at the server without corruption.
- FIG. 9 shows an example IP header that can be used for sequencing the audio and video streams and for synchronizing the streams.
- the IP header includes a 32-bit sequence number. The sequence number can be used for ordering the audio stream and the video stream at the server. However, each stream can have its own independent sequence, so to synchronize the two, the options field shown at 910 can be used.
- the synchronization information stored can include a packet number of the audio stream and a packet number of the video stream so that the two corresponding packets can be aligned.
- the audio and video streams are sent over I/O port 310 to be broadcast on the monitor 150.
- FIG. 10 shows an example RTP header.
- RTP provides end-to-end network transport functions suitable for applications transmitting real-time data, such as audio or video data.
- FIGs. 11 and 12 show exemplary top-level software and data flow diagrams for the client and server.
- the shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters.
- the shared data region is also used to provide a mechanism for inter-task communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data.
- API Application Programming Interface
- the A/V encoder encodes data in a well-known manner.
- the encoded audio and video streams are transmitted to the multiplexer task via the local loopback address at a predetermined ports for the audio stream and for the video stream.
- the multiplexer process accepts the audio and video streams from the A/V encoder, splits the audio and video streams, and re-transmits the stream parts to the server de-multiplexer via the satellite terminals.
- streams are transmitted to a single I/O interface.
- stream packets are tagged and multiplexed between the two I/O interfaces to take advantage of non-coupled available bandwidth.
- the multiplexer process accepts configuration requests from the user interface manager process. Configuration requests can be used to set transmission and stream processing parameters. Multiplexer status is maintained in the shared data segment.
- the system monitor and control process is used to monitor system temperature, and to control the A/V encoder. System temperature is accessed via the bus of the computer system.
- A/V encoder control is accomplished via an interface provided by the A/V encoder application.
- the system monitor and control process accepts control requests from the user interface manager process for controlling the A/V encoder process. System monitor and control status is maintained in the shared data segment.
- the configuration manager process is used to manage system configuration settings. System configuration data includes IP address to receiver mappings, and transmit operation modes. System configuration data is stored to, and retrieved from, the solid state compact flash drive of the computer system.
- the configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of configuration data. Operational configuration parameters are maintained in the shared data segment.
- the terminal interface manager process is used to manage the configuration and operation of the satellite terminals.
- the terminal interface manager uses the AT command spec for the satellite terminals to perform initial configuration and setup of the terminals, perform control and monitoring of terminal initialization, perform control and monitoring of terminal broadcast operation, perform monitoring of terminal status, and to restore terminal configuration.
- Initial terminal configuration includes gathering restore point of terminal configuration, enabling wireless option of terminal, and modifying IP access parameters of terminal.
- Control and monitoring of terminal includes monitoring terminal pointing status, terminal ready status, terminal GPS status, terminal battery status, terminal temperature status, terminal connection status, and terminal transmit status.
- the terminal interface manager accepts terminal requests from the user interface manager process. Requests include connection commands, and setup/restore commands. Terminal operation and status data is maintained in the shared data segment.
- the user interface manager process provides operational screens and menus for display on the touch screen, manages user input requests, and displays system operational parameters and status on the touch screen.
- the user interface manager process issues requests to the multiplexer process, the system monitor and control process, the configuration manager process, and the terminal interface manager process.
- the user interface manager accesses system operational parameters and status in the shared data segment.
- a variety of satellite antennas can be used, such as the Hughes 9201 BGAN terminal.
- the de-multiplexer process accepts the single or dual IFB audio stream(s) from the server.
- the demultiplexer provides a single IFB audio stream to the audio decoder process.
- the audio decoder accepts a single audio stream from the De-Multiplexer task and decodes the stream for analog output.
- the decoded audio stream is provided to the audio line-out of the computer system. This is provided for IFB audio support.
- the server interface manager provides a control and status interface between the client and server systems. This interface is used to generate and process server interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions.
- the server interface manager along with the caster interface manager on the server-side system, provide a tightly-coupled closed-loop control and monitoring capability for the client and server systems.
- the bandwidth test function executes the bandwidth test with the corresponding function on the server-side system.
- the control and status of the test is managed via the Server Interface Manager.
- the watchdog task is used to monitor the health and execution the client software system.
- a watchdog timer can be maintained for each system task, and health/status information is monitored via the shared data region. Expired watchdog timers indicated task- level failures.
- the watchdog task provides a system level of fail-safe and error recovery capabilities required to support reliable operation.
- the updater provides the capability for field updates of the client system via the external USB interface.
- the updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates.
- the updater task also provides a controlled startup environment for the client system. Software system tasks are prioritized and sequenced for startup and execution.
- FIG. 12 shows the server-side top-level software and data flow.
- the shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters.
- the shared data region is also used to provide a mechanism for intertask communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data.
- API Application Programming Interface
- the audio encoder is similar to the client side encoder already described. The encoded IFB audio stream is transmitted to the multiplexer task via the local loopback address at a predetermined port.
- the multiplexer process accepts the audio stream from the audio encoder, duplicates the stream if configured, and re-transmits the stream parts to client-side de-multiplexer task.
- the multiplexer process accepts configuration requests from the user interface manager process. Configuration requests can be used to set redundancy parameters used by the multiplexer process for transmitting the audio stream. Multiplexer status is maintained in the shared data segment.
- the system monitor and control process is used to control the A/V encoder. A/V encoder control is accomplished via the COM/OLE interface provided by the encoder application.
- the system monitor and control process accepts control requests from the client interface manager process for controlling the encoder process. System monitor and control status is maintained in the shared data segment.
- the configuration manager process is used to manage system configuration settings.
- System configuration data includes the system IP address setting. System configuration data is stored to, and retrieved from, the hard drive of the computer system.
- the configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of configuration data. Operational configuration parameters are maintained in the shared data segment.
- the de-multiplexer process accepts the A/V streams from the client-side multiplexer. The streams are "re-bonded" to form complete single streams. Error correction is applied if necessary.
- the de-multiplexer provides complete A/V streams to the A/V decoder process.
- the A/V decoder process accepts the audio and video streams from the de-multiplexer process and generates a composite audio/video signal for output.
- the interface manager provides a control and status interface between the server and client systems. This interface is used to process and respond to interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions.
- the interface manager together with the server interface manager on the client- side system, provide a tightly-coupled closed-loop control and monitoring capability for the server and client systems.
- the bandwidth test function executes the bandwidth test with the corresponding function on the client- side system.
- the control and status of the test is managed via the client-side Interface Manager.
- the updater provides the capability for field updates of the server system via the external USB interface.
- the updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates.
- the updater task also provides a controlled startup environment for the server system. Software system tasks are prioritized and sequenced for startup and execution.
- the multiplexer can split the audio and video streams received from the A/V encoder process. Splitting is accomplished by alternating one or more TCP/IP data packets between transmit queues connected to each of the BGAN terminals. In one embodiment, the multiplexing alternates each packet as it is received.
- the user interface manager process provides requests to alter this scheme by selecting various burst rates. In burst rate mode, the multiplexer queues the specified number of packets in sequence to a transmit queue before alternating to the opposite transmit queue - thus creating a "burst" of packets through each queue. In either mode, the audio stream and video stream are initialized to begin transmission on opposite transmission queues. Audio/video synchronization keys are stored in the options field of the IP header utilizing unused and experimental option values.
- the split streams are transmitted using the Real- Time Transport Protocol (RTP).
- RTP Real- Time Transport Protocol
- the de-multiplexer reconstructs the split audio and video streams transmitted by the multiplexer. Reconstruction is accomplished by using the sequence number field in the RTP header to insure that the order of the packets is restored for each data stream. Additionally, synchronization between the audio and video streams is achieved by using keying information stored by the multiplexer in the IP header options field.
- the reconstruction process uses a ring-buffer and a user specified buffer delay time to store and order the incoming stream data packets. Packets for a reconstruction sequence not received within the buffer delay window are considered lost.
- a polarity algorithm is used to manage sequences that incur rollover of the RTP header sequence number. When rollover occurs, packets are assigned polarity by comparing the sequence number of the packet to the largest allowed sequence number (Oxffff). The following code demonstrates the calculation of polarity:
- dif_2 IPid_WindowStartElement
- Reconstructed streams are transmitted to the A/V decoder process.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
Abstract
A satellite communication system is disclosed that allows for increased bandwidth. In one embodiment, at least two satellite antennas are used. The satellite antennas are typically portable and can be carried by a user to remote areas. A portable computer system can be used and coupled to the satellite antennas via wired or wireless connection. The portable computer system can be adapted for news coverage and can, therefore, receive a live audio and video feed. A multiplexer can be used to divide the audio and video data streams into multiple data streams for transmission over the satellite antennas. The audio and video feeds are transmitted with increased bandwidth due to the use of multiple- satellite antennas transmitting in parallel.
Description
SATELLITE TELECOMMUNICATIONS USING A COMPUTER SYSTEM ADAPTED FOR HARSH ENVIRONMENTS
Cross Reference to Related Application
This application claims priority to U.S. Provisional Patent Application No.
61/473,784, filed on April 10, 2011, which is incorporated by reference herein in its entirety.
Field The present disclosure generally relates to telecommunications, and more particularly to a computer system adapted for satellite telecommunications.
Background
In terms of news media, more attention is being shifted to cable, broadcast, Internet, blogs, mobile devices and other vehicles for delivery of rich video content.
Irrespective of medium, the distribution of content related to news and current events is a multi-billion dollar industry. By all measures the news media is becoming considerably more robust as consumers are growing accustomed to, and expect more from, broadband connections, mobile device access, high resolution images, faster connects, richer data, greater search-ability, and other factors. The news media is one of the most rapidly evolving e-commerce industries as a result of advancing mobile technologies— supporting broadcast quality image capture over high speed wireless. Today, media organizations are required to dispatch expensive electronic news gathering crews to breaking news events using expensive camera decks, satellite trucks and costly labor in order to capture time-critical content.
Such crews are well adapted to city streets in local U.S. cities. However, there are reporters that travel to other countries or in remote areas where cell phone
communication is on a different network or is not available at all. In such a case, reporters can communicate via satellite communications, but the bandwidth is limited. Additionally, current satellite communication devices are not well adapted to rugged locations. Summary
The present application describes a satellite communication system that allows for increased bandwidth.
In one embodiment, at least two portable satellite antennas are used. The satellite antennas are typically portable and can be carried by a user to remote areas. A portable computer system can be used and coupled to the satellite antennas via wired or wireless connection. The portable computer system can be adapted for news coverage and can, therefore, receive live audio and video feeds. A multiplexer can be used to divide the audio and video data streams into multiple data streams for transmission over the satellite antennas. In another embodiment, the portable computer system can include a demultiplexer for receiving an audio signal with or without an associated video signal from the two or more satellite antennas. Thus, two-way communication can be implemented to allow news reporters to broadcast and receive questions from remote areas.
In another embodiment, the portable computer system can be designed for rugged conditions. For example, the portable computer system can be housed in a waterproof case. Additionally, a flash drive can be used instead of a hard drive to reduce or eliminate mechanical motion in the computer system.
In another embodiment, duplicate audio packets can be transmitted if it is determined that there is adequate bandwidth. In news reporting, the audio feed is more important than the video feed. With redundant packets sent, if one of the packets is corrupted, the other packet can be used.
The foregoing and other objects, features, and advantages of the invention will become more apparent from the following detailed description, which proceeds with reference to the accompanying figures.
Brief Description of the Drawings FIG. 1 is system diagram showing multiple satellite antennas used to transmit an audio/video signal from a client to a server computer.
FIG. 2 is an example client computer of FIG. 1.
FIG. 3 is an example server computer of FIG. 1.
FIG. 4 is a flowchart of an embodiment for transmitting data streams over multiple satellite antennas in parallel.
FIG. 5 is a flowchart of a method for configuring antennas and the server computer from the client computer.
FIG. 6 is a flowchart of a method for transmitting and receiving dual data streams over multiple satellite antennas. FIG. 7 is a flowchart of a method for reconstructing the audio and video streams.
FIG. 8 is a flowchart of a method for transmitting redundant audio streams.
FIG. 9 shows an exemplary IP header.
FIG. 10 shows an exemplary RTP header.
FIG. 11 shows an exemplary client-side software model. FIG. 12 shows an exemplary server-side software model.
Detailed Description of Exemplary Embodiments
FIG. 1 is a system diagram of a client computer 110 communicating with a server computer 112 via a satellite 114. Two satellite antennas 116, 118 can be used for
parallel communication to the satellite 114. Although two satellite antennas are shown, additional antennas can be used. Parallel communication from the client computer over the antennas 116, 118 increases bandwidth and provides a more stable and reliable communication path, as further described below. The client computer can be housed in a water-proof case to allow transportation to remote locations where harsh environmental conditions can exist. The client computer 110 can be adapted for receiving audio/video signals from a portable news camera 120 that can capture a live news broadcast from a news person 122. The news person 122 can also receive audio 126 in return so as to establish two-way communicate via satellite with another news person 130 in a newsroom. As further described below, the audio/video signals can be split into two data streams for parallel
communications over the antennas 116, 118. The antennas 116, 118 can be any desired antennas for communicating with a satellite. Example antennas are broadband global area network (BGAN) antennas. Such antennas are normally used to connect a portable computer (e.g., laptop) to broadband Internet in remote locations, although as long as line of sight to the satellite exists, the terminal can be used anywhere. The BGAN terminal is about the size of a laptop (i.e., it is sized to be handheld) and can be easily carried into remote areas, unlike other satellite Internet services, which require bulky and heavy satellite dishes. The satellite 114 can receive the parallel communications from the antennas 116, 118 and transmit the same to an Internet server 140, which then transmits the parallel communications over the Internet 142 to the destination server 112. The Internet server 140 is typically a server controlled by a company that owns the satellite 114. It is understood that the Internet server 140 can establish two-way communication with satellite 140 via a fixed antenna system (not shown). As further described below, the server 112 receives the parallel signals and reconstructs the transmitted audio and video streams and synchronizes them for display on television monitor 150. A microphone 152 can receive audio signals from the news person 130 and transmit the audio data stream to the server 112. The server can split
the data stream into parallel streams for transmission over the Internet 142, through the satellite, 114, for receipt by the portable antennas 116, 118. The parallel audio signals can then be reconstructed in the client computer 110, and transmitted 126 to the news person 122. The system provides independent multi-path, multi-medium data flows by multiplexing and de-multiplexing streams implementing a virtual bonded data path. Back-channel support is provided to allow for feedback loops such as those utilized by live news feeds. The field apparatus is packaged in a small (e.g., 12.5" x 10.1" x 6") and lightweight (e.g., < 10 lbs.) form factor for ease of portability and remote use. The field apparatus is ruggedized to provide IP67 ingress protection, as well as high shock and vibration protection, to allow operation in the most extreme weather and environmental conditions. The field apparatus is designed to operate at high temperatures (e.g. ,158 degrees Fahrenheit), and runs for longer than 6.5 hours on a single battery charge. The field apparatus uses global satellite communications as the primary transmission medium, thereby, generally providing broader coverage than cellular, Wi-Fi, and other local area or land-based technologies. Alternatively, the field apparatus may employ other land-based or wireless communication known in the art, including, cellular and Wi-Fi.
FIG. 2 is an exemplary hardware diagram of the client computer 110. The water- tight case 202 is shown generically and is generally constructed to withstand an impact without damaging. The client computer 110 can include a battery 206, a processor 208, a flash memory 210, a touch screen 212, a touch screen controller 214, multiplexer 216, demultiplexer 218, and I/O ports 220, 222. Other components can be used. The components are generally coupled together, although not all connections are shown for purposes of clarity. The battery 206 allows the computer 110 to operate autonomously in remote locations and is used to power the other components. An example battery 206 can include of two Energizer Energi To Go XP 18000 (18000 mAh @ 5 V power capacity) batteries connected in series. When fully charged, the battery pack can provide over 6.5 hours of operation time. Other
batteries can be used. The processor 208 can be any type of desired controller, as is well understood in the art. Generally, the processor 208 receives instructions from flash memory 210 and executes the instructions to perform the transmission and reception of data streams. The flash memory 210 can be used instead of a hard drive to increase durability. Processing of data streams can be performed by the processor to decide how to split the data streams for transmission over the antennas 116, 118 through control of the multiplexer 216. Furthermore, processing of parallel received data streams can be used to reconstruct the data streams for transmission 126 through control of demultiplexer 218. The touch screen 212 allows the user to input commands to configure the antennas 116, 118 and configure the server 112, remotely. When configuring the server 112, configuration data can be sent as a parallel transmission over the antennas 116, 118 and can be used to configure one or more of the following transmission parameters: bandwidth, frame size, and/or frame rate. The touch screen controller 214 interprets user input to the touch screen 212 and sends the input signals to the processor 208 for further action or to the USB interface. The multiplexer 216 is used to combine the audio/video data streams and create two data streams A/Vl and A/V2 for transmission over I/O port 222. The client computer 110 can be directly coupled to the antennas 116, 118 or wirelessly connected, as well understood in the art. The demultiplexer 218 can be used to reconstruct the audio and/or video signals received from the antennas 116, 118.
Transmission of the reconstructed data streams can be transmitted through I/O port 220 for receipt by the user 122.
FIG. 3 shows further details of an exemplary server 112. The server 112 can include an I/O port 302, a processor 304, a multiplexer 306, a demultiplexer 308, an I/O port 310, and memory 312 for storing configuration data. The server 112 can include other components, as is well understood in the art. The processor 304 can control the demultiplexer 308 and receive the parallel data streams from the I/O 302 for further processing. The server-side processing can use configuration data previously received from the client computer 110 and stored in memory 312. Configuration
data can include various transmission parameters, such as bandwidth, frame size, frame rate, etc. Audio and/or video signals can be received through I/O port 310, processed and transmitted via multiplexer 306 as parallel data streams to the Internet for subsequent satellite transmission to the client computer. FIG. 4 is a flowchart of a method for communicating via satellite. In process block 410, audio and video data streams are received. Typically, such data streams are received from a live user transmission, such as a news broadcaster performing a live story. In process block 420, the data streams are split into multiplexed data streams. Splitting the data streams can take a variety of forms and can change based on the desired implementation. In any event, the multiplexer 216 accepts the audio and video data streams and splits the streams into independent streams to be sent over the multiple antennas. The multiplexer can accept configuration requests from the user interface, such as setting the burst rate parameters for transmitting stream parts. The streams can split to be an audio only and video only streams. Alternatively, the audio and video can be combined into each stream to better even out the packet size transmitted over each antenna in parallel. In process block 430, the multiplexed data streams are transmitted to the portable satellite antennas for parallel transmission. In the event that there are more than two antennas, the multiplexer can be designed to generate additional data streams to match the number of antennas. FIG. 5 is a flowchart of a method for configuring the portable antennas and the server computer. In process block 510, user input data is received from the user interface. The user interface has touch screen commands that guide a user through commands for configuration. In process block 520, the portable satellite antennas are configured based on the received input. The configuration depends on the antennas used. In process block 530, the server is configured remotely by sending the user interface commands over the antennas 116, 118, in parallel and to the server via satellite. The server can reconstruct the commands by receiving the parallel data streams, demultiplexing the streams and placing the packets in proper order.
FIG. 6 is a flowchart of a method for transmitting and receiving data streams via satellite over the multiple antennas. In process block 610, an audio/video data stream is received from a camera and microphone. In process block 620, the data streams are split and transmitted in parallel. Each data stream can include packets of audio and video data. In process block 630, while transmitting the audio/video data streams, multiple data streams are received in parallel. The multiple streams can include just audio or audio and video. In process block 640, the audio/video streams are re-joined to construct individual audio and/or video streams so that the user 122 can listen and/or watch the news person 130 positioned at the server location. Thus, the system allows for two-way communication over the portable satellite antennas.
FIG. 7 is a flowchart of a method for reordering and synchronizing packet data. In process block 710, on the server side, the multiplexed signals are received from the Internet. In process block 720, the signals are demultiplexed into the audio and video streams. Once the signals are demultiplexed, sequence numbers can be obtained from the packet headers. The sequence numbers can be used to order the packets of data in the correct order, which corresponds to the originating
audio/video signals received from the client computer. In process block 740, synchronization information is obtained from the IP header in order to synchronize the audio and video. FIG. 8 is a flowchart of a method for generating duplicate audio streams. In process block 810, the bandwidth is monitored. Typically, such monitoring is done on the server side based on the packet receipt rate. Alternatively, monitoring can be done on the client side based on received packets. The bandwidth information can then be passed back to the client 110. In process block 820, if the available bandwidth exceeds a predetermined threshold, the client can submit duplicate audio streams to the multiplexer 216. Redundant audio streams provides a higher probability that one of the audio data streams is received at the server without corruption. In process block 830, the video and duplicate audio streams are transmitted in parallel.
FIG. 9 shows an example IP header that can be used for sequencing the audio and video streams and for synchronizing the streams. The IP header includes a 32-bit sequence number. The sequence number can be used for ordering the audio stream and the video stream at the server. However, each stream can have its own independent sequence, so to synchronize the two, the options field shown at 910 can be used. The synchronization information stored can include a packet number of the audio stream and a packet number of the video stream so that the two corresponding packets can be aligned. Once synchronized and ordered, the audio and video streams are sent over I/O port 310 to be broadcast on the monitor 150. FIG. 10 shows an example RTP header. RTP provides end-to-end network transport functions suitable for applications transmitting real-time data, such as audio or video data.
FIGs. 11 and 12 show exemplary top-level software and data flow diagrams for the client and server. First, for FIG. 11, the shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters. The shared data region is also used to provide a mechanism for inter-task communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data. The A/V encoder encodes data in a well-known manner. The encoded audio and video streams are transmitted to the multiplexer task via the local loopback address at a predetermined ports for the audio stream and for the video stream. The multiplexer process accepts the audio and video streams from the A/V encoder, splits the audio and video streams, and re-transmits the stream parts to the server de-multiplexer via the satellite terminals. In single terminal mode, streams are transmitted to a single I/O interface. In dual terminal mode, stream packets are tagged and multiplexed between the two I/O interfaces to take advantage of non-coupled available bandwidth. The multiplexer process accepts configuration requests from the user
interface manager process. Configuration requests can be used to set transmission and stream processing parameters. Multiplexer status is maintained in the shared data segment. The system monitor and control process is used to monitor system temperature, and to control the A/V encoder. System temperature is accessed via the bus of the computer system. A/V encoder control is accomplished via an interface provided by the A/V encoder application. The system monitor and control process accepts control requests from the user interface manager process for controlling the A/V encoder process. System monitor and control status is maintained in the shared data segment. The configuration manager process is used to manage system configuration settings. System configuration data includes IP address to receiver mappings, and transmit operation modes. System configuration data is stored to, and retrieved from, the solid state compact flash drive of the computer system. The configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of configuration data. Operational configuration parameters are maintained in the shared data segment. The terminal interface manager process is used to manage the configuration and operation of the satellite terminals. The terminal interface manager uses the AT command spec for the satellite terminals to perform initial configuration and setup of the terminals, perform control and monitoring of terminal initialization, perform control and monitoring of terminal broadcast operation, perform monitoring of terminal status, and to restore terminal configuration. Initial terminal configuration includes gathering restore point of terminal configuration, enabling wireless option of terminal, and modifying IP access parameters of terminal. Control and monitoring of terminal includes monitoring terminal pointing status, terminal ready status, terminal GPS status, terminal battery status, terminal temperature status, terminal connection status, and terminal transmit status. The terminal interface manager accepts terminal requests from the user interface manager process. Requests include connection commands, and setup/restore commands. Terminal operation and status data is maintained in the shared data segment. The user interface manager process provides
operational screens and menus for display on the touch screen, manages user input requests, and displays system operational parameters and status on the touch screen. The user interface manager process issues requests to the multiplexer process, the system monitor and control process, the configuration manager process, and the terminal interface manager process. The user interface manager accesses system operational parameters and status in the shared data segment. A variety of satellite antennas can be used, such as the Hughes 9201 BGAN terminal. The de-multiplexer process accepts the single or dual IFB audio stream(s) from the server. The demultiplexer provides a single IFB audio stream to the audio decoder process. The audio decoder accepts a single audio stream from the De-Multiplexer task and decodes the stream for analog output. The decoded audio stream is provided to the audio line-out of the computer system. This is provided for IFB audio support. The server interface manager provides a control and status interface between the client and server systems. This interface is used to generate and process server interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions. The server interface manager, along with the caster interface manager on the server-side system, provide a tightly-coupled closed-loop control and monitoring capability for the client and server systems. The bandwidth test function executes the bandwidth test with the corresponding function on the server-side system. The control and status of the test is managed via the Server Interface Manager. The watchdog task is used to monitor the health and execution the client software system. A watchdog timer can be maintained for each system task, and health/status information is monitored via the shared data region. Expired watchdog timers indicated task- level failures. The watchdog task provides a system level of fail-safe and error recovery capabilities required to support reliable operation. The updater provides the capability for field updates of the client system via the external USB interface. The updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates.
The updater task also provides a controlled startup environment for the client system. Software system tasks are prioritized and sequenced for startup and execution.
FIG. 12 shows the server-side top-level software and data flow. The shared data region provides a common system resource that is shared between all tasks. This data region is used to store global system operating parameters, system status information, system run-time data and statistics, and system configuration parameters. The shared data region is also used to provide a mechanism for intertask communication and signaling. Access to the shared data region is managed through and Application Programming Interface (API) library that provides controlled access to the regions functions and data. The audio encoder is similar to the client side encoder already described. The encoded IFB audio stream is transmitted to the multiplexer task via the local loopback address at a predetermined port. The multiplexer process accepts the audio stream from the audio encoder, duplicates the stream if configured, and re-transmits the stream parts to client-side de-multiplexer task. The multiplexer process accepts configuration requests from the user interface manager process. Configuration requests can be used to set redundancy parameters used by the multiplexer process for transmitting the audio stream. Multiplexer status is maintained in the shared data segment. The system monitor and control process is used to control the A/V encoder. A/V encoder control is accomplished via the COM/OLE interface provided by the encoder application. The system monitor and control process accepts control requests from the client interface manager process for controlling the encoder process. System monitor and control status is maintained in the shared data segment. The configuration manager process is used to manage system configuration settings. System configuration data includes the system IP address setting. System configuration data is stored to, and retrieved from, the hard drive of the computer system. The configuration manager process accepts configuration requests from the user interface manager process. Requests include editing of configuration parameters and storage/retrieval of
configuration data. Operational configuration parameters are maintained in the shared data segment. The de-multiplexer process accepts the A/V streams from the client-side multiplexer. The streams are "re-bonded" to form complete single streams. Error correction is applied if necessary. The de-multiplexer provides complete A/V streams to the A/V decoder process. The A/V decoder process accepts the audio and video streams from the de-multiplexer process and generates a composite audio/video signal for output. The interface manager provides a control and status interface between the server and client systems. This interface is used to process and respond to interface commands (such as bandwidth test request/results, broadcast start/stop, progress status reports, etc.), and to maintain status and progress of the server-side system processes and functions. The interface manager, together with the server interface manager on the client- side system, provide a tightly-coupled closed-loop control and monitoring capability for the server and client systems. The bandwidth test function executes the bandwidth test with the corresponding function on the client- side system. The control and status of the test is managed via the client-side Interface Manager. The updater provides the capability for field updates of the server system via the external USB interface. The updater task executes prior to any other task in the system and scans for software system updates on removable media. Updates for the updater task are processed first, followed by any other software system updates. The updater task also provides a controlled startup environment for the server system. Software system tasks are prioritized and sequenced for startup and execution.
In a particular implementation, the multiplexer can split the audio and video streams received from the A/V encoder process. Splitting is accomplished by alternating one or more TCP/IP data packets between transmit queues connected to each of the BGAN terminals. In one embodiment, the multiplexing alternates each packet as it is received. The user interface manager process provides requests to alter this scheme by selecting various burst rates. In burst rate mode, the multiplexer queues the specified number of packets in sequence to a transmit queue before
alternating to the opposite transmit queue - thus creating a "burst" of packets through each queue. In either mode, the audio stream and video stream are initialized to begin transmission on opposite transmission queues. Audio/video synchronization keys are stored in the options field of the IP header utilizing unused and experimental option values. The split streams are transmitted using the Real- Time Transport Protocol (RTP).
Table 1.1
The de-multiplexer reconstructs the split audio and video streams transmitted by the multiplexer. Reconstruction is accomplished by using the sequence number field in the RTP header to insure that the order of the packets is restored for each data stream. Additionally, synchronization between the audio and video streams is
achieved by using keying information stored by the multiplexer in the IP header options field. The reconstruction process uses a ring-buffer and a user specified buffer delay time to store and order the incoming stream data packets. Packets for a reconstruction sequence not received within the buffer delay window are considered lost. A polarity algorithm is used to manage sequences that incur rollover of the RTP header sequence number. When rollover occurs, packets are assigned polarity by comparing the sequence number of the packet to the largest allowed sequence number (Oxffff). The following code demonstrates the calculation of polarity:
IPid_WindowStartElement =
ntohs(p_WindowStartElement- >ipPacket.rtpHeader.PayloadCounter);
dif_l = Oxffff - IPid_WindowStartElement;
dif_2 = IPid_WindowStartElement;
y = (dif_l < dif_2 ? dif_l : dif_2); // y = distance from IP header id wrap polarity_2 = (dif_l < dif_2 ? LEFT : RIGHT); // polarity indicates which side of the wrap this element is on
Reconstructed streams are transmitted to the A/V decoder process.
In view of the many possible embodiments to which the principles of the disclosed invention may be applied, it should be recognized that the illustrated embodiments are only preferred examples of the invention and should not be taken as limiting the scope of the invention. Rather, the scope of the invention is defined by the following claims. We therefore claim as our invention all that comes within the scope of these claims.
Claims
1. A system for communicating via satellite, comprising:
at least two portable satellite antennas; and
a portable computer system coupled to the antennas;
the portable computer system including at least the following:
an input port for receiving an audio data stream and a video data stream; a multiplexer for dividing the audio data stream and video data stream into multiple data streams for transmission over the portable satellite antennas;
an output port for transmitting the multiple data streams over the portable satellite antennas.
2. The system of claim 1, wherein the portable computer system further includes a demultiplexer for receiving at least an audio signal from the portable satellite antennas to allow for two-way communication over the portable satellite antennas.
3. The system of claim 1, wherein the portable computer system is housed in a waterproof case.
4. The system of claim 1, wherein the portable computer system monitors bandwidth of transmission over the portable satellite antennas, and, if bandwidth allows, transmits duplicate packets associated with at least the audio data stream.
5. The system of claim 1, wherein the portable satellite antennas and portable computer system are sized for carrying by a single user.
6. The system of claim 1, further including a server computer for receiving the multiple data streams from the Internet and demultiplexing the data streams into the audio data stream and the video data stream.
7. The system of claim 1, further including a touch screen display associated with the portable computer system for allowing configuration of the portable satellite antennas.
8. The system of claim 1, further including storing synchronization keys in a packet header for synchronizing the audio and video data streams.
9. A method of transmission via satellite, comprising:
receiving an audio data stream and a video data stream from a live user transmission;
splitting the data streams into at least first and second multiplexed data streams; and
transmitting the multiplexed data streams to at least two, portable satellite antennas for parallel transmission of the at least first and second multiplexed data streams to a satellite.
10. The method of claim 9, further including monitoring bandwidth of the transmission to the satellite and, if the bandwidth is adequate, duplicating the audio data stream so that the first and second multiplexed data streams include duplicated audio packets.
11. The method of claim 9, further including storing audio/video synchronization data in an Internet Protocol header for synchronizing the audio and video.
12. The method of claim 9, wherein the splitting of the data streams is performed by a computer in a water-tight box.
13. The method of claim 9, further including receiving at least an audio stream from the portable satellite antennas while transmitting the multiplexed data streams.
14. The method of claim 9, further including configuring the portable satellite antennas from a user interface.
15. The method of claim 9, further including sending server-side configuration data over the parallel transmission and configuring one or more of the following transmission parameters: bandwidth, frame size, and/or frame rate.
16. The method of claim 9, further including receiving the multiplexed data streams over the Internet and demultiplexing the data streams.
17. The method of claim 9, further including using synchronization information to reconstruct the audio and video data streams.
18. A system for communicating via satellite, comprising:
at least two portable satellite antennas;
a portable, waterproof computer system coupled to the antennas;
a video camera and a microphone for capturing video and audio streams and for sending the video and audio streams to the waterproof computer system;
the computer system for generating at least two streams of combined video and audio and transmitting the streams in parallel over the portable satellite antennas;
a server computer for receiving the streams from an Internet connection and reconstructing the video and audio streams.
19. The system of claim 18, wherein the server computer is configurable from the portable, waterproof computer system or via a local HTTP interface.
20. The system of claim 18, wherein the server computer can communicate bandwidth information back to the portable, waterproof computer system.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201161473784P | 2011-04-10 | 2011-04-10 | |
US61/473,784 | 2011-04-10 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012142042A1 true WO2012142042A1 (en) | 2012-10-18 |
Family
ID=47009657
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2012/032928 WO2012142042A1 (en) | 2011-04-10 | 2012-04-10 | Satellite telecommunications using a computer system adapted for harsh environments |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2012142042A1 (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3136742A1 (en) * | 2015-08-28 | 2017-03-01 | Imagination Technologies Limited | Bandwidth management |
CN106876878A (en) * | 2017-02-21 | 2017-06-20 | 协同通信技术有限公司 | Communication system based on satellite antenna and satellite antenna |
US9973816B2 (en) | 2015-11-18 | 2018-05-15 | At&T Intellectual Property I, L.P. | Media content distribution |
US10341225B2 (en) | 2016-12-30 | 2019-07-02 | Hughes Network Systems, Llc | Bonding of satellite terminals |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5542104A (en) * | 1993-05-27 | 1996-07-30 | Nec Corporation | Portable satellite communication equipment with foldable flat antennae common to both transmission and reception |
US5915020A (en) * | 1995-11-21 | 1999-06-22 | Hughes Electronics Corporation | Portable satellite earth station |
US20090034656A1 (en) * | 2007-06-29 | 2009-02-05 | Lg Electronics Inc. | Broadcast receiving system and method for processing broadcast signals |
US20100265129A1 (en) * | 2004-09-14 | 2010-10-21 | St Electronics (Satcom & Sensor Systems) Pte Ltd | Portable satellite terminal |
US20110067082A1 (en) * | 2009-08-17 | 2011-03-17 | Weigel Broadcasting Co. | System and method for remote live audio-visual production |
-
2012
- 2012-04-10 WO PCT/US2012/032928 patent/WO2012142042A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5542104A (en) * | 1993-05-27 | 1996-07-30 | Nec Corporation | Portable satellite communication equipment with foldable flat antennae common to both transmission and reception |
US5915020A (en) * | 1995-11-21 | 1999-06-22 | Hughes Electronics Corporation | Portable satellite earth station |
US20100265129A1 (en) * | 2004-09-14 | 2010-10-21 | St Electronics (Satcom & Sensor Systems) Pte Ltd | Portable satellite terminal |
US20090034656A1 (en) * | 2007-06-29 | 2009-02-05 | Lg Electronics Inc. | Broadcast receiving system and method for processing broadcast signals |
US20110067082A1 (en) * | 2009-08-17 | 2011-03-17 | Weigel Broadcasting Co. | System and method for remote live audio-visual production |
Cited By (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP3136742A1 (en) * | 2015-08-28 | 2017-03-01 | Imagination Technologies Limited | Bandwidth management |
US10587523B2 (en) | 2015-08-28 | 2020-03-10 | Imagination Technologies Limited | Bandwidth Management |
US10965603B2 (en) | 2015-08-28 | 2021-03-30 | Imagination Technologies Limited | Bandwidth management |
US11489781B2 (en) | 2015-08-28 | 2022-11-01 | Imagination Technologies Limited | Bandwidth management |
US9973816B2 (en) | 2015-11-18 | 2018-05-15 | At&T Intellectual Property I, L.P. | Media content distribution |
US10149011B2 (en) | 2015-11-18 | 2018-12-04 | At&T Intellectual Property I, L.P. | Media content distribution |
US10667011B2 (en) | 2015-11-18 | 2020-05-26 | At&T Intellectual Property I, L.P. | Media content distribution |
US10945038B2 (en) | 2015-11-18 | 2021-03-09 | At&T Intellectual Property I, L.P. | Media content distribution |
US10341225B2 (en) | 2016-12-30 | 2019-07-02 | Hughes Network Systems, Llc | Bonding of satellite terminals |
CN106876878A (en) * | 2017-02-21 | 2017-06-20 | 协同通信技术有限公司 | Communication system based on satellite antenna and satellite antenna |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9635252B2 (en) | Live panoramic image capture and distribution | |
US6185737B1 (en) | Method and apparatus for providing multi media network interface | |
US8831780B2 (en) | System and method for creating virtual presence | |
EP2605484A1 (en) | A transceiver unit and a method for generating bonded streams of data | |
KR20200134677A (en) | Ship remote control system using real time ship monitoring | |
US20220147042A1 (en) | Near Real-Time Data and Video Streaming System for a Vehicle, Robot or Drone | |
US11438638B2 (en) | Systems and methods for extraterrestrial streaming | |
WO2012142042A1 (en) | Satellite telecommunications using a computer system adapted for harsh environments | |
CN106331613A (en) | Communication method and system based on unmanned aerial vehicle | |
CN105208335B (en) | The aerial high definition multidimensional of high power zoom unmanned plane investigates Transmission system in real time | |
KR20100027246A (en) | Apparatus, systems and methods to synchronize communication of content to a presentation device and a mobile device | |
KR20180123847A (en) | Image Processing Device and Image Processing Method Performing Slice-based Compression | |
KR101712542B1 (en) | Data Backup QoS System on Failover and Method thereof | |
RU2689187C2 (en) | Decoding and synthesizing frames for partial video data | |
EP3138294A1 (en) | Content message for video conferencing | |
EP1245114B1 (en) | Method and system for video monitoring | |
US10666351B2 (en) | Methods and systems for live video broadcasting from a remote location based on an overlay of audio | |
CN105208336A (en) | High-power zoom unmanned aerial vehicle aerial high-definition multi-dimension real-time investigation transmitting method | |
CA3227920A1 (en) | Integrated meo-leo satellite communication system | |
CN102595111A (en) | Transmission method, device and system for multi-view coding stream | |
KR20180078931A (en) | A drone and a drone system which has a switching transmission capability of video streaming from dual cameras | |
WO2022091213A1 (en) | Video image communication system, reception device, transmission device, method, program, and recording medium with program recorded therein | |
KR102515413B1 (en) | Real-time video transmission system for remote ship maintenance | |
KR102145285B1 (en) | Transmitting and receiving apparatus for providing a real-time image and a high-quality image | |
CN115529298B (en) | System, method and device for transmitting dense video and audio |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12771218 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12771218 Country of ref document: EP Kind code of ref document: A1 |