WO2011131230A1 - System and method to display a user interface in a three-dimensional display - Google Patents
System and method to display a user interface in a three-dimensional display Download PDFInfo
- Publication number
- WO2011131230A1 WO2011131230A1 PCT/EP2010/055201 EP2010055201W WO2011131230A1 WO 2011131230 A1 WO2011131230 A1 WO 2011131230A1 EP 2010055201 W EP2010055201 W EP 2010055201W WO 2011131230 A1 WO2011131230 A1 WO 2011131230A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- user interface
- blending
- background
- interface background
- distance
- Prior art date
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/445—Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/156—Mixing image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/80—Generation or processing of content or additional data by content creator independently of the distribution process; Content per se
- H04N21/81—Monomedia components thereof
- H04N21/816—Monomedia components thereof involving special video data, e.g 3D video
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/47—End-user applications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/50—Tuning indicators; Automatic tuning control
Definitions
- Embodiments relate generally to displaying user interfaces or menus on a three-dimensional display. More particularly, embodiments relate to displaying a user interface or menu on a 3DTV or other device configured to display three- dimensional video or images.
- 3DTVs employ three-dimensional presentation methods to project a television program into a realistic three- dimensional field.
- 3DTVs can display video or images where objects can be shown to appear to project out of the screen and/or behind the screen.
- the basic concept underlying 3DTV is the stereoscopic nature of the human visual system. That is, when two, shifted views are shown separately to a person's left and right eye, the human visual system can perceive depth based on the displacement
- UIs User interfaces
- menus having user-selectable options are commonly displayed in modern televisions. These menus provide users the ability to select a variety of features to affect the viewing experience. For example, user interfaces often provide menus to allow a user to select television programs to view, options to view and/or save television programs, and options to control how television programs are displayed.
- the user interface can also include other images or video superimposed on a video or image including for example, scrolling text, picture-in- picture, and other images or video superimposed on a video or image .
- Such user interfaces are placed on top of the video or image content.
- the user interface can be displayed with semi-transparency to allow a user to continue to view the video underlying the user interface.
- Such user interfaces can be implemented using, for example, microprocessors such as the PNX85500 available from Trident Microsystems of Santa Clara, California.
- the present invention is a system for presenting a user interface in a three-dimensional display.
- the system includes a device having a display to display a three-dimensional video or image and a processor.
- the processor can be located in a set-top box or in the device itself.
- the processor is configured to cause a user
- the present invention is a method for presenting a user interface on a display that displays three- dimensional video or images.
- the method includes generating a user interface having a background defined by an original background border, displaying the user interface on the display, defining an extended portion of the background with an extended border; and performing graphical processing in the extended portion of the user interface background.
- the present invention is a device to process a three-dimensional video or image.
- the device includes an input for receiving an image for display as a three-dimensional video or image and a processor configured to generate a graphical overlay having a foreground portion and a background portion.
- the graphical overlay is to be displayed in combination with the three-dimensional image, wherein the processor generates an extended background portion and performs graphical processing in the extended background portion.
- the graphical overlay can be a user interface.
- the device can be, for example, a system-on-a- chip located on a television.
- the nature of the graphical processing depends on whether the user interface background is opaque, semi-transparent, or fully transparent. If the user
- the interface background is opaque the extended portion of the user interface background is blended with the image or video underlying the extended portion.
- the blending is performed in accordance with a varying alpha value.
- the alpha value varies as a function of distance from an original border of the user interface background.
- the alpha value decreases as distance from an original border of the user interface background increases.
- the alpha value decreases non-linearly as distance from an original border of the user interface background increases.
- An exemplary non-linear function is an exponential function.
- the alpha value decreases linearly as distance from an original border of the user interface background increases.
- the user interface background is semi-transparent, content (including text and two- or three- dimensional images or icons) associated with the user interface is presented to appear as if projected out of the screen.
- the extended portion of the user interface background is blended with the video or image underlying the extended portion.
- the blending is performed in accordance with a varying alpha value.
- the alpha value varies as a function of distance from an original border of the user interface background. For example in an
- the alpha value decreases non-linearly as distance from an original border of the user interface background increases.
- An exemplary non-linear function is an exponential function.
- the alpha value decreases linearly as distance from an original border of the user interface background increases.
- the video or image underlying the user interface background within a region defined by an original border of the user interface background is blurred.
- content including text and two- or three-dimensional images, and icons
- the video or image underlying the extended portion of the user interface background is blurred.
- a blur radius used in blurring the video or image underlying the extended portion decreases as distance from an original border of the user interface background increases.
- the blur radius used in blurring the video or image underlying the extended portion decreases linearly as distance from an original border of the user interface background increases.
- the blur radius decreases non-linearly as distance from an original border of the user interface background increases.
- An exemplary non ⁇ linear function is an exponential function.
- Figure 1 is a schematic diagram of a simple television entertainment system according to an embodiment of the present invention
- Figure 2 illustrates a left presentation and a right
- Figure 3 illustrates a left presentation and a right
- Figure 4 illustrates a left presentation and a right
- FIG. 5 illustrates a left presentation and a right
- Figure 6 illustrates a left presentation and a right
- Figure 7 is a flow chart for a method for presenting a user interface (UI) such to a user on a display that can be configured to display a three-dimensional presentation.
- UI user interface
- Figure 1 is a schematic diagram of a television entertainment system 102 according to an embodiment of the present
- television entertainment system 102 includes a television 104 having a display 105.
- a set top box 106 accepts a television signal from a source through a connector 108.
- the source can feed a television signal through connector 108 from any television signal source, including for example, a satellite television service provider or a cable television service provider.
- Set top box 106 receives the television signal through connector 108 from the television service provider
- television 104 can provide three-dimensional presentations of television video and images. The three-dimensional presentation is shown to a user on display 105.
- Display 105 can be any display that can provide a three-dimensional view to a user.
- set top box 106 includes a processor 107.
- Processor 107 can be any processor that can be configured to perform the processing described herein.
- An exemplary such processor is the PNX85500 available from Trident Microsystems of Santa Clara, California.
- One function of processor 107 is to cause a user interface, including without limitation a menu of user-selectable options or a subtitle, to be
- embodiments of the invention remove unsettling effects caused when conventional user interface techniques are applied in a three-dimensional video or image presentation.
- set-top box 106 is not included, and processor 107 is located in a device such as television 104 or any other device to provide the user interface.
- the device is a system-on-a-chip (SOC) .
- SOC includes an input to acquire at least a three dimensional image or video for display and a processor configured to generate a graphical overlay having a
- processor 107 generates an extended background portion.
- Processor 107 performs graphical processing such as described herein in the extended background portion.
- An exemplary such graphical overly is the user interface described herein.
- Display 105 can be a display other than on a television set, and devices other than television sets can be used in
- display 105 can be a display used on a device such as a portable video player, a personal digital assistant, a tablet computer such as an Apple iPad, or a telephone such as an Apple iPhone, RIM Blackberry, or other telephone configured to display three-dimensional images or video, a screen on a camera configured to display three- dimensional images, or a screen on any other device that can present three-dimensional images or video.
- processor 107 is located in the device itself to provide the user interface.
- Figure 2 illustrates a left presentation 202a and a right presentation 202b of a three-dimensional presentation having an object 204.
- Display 206 can be any display that can display three-dimensional presentations, such as display 105 described above with respect to Figure 1.
- the three-dimensional effect is provided by the position shift of an object, such as object 204, or a pixel in the left and right presentations 202a and 202b on display 206.
- Figure 3 illustrates a left presentation 302a and a right presentation 302b of a three-dimensional presentation having an object 304.
- left presentation 302a and right presentation 302b are a three-dimensional presentation having an object 304.
- Display 306 can be any display that can display three-dimensional
- Figure 3 also illustrates an exemplary user interface 308.
- User interface 308 has an original border 309.
- the dotted line showing border 309 is shown in Figure 3 for clarity to show the original border of user interface 308. However, in an embodiment, the dotted line is not presented to the user on display 306.
- user interface 308 is a menu of selectable channels, Chi to Ch8. It should be understood that user interface 308 can be any user interface in the context of an embodiment of the present invention including a menu, a subtitle, or any other user interface. Conventionally, user interface 308 is presented to the user at screen depth level.
- original border 309 of user interface 308 is extended by an amount horizontal and vertical direction as shown by extended border 310 to define an extended portion 312 of the background of user interface 308. In an embodiment, this amount is 5% of the screen height in the vertical direction and 5% of the screen width in the horizontal direction. The amount to extend original border 309 of user interface 308 can differ from 5% depending on implementation. In addition, the amount of extension need not be the same in the vertical and horizontal directions. In an embodiment, the amount of extension is preset by the set top box manufacturer. In another embodiment, the amount of extension is user-programmable, by using for example, a set top box configuration mode. In embodiments, graphical processing as described below is performed on extended portion 312 of the presentation falling between extended border 310 and original border 309 of user interface 308 to overcome the unsettling effects of three-dimensional user interface presentation.
- Figure 4 illustrates a left presentation 402a and a right presentation 402b to overcome the unsettling effect of three- dimensional user interface presentation according to an embodiment of the present invention where the user interface has an opaque (non-transparent) background.
- Display 406 can be any display that can display three-dimensional presentations to a user, such as display 105 described above with respect to Figure 1.
- left presentation 402a and right presentation 402b include an object 404 and a user interface 408.
- user interface 408 can be any user interface including a menu, a subtitle, or any other user interface.
- User interface 408 has an original border 409. The dotted line showing original border 409 is shown in Figure 4 for clarity to show the original border of user interface 408. However, in an embodiment, the dotted line is not presented to the user on display 406.
- user interface 408 has an opaque (non-transparent) background.
- an extended border 410 is defined.
- the dotted line showing extended border 410 is shown in Figure 4 for clarity to show the extended border of user interface 408. However, in an embodiment, the dotted line is not presented to the user on display 406.
- an extended portion 412 of the background between extended border 410 and original border 409 of the background of user interface 408 is blended with underlying content video or image with a decreasing alpha.
- alpha represents the amount of blending. Alpha ranges from 1 (opaque) to 0 (transparent) .
- the value of alpha decreases from 1 (opaque) to 0 (fully transparent) as a function of the distance from the original border of the user interface to the extended border.
- alpha is determined as a non-linear function of pixel distance from the original border.
- An exemplary such non-linear function is an exponential function.
- alpha is determined as a linear function of pixel distance from the original border. Other functions can be used to determine alpha as would be apparent to those skilled in the art.
- alpha blending underlying video or image with extended portion 412 has the following properties.
- the variable alpha blending with the content at the border removes hard depth transitions between the user interface 408 plane and the content video plane.
- the variable alpha blending with extended portion 412 dampens (attenuates) the depth (disparity) of the content in a smooth manner toward the depth (disparity) of user interface 408.
- the text in user interface 408 having a non- transparent background can be read both with and without glasses .
- Figure 5 illustrates a left presentation 502a and a right presentation 502b to overcome the unsettling effect of three- dimensional user interface presentation according to an embodiment of the present invention where the user interface has a semi-transparent background.
- Left presentation 502a and right presentation 502b are presented to a user on a display 506.
- Display 506 can be any display that can be configured to show three-dimensional presentations to a user, such as display 105 described above with respect to Figure 1.
- left presentation 502a and right presentation 502b include an object 504 and a user interface 508.
- user interface 508 can be any user interface including a menu, a subtitle, or any other use of a user interface.
- User interface 508 has an original border 509.
- the dotted line showing original border 509 is shown in Figure 5 for clarity to show the original border of user interface 508. However, in an embodiment, the dotted line is not presented to the user on display 506.
- user interface 508 has a semi-transparent background.
- the text of user interface 508 is position-shifted so as to appear to be projected out of the screen.
- any two- or three-dimensional icons or images in user interface 508 are position-shifted so as to appear to be projected out of the screen. In this manner, intersection of the text of user interface 508 with the video or image is avoided.
- an extended border 510 is defined.
- the dotted line showing extended border 510 is shown in Figure 5 for clarity to show the extended border of user interface 508. However, in an embodiment, the dotted line is not presented to the user on display 506.
- an extended portion 512 of the background between extended border 510 and original border 509 of the background of user interface 508 is blended with content video or image with a decreasing alpha, where alpha
- Alpha represents the amount of blending.
- Alpha ranges from the value of alpha used to provide the semi-transparency of the user interface background (semi-transparent) to 0
- extended portion 512 can be smaller than portion 412 shown in Figure 4.
- the value of alpha decreases from the semi-transparency alpha value to 0 (fully
- alpha is determined as a non-linear function of pixel distance from the original border.
- An exemplary such non-linear function is an exponential function.
- alpha is determined as a linear function of pixel distance from the original border. Other functions can be used to determine alpha as would be apparent to those skilled in the art.
- alpha blending underlying video or image with extended portion 512 has the following properties. The variable alpha blending with the content at the border removes hard depth transitions between the user interface 508 plane and the content video plane.
- variable alpha blending with portion 512 also dampens (attenuates) the depth (disparity) of the content in a smooth manner toward the depth (disparity) of user interface 508.
- Figure 6 illustrates a left presentation 602a and a right presentation 602b to overcome the unsettling effect of three- dimensional user interface presentation according to an embodiment of the present invention where the user interface has a fully transparent background.
- Left presentation 602a and right presentation 602b are presented to a user on a display 606.
- Display 606 can be any display that can display three-dimensional presentations to a user, such as display 105 described above with respect to Figure 1.
- left presentation 602a and right presentation 602b include an object 604 and a user interface 608.
- user interface 608 can be any user interface invention including a menu, a subtitle, or any other use of a user interface.
- User interface 608 has an original border 609. The dotted line showing original border 609 is shown in Figure 6 for clarity to show the original border of user interface 608. However, in an embodiment, the dotted line is not presented to the user on display 606.
- user interface 608 has a fully transparent background.
- the text of user interface 608 is position-shifted so as to appear to be projected out of the screen.
- any two- or three-dimensional icons or images in user interface 608 are position-shifted so as to appear to be projected out of the screen. In this manner, intersection of the text of user interface 608 with the video or image is avoided.
- an extended border 610 can be created.
- the dotted line showing extended border 610 is shown in Figure 6 for clarity to show the extended border of user interface 608. However, in an embodiment, the dotted line is not presented to the user on display 606.
- an extended portion 612 of the background between extended border 610 and the original border of the background 609 of user interface 608 is blurred.
- the video or image underlying the original background of user interface 608 is blurred.
- the video or image underlying the original background of user interface 608 is blurred using a maximum available blur radius. Other values of blur radius can be used depending on implementation.
- the video or image underlying extended portion 612 is blurred.
- the blur radius decreases with increasing distance from original border 609 of user interface 608 to the extended border 610.
- blur radius begins with a maximum available blur radius and decreases with increasing distance from original border 609 of user interface 608 to the extended border 610.
- Typical values for blur radius range from 80 to 40 pixels. Other blur radius ranges can be used depending on implementation.
- the blurring continues to the extent of the extension of user interface 608 with the blur radius decreasing with increasing distance from original border 609.
- blur radius is determined as a non-linear function of pixel distance from the original border.
- An exemplary such nonlinear function is an exponential function.
- blur radius is determined as a linear function of pixel distance from the original border. Other functions can be used to determine blur radius as would be apparent to those skilled in the art. In addition, other ranges of blur radius can be used. Blurring the video or image behind the user interface as described above makes the user interface text and any
- Varying the blur radius in extended portion 612 removes hard depth transitions between the text of the user interface plane and the content video or image plane.
- extended portion 612 can be smaller than portion 412 shown in Figure 4.
- Figure 7 is a flow chart for a method for presenting a user interface (UI) such to a user on a display that can be configured to display a three-dimensional presentation.
- the display can be any screen to display three-dimensional images or video such as display 105 described above with respect to Figure 1.
- a user interface such as a menu of user- selectable options, a subtitle, or any other user interface
- a user interface is displayed on the display.
- an extended border is created around the user interface background to define an extended portion of the user interface background.
- a determination is made whether the user interface background is opaque, semi-transparent, or transparent.
- step 706 If in step 706 it is determined that the user interface background is opaque, operation of the method continues in step 708.
- step 708 the user interface background in the extended portion is blended with the underlying video or image being displayed on the display.
- the blending is preformed using an alpha value that indicates an amount of blending transparency.
- alpha ranges from 1 (opaque) to 0 (fully transparent) .
- alpha for blending is determined as a function of distance from the original border of the user interface background. For example, in an
- alpha is decreased as a distance from the
- alpha decreases from 1 to 0 as a function of the distance from the original border of the user interface to the extended border.
- alpha is determined as a non-linear function of pixel distance from the original border.
- An exemplary such non-linear function is an exponential function.
- alpha is determined as a linear function of pixel distance from the original border. Other functions can be used to determine alpha as would be apparent to those skilled in the art.
- step 706 If in step 706 it is determined that the user interface background is semi-transparent, operation of the method continues in step 710.
- step 710 the text of the user interface is presented to the user to appear as projecting out of the display. Further, in step 710, any two- or three- dimensional images or icons of the user interface are
- step 712 the user interface background in the extended portion is blended with the underlying video or image being displayed on the display.
- blending is performed using an alpha value that indicates an amount of blending transparency.
- alpha ranges from a value of alpha corresponding to the semi-transparent user interface background (semi- transparent) to 0 (fully transparent) .
- alpha for blending is determined as a function of distance from the original border of the user interface background. For example, in an embodiment, alpha is decreased as a distance from the original border of the user interface increases. In an embodiment, for example, alpha decreases from the semi-transparency alpha value to 0 as a function of the distance from the original border of the user interface to the extended border. In an embodiment, alpha is
- alpha is determined as a linear function of pixel distance from the original border. Other functions can be used to determine alpha as would be apparent to those skilled in the art.
- step 714 the text of the user interface is presented to the user to appear as projecting out of the display. Further, in step 714, any two- or three-dimensional images or icons of the user interface are presented to the user to appear as projecting out of the display. Operation of the method then continues in step 716 where the video or image underlying the user interface is blurred. For example, in an embodiment, the video or image underlying the original background of user interface 608 blurred using a maximum available blur radius. Other values of blur radius can be used depending on implementation.
- the video or image in the extended portion is blurred using a blur radius determined as a function of distance from the original border of the user interface background.
- the blur radius is decreased as a distance from the original border of the user interface increases.
- blur radius begins with a maximum available blur radius and decreases with increasing distance from original border of the user interface.
- blur radius is decreased from 80 pixels to 40 pixels.
- blur radius is determined as a non-linear
- An exemplary such non-linear function is an exponential
- blur radius is determined as a linear function of pixel distance from the original border. Other functions can be used to determine blur radius as would be apparent to those skilled in the art.
- processor 107 can be located in a set top box 106 of Figure 1 or in a device displaying the three-dimensional image or video.
- Processor 107 can be any processor that can be configured with software programmed to execute the operations described herein, for example, with respect to Figure 7.
- processor is the PNX85500 available from Trident Microsystems of Santa Clara, California.
- determination step 706 is not required.
- processor 107 is preconfigured to implement a user interface having either an opaque
- processor 107 implements a user interface having only an opaque background, only steps 702, 704, and 708 of Figure 7 are required. If processor 107 implements a user interface having only a semi- transparent background, only step 702, 704, 710, and 712 are required. If processor 107 implements a user interface having only a fully transparent background, only step 702, 704, 714, and 716 are required.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
- Controls And Circuits For Display Device (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201080066549.0A CN103039078B (en) | 2010-04-20 | 2010-04-20 | The system and method for user interface is shown in three dimensional display |
EP10716808A EP2561676A1 (en) | 2010-04-20 | 2010-04-20 | System and method to display a user interface in a three-dimensional display |
KR1020127027288A KR20130062907A (en) | 2010-04-20 | 2010-04-20 | System and method to display a user interface in a three-dimensional display |
JP2013505338A JP2013530413A (en) | 2010-04-20 | 2010-04-20 | System and method for displaying a user interface on a three-dimensional display |
PCT/EP2010/055201 WO2011131230A1 (en) | 2010-04-20 | 2010-04-20 | System and method to display a user interface in a three-dimensional display |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/EP2010/055201 WO2011131230A1 (en) | 2010-04-20 | 2010-04-20 | System and method to display a user interface in a three-dimensional display |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011131230A1 true WO2011131230A1 (en) | 2011-10-27 |
Family
ID=43037600
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2010/055201 WO2011131230A1 (en) | 2010-04-20 | 2010-04-20 | System and method to display a user interface in a three-dimensional display |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP2561676A1 (en) |
JP (1) | JP2013530413A (en) |
KR (1) | KR20130062907A (en) |
CN (1) | CN103039078B (en) |
WO (1) | WO2011131230A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016507936A (en) * | 2012-12-24 | 2016-03-10 | トムソン ライセンシングThomson Licensing | Apparatus and method for displaying stereoscopic image |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2472878A1 (en) * | 2010-12-31 | 2012-07-04 | Advanced Digital Broadcast S.A. | Method and apparatus for combining images of a graphic user interface with a stereoscopic video |
CN109729417B (en) * | 2019-03-28 | 2019-09-10 | 深圳市酷开网络科技有限公司 | A kind of video-see play handling method, smart television and storage medium |
GB2602027B (en) * | 2020-12-15 | 2024-08-21 | Samsung Electronics Co Ltd | Display apparatus |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1170942A1 (en) * | 2000-01-24 | 2002-01-09 | Matsushita Electric Industrial Co., Ltd. | Image synthesizing device, recorded medium, and program |
US6577350B1 (en) * | 1998-12-21 | 2003-06-10 | Sony Corporation | Method and apparatus for displaying an electronic program guide |
US20040220791A1 (en) * | 2000-01-03 | 2004-11-04 | Interactual Technologies, Inc. A California Corpor | Personalization services for entities from multiple sources |
EP1739980A1 (en) * | 2005-06-30 | 2007-01-03 | Samsung SDI Co., Ltd. | Stereoscopic image display device |
US20070058034A1 (en) * | 2005-09-12 | 2007-03-15 | Shunichi Numazaki | Stereoscopic image display device, stereoscopic display program, and stereoscopic display method |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2001285749A (en) * | 2000-01-24 | 2001-10-12 | Matsushita Electric Ind Co Ltd | Image synthesizer, recording medium and program |
WO2004107763A1 (en) * | 2003-05-28 | 2004-12-09 | Sanyo Electric Co., Ltd. | 3-dimensional video display device and program |
US8279241B2 (en) * | 2008-09-09 | 2012-10-02 | Microsoft Corporation | Zooming graphical user interface |
MX2011002553A (en) * | 2008-09-18 | 2011-04-04 | Panasonic Corp | Stereoscopic video reproduction device and stereoscopic video reproduction device. |
KR20110018261A (en) * | 2009-08-17 | 2011-02-23 | 삼성전자주식회사 | Method and apparatus for processing text subtitle data |
-
2010
- 2010-04-20 WO PCT/EP2010/055201 patent/WO2011131230A1/en active Application Filing
- 2010-04-20 CN CN201080066549.0A patent/CN103039078B/en not_active Expired - Fee Related
- 2010-04-20 KR KR1020127027288A patent/KR20130062907A/en not_active Application Discontinuation
- 2010-04-20 JP JP2013505338A patent/JP2013530413A/en active Pending
- 2010-04-20 EP EP10716808A patent/EP2561676A1/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6577350B1 (en) * | 1998-12-21 | 2003-06-10 | Sony Corporation | Method and apparatus for displaying an electronic program guide |
US20040220791A1 (en) * | 2000-01-03 | 2004-11-04 | Interactual Technologies, Inc. A California Corpor | Personalization services for entities from multiple sources |
EP1170942A1 (en) * | 2000-01-24 | 2002-01-09 | Matsushita Electric Industrial Co., Ltd. | Image synthesizing device, recorded medium, and program |
EP1739980A1 (en) * | 2005-06-30 | 2007-01-03 | Samsung SDI Co., Ltd. | Stereoscopic image display device |
US20070058034A1 (en) * | 2005-09-12 | 2007-03-15 | Shunichi Numazaki | Stereoscopic image display device, stereoscopic display program, and stereoscopic display method |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2016507936A (en) * | 2012-12-24 | 2016-03-10 | トムソン ライセンシングThomson Licensing | Apparatus and method for displaying stereoscopic image |
US10091495B2 (en) | 2012-12-24 | 2018-10-02 | Thomson Licensing | Apparatus and method for displaying stereoscopic images |
Also Published As
Publication number | Publication date |
---|---|
KR20130062907A (en) | 2013-06-13 |
JP2013530413A (en) | 2013-07-25 |
CN103039078B (en) | 2015-09-23 |
CN103039078A (en) | 2013-04-10 |
EP2561676A1 (en) | 2013-02-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8605136B2 (en) | 2D to 3D user interface content data conversion | |
US8930838B2 (en) | Display apparatus and display method thereof | |
US9021399B2 (en) | Stereoscopic image reproduction device and method for providing 3D user interface | |
US20100091012A1 (en) | 3 menu display | |
EP2337366A1 (en) | Image display device and method for operating the same | |
EP2381692A2 (en) | Image display apparatus and method for controlling the same | |
RU2598989C2 (en) | Three-dimensional image display apparatus and display method thereof | |
JP2011216937A (en) | Stereoscopic image display device | |
EP2561676A1 (en) | System and method to display a user interface in a three-dimensional display | |
US9118893B2 (en) | Display control apparatus, display control method, and program | |
CN102300114B (en) | The display packing of stereoscopic display device and stereoscopic display device | |
EP2418568A1 (en) | Apparatus and method for reproducing stereoscopic images, providing a user interface appropriate for a 3d image signal | |
KR20140000193A (en) | Stereoscopic menu control | |
KR20130070592A (en) | Data transmission system | |
US20130021454A1 (en) | 3d display apparatus and content displaying method thereof | |
US9547933B2 (en) | Display apparatus and display method thereof | |
KR101713786B1 (en) | Display apparatus and method for providing graphic user interface applied to the same | |
EP2426933A2 (en) | Display control device, display control method, and program | |
EP3389267B1 (en) | Display apparatus and method | |
US9071833B1 (en) | Two-dimensional supplementary information in a three-dimensional image | |
JP2014225736A (en) | Image processor | |
KR101878808B1 (en) | Image display apparatus and method for operating the same | |
KR101880479B1 (en) | Image display apparatus, and method for operating the same | |
EP3389266A1 (en) | Viewing apparatus and method | |
US20140192150A1 (en) | Image processing device, method for controlling image processing device, control program, and computer-readable recording medium which records the control program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080066549.0 Country of ref document: CN |
|
WWE | Wipo information: entry into national phase |
Ref document number: 8647/CHENP/2011 Country of ref document: IN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10716808 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2013505338 Country of ref document: JP Kind code of ref document: A Ref document number: 20127027288 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010716808 Country of ref document: EP |