WO2012127836A1 - 生成装置、表示装置、再生装置、眼鏡 - Google Patents
生成装置、表示装置、再生装置、眼鏡 Download PDFInfo
- Publication number
- WO2012127836A1 WO2012127836A1 PCT/JP2012/001852 JP2012001852W WO2012127836A1 WO 2012127836 A1 WO2012127836 A1 WO 2012127836A1 JP 2012001852 W JP2012001852 W JP 2012001852W WO 2012127836 A1 WO2012127836 A1 WO 2012127836A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- cancellation
- glasses
- shutter
- display device
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/172—Processing image signals image signals comprising non-image signal components, e.g. headers or format information
- H04N13/183—On-screen display [OSD] information, e.g. subtitles or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/349—Multi-view displays for displaying three or more geometrical viewpoints without viewer tracking
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/356—Image reproducers having separate monoscopic and stereoscopic modes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N2013/40—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene
- H04N2013/403—Privacy aspects, i.e. devices showing different images to different viewers, the images not being viewpoints of the same scene the images being monoscopic
Definitions
- the synchronization technique between the display device and the glasses is to switch between showing or not showing the image to the user by synchronizing the original image to be displayed on the display device and the shutter state of the glasses. .
- multi-view support and multi-user support can be realized.
- the multi-view support is to individually realize display for individual views constituting a stereoscopic view and display for views constituting a planar view. Specifically, it corresponds to the display of the left view and the display of the right view.
- Multi-user support is to individually provide images to be viewed by each of a plurality of users.
- Display switching refers to switching contents to be displayed in each display period obtained by dividing one frame into four or six.
- the content of the stereoscopic video displayed on the multi-view compatible display device is suitable for viewing with the eyeglasses worn.
- the screen content displayed on the television Is unpleasant to see because it is blurred from side to side. Therefore, it cannot be said that the conventional multi-view compatible display device is sufficiently considered for a user who does not wear glasses.
- An object of the present invention is to provide a generation device capable of generating an image that does not give an unpleasant impression to a user who does not wear glasses.
- a generation device that can solve such a problem is a generation device that generates an image to be viewed by a user wearing spectacles, Acquisition means for acquiring a positive image; Generating means for generating a cancellation image for the acquired normal image,
- the glasses are worn by the user when viewing one of a plurality of images displayed in a time-division manner in the frame period of the video signal,
- the normal image and the cancellation image are images to be subjected to the time-division display,
- the luminance of each pixel constituting the cancellation image is set to a value larger than a difference value obtained by subtracting the luminance of each pixel in the normal image from the maximum value in the numerical value range of luminance.
- the above problem is solved because the display method of the display device and the control method of the shutter glasses are devised, and the images that are visible with or without the glasses are made different by switching between an image and an image that cancels the image at high speed. Will be.
- 1 shows a home theater system including a recording medium, a playback device, a display device, and shutter-type glasses. An example of viewing the left eye image and the right eye image through the active shutter glasses 103 is shown. A set of a left-eye image and a cancellation image, and a superimposed image based on these time-division displays are shown.
- 1 shows an internal configuration of a display device according to Embodiment 1. The synchronization with the shutter-type glasses by the synchronization signal in pattern 1 and the time-division display by the time-division processing unit are shown. The synchronization with the shutter-type glasses by the synchronization signal in pattern 2 and the time-division display by the time-division processing unit are shown.
- the synchronization with the shutter-type glasses by the synchronization signal in pattern 3 and the time-division display by the time-division processing unit are shown.
- the synchronization with the shutter-type glasses by the synchronization signal in the pattern 4 and the time-division display by the time-division processing unit are shown.
- the synchronization with the shutter-type glasses by the synchronization signal in the pattern 5 and the time-division display by the time-division processing unit are shown.
- the calculation principle for obtaining the inversion values of Y, Cr, and Cb is shown. Shows the theoretical change in brightness on the screen and the actual change for data in the numerical range of 0-255.
- regenerating apparatus to which the improvement peculiar to 3rd Embodiment was added, and the internal structure of a display apparatus are shown. It is a flowchart which shows the initialization procedure of a display apparatus (display). The internal structure of the reproducing
- regenerating apparatus concerning 6th Embodiment is shown.
- the internal structure of the display apparatus which concerns on 6th Embodiment, and shutter-type spectacles is shown.
- An example in which a normal image of image A and a cancellation image of image A are displayed in this order according to the code sequence is shown. It is a figure which supplements the processing content of the component of 6th Embodiment from the aspect of a use case. It is a figure which shows the concept of the error countermeasure by bit width expansion.
- the invention of the generation device and the display device provided with the above-described problem solving means can be implemented as a television device, and the invention of the shutter-type glasses is implemented as shutter-type glasses for viewing stereoscopic images on the television device. be able to.
- the invention of the reproducing apparatus can be implemented as a player device for reproducing the package medium, and the invention of the integrated circuit can be implemented as a system LSI incorporated in these devices.
- the invention of the program can be implemented as an executable program recorded in a computer-readable recording medium and installed in these devices.
- the present embodiment provides a multi-view generation device and a multi-user generation device that do not cause discomfort to the user even when a user who does not wear glasses wears the screen of the display device.
- the display content when viewed by a user who is not wearing glasses is that two or more viewpoint image data are superimposed, and this overlapped image becomes an obstacle. It is inappropriate to display a message prompting you to erase your glasses.
- the image that overlaps two or more viewpoint image data is annoying, so even if it is installed for a storefront demonstration, the appeal to the user is only one. In the present embodiment, such a problem is solved.
- FIG. 1 shows a home theater system including a playback device, a display device, and shutter-type glasses.
- the home theater system includes a playback device 100, an optical disc 101, a remote controller 102, active shutter glasses 103, and a display device 200, and is used by a user.
- the playback device 100 is connected to the display device 200 and plays back the content recorded on the optical disc 101.
- the optical disc 101 supplies, for example, a movie work to the home theater system.
- the remote controller 102 is a device that accepts an operation on a hierarchical GUI from a user, and in order to accept the operation, the remote controller 102 moves a menu key for calling a menu constituting the GUI and a focus of a GUI component constituting the menu.
- An arrow key, a determination key for performing a confirmation operation on a GUI component constituting the menu, a return key for returning a hierarchical menu to a higher level, and a numerical key are provided.
- the active shutter glasses 103 close one of the right eye shutter and left eye shutter and open the other in each of a plurality of display periods obtained by dividing the frame. In this way, a stereoscopic video is constructed.
- the right-eye shutter is set to the closed state.
- the left-eye shutter is set to a closed state.
- the shutter-type glasses have a wireless communication function, and can transmit the remaining amount of the built-in battery to the display device in response to a request from the display device 200.
- the display device 200 displays a stereoscopic image of the movie work. At the time of displaying a stereoscopic video, two or more viewpoint image data constituting the stereoscopic video are displayed in each of a plurality of display periods obtained by dividing a frame.
- two or more viewpoint image data left-eye image and right-eye image in the figure
- FIG. 2 shows an example of viewing the left eye image and the right eye image through the active shutter glasses 103.
- the line of sight vw1 indicates the incidence of an image when the right eye is shielded by the active shutter glasses 103.
- a line of sight vw2 indicates the incidence of an image when the left eye is shielded by the active shutter glasses 103.
- the left eye image is viewed by this vw1.
- the right eye image is viewed by vw2.
- the user views the right eye image and the left eye image alternately, and a stereoscopic image is reproduced.
- FIG. 2 it can be seen that a stereoscopic video image appears at a place where the two lines of sight intersect.
- the screen content of the display device 200 is assumed to be worn with the active shutter glasses 103, and thus cannot be viewed without wearing the shutter glasses. Therefore, in consideration of viewing without wearing shutter-type glasses, the first embodiment provides a cancellation image for each of the left-eye image and the right-eye image, and the cancellation image is provided at the same ratio as the right-eye image and the left-eye image. Reproduction prevents the left-eye image or the right-eye image from being seen in viewing without wearing the shutter-type glasses.
- FIG. 3 shows a set of a left-eye image and a cancellation image, and a superimposed image based on these time-division displays.
- a positive image of the left eye image is drawn on the left side of the + symbol.
- the image of the normal image is not seen when viewing without wearing shutter-type glasses.
- a person wearing shutter-type glasses is used by using the effect that only a non-shaded screen can be seen and a normal image cannot be recognized by providing time-division display of a normal image and a cancellation image. Can see the correct image, but can prevent people without shutter glasses from recognizing the image.
- the shutter function of the shutter-type spectacles allows the person wearing the shutter-type spectacles to wear the shutter-type spectacles by making only the normal image visible and making the cancellation image invisible. Make sure that only the person who sees it sees the correct image.
- FIG. 4 shows an internal configuration of the display device according to the first embodiment.
- the display device includes an inter-device interface 1, a left eye frame memory 2a, a right eye frame memory 2b, a memory controller 3a, b, a cancellation image generation unit 4a, b, a time division processing unit 5, a display circuit 6, A configuration register 7, a display pattern generation unit 8, and a synchronization signal transmission unit 9 are included.
- this apparatus assumes processing of a left-eye image and a right-eye image, some of the components in this internal configuration have the same configuration and are used for left-eye use or right-eye use. Things exist. In this way, for the same configuration and different uses such as for the left eye or for the right eye, it is distinguished from other components by combining the same numerical value and alphanumeric characters a and b. To do. In addition, since components having the same configuration and different uses such as for the left eye or for the right eye have the same configuration, only common processing will be described.
- the number of systems of the frame memory and cancellation image generation unit is set to “2”. This is the minimum configuration required to support 2 views (left eye and right eye) and 2 users (user A wearing shutter glasses A and user B wearing shutter glasses B). is there.
- components of the display device will be described.
- the inter-device interface 1 transfers decoded video and audio through, for example, a multimedia cable, composite cable, and component cable compliant with the HDMI standard.
- HDMI can add various property information in addition to video.
- the left-eye frame memory 2a stores the left-eye image data transferred through the inter-device interface for each frame.
- the right-eye frame memory 2b stores the right-eye image data transferred through the inter-device interface for each frame.
- the memory controllers 3a and 3b generate a read destination address for the frame memory and instruct the frame memory 2 to read data from the read destination address.
- the cancellation image creation unit 4a, b generates a cancellation image by converting the pixel value of each pixel in the normal image using a predetermined function, and outputs the cancellation image to the display circuit 6.
- the time division processing unit 5 reads out the normal image in each of the plurality of display periods obtained by dividing one frame period, so that the left eye positive image, the left eye cancel image, the right eye correct image, and the right eye cancel image are displayed. One of them is selectively output to the display circuit 6.
- the display circuit 6 includes a display panel in which light-emitting elements such as a plurality of organic EL elements, liquid crystal elements, and plasma elements are arranged in a matrix, a drive circuit attached to four sides of the display panel, and an element control circuit, and includes a frame memory 2a, The light emitting elements are blinked according to the pixels constituting the image data stored in b.
- the configuration register 7 is a non-volatile memory that stores the screen size, screen mode, manufacturer name, and model name.
- the display pattern generation unit 8 generates an intra-frame switching pattern that is a display pattern for realizing each of the multi-view mode and the multi-user mode.
- the intra-frame switching pattern determines whether a normal image or a cancellation image is displayed in each of a plurality of display periods obtained by dividing one frame.
- the multi-view mode there are types of normal images such as a left-eye image L, a right-eye image R, and a planar view dedicated image 2D.
- the normal images include an image A for user A and an image B for user B. If the number of divisions is four, four display periods are obtained in the frame.
- Each of the four display periods is set as a display period 1, a display period 2, a display period 3, and a display period 4, and either a normal image or a cancellation image can be assigned to each display period.
- the total number of positive images to be allocated within one frame and the total number of cancellation images to be allocated within one frame must be the same.
- the normal image and the cancellation image should be displayed in each display period obtained by dividing one frame period, before the display period assigned to the combination of the view and the user arrives, It must be determined which is the normal image to be displayed and which is the cancellation image.
- the synchronization signal transmission unit 9 creates and transmits a synchronization signal according to the intra-frame switching pattern.
- the transmitted synchronization signal defines how to set the states of the left eye shutter and the right eye shutter in the shutter glasses of each user in each display period within one frame.
- the multi-view mode there is basically one user, and the shutter state of the shutter-type glasses worn by the user is switched for each left eye and each right eye.
- the multi-user mode there are a plurality of users, but the setting of the open / close state is common between the left eye and the right eye. That is, the synchronization signal to be transmitted in the multi-user mode switches the state of the left eye shutter and the right eye shutter of the shutter glasses worn by each user for each user.
- the synchronization signal transmission unit 9 transmits the synchronization signal with a shutter-type spectacle identifier attached thereto.
- This shutter-type spectacle identifier specifies the shutter-type spectacles to which the synchronization signal is to be applied.
- the control unit of the shutter-type glasses worn by each of a plurality of users acquires only the synchronization signal with its own identifier, and controls to ignore the others, so that the user varies from user to user. You can watch the video. This completes the description of the internal configuration of the display device.
- FIG. 5 shows synchronization with the shutter-type glasses by the synchronization signal in pattern 1 and time-division display by the time-division processing unit.
- FIG. 5A it can be seen that the normal image A, the cancellation image A, the normal image B, and the cancellation image B are sequentially provided for time division display.
- the normal image and the cancellation image are displayed at the same time, and the normal image and the cancellation image are displayed.
- the images A and B are all erased by being superimposed.
- FIG. 5B shows a synchronization signal transmitted by the synchronization signal transmission unit.
- the display period of the first quarter frame is open and the remaining display period is closed. As a result, the user wearing the shutter glasses A sees only the image A.
- FIG. 5C shows the synchronization control for the shutter-type glasses B.
- FIG. 5C it can be seen that only the third quarter frame period is open and the rest is closed. As a result, only image B can be seen.
- those who do not wear shutter-type glasses see all the images, so that the normal images A and B and the cancellation images A and B are superimposed by time-sharing display, and the shading Recognize no images.
- the shutter glasses A open only when the normal image A is displayed, and close otherwise. A person wearing the shutter glasses A can see only the normal image A and not the other images, so the cancellation image A cannot be seen. Therefore, the normal image A can be recognized.
- the shutter glasses B are open only when the correct image B is displayed, and closed otherwise. A person wearing the shutter-type glasses B can see only the normal image B and not the other images, so that the cancellation image B cannot be seen and the normal image B can be recognized.
- the synchronization signal transmission unit 9 needs the user wearing the shutter glasses A to view the image A. Therefore, the left eye of the user A is in the open state and the right eye is in the closed state. It is necessary to create a synchronization signal with the left eye closed and the right eye open.
- the synchronization signal transmitting unit 9 needs to create a pattern in which the left eye of the shutter glasses A is closed, the right eye is closed, and the left eye and right eye of the shutter glasses b are opened. A synchronization signal indicating such a pattern is transmitted until the start of the display period arrives, and image switching is performed in such a pattern.
- Pattern 2 Pattern 2 performs brightness adjustment.
- FIG. 6A it can be seen that the normal image A, the cancellation image A, the normal image A, and the cancellation image A are sequentially provided for time-division display. The normal image and the cancellation image are displayed at the same time so that the entire image is erased.
- (B) shows the synchronization signal transmitted by the synchronization signal transmission unit. The first 1/4 frame display period is open and the remaining display periods are closed. As a result, the user wearing the shutter glasses A sees only the low brightness image.
- (C) shows synchronous control for the shutter-type glasses B.
- (c) it can be seen that only the second quarter frame period is closed and the rest is open. Thereby, only an image can be seen with high brightness.
- Pattern 3 is an example in which image A can be seen when there is no shutter glasses, and image B can be seen when shutter glasses are worn.
- FIG. 7 shows a pattern 3 in which the image A can be seen when there is no shutter-type glasses, and the image B can be seen when the shutter-type glasses are put on.
- the normal video A ⁇ the normal image B ⁇ the cancellation image B is repeated and displayed at high speed.
- FIG. 7A a normal image A, a normal image B, a cancellation image B, a normal image A, a normal image B, and a cancellation image B are sequentially obtained in a 1/6 frame period obtained by dividing one frame into six. It can be seen that the time-division display is used.
- FIG. 7B shows viewing in a state where the shutter-type glasses are not worn.
- FIG. 7C shows a synchronization signal transmitted to the shutter-type glasses.
- the second and fifth 1/6 frame display periods are open and the remaining display periods are closed. As a result, the user wearing the shutter-type glasses B sees only the image B.
- the correct image B and the cancellation image B are overlapped and cannot be recognized, so that only the correct image A can be recognized.
- the shutter opens at the timing of the normal image B, so that the normal image B can be viewed.
- Pattern 4 allows a user wearing shutter-type glasses to view a stereoscopic image, and shows a user not wearing shutter-type glasses to one of the left-eye image and right-eye image constituting the stereoscopic image.
- FIG. 8 shows time-division display of the stereoscopic processing in the pattern 4.
- the images R are sequentially subjected to time division display.
- (B) shows viewing without wearing shutter-type glasses. The right-eye image R and the right-eye cancellation image are displayed at the same time, so that the right-eye image R is completely erased and only the left-eye image L is visible.
- (C) shows that the left eye shutter is open during the first 1/6 frame display period, the right eye shutter is open during the second 1/6 frame display period, and the fourth 1/6 frame display period.
- the left-eye shutter is open, the right-eye shutter is closed during the fifth 1 / 6-frame display period, and the remaining 1 / 6-frame display period is all closed.
- the right image of L and the right eye image R can be viewed and the body image can be seen.
- Pattern 5 shows a stereoscopic image to a user who wears shutter-type spectacles, and gives a user who does not wear shutter-type spectacles a 2D image that is different from the left-eye image and right-eye image constituting the stereoscopic image. It is a pattern to watch.
- a 2D image, a left-eye positive image L, a left-eye cancel image L, a right-eye correct image R, a right-eye cancel image R, and a 2D image are obtained in a 1/6 frame period obtained by dividing one frame into six. It can be seen that the time-division display is used sequentially.
- (B) shows viewing without wearing shutter-type glasses. The left-eye image, the left-eye cancellation image, the right-eye positive image, and the right-eye cancellation image are displayed at the same time, so that the left-eye image and the right-eye image are all erased and only the 2D image is visible.
- C shows a synchronization signal transmitted to the shutter-type glasses.
- the left eye shutter is open during the second 1/6 frame display period
- the right eye shutter is open during the fourth 1/6 frame display period
- the remaining 1/6 frame display periods are all closed.
- the left-eye image and the right-eye image are alternately displayed, so that the user wearing the shutter-type glasses sees the stereoscopic video.
- the left-eye shutter opens when the left-eye positive image is displayed, and the right-eye shutter opens when the right-eye positive image is displayed. You can see 3D images.
- the cancellation image creation units 4a and 4b are the components that form the core of the apparatus, and play a particularly important role in this embodiment. Focusing on the internal configuration of the cancel image creating units 4a and 4b, the internal configuration of the cancel image creating units 4a and 4b will be described in more detail.
- the cancellation image creation units 4a and 4b are generation devices that generate a cancellation image and have an internal configuration as shown in FIG.
- FIG. 10 is a diagram illustrating an internal configuration of the cancellation image creation units 4a and 4b. As shown in the figure, the cancel image creation units 4a and 4b are composed of conversion storages 11a and 11b, computing units 12a and 12b, and delay circuits 13a and 13b.
- this apparatus assumes processing of a left-eye image and a right-eye image, some of the components in this internal configuration have the same configuration and are used for left-eye use or right-eye use. Things exist. In this way, for the same configuration and different uses such as for the left eye or for the right eye, it is distinguished from other components by combining the same numerical value and alphanumeric characters a and b. To do. In addition, since components having the same configuration and different uses such as for the left eye or for the right eye have the same configuration, only common processing will be described.
- the conversion formula storages 11a and 11b store a plurality of conversion formulas. These conversion formulas are associated with a combination of the size of the display device and the screen mode, and any one of the conversion formulas can be extracted according to the combination of the current screen mode and the screen size. .
- One model of one display device has various sizes such as 50 inches, 42 inches, and 37 inches, and unique conversion formulas are associated with these screen sizes. Even in the screen size, images can be displayed in various modes such as a high contrast mode, a smooth mode, and a movie mode. Therefore, the conversion formula storages 11a and 11b hold mathematical formula codes and correction parameters for specifying conversion formulas having different orders and coefficients in accordance with the respective screen modes.
- the arithmetic units 12a and 12b convert the luminance Y, red color difference Cr, and blue color difference Cb constituting the positive image into pixel value positions of the cancellation image.
- the red color difference Cr and the blue color difference Cb are converted into inverted values.
- the luminance Y is converted into a pixel value of the cancellation image using the conversion formula (g (Y)) and correction parameters.
- the conversion formula g (Y) is a luminance Y (x, x, x) positioned at an arbitrary X coordinate on the screen, where gsize, mode is a conversion formula corresponding to the screen size of the display device 200 and the current screen mode of the display device 200.
- y) is converted by the conversion expression gsize, mode.
- FIG. 11 shows the calculation principle for obtaining the inversion values of Y, Cr, and Cb.
- A shows a conversion matrix for converting R, G, B to Y, Cr, Cb.
- the transformation matrix is a 3 ⁇ 3 determinant whose elements are a, b, c, d, e, f, g, h, and i.
- B shows the values of the elements a, b, c, d, e, f, g, h, i of this determinant.
- C shows the inverted values of the R, G, and B elements.
- the inversion values are 1-R, 1-G, and 1-B.
- (D) shows the inversion values of Y, Cr, and Cb and the inversion values of R, G, and B.
- the inversion values of luminance Y, red color difference Cr, and blue color difference Cb are the RGB values of the pixel using a conversion formula with a, b, c, d, e, f, g, h, i as elements.
- Can be obtained by converting (E) shows the relationship between the elements of the determinants a, b, c, d, e, f, g, h, i.
- (F) (g) shows the relationship between the inversion values of Y, Cr, and Cb and R, G, B, Y, Cr, and Cb.
- the sum of Y and its inverted value is 1, the sum of the color difference Cb and its inverted value is 0, and the sum of the color difference Cr and its inverted value is 0.
- FIG. 11 (g) in order to cancel the normal image by the time-division table, inversion values are obtained for each of the luminance Y, red difference Cr, and blue difference Cb constituting the normal image, and the inversion values are converted into pixels. What is necessary is just to produce the cancellation image made into the value position. However, the actual pixel brightness does not change linearly with respect to the luminance data.
- FIG. 12 shows the theoretical change of the brightness on the screen for the data in the numerical range of 0 to 255 and the actual change.
- FIG. 12A is a graph showing a change in brightness, in which the horizontal axis is defined as a range of luminance values in data from 0 to 255, and the vertical axis is defined as expected brightness. As shown in FIG.
- the proportional relationship that the screen becomes proportionally brighter as the luminance increases becomes an ideal change.
- the graph in FIG. 12B shows the actual change.
- the change in brightness in this figure shows that the actual screen brightness changes non-linearly with respect to the change in the brightness value in the data from 0 to 255.
- FIG. 13 shows the setting of a theoretical cancellation image, the content of a realistic cancellation image, and measures to be taken as countermeasures in association with each other.
- FIG. 13A shows expected changes in luminance of the normal image and luminance of the cancellation image with respect to the coordinates.
- the sum of luminance is set to 255 for the normal image and the cancellation image. If the luminance of the normal image is 0, the luminance of the cancellation image is 255. If the luminance of the positive image is 128, the luminance of the cancellation image is 127. If the luminance of the positive image is 255, the luminance of the cancellation image is 0. In this case, the luminance of the normal image monotonously increases with respect to the coordinates, and the brightness of the cancellation image monotonously decreases with respect to the coordinates. In this way, the brightness of the superimposed screen is expected to be constant with respect to coordinate changes.
- FIG. 13 (a) shows a theoretical change
- FIG. 13 (b) shows a change in luminance of a positive image and a change in luminance of a cancellation image with respect to coordinates, which are actual.
- the brightness of the positive image increases in a curve as shown by the curve cv2 with respect to the coordinates on the screen
- the brightness of the cancellation image also curves as shown by the curve cv1 with respect to the coordinates on the screen. Decrease.
- the change in the superimposed image due to the time-division display of the normal image and the cancellation image becomes a U-shaped curve as indicated by cv3 in the figure and does not become constant.
- the apparent brightness of the superimposed image by the time-division display of the normal image and the cancellation image is not constant, the shape of the positive image is changed when the normal image and the cancellation image are switched alternately. It appears slightly and the pattern in the positive image can be seen to some extent.
- FIG. 13C shows an ideal form of luminance change of the cancellation image.
- the change in the pixels of the positive image is as shown by a curve cv3.
- a change that is symmetrical with the change of the positive image at the center of the vertical axis is defined as a change of the cancellation image.
- the change of the cancellation image is axisymmetric, the brightness of the superimposed image obtained by time-division display of the normal image and the cancellation image becomes constant.
- the luminance of the normal image is changed as shown in FIG. FIG. 14 shows a case where the luminance of the cancellation image is changed overshootingly with respect to a straight line indicating the luminance change when the luminance of the normal image changes in the range of 0 to 255.
- the change of overshoot is made by setting the value larger than the difference obtained by subtracting the luminance of the positive image from the maximum luminance as the luminance of the cancellation image.
- the curve in FIG. 14 is very dependent on the screen characteristics of the display device. In the display device, the signal value and the amount of energy actually given to the dot are adjusted according to the panel characteristics or mode, but it is not a linear function of the signal value and the amount of energy, but is linearly proportional. However, it does not always react linearly to the human eye. Therefore, it is desirable to derive this curve empirically.
- the pattern of the positive image can be canceled by changing the luminance of the cancellation image with a value larger than the difference obtained by subtracting the luminance of the positive image from the maximum luminance as the luminance of the cancellation image.
- the change of the cancellation image may be any change as long as it satisfies the requirement of a value larger than the difference obtained by subtracting the luminance of the positive image from the maximum luminance, and can be defined by an n-order function with respect to the luminance. How to define the change in the luminance overshoot of the cancellation image varies depending on the screen mode of the display device and the screen size of the display device. Therefore, in the present embodiment, a plurality of conversion formulas having different coefficients and orders and indicating the n-order function are stored in advance. A combination of the screen mode and the screen size is assigned to each of these conversion formulas so as to adapt to the current screen mode and screen size.
- the display device can be industrially produced by embodying each component in the display device as described above with a hardware integrated element such as an ASIC.
- a general-purpose computer architecture such as CPU, code ROM, or RAM
- a program in which the processing procedure of each component as described above is described in computer code is pre-installed in the code ROM.
- the CPU in the hardware integrated device must execute the processing procedure of this program. A processing procedure required for software implementation when a general-purpose computer system architecture is adopted will be described.
- FIG. 15 is a main flowchart of the processing procedure of the display device. Steps S1 to S2 form a loop. Step S1 is a determination of whether or not a screen mode has been set, and step S2 is a determination of whether or not a multi-view mode or a multi-user mode has been set. If the mode is set, a setup menu is displayed in step S3, and an operation is accepted in step S4. Thereafter, after setting contents are written in the configuration register in step S5, the process returns to the loop of step S1 to step S2.
- step S6 a mathematical expression code for specifying a conversion expression corresponding to the current screen mode and the screen size and a correction parameter are set in the cancellation image generation unit, and the process proceeds to step S7.
- Step S7 waits for a time whether or not the start of the intra-frame display period has arrived. If the start time arrives, the image to be displayed is specified from A, B, L, R, 2D by the intra-frame switching pattern in step S8, and the image to be displayed in step S9 is A, B, L, It is determined whether the image is an R or 2D image.
- step S10 If it is a positive image, A, B, L, R, and 2D positive images are output to the display unit in step S10, and a synchronization signal designating the left eye shutter state and the right eye shutter state for each user in step S13. To each user. Thereafter, in step S14, it is determined whether or not the multi-view mode or the multi-user mode is finished. If not finished, the process proceeds to step S7. If the image to be displayed is not a positive image of A, B, L, R, 2D, the luminance of the positive image of A, B, L, R, 2D is converted using the conversion formula set in step S11, A cancellation image is obtained. Thereafter, in step S12, the cancellation image is output to the display circuit.
- step S13 for each user, a synchronization signal designating the left eye shutter state and the right eye shutter state is transmitted to each user. Thereafter, in step S14, it is determined whether or not the mode has ended. If it has not ended, the process proceeds to step S7.
- two or more viewpoint image data constituting a stereoscopic video is a set of a left-eye image and a right-eye image
- the left-eye image normal image and the left-eye image within one frame period. Since the cancellation image, the right-eye image, and the right-eye image are displayed in a time-sharing manner, the contents of the left-eye and right-eye images are canceled when viewing without wearing shutter-type glasses. It will be canceled by the image.
- the user recognizes the image as an image with uniform brightness on the entire screen, so even if the generation device is installed at a store as a multi-view compatible display device, it is uncomfortable for the user. Don't make me think.
- the image is shown to the user who has watched the shutter-type glasses.
- the video can be concealed by non-users. In this way, only a specific user wearing shutter-type glasses can view a stereoscopic video.
- the screen can be set by setting how much of the plurality of display periods associated with the cancellation image is the shutter open period. Can control the brightness.
- the invention of the generation device described in the present embodiment is a generation device that generates an image to be viewed by a user wearing spectacles.
- Generating means for generating a cancellation image for the acquired normal image The glasses are worn by the user when viewing one of a plurality of images displayed in a time-division manner in the frame period of the video signal,
- the normal image and the cancellation image are images to be subjected to the time-division display,
- the luminance of each pixel constituting the cancellation image is set to a value larger than a difference value obtained by subtracting the luminance of each pixel in the normal image from the maximum value in the numerical value range of luminance.
- the normal image to be displayed by the display device in correspondence with the multi-view is a combination of the left eye image and the right eye image, the positive image of the left eye image, the cancellation image of the left eye image, and the right eye image within one frame period. Since the normal image and the right-eye cancel image are displayed in a time-sharing manner, the contents of the left-eye positive image and right-eye positive image are canceled by the cancel image when viewing without glasses. . In this way, the user recognizes the image as a video with uniform brightness on the entire screen, so that even when a multi-view display device is installed in the store, the user feels uncomfortable. There is no. In this way, product manufacturers can send new products to the market and succeed in establishing a corporate brand image and acquiring market share. Therefore, the invention of the above-mentioned generating device brings various contributions to domestic industry.
- the video is shown to the user who watches the glasses and the video is displayed to the other users Can be concealed. In this way, only a specific user wearing spectacles can view a stereoscopic video.
- the glasses are shutter glasses
- the generating device is a display device
- a transmission unit that transmits to the glasses a synchronization signal that defines the open / closed state of the left-eye shutter and the open / closed state of the right-eye shutter in the glasses may be provided.
- the positive image includes an image for a user who wears spectacles and an image for a non-wearing user who does not wear spectacles.
- the appearance frequency of the normal image and the cancellation image is equal,
- the synchronization signal transmitted by the transmission unit may have both the left-eye shutter state and the right-eye shutter state closed during the cancellation image display period. If you don't have glasses, you can see 2D images, but if you wear glasses, you can provide a viewing method that allows you to see 3D images.
- FIG. 16 shows an internal configuration of the inversion calculators 12a and 12b according to the second embodiment. This figure is drawn based on the internal configuration shown in FIG. 10, and is new in that image composition units 144a and 144b are newly added as compared with the internal configuration serving as the base. Hereinafter, the added components will be described.
- the space division display units 14a and 14b realize space division display by switching between a normal image and a cancellation image for each checkerboard and for each line in a part of an area obtained by dividing the display screen.
- a line is a rectangular area composed of pixels in a horizontal row on the screen, and a checkerboard is a small area obtained by dividing the screen into minute rectangles.
- the normal image and the cancellation image can be superimposed by displaying the normal image and the cancellation image for each checkerboard / line. Therefore, when the screen of the display device is viewed without wearing the shutter-type glasses, the brightness of the screen becomes uniform and nothing can be seen.
- the user wearing the shutter-type spectacles by controlling the shutter state of the shutter-type spectacles so as to transmit only the positive image out of the normal image and the cancellation image arranged for each checkerboard / line. Can watch.
- the cancellation image generation unit 4a, b realizes partial time-division display by converting some of the pixels of the normal image based on the conversion formula. By doing so, the cancellation image in which a part of the pixels are replaced with the cancellation pixel and the normal image can be set as targets for time-division display.
- the cancellation of the partial image is realized by setting the cancellation image subjected to the partial replacement in this manner as a time-division display target.
- FIG. 17 shows the visual effect due to partial cancellation.
- 100% and 0% pixels are subject to partial time-division display
- 50% and 50% pixels are subject to partial time-division display. Yes.
- the user feels that the upper half is brighter than the lower half. This is due to visual characteristics and correction of the display device.
- By alternately displaying 100% and 0% pixels a display that can withstand viewing can be realized. The same applies to the case of space division.
- the superimposed image by displaying the normal image and the cancellation image in a time-sharing manner has each luminance Appears brighter than the result of overlapping 50% images.
- the image corresponding to A is a normal image
- the image corresponding to B is a cancellation image
- C is a superimposed image by time-division display.
- (B) shows the superimposition of the normal image and the cancellation image in the form of a mathematical expression when the lower half of the screen is the target of high-speed switching. As shown on the right side of this equation, the lower half is partially erased. As shown in FIG. 18B, it is possible to make only a part of the screen unrecognizable by making the cancellation image only in the lower part of the screen.
- the present embodiment since a part of the normal image is the target of image switching, for example, in the moving image content of the quiz, only the quiz answer is concealed when the shutter-type glasses are not worn.
- the conversion formula used by the display device 200 for creating the cancellation image is selected.
- the present embodiment relates to an improvement in which the playback device selects the conversion formula used for creating the cancellation image.
- the playback device identifies the display device to be connected, for example, model information (model number information), and the currently selected display device.
- the screen mode is acquired from the display device, and the playback device generates a cancellation image in accordance with a mathematical expression code and a correction parameter that specify a conversion formula suitable for the display device and the screen mode.
- FIG. 19 shows the internal configuration of the playback apparatus to which improvements specific to this embodiment are added.
- FIG. 19 is a diagram illustrating an internal configuration of a playback apparatus according to the third embodiment. Since this apparatus assumes processing of a left-eye image and a right-eye image, some of the components in this internal configuration have the same configuration and are used for left-eye use or right-eye use. Things exist. In this way, for the same configuration and different uses such as for the left eye or for the right eye, it is distinguished from other components by combining the same numerical value and alphanumeric characters a and b. To do. In addition, since components having the same configuration and different uses such as for the left eye or for the right eye have the same configuration, only common processing will be described.
- the interface 19 includes a disk drive 21, a local storage 22, a demultiplexer 23, a left eye video decoder 24a, a right eye video decoder 24b, a left eye plane memory 25a, a right eye plane memory 25b, a configuration register 26, a communication control unit 27, and a device.
- the interface 28 is configured.
- the disc drive 21 is loaded with a disc medium on which content constituting a stereoscopic video is recorded, and executes reading / writing on the recording medium.
- the recording medium includes types such as read-only media, rewritable removable media, and rewritable build-in media.
- the playback device also includes a random access unit.
- the random access unit executes random access from an arbitrary time point on the time axis of the video stream.
- the video stream includes a normal video stream and a multi-view video stream.
- the multi-view video stream is a stereoscopic video stream composed of a base-view video stream and a dependent-view video stream.
- the random access unit specifically, when playback from an arbitrary time point on the time axis of the video stream is instructed, an access corresponding to the arbitrary time point is made using an entry map that is one of the scenario data.
- Search for the unit's source packet number Such an access unit includes picture data that can be independently decoded or a set of view components.
- the view component is a component that forms a stereoscopic video, and each of one right-eye video and one left-eye video corresponds to the view component.
- the source packet number of the source packet storing the access unit delimiter for the access unit is specified. Reading from the source packet number and decoding are executed. In the scene jump, random access is executed by executing the search using time information indicating a branch destination.
- a conversion formula reference table describing a mathematical expression code for specifying a conversion formula and correction parameters is read from an optical disc such as a Blu-ray through the disc drive 21 and used for generating a cancellation image.
- the local storage 22 serves as a receptacle for a conversion formula reference table in which a mathematical expression code for specifying a conversion formula and correction parameters are described, and the stored information in the local storage 22 is always updated with the latest information.
- the demultiplexer 23 performs demultiplexing on the input stream and outputs a plurality of types of packetized elementary streams.
- the elementary streams output in this way include a video stream, a subtitle graphics stream, an interactive graphics stream, and an audio stream, and the video stream is output to the left-eye video decoder 24a and the right-eye video decoder 24b.
- the subtitle graphics stream and the interactive graphics stream are sent to a dedicated graphics decoder (not shown) corresponding thereto, and the audio stream is sent to an audio decoder (not shown).
- the left video decoder 24a decodes left-eye image data which is a view component constituting the base view video stream.
- the right-eye video decoder 24b decodes right-eye image data that is a view component constituting the dependent-view video stream.
- Each of the left-eye video decoder 24a and the right-eye video decoder 24b includes a coded data buffer and a decoded data buffer, preloads the view components constituting the dependent-view video stream into the coded data buffer, and then closes the base-view video stream.
- IDR type picture type
- the left-eye video decoder 24a and the right-eye video decoder 24b After decoding the IDR-type view component in this way, the left-eye video decoder 24a and the right-eye video decoder 24b perform the subsequent view component and the decode of the base-view video stream that has been compression-encoded based on the correlation with the view component. Decodes the view component of the pendant view video stream. If uncompressed picture data for the view component is obtained by decoding, the picture data is stored in the decoded data buffer, and the picture data is used as a reference picture.
- the left-eye video decoder 24a and the right-eye video decoder 24b perform motion compensation for the subsequent view component of the base-view video stream and the view component of the dependent-view video stream. If uncompressed picture data is obtained for the subsequent view component of the base-view video stream and the view component of the dependent-view video stream by motion compensation, these are stored in the decoded data buffer and used as reference pictures.
- the above decoding is performed when the decoding start time indicated in the decoding time stamp of each access unit arrives.
- the left eye plane memory 25a stores uncompressed left eye picture data obtained by decoding of the left eye video decoder 24a.
- the right eye plane memory 25b stores uncompressed right eye picture data obtained by decoding by the right eye video decoder 24b.
- the configuration register 26 stores the conversion formula reference table when the conversion formula reference table is read from the disk medium.
- This conversion formula reference table shows a plurality of conversion formulas in association with combinations of model names and screen modes.
- the conversion formula is associated with the combination of the screen size and the screen mode.
- the conversion formula reference table of this embodiment associates the conversion formula with the combination of the model name of the display device and the screen mode. Yes.
- the conversion-type reference table of this embodiment is defined by the creator of the movie work, and the creator of the movie work does not grasp the detailed characteristics of the display device like the manufacturer of the display device. Therefore, it means that the correspondence with the conversion formula is simplified under the assumption that one model of the display device has one screen size.
- a conversion formula is associated with each combination of the screen mode B of the model A and the combination of the screen mode D of the model C.
- the communication control unit 27 selects a conversion formula set in the conversion formula reference table that matches the model name of the display device to be connected and the selected screen mode from the display device. This is selected and set in the display device through the inter-device interface 28.
- the inter-device interface 28 transfers decoded video and audio through, for example, a multimedia cable, composite cable, and component cable compliant with the HDMI standard.
- HDMI can add various property information in addition to video.
- the multimedia cable interface in the inter-device interface 28 is used instead of the network interface, the performance information of the device that executes the display process is stored through the multimedia cable interface.
- the left-eye image obtained by decoding the base-view video stream and the right-eye image obtained by decoding the dependent-view video stream are subject to cancellation image generation. Is realized.
- the playback device can be industrially produced by embodying each component in the playback device as described above with hardware elements.
- the playback device can also be implemented by software. That is, a program in which the processing procedure of each component described above is described in computer code is pre-assembled in the code ROM, and the processing procedure of this program is stored in a single processing unit (CPU) in the hardware configuration of the apparatus.
- This apparatus can also be industrially produced by executing the above.
- a processing procedure necessary for software implementation of the apparatus will be described with reference to a flowchart.
- FIG. 20 is a flowchart showing a display device (display) initialization procedure.
- step S21 the conversion formula reference table is read, and in step S22, connection with a display device is attempted. If the connection is successful, an acquisition request for the model name and screen mode of the display device to be connected is transmitted in step S23. After that, in step S24, the reception of the model name / screen mode is awaited. If received, the conversion formula that matches the acquired model name and screen mode from the conversion formulas in the conversion formula reference table in the configuration register in step S25. Is acquired, a conversion parameter is transmitted to the display device (display) and set on the display device (display) side (step S26), and it is determined whether or not the setting is successful. If it is successful (step S27), if successful, the cancel image generation unit of the display device is caused to generate a cancel image using the conversion formula.
- the playback device that reads image data from the optical disc generates a cancellation image. Therefore, the playback device operates according to the application loaded from the optical disc, so that the author The cancellation image generation according to the intention is performed. Thereby, the quality of the cancellation image can be further improved.
- the table recorded on the optical disk is read and the conversion formula including the description of this table is selected, the optimum one for the display device can be selected, so that the intention of the content creator side can be reflected in the cancellation image. it can. Since the creator's willingness to know the design and color of the contents can be reflected in the conversion formula, the cancellation by the cancellation image can be shown more beautifully.
- the generating device is a playback device, and includes a reading unit that reads a plurality of conversion formulas from a recording medium, a conversion formula reference table corresponding to a combination of screen size and screen mode,
- the generating means includes A conversion formula corresponding to the combination of the model name of the connected display device and the screen mode is extracted from the conversion formula reference table, and a cancellation image is generated based on the extracted conversion formula.
- an optimal conversion formula can be selected according to the characteristics of these modes. As a result, as soon as the mode is changed, it is possible to avoid the inconvenience that the overlapping of the left and right images becomes visible. Since the playback device reads the table from the recording medium and generates a cancellation image based on the conversion formula described in this table, the luminance of the normal image is converted using the conversion formula according to the author's intention. The brightness of the image can be obtained. Also, since the table recorded on the recording medium is read and the conversion formula described in this table is selected for the display device, the intention of the content creator can be reflected in the cancellation image. . Since the creator's willingness to know the design and color of the content is reflected in the conversion formula, the cancellation by the cancellation image can be shown more beautifully.
- FIG. 21 shows the internal structure of the playback apparatus according to the fifth embodiment. This figure is drawn on the basis of the internal configuration of FIG. 19, and is new in that the following components are newly added as compared with the internal configuration serving as the base.
- a cancellation image creation unit 29a, b for creating a cancellation image for the image stored in the plane memory, a normal image stored in the plane memory, and a cancellation image created by the cancellation image creation transfer unit are output to the display device.
- Time-division processing units 30a and 30b used for division display are added.
- the playback device since the playback device reads the reference table from the recording medium and generates a cancellation image based on the conversion formula described in the conversion formula reference table, the conversion according to the author's intention is performed. By converting the luminance of the positive image using the equation, the luminance of the cancellation image can be obtained.
- the time-division display for only the video is realized, but the present embodiment relates to an improvement for realizing the space-division display for the video with subtitles.
- the present embodiment there are images in which subtitles are synthesized and images in which cancellation subtitles are synthesized in images to be subjected to time-division display within one frame period.
- FIG. 22 shows the internal structure of the playback apparatus according to the fifth embodiment.
- the internal configuration of this figure is drawn based on the internal configuration diagram in the fourth embodiment, and is different from the base configuration in that constituent elements belonging to the caption system are added.
- the added subtitle system includes a subtitle decoder 31 that decodes subtitles, a subtitle plane memory 32 that stores bitmaps obtained by subtitle decoding, and a plane shift for bitmaps stored in subtitle plane memories.
- Plane shift unit 33 that obtains left-eye caption and right-eye caption
- cancellation subtitle creation unit 34a, b that obtains left-eye cancellation subtitles and right-eye cancellation subtitles that are cancellation images for the right-eye caption and left-eye caption obtained by plane shift, left-eye caption, and right-eye
- Time-division processing units 35a and 35b that output time-division subtitles or left-eye cancellation subtitles and right-eye cancellation subtitles, and output left-eye subtitles and right-eye subtitles, or left-eye cancellation subtitles and right-eye cancellation subtitles into left-eye images and right-eye images
- the combining units 36a and 36b are combined.
- the cancellation subtitle creation units 34a and 34b create cancellation subtitles with the same configuration as the cancellation image creation units 4a and 4b shown in the first embodiment.
- the cancellation subtitles are created on the same principle as the cancellation image generation units 4a and 4b shown in the first embodiment because the luminance constituting the subtitles has the same visual characteristics as the luminance of the image as shown in the first embodiment. It is because it has.
- the caption decoder will be described in detail.
- the subtitle decoder includes a graphics decoder and a text subtitle decoder.
- the graphics decoder includes a “coded data buffer” that stores functional segments read from the graphics stream, a “stream processor” that obtains an object by decoding a screen configuration segment that defines the screen configuration of graphics, and an object obtained by decoding
- the "object buffer” for storing the screen composition segment, the "composition buffer” for storing the screen composition segment, and the screen composition segment stored in the composition buffer are decoded, and the object buffer is based on the control items in these screen composition segments.
- a “composition controller” that performs screen composition on a plane using the obtained objects.
- the text subtitle decoder includes a “subtitle processor” that separates the text code and the control information from the subtitle description data existing in the text subtitle stream, and a “management information buffer” that stores the text code separated from the subtitle description data.
- An “object buffer” and a “drawing control unit” that executes control of text subtitle reproduction along the time axis using control information separated from subtitle description data.
- the “font preload buffer” that preloads font data
- the “transport stream (TS) buffer” that adjusts the input speed of TS packets that make up the text subtitle stream
- play item playback There is a “subtitle preload buffer” for preloading a text subtitle stream. This completes the description of the caption decoder. Next, details of the display device in the present embodiment will be described.
- FIG. 23 is a diagram showing an internal configuration of the display device according to the fifth embodiment.
- the internal configuration in this figure is drawn based on the internal configuration diagram in the first embodiment, and is different from the base configuration in that an audio processing system is added.
- the added audio processing system includes a 1st audio decoder 41 that decodes the first audio stream, a 2nd audio decoder 42 that decodes the second audio stream, and uncompressed audio data output by the 2nd audio decoder.
- Phase inverter 43 for inverting the phase, audio output by the primary audio decoder and audio output unit 44 for causing the speaker to output audio by the secondary audio decoder, speaker 45, and the uncompressed audio data subjected to phase inversion to the shutter type
- the audio data transmitting unit 46 transmits canceling sound data that can cancel the sound output from the display device to the shutter-type glasses, and causes the shutter-type glasses to output the transmitted canceling sound data.
- the synchronization signal transmission unit 13 transmits a special synchronization signal.
- This special synchronization signal is a signal for executing control to close the shutter of the shutter-type glasses during the display period of the image in which the cancellation subtitle is synthesized. The above is the description of the display device. Next, details of the shutter glasses will be described.
- FIG. 24 shows the internal configuration of the shutter-type glasses.
- the shutter glasses include a synchronization signal receiving unit 51 that receives a synchronization signal transmitted from a display device, a shutter control unit 52 that opens and closes a left eye shutter and a right eye shutter according to the received synchronization signal, and a display.
- the audio receiving unit 53 receives the audio data transmitted from the apparatus, and the speakers 54a and 54b output the received sound.
- FIG. 25 shows time-division display for images with subtitles and images without subtitles.
- an image with English subtitles, an image with cancellation subtitles, an image with English subtitles, and an image with cancellation subtitles are provided for time-division display in each of 1/4 frames obtained by dividing one frame into four. Recognize.
- B) shows viewing without wearing shutter-type glasses. The image with English subtitles and the image with cancellation subtitles are displayed at the same time, so that all subtitles are deleted.
- (C) shows the synchronization signal transmitted by the synchronization signal transmitter. The display period of the first 1/4 frame and the third 1/4 frame are open, and the remaining display period is closed. Since the period during which the shutter is closed is a period in which English subtitles are combined with the image, the user wearing the shutter-type glasses B can view the image with English subtitles.
- FIG. 26 shows time-division display for captioned images and audio for a specific language.
- FIG. 26 (a) it can be seen that in each 1/8 frame obtained by dividing one frame into eight, an image without subtitles, an image with English subtitles, an image without subtitles, and an image with English subtitles are provided for time division display.
- the lower part of (a) shows the audio output from the display device. As shown in the lower part, it can be seen that only Japanese speech is output from the display device.
- FIG. 26B shows a synchronization signal for the shutter glasses A. It can be seen that the first 1/4 frame, the third 1/4 frame are open, and the remaining 1/4 frame is closed. Since the shutter of the shutter glasses A is closed during the caption display period, the captions are not visible, and the user who views the shutter glasses A sees only the video.
- FIG. 26 (c) shows a synchronization signal transmitted by the synchronization signal transmission unit.
- the display period of the second 1/4 frame and the fourth 1/4 frame is open, and the remaining display periods are closed.
- the lower part of (c) shows audio data to be transmitted to the shutter-type glasses B.
- To the shutter-type glasses B sound in reverse phase with respect to Japanese speech and English speech are transmitted.
- the reverse phase sound cancels the sound output from the display device. This is the same principle as the noise canceller.
- the Japanese voice from the display device is canceled, and the user who views the shutter-type glasses B listens only to the English voice.
- a person who is not wearing shutter-type glasses A can see video but cannot see subtitles, can hear Japanese audio from a TV speaker, and a person wearing shutter-type glasses has subtitles.
- the sound that can be heard from the earphones attached to the shutter-type glasses is Japanese sound, anti-phase sound, and English sound flowing from the speaker. Therefore, when listening together with the sound coming from the TV speaker, the background sound and sound effect are heard as they are, the Japanese sound is canceled, and the English sound is heard together.
- the person wearing the shutter glasses A can see the video but not the subtitles, and can hear Japanese speech from the earphones attached to or paired with the shutter glasses.
- a person wearing the shutter glasses B can see a video with subtitles, and can realize a viewing style for each user such that English sound can be heard from the earphones attached to the shutter glasses B.
- children can watch the dubbed version with shutter glasses A, and adults can watch English audio and Japanese subtitles with shutter glasses B. Since the playback content of subtitles and audio changes between a user wearing shutter glasses and a non-wearing user, a viewing environment in which shutter glasses are combined can be constructed. In particular, the development of language teaching materials can be expected.
- the invention described in the present embodiment (hereinafter referred to as the present invention) is obtained by adding the following limitations to the invention of the display device described in the first embodiment.
- the normal image includes a composite of subtitles and a composite of cancellation subtitles
- a normal image with subtitle composition and a normal image with cancellation subtitles appear at the same frequency in one frame period
- the synchronization signal transmitted by the transmission means is such that both the left-eye shutter state and the right-eye shutter state are closed during the cancellation image display period.
- the display device may further include audio data transmitting means for transmitting canceling sound data capable of canceling sound output from the display device to the glasses.
- People wearing glasses A can see video but not subtitles, can hear Japanese audio from earphones attached to or paired with glasses, and people wearing glasses B can see video with subtitles,
- English sound can be heard from the earphones attached to the glasses B.
- the child can watch the dubbed version with glasses A, and the adult can watch English audio and Japanese subtitles with glasses B. Since the playback content of subtitles and audio changes between a user wearing glasses and a non-wearing user, a viewing environment in which glasses are combined can be constructed. In particular, the development of language teaching materials can be expected.
- FIG. 27 shows the internal structure of the playback apparatus according to the sixth embodiment.
- the internal configuration of this figure is drawn based on the internal configuration diagram in the third embodiment, and is different from the base configuration in that the components of the authentication system are added.
- the authentication system in the playback apparatus includes a general-purpose register 61 that stores a registered shutter-type eyeglass list read from a recording medium, a shutter-type eyeglass ID storage unit 62 that stores an ID of a shutter-type eyeglass in the display device, and a shutter-type eyeglass.
- the authentication unit 63 that authenticates whether or not the shutter-type glasses in the display device are valid using the ID and the shutter-type glasses registration list, and notifies the display device of the fact when the validity is found by the authentication. Consists of
- FIG. 28 shows the internal structure of the display device.
- the display device includes: a random number sequence generator 65 that generates a random number sequence that is one of the code sequences; a signaling signal transmission unit 66 that transmits a signaling signal that allows the shutter glasses to generate a code sequence; A time-division processing unit 67 that executes switching between a normal image and a cancellation image in accordance with each codeword of the generated code sequence.
- the shutter-type glasses in the sixth embodiment are based on a signaling signal receiving unit 71 that receives a signaling signal, a random number sequence generator 72 that generates a code sequence in response to reception, and a codeword of the generated code sequence.
- a shutter control unit 73 that controls the open / closed state of the left-eye shutter and the open / closed state of the right-eye shutter.
- the code sequence generated by the random number sequence generators 65 and 71 has the same regularity as the code sequence in the shutter-type glasses. If the shutter controller in the shutter glasses opens and closes the shutter according to the code sequence started to be generated by the signaling signal, the user wearing the shutter glasses can display the normal image displayed in accordance with the code sequence on the display device, the cancellation You can view images.
- the generated code sequence is a PE modulated bit sequence.
- the PE modulation bit sequence is a bit sequence obtained by performing PE (Phase Encode) modulation on a bit sequence constituting an M-sequence random number.
- the M-sequence random number is a pseudo-random number sequence in which the longest bit sequence that can be generated by a certain primitive polynomial is one cycle, and has a property that the probability that any one of “0” and “1” will continue is low.
- the phase modulation is a modulation in which the bit value “0” constituting the M-sequence random number is replaced with 2-bit “10”, and the bit value “1” is replaced with “01”. Half of "0" and "1” will appear. Since the bit value “0” and bit value “1” of this random number bit sequence are assigned to the shutter open state and the shutter closed state, respectively, the appearance probabilities of the closed state and the open state in the frame period are equal.
- FIG. 29 sequentially displays a normal image of image A and a cancellation image of image A in accordance with the code sequence.
- image A is used for time-division display.
- the normal image and the cancellation image are displayed at the same time so that the entire image is erased.
- FIG. 29 (a) when viewed without shutter-type glasses, the normal image and the cancellation image are superimposed, and a non-shaded screen is seen as before, and the normal image cannot be recognized.
- FIG. 29 (b) shows viewing with shutter glasses without authentication. With shutter glasses without authentication, shutter opening / closing control is performed in relation to the output of the normal image and the cancellation image. As shown in FIG. 29 (b), when the shutter display glasses whose opening / closing pattern does not match the display timing of the normal image is worn and the screen of the display device is viewed, both the normal image and the cancellation image can be seen. The superposed image due to the image becomes a screen with no shading, and the correct image cannot be recognized.
- FIG. 29 (c) shows the synchronization control for the shutter-type glasses B.
- the shutter glasses B are assumed to be authenticated shutter glasses that have been authenticated by the playback device.
- FIG. 29 (c) only the 1 / 8th frame period of the first, fifth, sixth and eighth is opened and the rest is closed.
- Authenticated shutter-type glasses that have been authenticated by the playback device generate a code sequence having the same regularity as the code sequence generation unit in the display device, and control the open / closed state of the left-eye shutter and right-eye shutter according to the code word of this code sequence To do. As a result, only the image A is visible.
- FIG. 29 (c) shows the synchronization control for the shutter-type glasses B.
- FIG. 29 (c) shows the synchronization control for the shutter-type glasses B.
- FIG. 29 (c) shows the synchronization control for the shutter-type glasses B.
- FIG. 29 (c) shows the synchronization control for the shutter-type glasses B.
- the screen area where the time division display of the normal image and the cancellation image is performed with the same regularity can be limited to a partial area of the screen.
- FIG. 30A shows a use case in which a cancellation image is applied to a partial area of the screen.
- a normal image and a cancellation image are alternately displayed.
- the portion near the center becomes the one with the mosaic concealed as the video is concealed.
- the mosaic in the figure indicates that the vicinity of the center is hidden.
- the upper side of FIG. 30 (b) shows a viewing image when the shutter glasses are not worn and when the non-authentication shutter glasses are used. In the non-authenticated shutter glasses, the shutter cannot be closed during the display period of the cancellation image, and thus a video with a mosaic must be viewed.
- the lower side of (b) shows viewing when wearing authenticated shutter glasses.
- the authenticated glasses receive the synchronization signal and close the shutter during the display period of the cancellation image, so that only the normal image is viewed.
- FIG. 30C shows a combination of the display device of this embodiment, a personal computer to which shutter-type glasses are applied, and shutter-type glasses.
- the present invention can also be applied to anti-piracy measures. In other words, as long as there is no entry in the shutter-type eyeglass registration list recorded on the disc, viewing with shutter-type eyeglasses is impossible, so if you record the shutter-type eyeglass registration list in a special area that cannot be copied, the right disc owner Only video can be viewed. This can strengthen pirated measures.
- authentication by the playback device is performed, and only the shutter glasses that are correctly authenticated perform generation of a code sequence having the same regularity as the display device, and open and close the shutter state. Only shutter glasses that are correctly authenticated can perform synchronization, and synchronization is impossible with shutter glasses that are not correctly authenticated. Thereby, only the user who has watched regular shutter-type glasses can view the video.
- a list of regular shutter glasses is recorded on a recording medium, and the target shutter glasses transmit a signaling signal for generating a code sequence only when the shutter glasses are registered in this list. In this case, only a user who wears shutter-type eyeglasses certified by the provider as regular shutter-type eyeglasses can view the video.
- the invention described in the present embodiment (hereinafter referred to as the present invention) is obtained by adding the following limitations to the invention of the actual generation device described in the first embodiment.
- the generation device is a display device
- the display device includes a code sequence generating means for generating a code sequence having a regularity common to the glasses and the display device;
- a display unit for displaying a normal image and a cancellation image in accordance with the generated code sequence;
- transmission means for causing the glasses to start shutter opening / closing control according to the code word in the code sequence.
- the display device is connected to a playback device that reads and plays back content from a recording medium, and the recording medium stores a glasses registration list indicating glasses that are allowed to view the content.
- the transmission means includes When the glasses corresponding to the display device are correctly authenticated by the playback device using the glasses registration list, a predetermined signaling signal may be transmitted to the glasses.
- a list in which regular glasses are registered is recorded on a recording medium, and if a target glasses transmits a signaling signal for generating a code sequence only when glasses are registered in this list, the provider can authenticate. Only a user who wears glasses that are certified as glasses can view the video. Users will be motivated to purchase a genuine version of optical discs and glasses because they cannot view the content without wearing the glasses registered in the glasses registration list recorded on the optical disc. become. As a result, it is possible to take pirated measures.
- Measures against pirates can be made from a new perspective of pairing glasses with recording media, which can lead to further development of the content production industry such as the movie industry, publishing industry, game industry, and music industry. This development of the production industry can revitalize domestic industries and enhance the competitiveness of domestic industries.
- a countermeasure against errors by extending the bit width is realized.
- the pixel values of the cancellation image are created for the 12-bit luminance Y, red difference Cr, and blue difference Cb by expanding the bit widths of the luminance Y, red difference Cr, and blue difference Cb of the positive image. To reduce errors.
- FIG. 31 is a diagram showing the concept of error countermeasures by extending the bit width.
- FIG. 31A is a graph in which the horizontal axis represents the luminance value in the data and the vertical axis represents the luminance value of the inverted image.
- FIG. 31B is a table showing the 4096-level luminance in the normal image in association with the 4096-level luminance in the inverted image, the upper 8 bits of the luminance in the cancellation image, and the lower 4 bits in the cancellation image.
- the luminance of the positive image varies in the range from 0 to 4095.
- the luminance of the reverse image changes in the range from 4095 to 0.
- the luminance value of the positive image is different, as in the 8-bit column in the table on the right.
- the luminance value of the cancellation image becomes the same value as a result of the digit loss, and the gradation cannot be maintained.
- the luminance range within the frame luminance range from 4080 to 4095 in the inverted image
- the cancellation image generation unit converts a 12-bit long luminance into a pixel bit value of the cancellation image according to the screen mode and the screen size. Then, in the luminance range of 1 to 15 in the normal image, the luminance of the cancellation image is represented by a bit value of 255 in the upper 8 bits and 12 to 4 in the lower 12 to 4 bits. Thus, the numerical value range from 4080 to 4095 can be expressed for the luminance of the cancellation image.
- the luminance of the cancellation image is expressed in a numerical range of 4095 to 4080.
- the luminance data of the normal image is set to 12 bits, and the cancellation image generation unit converts the 12-bit luminance into the cancellation image of the cancellation image, thereby eliminating an error at the time of bit inversion.
- the glasses for viewing the normal image and the cancellation image are shutter-type glasses, any one of a plurality of images displayed in a time division manner in one frame of the video signal is selected and viewed by the user. Anything other than the shutter type may be adopted as long as it can be used. More specifically, deflection glasses may be employed as long as it has an optical structure that does not show only the cancellation image among the normal image and the cancellation image.
- the display device may be implemented as a portable device with a stereoscopic photography function.
- the portable device includes a photographing unit, and the left-eye image data and right-eye image data obtained by the photographing unit are stored in a photo file and written to a recording medium.
- the portable terminal extracts compressed left-eye image data and compressed right-eye image data from the photo file and uses them for reproduction.
- the stereoscopic photo file here is an MPO file.
- MPO (Multi picture object) files are files that can be shot with Nintendo 3DS and Fujifilm FinePix REAL 3D W1 and W3 cameras, including shooting date, size, compressed left-eye image, and compressed right-eye image.
- Geographic latitude, longitude, elevation, direction, and slope are included as geographical information about the ground.
- the compressed left-eye image and the compressed right-eye image are data compressed in JPEG format. Therefore, the portable terminal obtains a right-eye image and a left-eye image by decompressing JPEG format data.
- the processing of the first embodiment can be realized on the mobile terminal by generating a cancellation image for the left-eye image and the right-eye image obtained in this way.
- the service reception unit manages service selection. Specifically, it receives a service change request based on a user instruction from a remote control signal or an instruction from an application, and notifies the reception unit.
- the receiving unit receives a signal at the frequency of the carrier wave of the transport stream to which the selected service is distributed from the antenna or cable and demodulates it. Then, the demodulated transport stream is sent to the separation unit.
- the reception unit includes a tuner unit that performs IQ detection on the received broadcast wave, a demodulation unit that performs QPSK demodulation, VSB demodulation, and QAM demodulation on the broadcast wave subjected to IQ detection, and a transport decoder. .
- the display determination unit refers to each of 3D_system_info_descriptor, 3D_service_info_descriptor, and 3D_combi_info_descriptor notified from the demultiplexing unit to grasp the stream configuration of the transport stream. Then, in the current screen mode, the demultiplexing unit is notified of the PID of the TS packet to be demultiplexed.
- the display determination unit refers to 2D_view_flag of 3D_system_info_descriptor and frame_packing_arrangement_type of 3D_service_info_descriptor to determine whether the left-eye image or the right-eye image is 2D with respect to the display processing unit when the stereoscopic playback method is a frame compatible method. Notify whether it is used for playback or the video stream is a Side-by-Side format.
- the display determination unit refers to the 3D_playback_type of the 3D_system_info_descriptor extracted from the demultiplexing unit, and determines the playback method of the received transport stream.
- the 2D_independent_flag of the 3D_system_info_descriptor is referred to and it is determined whether or not the video stream used for 2D playback and the video stream used for 3D playback are shared.
- the stream configuration is specified with reference to 3D_combi_info_descriptor.
- the stream structure of the transport stream is 2D / L + R1 + R2
- the 2D / L + R1 + R2 stream is decoded to obtain a set of left-eye image data and right-eye image data.
- the 2D / L + R stream is decoded to obtain a set of left-eye image data and right-eye image data.
- the display determination unit identifies the stream configuration with reference to 3D_combi_info_descriptor.
- the stream structure of the transport stream is MPEG2 + MVC (Base) + MVC (Dependent)
- the MPEG2 + MVC (Base) + MVC (Dependent) stream is decoded to obtain a set of left-eye image data and right-eye image data .
- the MPEG2 + AVC + AVC stream is decoded to obtain a set of left-eye image data and right-eye image data.
- the 2D_independent_flag of the 3D_system_info_descriptor is referred to and it is determined whether or not the video stream used for 2D playback and the video stream used for 3D playback are shared.
- the value of 2D_independent_flag is 0, a 2D / SBS stream is decoded to obtain a set of left-eye image data and right-eye image data.
- 2D + SBS stream is decoded to obtain a set of left eye image data and right eye image data.
- 3D playback is performed by cropping out the left-eye image and the right-eye image that exist on the left and right.
- frame_packing_arrangement_type is not a Side-by-Side format, 3D playback is performed by specifying the TopBottom method and cropping out the left-eye image and the right-eye image existing above and below.
- left-eye image data and right-eye image data can be obtained.
- a cancellation image may be created using an image obtained by demodulation or decoding of a television broadcast wave as a normal image.
- a mechanical part such as a drive unit of a recording medium or an external connector is excluded to form a logic circuit or a storage element.
- the corresponding part that is, the core part of the logic circuit may be formed as a system LSI.
- the system LSI is a package in which a bare chip is mounted on a high-density substrate and packaged. By mounting multiple bare chips on a high-density substrate and packaging them, it is called a multichip module that has multiple bare chips with the same external structure as a single LSI. Is also included in the system LSI.
- system LSIs are classified into QFP (Quad-Flood Array) and PGA (Pin-Grid Array).
- QFP is a system LSI with pins attached to the four sides of the package.
- the PGA is a system LSI with many pins attached to the entire bottom surface.
- pins serve as power supply, ground, and interface with other circuits. Since pins in the system LSI have such an interface role, the system LSI plays the role of the core of the playback device by connecting other circuits to these pins in the system LSI.
- the program shown in each embodiment can be created as follows. First, a software developer uses a programming language to write a source program that implements each flowchart and functional components. In this description, the software developer describes a source program that embodies each flowchart and functional components using a class structure, a variable, an array variable, and an external function call according to the syntax of the programming language.
- the described source program is given to the compiler as a file.
- the compiler translates these source programs to generate an object program.
- Translator translation consists of processes such as syntax analysis, optimization, resource allocation, and code generation.
- syntax analysis lexical analysis, syntax analysis, and semantic analysis of the source program are performed, and the source program is converted into an intermediate program.
- optimization operations such as basic block formation, control flow analysis, and data flow analysis are performed on the intermediate program.
- resource allocation in order to adapt to the instruction set of the target processor, a variable in the intermediate program is allocated to a register or memory of the processor of the target processor.
- code generation each intermediate instruction in the intermediate program is converted into a program code to obtain an object program.
- the object program generated here is composed of one or more program codes that cause a computer to execute each step of the flowcharts shown in the embodiments and individual procedures of functional components.
- program codes such as a processor native code and JAVA (registered trademark) byte code.
- JAVA registered trademark
- a call statement that calls the external function becomes a program code.
- a program code that realizes one step may belong to different object programs.
- each step of the flowchart may be realized by combining arithmetic operation instructions, logical operation instructions, branch instructions, and the like.
- the programmer activates the linker for these.
- the linker allocates these object programs and related library programs to a memory space, and combines them into one to generate a load module.
- the load module generated in this manner is premised on reading by a computer, and causes the computer to execute the processing procedures and the functional component processing procedures shown in each flowchart.
- Such a computer program may be recorded on a non-transitory computer-readable recording medium and provided to the user.
- the display method and display device according to the present invention have high applicability in the video content industry and the consumer equipment industry.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
- Television Receiver Circuits (AREA)
Abstract
Description
正画像を取得する取得手段と、
取得した正画像についての打消画像を生成する生成手段とを備え、
前記眼鏡は、映像信号のフレーム期間において時分割で表示される複数画像のうち、何れかのものを視聴する際、ユーザが着用するものであり、
前記正画像及び打消画像は、前記時分割表示の対象になるべき画像であり、
前記打消画像を構成する個々の画素の輝度は、輝度の数値範囲の最大値から、正画像における個々の画素の輝度を引いた差分値よりも大きい値に設定されている
ことを特徴とする。
本実施形態は、眼鏡を着用していないユーザが表示装置の画面を場合でも、ユーザに不快感をもたらすことのないマルチビュー対応の生成装置、マルチユーザ対応の生成装置を提供するものである。
図2は、アクティブシャッター式眼鏡103を通じた左目画像、右目画像の視聴の一例を示す。視線vw1は、アクティブシャッター式眼鏡103によって右目が遮光された場合の映像の入射を示す。視線vw2は、アクティブシャッター式眼鏡103によって左目が遮光された場合の映像の入射を示す。このvw1により左目画像が視聴されることになる。またvw2により右目画像が視聴されることになる。アクティブシャッター式眼鏡103を着用することによりユーザは、右目画像、左目画像を交互に視聴することになり、立体視画像が再生される。図2では、上記2つの視線が交差する場所に、立体視映像が出現していることがわかる。
上記の図2に示すように表示装置200の画面内容は、アクティブシャッター式眼鏡103での着用を前提にしたものだから、シャッター式眼鏡を着用していないでの視聴に耐えない。そこで、シャッター式眼鏡を着用していないでの視聴を念頭において、実施の形態1では、左目画像、右目画像のそれぞれについての打消画像を設け、かかる打消画像を右目画像、左目画像と同じ比率で再生することによりシャッター式眼鏡を着用していない状態での視聴において、左目画像又は右目画像が見えないようにする。
パターン1は、画像A、Bを複数のユーザのそれぞれに視聴させるものである。図5は、パターン1における同期信号によるシャッター式眼鏡との同期と、時分割処理部による時分割表示とを示す。図5(a)では、正画像A、打消画像A、正画像B、打消画像Bが順次、時分割表示に供されていることがわかる。これら正画像、打消画像とが同時に表示され、正画像と、打消画像とが。重ね合わせられることで画像A、Bの全消去がなされる。図5(b)は、同期信号送信部によって送信される同期信号を示す。図5(b)では最初の1/4フレームの表示期間が開、残りの表示期間が閉である。これによりシャッター式眼鏡Aをかけたユーザは、画像Aだけを見ることになる。
シャッター式眼鏡をかけていない人は、全ての画像を見るため、正画像A,Bと打ち消し画像A,Bとが時分割表示によって重ね合わせられ、濃淡のない画像を認識する。シャッター式眼鏡Aは正画像Aが表示されているときのみシャッターが開き、それ以外では閉じている。シャッター式眼鏡Aをかけている人は正画像Aのみ見えて、他の画像は見えないため、打ち消し画像Aは見えない。よって正画像Aが認識できる。
パターン2は、明度調整を実行するものである。図6(a)では、正画像A、打消画像A、正画像A、打消画像Aが順次、時分割表示に供されていることがわかる。これら正画像、打消画像とが同時に表示されることで全消去がなされる。(b)は、同期信号送信部によって送信される同期信号を示す。最初の1/4フレームの表示期間が開、残りの表示期間が閉である。これによりシャッター式眼鏡Aをかけたユーザは、低い明るさ画像だけを見ることになる。
パターン3は、シャッター式眼鏡がない場合は画像Aが見えて、シャッター式眼鏡をかけると画像Bが見える例である。
パターン4は、シャッター式眼鏡を着用したユーザに立体視映像を視聴させ、シャッター式眼鏡を着用していないユーザには立体視映像を構成する左目画像、右目画像のうち片方を見せるものである。
パターン5は、シャッター式眼鏡を着用したユーザに立体視画像を見せ、シャッター式眼鏡を着用していないのユーザに、当該立体視画像を構成する左目画像、右目画像とは別の、2D画像を視聴させるパターンである。
変換式ストレージ11a,bは、複数の変換式を格納している。これらの変換式は、表示装置のサイズと、画面モードとの組合せに対応付けられており、カレントの画面モードと、画面サイズとの組合せに応じて何れかの変換式が取り出せるようになっている。1つの表示装置の1つのモデルについては、50インチ、42インチ、37インチ等様々なサイズのものが存在するので、これらの画面サイズに一意な変換式を対応付けるようにしている。また、その画面サイズにおいても、ハイコントラストモード、スムーズモードや映画モードといった様々なモードで画像を表示させることができる。よって変換式ストレージ11a,bは、それぞれの画面モードに合わせて次数や係数が異なる変換式を特定する数式コードや補正パラメータを保持する。表示装置自体が変換式を持つ場合は、表示装置の制作者が表示装置の特性を把握しているから、特性に合わせて次数や係数が異なる変換式が不揮発メモリに保持されることになる。ここで変換式ストレージ11a,bにおける変換式の格納形態には、複数の変換式のそれぞれを表す数式コードをデータベース化して格納するというものと、変換式の係数及び次数を補正パラメータとしてデータベース化して格納するというものとがある。
演算器12a,bは、正画像を構成する輝度Y,赤色差Cr,青色差Cbを打消画像の画素値位置に変換する。赤色差Cr,青色差Cbについては反転値に変換する。輝度Yについては、変換式(g(Y))や補正パラメータを用いて打消画像の画素値に変換する。変換式g(Y)は、表示装置200の画面サイズ及び表示装置200のカレント画面モードに応じた変換式を、gsize,modeとすると、画面上の任意のX座標に位置する輝度Y(x,y)は、変換式gsize,modeによって変換される。
遅延回路13a,bは、演算器12a,bから時分割処理部5への転送を所定の時間だけ遅延させる。
本実施形態に記載された生成装置の発明は、眼鏡を着用したユーザに視聴させるべき画像を生成する生成装置であって、
正画像を取得する取得手段と、
取得した正画像についての打消画像を生成する生成手段とを備え、
前記眼鏡は、映像信号のフレーム期間において時分割で表示される複数画像のうち、何れかのものを視聴する際、ユーザが着用するものであり、
前記正画像及び打消画像は、前記時分割表示の対象になるべき画像であり、
前記打消画像を構成する個々の画素の輝度は、輝度の数値範囲の最大値から、正画像における個々の画素の輝度を引いた差分値よりも大きい値に設定されている
ことを特徴とする。
一フレーム期間内で正画像と、打消画像とを時分割で表示させる表示手段と、
正画像又は打消画像の何れかの表示を開始させる際、眼鏡における左目シャッターの開閉状態及び右目シャッターの開閉状態を規定した同期信号を、眼鏡に送信する送信手段と
を備えることを特徴としてもよい。
前記送信手段により送信される同期信号は、打消画像の表示期間において左目シャッターの状態及び右目シャッターの状態を何れも閉状態としてもよい。眼鏡が無い場合は2D画像が見えるが、眼鏡をかけると3D画像が見えるといった視聴方法の提供も可能になる。
第1実施形態は、正画像、打消画像の切り替え手法として、全画面に対する時分割表示を導入する一例を説明したが、本実施形態では、画面の一部において正画像、打消画像の切替えを実現する改良に関する。かかる改良が加えられた箇所は、第1実施形態に示した打消画像作成部の内部である。図16は、第2実施形態に係る反転演算器12a,bの内部構成を示す。本図は、図10の内部構成をベースとして作図されており、このベースとなる内部構成と比較して、画像合成部144a,bが新規に追加されている点が新しい。以下、追加された構成要素について説明する。
空間分割表示部14a,bは、表示画面を分割することで得られた領域の一部において、チェッカーボード毎、ライン毎に正画像と、打消画像とを切り替えることで空間分割表示を実現する。ラインとは、画面の横一列の画素からなる矩形領域のことであり、チェッカーボードとは、画面を微小な矩形に分割することでえられる微少領域のことである。これらチェッカーボード毎・ライン毎に、正画像、打消画像を表示させることでも正画像、打消画像の重ね合わせは可能になる。よって、シャッター式眼鏡を着用していない状態で表示装置の画面を見ると、画面の輝度は一様になり何も見えなくなくなる。一方、チェッカーボード毎・ライン毎に配置されている正画像、打消画像のうち正画像をのみを透過するようシャッター式眼鏡のシャッター状態を制御することでシャッター式眼鏡を着用したユーザは正画像を視聴することができる。
第1実施形態では、表示装置200が打消画像作成に用いる変換式を選択したが、本実施形態では、再生装置が打消画像作成に用いる変換式を選択する改良に関する。つまり表示装置に接続される再生装置が変換式を特定する数式コードや補正パラメータを保持する場合は、再生装置は接続される表示装置の識別情報、例えばモデル情報(型番情報)と、選択中の画面モードを表示装置より取得し、表示装置と画面モードにあった変換式を特定する数式コードや補正パラメータに従って、再生装置が打ち消し画像を生成する。本実施形態特有の改良が加えられた再生装置の内部構成は、図19の通りとなる。図19は、第3実施形態に係る再生装置の内部構成を示す図である。本装置は、左目画像、右目画像の処理を想定したものであるから、この内部構成における構成要素の中には、同一構成であって左目用であるか、右目用であるかという用途が異なるものが存在する。このように、同一構成であって左目用であるか、右目用であるかという用途が異なるものについては、同一数値と、英数字a,bとの結合で他の構成要素と区別するものとする。また同一構成であって左目用であるか、右目用であるかという用途が異なる構成要素は、構成としては同一であるから共通の処理のみを説明するものとする。
本実施形態に記載された再生装置の発明(以下、本発明と呼ぶ)は、上記第1実施形態に記載した生成装置の発明に以下の限定事項を追加したものである。つまり生成装置は再生装置であり複数の変換式を、画面サイズ及び画面モードの組合せに対応付けた変換式参照テーブルを記録媒体から読み出す読出手段を備え、
前記生成手段は、
接続されている表示装置のモデル名及び画面モードの組合せに応じた変換式を変換式参照テーブルから取り出して、取り出された変換式に基づき、打消画像の生成を行。ハイコントラストモード、映画モードといった様々なモードが表示装置に存在する場合、それらのモードの特性に応じて最適な変換式を選ぶことができる。これによりモードが変化した途端、左右の画像の重ね合わせが見えるようになるとの不都合を回避することができる。再生装置が記録媒体からテーブルを読み出し、このテーブルに記載された変換式に基づき打消画像を生成するから、オーサリング者の意図に沿った変換式を用いて正画像の輝度を変換することにより、打消画像の輝度を得ることができる。また記録媒体に記録されたテーブルを読み出してこのテーブルの記載されている変換式の中から表示装置にとって最適なものを選ぶので、打消画像に、コンテンツの作り手側の意図を反映させることができる。コンテンツの絵柄や色彩を熟知した制作者の意思が変換式に反映されるので、打消画像による打消しをよりきれいに見せることができる。
第3実施形態では、再生装置が変換式を選択して時分割処理を実現したが、本実施形態では、再生装置側で時分割処理を実現する改良に関する。図21は、第5実施形態に係る再生装置の内部構成を示す。本図は、図19の内部構成をベースとして作図されており、このベースとなる内部構成と比較して、以下の構成要素が新規に追加されている点が新しい。
第1実施形態では、映像のみについての時分割表示を実現したが、本実施形態では、字幕付き映像についての空間分割表示を実現する改良に関する。本実施形態において、1フレーム期間内の時分割表示の対象となる画像には、字幕が合成されたものと、打消し字幕が合成されたものとがある。
本実施形態に記載された発明(以下、本発明と呼ぶ)は、上記第1実施形態に記載した表示装置の発明に以下の限定事項を追加したものである。
字幕合成がなされた正画像、及び、打消字幕が合成された正画像は、1フレーム期間において同じ頻度で出現しており、
前記送信手段により送信される同期信号は、打消画像の表示期間において左目シャッターの状態及び右目シャッターの状態を何れも閉状態としたものである。本発明によれば、例えば同じ画面で複数の視聴者が同じ映画を見るときに、ある視聴者は眼鏡をかけずに字幕なしの吹き替え版を見て、別の視聴者は眼鏡をかけて日本語字幕を見るといった視聴方法が提供可能になる。
本実施形態は、シャッター式眼鏡をかけていない、あるいは、シャッター開閉のタイミングが画像の表示タイミングと合わないシャッター式眼鏡をかけている場合には、打ち消し画像により画面の一部あるいは全部を見せないような視聴環境を提供する。
本実施形態に記載された発明(以下、本発明と呼ぶ)は、上記第1実施形態に記載した実生成装置の発明に以下の限定事項を追加したものである。
表示装置は、眼鏡と、表示装置とで共通の規則性をもった符号系列を生成する符号系列生成手段と、
生成された符号系列に従って正画像、打消画像を表示する表示部と、
所定のシグナリング信号を送信することにより、符号系列における符号語に従ったシャッターの開閉制御を眼鏡に開始させる送信手段を備えたものである。再生装置による認証がなされ、正しく認証された眼鏡のみが、表示装置と同じ規則性の符号系列の生成を行い、シャッター状態の開閉を行うから、正規品の眼鏡のみが同期を行えることができ、非正規品の眼鏡では同期は不可能になる。これにより正規な眼鏡を視聴したユーザのみが映像を視聴することができる。これにより、表示装置とのペアでの使用が想定された眼鏡を着用していないと、映像を視することはできないから、表示装置とのカップリングで眼鏡を購入させることができる。
前記送信手段は、
表示装置に対応している眼鏡が、前記眼鏡登録リストを用いて再生装置により正しく認証された場合、所定のシグナリング信号を眼鏡に送信してもよい。正規の眼鏡を登録したリストを記録媒体に記録しておき対象となる眼鏡が、このリストに眼鏡が登録されている場合のみ、符号系列生成のためのシグナリング信号を送信すれば、プロバイダが正規の眼鏡であると認定した眼鏡を着用したユーザのみが映像を視聴することができる。光ディスクの記録された眼鏡登録リストに登録された眼鏡を着用していないとコンテンツを視聴することはできないため、正規版の光ディスクと、眼鏡とを購入せねばならないというモチベーションがユーザに喚起されることになる。これにより、海賊版対策を講ずることができる。
本実施形態は、ビット幅を拡張することによる誤差対策を実現するものである。具体的にいうと、正画像の輝度Y,赤色差Cr,青色差Cbのビット幅を拡張することで、12ビットの輝度Y,赤色差Cr,青色差Cbについて打消画像の画素値を作成する際の誤差を低減する。
以上、本願の出願時点において、出願人が知り得る最良の実施形態について説明したが、以下に示す技術的トピックについては、更なる改良や変更実施を加えることができる。各実施形態に示した通り実施するか、これらの改良・変更を施すか否かは、何れも任意的であり、実施する者の主観によることは留意されたい。
正画像、打消画像の視聴のための眼鏡は、シャッター式眼鏡であるとしたが、映像信号の1フレームにおいて、時分割で表示される複数画像のうち、何れかものを選択してユーザに視聴させることができるものであれば、シャッター式以外のものを採用してもよい。具体的にいうと、正画像、打消画像のうち、打消画像のみを見せないような光学構造をもつものであれば、偏向型の眼鏡を採用してもよい。
表示装置は、立体視写真の撮影機能付きの携帯装置として実施してもよい。この場合、携帯装置は、撮影部を備え、撮影部で得た左目画像データ、右目画像データを写真ファイルに格納して記録媒体に書き込む。一方携帯端末は、写真ファイルから圧縮左目画像データ、圧縮右目画像データを取り出して再生に供する。ここでの立体視写真ファイルにはMPOファイルがある。MPO(Multi picture object)ファイルとは、任天堂株式会社の3DS、富士フィルム FinePix REAL 3D W1およびW3カメラにより撮影可能なファイルであり、撮影日、サイズ、圧縮左目画像、圧縮右目画像を含み、また撮影地に関する地理的情報として地理的緯度、経度、標高、方角、傾斜を含む。圧縮左目画像、圧縮右目画像は、JPEG形式で圧縮されたデータである。よって携帯端末は、JPEG形式のデータの伸長を行うことで右目画像、左目画像を得る。このようにして得られた左目画像、右目画像についての打消画像を生成することで第1実施形態の処理を携帯端末上で実現することができる。
上記実施形態では、純粋な表示装置の内部構成を開示していた。ここで表示装置をテレビ放送受信装置として構成するには、サービス受付部と、受信部と、分離部と、表示判定部とを表示装置に追加する必要がある。
各実施形態に示した表示装置、再生装置、シャッター式眼鏡のハードウェア構成のうち、記録媒体のドライブ部や、外部とのコネクタ等、機構的な部分を排除して、論理回路や記憶素子に該当する部分、つまり、論理回路の中核部分をシステムLSI化してもよい。システムLSIとは、高密度基板上にベアチップを実装し、パッケージングしたものをいう。複数個のベアチップを高密度基板上に実装し、パッケージングすることにより、あたかも1つのLSIのような外形構造を複数個のベアチップに持たせたものはマルチチップモジュールと呼ばれるが、このようなものも、システムLSIに含まれる。
各実施形態に示したプログラムは、以下のようにして作ることができる。先ず初めに、ソフトウェア開発者は、プログラミング言語を用いて、各フローチャートや、機能的な構成要素を実現するようなソースプログラムを記述する。この記述にあたって、ソフトウェア開発者は、プログラミング言語の構文に従い、クラス構造体や変数、配列変数、外部関数のコールを用いて、各フローチャートや、機能的な構成要素を具現するソースプログラムを記述する。
101 光ディスク
102 リモコン
103 シャッター式眼鏡
200 表示装置
Claims (9)
- 眼鏡を着用したユーザに視聴させるべき画像を生成する生成装置であって、
正画像を取得する取得手段と、
取得した正画像についての打消画像を生成する生成手段とを備え、
前記眼鏡は、映像信号のフレーム期間において時分割で表示される複数画像のうち、何れかのものを視聴する際、ユーザが着用するものであり、
前記正画像及び打消画像は、前記時分割表示の対象になるべき画像であり、
前記打消画像を構成する個々の画素の輝度は、輝度の数値範囲の最大値から、正画像における個々の画素の輝度を引いた差分値よりも大きい値に設定されている
ことを特徴とする生成装置。 - 前記眼鏡はシャッター式眼鏡であり、
前記生成装置は表示装置であり、
一フレーム期間内で正画像と、打消画像とを時分割で表示させる表示手段と、
正画像又は打消画像の何れかの表示を開始させる際、眼鏡における左目シャッターの開閉状態及び右目シャッターの開閉状態を規定した同期信号を、眼鏡に送信する送信手段と
を備えることを特徴とする請求項1記載の表示装置。 - 前記正画像には、眼鏡を着用したユーザ向けの画像と、眼鏡を着用していない非着用ユーザ向けの画像とがあり、着用ユーザ向けの画像については、1フレーム期間における正画像及び打消画像の出現頻度が等しく、
前記送信手段により送信される同期信号は、打消画像の表示期間において左目シャッターの状態及び右目シャッターの状態を何れも閉状態とする
ことを特徴とする請求項2記載の記載の表示装置。 - 前記正画像には、字幕が合成されたものと、打消字幕が合成されたものとがあり、
字幕合成がなされた正画像、及び、打消字幕が合成された正画像は、1フレーム期間において同じ頻度で出現しており、
前記送信手段により送信される同期信号は、打消画像の表示期間において左目シャッターの状態及び右目シャッターの状態を何れも閉状態とする
ことを特徴とする請求項2記載の記載の表示装置。 - 前記表示装置は更に、
表示装置から出力される音声を打ち消すことができる打消し音声データを眼鏡に送信する音声データ送信手段を備える
ことを特徴とする請求項4記載の表示装置。 - 前記生成装置は表示装置であり、
表示装置は、眼鏡と、表示装置とで共通の規則性をもった符号系列を生成する符号系列生成手段と、
生成された符号系列に従って正画像、打消画像を表示する表示部と、
所定のシグナリング信号を送信することにより、符号系列における符号語に従ったシャッターの開閉制御を眼鏡に開始させる送信手段と
ことを特徴とする請求項1記載の表示装置。 - 前記表示装置は、記録媒体からコンテンツを読み出して再生する再生装置と接続されており、前記記録媒体には、コンテンツの視聴が許可されている眼鏡を示す眼鏡登録リストが記録されており、
前記送信手段は、
表示装置に対応している眼鏡が、前記眼鏡登録リストを用いて再生装置により正しく認証された場合、所定のシグナリング信号を眼鏡に送信する
ことを特徴とする請求項6記載の表示装置。 - 前記生成装置は再生装置であり、
複数の変換式を、画面サイズ及び画面モードの組合せに対応付けた変換式参照テーブルを記録媒体から読み出す読出手段を備え、
前記生成手段は、
接続されている表示装置のモデル名及び画面モードの組合せに応じた変換式を変換式参照テーブルから取り出して、取り出された変換式に基づき、打消画像の生成を行う
ことを特徴とする請求項1記載の再生装置。 - 表示装置により表示される画像を視聴する際、ユーザが着用する眼鏡であって、
映像信号のフレーム期間において時分割で表示される複数画像のうち、何れかのものを選択する選択手段を備え、
前記表示装置により表示される画像には、正画像と、打消画像とがあり、
前記正画像及び打消画像は、前記時分割表示の対象になるべき画像であり、
前記打消画像を構成する個々の画素の輝度は、輝度の数値範囲の最大値から、正画像における個々の画素の輝度を引いた差分値よりも大きい値に設定されている
ことを特徴とする眼鏡。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2012800013570A CN102907108A (zh) | 2011-03-18 | 2012-03-16 | 生成装置、显示装置、再现装置、眼镜 |
US13/697,850 US20130057526A1 (en) | 2011-03-18 | 2012-03-16 | Generating device, display device, playback device, glasses |
JP2013505811A JPWO2012127836A1 (ja) | 2011-03-18 | 2012-03-16 | 生成装置、表示装置、再生装置、眼鏡 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2011060212 | 2011-03-18 | ||
JP2011-060212 | 2011-03-18 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2012127836A1 true WO2012127836A1 (ja) | 2012-09-27 |
Family
ID=46879016
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/001852 WO2012127836A1 (ja) | 2011-03-18 | 2012-03-16 | 生成装置、表示装置、再生装置、眼鏡 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20130057526A1 (ja) |
JP (1) | JPWO2012127836A1 (ja) |
CN (1) | CN102907108A (ja) |
WO (1) | WO2012127836A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140210695A1 (en) * | 2013-01-30 | 2014-07-31 | Hewlett-Packard Development Company | Securing information |
JP2015118191A (ja) * | 2013-12-17 | 2015-06-25 | 富士通株式会社 | 情報表示システム、情報表示装置及びメガネ |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20110115806A (ko) * | 2010-04-16 | 2011-10-24 | 삼성전자주식회사 | 디스플레이 장치 및 3d 안경, 그리고 이를 포함하는 디스플레이 시스템 |
CN202995143U (zh) * | 2011-12-29 | 2013-06-12 | 三星电子株式会社 | 眼镜装置及显示装置 |
KR102091519B1 (ko) * | 2013-11-05 | 2020-03-20 | 엘지전자 주식회사 | 이동 단말기 및 이의 제어방법 |
CN105472366B (zh) * | 2015-12-07 | 2016-11-02 | 京东方科技集团股份有限公司 | 基于心理视觉调制的图像处理方法、装置及显示设备 |
US10142298B2 (en) * | 2016-09-26 | 2018-11-27 | Versa Networks, Inc. | Method and system for protecting data flow between pairs of branch nodes in a software-defined wide-area network |
EP3301940A1 (en) * | 2016-09-30 | 2018-04-04 | Advanced Digital Broadcast S.A. | A method and a system for registering shutter glasses in an image generating device |
US12080030B2 (en) | 2020-04-28 | 2024-09-03 | Shenzhen Sitan Technology Co., Ltd. | Image processing method and device, camera apparatus and storage medium |
CN111526366B (zh) * | 2020-04-28 | 2021-08-06 | 深圳市思坦科技有限公司 | 图像处理方法、装置、摄像设备和存储介质 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003189208A (ja) * | 2001-12-20 | 2003-07-04 | Toshiba Corp | 表示システム及び表示方法 |
JP2006245680A (ja) * | 2005-02-28 | 2006-09-14 | Victor Co Of Japan Ltd | 映像音響再生方法及び映像音響再生装置 |
WO2008078630A1 (ja) * | 2006-12-26 | 2008-07-03 | Nec Corporation | 表示装置及び表示方法 |
WO2008102883A1 (ja) * | 2007-02-22 | 2008-08-28 | Nec Corporation | 画像処理装置及び方法、プログラム並びに表示装置 |
WO2008146752A1 (ja) * | 2007-05-25 | 2008-12-04 | Nec Corporation | 画像処理装置及びその方法並びにプログラム、及び表示装置 |
WO2008152932A1 (ja) * | 2007-06-13 | 2008-12-18 | Nec Corporation | 画像表示装置、画像表示方法、及びその表示プログラム |
JP2009204948A (ja) * | 2008-02-28 | 2009-09-10 | Toshiba Corp | 画像表示装置及びその方法 |
WO2010071193A1 (ja) * | 2008-12-18 | 2010-06-24 | 日本電気株式会社 | ディスプレイシステム、制御装置、表示方法およびプログラム |
JP2010276721A (ja) * | 2009-05-26 | 2010-12-09 | Sony Corp | 画像表示装置、画像観察用眼鏡、および画像表示制御方法、並びにプログラム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100653090B1 (ko) * | 2004-07-13 | 2006-12-06 | 삼성전자주식회사 | 디스플레이 사이즈 조정 장치 및 그 방법 |
EP1779676A1 (en) * | 2004-08-10 | 2007-05-02 | Koninklijke Philips Electronics N.V. | Detection of view mode |
US9524700B2 (en) * | 2009-05-14 | 2016-12-20 | Pure Depth Limited | Method and system for displaying images of various formats on a single display |
US8446462B2 (en) * | 2009-10-15 | 2013-05-21 | At&T Intellectual Property I, L.P. | Method and system for time-multiplexed shared display |
US8624960B2 (en) * | 2010-07-30 | 2014-01-07 | Silicon Image, Inc. | Multi-view display system |
TW201226985A (en) * | 2010-12-31 | 2012-07-01 | Au Optronics Corp | Display system |
-
2012
- 2012-03-16 CN CN2012800013570A patent/CN102907108A/zh active Pending
- 2012-03-16 WO PCT/JP2012/001852 patent/WO2012127836A1/ja active Application Filing
- 2012-03-16 JP JP2013505811A patent/JPWO2012127836A1/ja active Pending
- 2012-03-16 US US13/697,850 patent/US20130057526A1/en not_active Abandoned
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003189208A (ja) * | 2001-12-20 | 2003-07-04 | Toshiba Corp | 表示システム及び表示方法 |
JP2006245680A (ja) * | 2005-02-28 | 2006-09-14 | Victor Co Of Japan Ltd | 映像音響再生方法及び映像音響再生装置 |
WO2008078630A1 (ja) * | 2006-12-26 | 2008-07-03 | Nec Corporation | 表示装置及び表示方法 |
WO2008102883A1 (ja) * | 2007-02-22 | 2008-08-28 | Nec Corporation | 画像処理装置及び方法、プログラム並びに表示装置 |
WO2008146752A1 (ja) * | 2007-05-25 | 2008-12-04 | Nec Corporation | 画像処理装置及びその方法並びにプログラム、及び表示装置 |
WO2008152932A1 (ja) * | 2007-06-13 | 2008-12-18 | Nec Corporation | 画像表示装置、画像表示方法、及びその表示プログラム |
JP2009204948A (ja) * | 2008-02-28 | 2009-09-10 | Toshiba Corp | 画像表示装置及びその方法 |
WO2010071193A1 (ja) * | 2008-12-18 | 2010-06-24 | 日本電気株式会社 | ディスプレイシステム、制御装置、表示方法およびプログラム |
JP2010276721A (ja) * | 2009-05-26 | 2010-12-09 | Sony Corp | 画像表示装置、画像観察用眼鏡、および画像表示制御方法、並びにプログラム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140210695A1 (en) * | 2013-01-30 | 2014-07-31 | Hewlett-Packard Development Company | Securing information |
JP2015118191A (ja) * | 2013-12-17 | 2015-06-25 | 富士通株式会社 | 情報表示システム、情報表示装置及びメガネ |
Also Published As
Publication number | Publication date |
---|---|
CN102907108A (zh) | 2013-01-30 |
US20130057526A1 (en) | 2013-03-07 |
JPWO2012127836A1 (ja) | 2014-07-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2012127836A1 (ja) | 生成装置、表示装置、再生装置、眼鏡 | |
TWI520566B (zh) | 在三維視訊上疊加三維圖形的方法及裝置 | |
US9628872B2 (en) | Electronic device, stereoscopic image information transmission method of electronic device and stereoscopic information receiving method of electronic device | |
CN102811361B (zh) | 立体图像数据发送、接收和中继方法以及其设备 | |
CN102918847B (zh) | 显示图像的方法和设备 | |
CN102197655B (zh) | 暂停模式中的立体图像再现方法及使用该方法的立体图像再现装置 | |
CN103297794B (zh) | 数据结构、记录介质、播放设备和播放方法以及程序 | |
JP2011525746A (ja) | 三次元ビデオ映像処理方法及びその装置 | |
KR20110113186A (ko) | 비디오 인터페이스를 통해 송신하고 3d 비디오 및 3d 오버레이들을 합성하기 위한 방법 및 시스템 | |
JP5593333B2 (ja) | 映像処理方法及びその装置 | |
CN102378020A (zh) | 接收装置和接收方法 | |
TW201119349A (en) | Data structure, recording medium, playback apparatus and method, and program | |
JP2011228959A (ja) | 3次元映像再生方法、および3次元映像再生装置 | |
WO2012132424A1 (ja) | 立体視映像の奥行きを変更することができる映像処理装置、システム、映像処理方法、映像処理プログラム | |
CN103188454A (zh) | 用于显示的装置和方法 | |
WO2016063617A1 (ja) | 画像生成装置、画像抽出装置、画像生成方法、および画像抽出方法 | |
US9357200B2 (en) | Video processing device and video processing method | |
JP6667981B2 (ja) | 不均衡設定方法及び対応する装置 | |
EP2475181A1 (en) | Stereoscopic vision control device, integrated circuit, stereoscopic vision control method | |
WO2010119814A1 (ja) | データ構造、記録媒体、再生装置および再生方法、並びにプログラム | |
CN102439553A (zh) | 用于再现立体图像、提供适于3d图像信号的用户界面的设备和方法 | |
TW201042643A (en) | Controlling of display parameter settings | |
WO2012063675A1 (ja) | 立体画像データ送信装置、立体画像データ送信方法および立体画像データ受信装置 | |
US20120098944A1 (en) | 3-dimensional image display apparatus and image display method thereof | |
KR20130122349A (ko) | 영상표시장치 및 휴대 단말기의 동작 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201280001357.0 Country of ref document: CN |
|
ENP | Entry into the national phase |
Ref document number: 2013505811 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12761190 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13697850 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 12761190 Country of ref document: EP Kind code of ref document: A1 |