US8094927B2 - Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer - Google Patents
Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer Download PDFInfo
- Publication number
- US8094927B2 US8094927B2 US10/789,272 US78927204A US8094927B2 US 8094927 B2 US8094927 B2 US 8094927B2 US 78927204 A US78927204 A US 78927204A US 8094927 B2 US8094927 B2 US 8094927B2
- Authority
- US
- United States
- Prior art keywords
- stereoscopic
- disparity
- user
- customized
- range
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/10—Geometric effects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
Definitions
- the invention relates generally to the field of displays, and in particular to stereoscopic display systems. More specifically, the invention relates to a stereoscopic display system that provides a customized rendering for groups of similar individuals or each individual user.
- the normal human visual system provides two separate views of the world through our two eyes. Each eye has a horizontal field of view of about 60 degrees on the nasal side and 90 degrees on the temporal side. A person with two eyes, not only has an overall broader field of view, but also has two slightly different images formed at her two retinas, thus forming different viewing perspectives.
- the disparity between the two views of each object is used as a cue by the human brain to derive the relative depth between objects. This derivation is accomplished by comparing the relative horizontal displacement of corresponding objects in the two images.
- Stereoscopic displays are designed to provide the visual system with the horizontal disparity cue by displaying a different image to each eye.
- Known stereoscopic displays typically display a different image to each of the observers' two eyes by separating them in time, wavelength or space.
- These systems include using liquid crystal shutters to separate the two images in time, lenticular screens, barrier screens or auto-stereoscopic projection to separate the two images in space, and the use of color filters or polarizers to separate the two images based on optical properties.
- the visual information provided by a stereoscopic display differs in many ways from viewing real world objects. This difference in visual information or visual cues often leads to user discomfort or the inability to fuse the two images (i.e., to perceive the two images displayed to the two eyes as one fused image rather than two separate images).
- One of the important differences between the visual information provided by most prior art stereoscopic display systems and the real world environment arises from the fact that the user of these displays must accommodate, or focus, at a single plane in space when viewing a stereoscopic display while the disparity cues that are provided indicate that the objects are at different planes in space. This differs from the real world visual environment where the accommodative and disparity cues provide consistent information. This is a very important difference since the vergence of our eyes and their accommodation distance are reflexively linked, often causing accommodation distance to track convergence distance as the two eyes converge to place an important object onto the fovea of each eye.
- stereoscopic display systems have the ability to augment the disparity cues, providing larger amounts of disparity information than actually exists in the real world. This provision can provide a greater sense of depth and greater perceived separation of objects in an image. This can be particularly important in detection or identification tasks, where it is important to separate a target (e.g., a cancerous lesion, a surveillance threat, or a potential rescue victim) from a complex background. Unfortunately, when this disparity becomes too large, users of stereoscopic display systems often experience discomfort.
- a target e.g., a cancerous lesion, a surveillance threat, or a potential rescue victim
- this range of convergent and divergent angles that can be fused differs from individual to individual and differs with differences in the accommodative stimulus that is provided.
- research in this field has shown that some individuals can fuse a large range of convergent angles and a small range of divergent angles while other individuals can fuse larger ranges of divergent angles than convergent angles. Because of this individual variability, if a stereoscopic system is developed to provide comfortable stereoscopic images to all users, the range of disparity that can be shown will be extremely limited.
- U.S. Pat. No. 4,647,965 describes a stereoscopic imaging system that has the ability to adjust the horizontal positions of the corresponding objects in the stereoscopic pair to provide quality and viewing comfort for viewing three-dimensional content.
- the system consists of image capture, image digitization, image processing, and digital to analog conversion.
- the image-processing algorithm identifies corresponding objects in the stereoscopic pair, and shifts the objects in one of the two images to reduce the disparities between the views.
- this system does not provide a means for determining an individual's ability or for clustering an individual into a group of individuals that may have similar visual performance.
- image display as described in this display is performed through analog media that can not be altered for each individual and therefore the images produced by this system cannot be rendered for users of different ability and therefore to be successful this system must reduce the maximum disparity to the small range that all observers can comfortably fuse or accept the fact that some percentage of users will experience discomfort while viewing these stereoscopic images.
- U.S. Pat. No. 6,175,379B1 discusses a computer graphics driven stereoscopic display system which also provides a customized rendering for different viewing conditions. Specifically, the system alters the rendering camera separation, focal length, and/or distance from objects in the scene as a function of the viewing conditions that include the display size, viewing distance, and resolution.
- the intent of this patent is to render stereoscopic information that can be comfortably viewed by all users.
- computer graphics experts decide on the acceptable range of disparities that can be fused by all users when rendering the computer graphics scene. Image content outside the fusional area are clipped from view or their contrast and/or focus are reduced to make them unusable.
- this system does not provide a means for determining an individual's ability and the output of this system uses analog media that cannot be altered for each individual user who has a different ability.
- This driver can generate and display a stereoscopic image pair from one computer generated image with embedded depth information about the computer generated graphic objects.
- This driver comes with a number of controls for use in stereo game playing, including stereo separation adjustment to suit individual adaptation to stereo viewing, hot keys for in-game stereo control, stereo test process for determining the best screen mode to use, and custom game configurations that tailor the stereo process to a wide range of games.
- This product provides the technical capability to adjust the range of disparity for a range of stereoscopic scenes.
- this system does not provide a method to link the capability of individual users to fuse stereoscopic images to the stereoscopic rendering process.
- the viewing comfort can only be achieved by the dynamic adjustment during viewing.
- the range of disparity may vary significantly from scene to scene, the dynamic nature of the adjustment can be a challenging task for the users.
- the optimization between viewing comfort and good stereoscopic perception cannot be achieved by a method of dynamic adjustment such as the one provided by NVIDIA as it does not consider the actual range of disparities present in any scene.
- a method for customizing scene content, according to a user or a cluster of users, for a given stereoscopic display including obtaining customization information about the user; obtaining a scene disparity map for a pair of given stereo images and/or a three-dimensional (3D) computer graphic model; and determining an aim disparity range for the user.
- the method of the present invention also generates a customized disparity map and/or rendering conditions for a three-dimensional (3D) computer graphic model correlating with the user's fusing capability of the given stereoscopic display; and re-renders the stereo images for subsequent display.
- FIG. 1 is a diagram of the system employed in the practice of the present invention.
- FIG. 2 is a flow chart showing the steps of the method of the present invention.
- FIG. 3 shows an example user interface screen for inputting rendering intent.
- FIG. 4 is an example graphical user interface for on-display measurement of user stereo fusing capability.
- FIG. 5 is a flow chart showing the steps of determining the aim disparity range using customization information.
- FIG. 6 is a graphical illustration of the default range of disparity as a function of the viewing distance.
- FIG. 7 is a flow chart showing the steps of a method to calculate the aim disparity range based on optometric data.
- FIG. 8 is an example user interface for inputting optometric data.
- FIG. 9 is an illustration of a plot representing optometric data and the range of disparity that can be fused as a function of accommodation distance.
- FIG. 10 shows the relationship of the spatial arrangement of the cameras, the scene depth range, and the corresponding scene disparity range.
- FIG. 11 shows an illustration of remapping between scene disparity and aim disparity range.
- FIG. 12( a ) is a plot showing one method of performing the mapping from input disparity to output disparity that may be employed in the practice of the present invention.
- FIG. 12( b ) is a plot showing a second method of performing the mapping from input disparity to output disparity that may be employed in the practice of the present invention.
- FIG. 12( c ) is a plot showing a third method of performing the mapping from input disparity to output disparity that may be employed in the practice of the present invention.
- the present invention is directed towards a stereoscopic imaging system in which display attributes and knowledge of user's binocular visual performance are applied to improve the comfort and the quality of the viewing experience.
- This invention is based on the research results by the authors where a link between the optometric measurement and the stereo fusing capability is established.
- This system utilizes knowledge of the display attributes and user's binocular visual performance to determine rendering parameters for the stereoscopic information. These rendering parameters are used to provide a customized presentation of information on the display device for a group or a specific human observer. These rendering parameters may be used to select among various segments of pre-rendered content or are used to render left and right eye views to the observer in a way that improves the viewing experience.
- This improvement in viewing experience will often result in increased user comfort or enhancing the quality of the viewing experience in terms of increasing user enjoyment, engagement or presence.
- This improvement may also be linked to the improvement in the performance of the user during the completion of a task.
- the system of the present invention includes an image source 10 for obtaining stereoscopic image information or computer graphics models and textures, an optional image processor 20 for extracting a disparity map from the image source, a storage device 30 for storing display attributes, a storage device 40 for storing customization information, a rendering processor 50 for generating a customized disparity map, and rendering the stereoscopic information based on this map, and a stereoscopic display device 60 for displaying the rendered stereoscopic pair of images.
- This system can optionally have an input device 70 that takes feedback from the user that may be used to assess the user's binocular visual performance, or to set the rendering intent, or to provide means for user interaction with the system during viewing (e.g. indicating the region of interest). It may also optionally have a sensor 80 that monitors the characteristics of the user including the head position, eye gaze direction, and accommodation state of the eye.
- the image source 10 may be any device or combination of devices that are capable of providing stereoscopic image information.
- this image source may include a pair of still or video cameras capable of capturing the stereoscopic image information.
- the image source 10 may be a server that is capable of storing one or more stereoscopic images.
- the image source 10 may also consist of a memory device capable of providing definitions of a computer generated graphics environment and textures that can be used by the image processor to render a stereoscopic view of a three dimensional graphical environment.
- the optional image processor 20 may be any processor capable of performing the calculations that are necessary to determine the disparity between a pair of stereoscopic images that have been retrieved from the image source 10 .
- this processor may be any application specific integrated circuit (ASIC), programmable integrated circuit or general-purpose processor.
- the image processor 20 performs the needed calculation based on information from the storage device for display attributes 30 as well as information from the image source 10 .
- the storage device for display attributes 30 may be any storage device that is capable of storing information regarding the characteristics of the display device.
- This storage device may be, for example, an electronically programmable read only memory (EPROM) that is embedded in the display itself or on a video card. Alternatively, it may be storage that is available on a local or network based computer disk such as a hard drive, CD.
- EPROM electronically programmable read only memory
- This storage device will store information for the display device such as the resolution, size of display, point of optical convergence, and plane of accommodative focus for retrieval and use.
- This storage device will also store viewing environment characteristics, such as the nominal observer viewing distance.
- the storage device for customization information 40 may be any memory device that is capable of storing information regarding the characteristics of the user.
- This storage device may be, for example, an electronically programmable read only memory (EPROM) that is embedded on a video card.
- EPROM electronically programmable read only memory
- it may be a storage device that is available on a local or network based computer disk such as a hard drive, compact disk, or solid-state memory.
- this storage device will include a means to obtain the stereo capability of the user using a computer program.
- This storage device will store information for the user such as the capability of convergence/divergence, phoria, capability of accommodation, range of fusion.
- This storage device will also store the rendering intent, as specified by the user.
- the storage device 30 and 40 may be two separate devices, or may be the same device.
- the rendering processor 50 may be any processor capable of performing the calculations that are necessary to determine the customized disparity map for a specific user and specific scene content and to render the graphical or scene information using this disparity map. The calculation is based on the inputs from image processor 20 , storage device for display attributes 30 , and storage device for customization information 40 , input device 70 , and sensor 80 .
- the rendering processor 50 and the image processor 20 may be two separate devices, or may be the same device.
- the stereoscopic display device 60 may be any display capable of providing a stereoscopic pair of images to a user.
- the stereoscopic display device 60 may be a direct view device that presents an image at the surface of the display (i.e., has a point of accommodation and convergence at the plane of the display surface); such as a barrier screen Liquid Crystal display device, a CRT with liquid crystal shutters and shutter glasses, a polarized projection system with linearly or circular polarized glasses, a display employing lenticules, a projected auto-stereoscopic display, or any other device capable of presenting a pair of stereographic images to each of the left and right eyes at the surface of the display.
- the stereoscopic display device 60 may also be a virtual image display that displays the image at a virtual location, having adjustable points of accommodation and convergence, such as an auto-stereoscopic projection display device, a binocular helmet-mounted display device or retinal laser projection display.
- the optional input device 70 may be a mouse or a keyboard that are capable of taking input from the user. It may also be a multi-modal device that takes input from the user via eye tracking, by voice, or by tactile action.
- the optional sensor 80 may be any tracking device that can detect the position of the head or the gaze direction of the eyes. For example, it can be two cameras that sense the head position, and hence is able to determine the distance between the head and the display screen.
- FIG. 2 is a flow chart depicting a method for implementing this invention.
- Customization information is obtained in step 100 .
- Display attributes are obtained in step 110 .
- Stereoscopic imagery or 3-D graphical models and textures are obtained in step 120 .
- a scene disparity map is extracted in step 130 .
- an aim disparity range is determined based on the customization information and the display attributes.
- a remapping is completed in step 140 where a customized disparity map is generated.
- the stereoscopic scenes are re-rendered in step 150 , and then displayed in step 160 .
- the customization information obtained in step 100 includes a user profile and/or a rendering intent.
- the user profile defines the stereo fusing capability of an individual user as a function of the accommodation state. It also contains a unique user identifier.
- the rendering intent may include items such as the level of skill and the type of task to be performed.
- FIG. 3 shows options that may be depicted in an example user interface screen and presented to the user or a system administrator to input the rendering intent.
- the rendering intent includes task related options as shown in FIG. 3 , such as “optimize detectability” 210 , “provide a fun and comfortable experience” 212 , or “maximize depth near the pointer” 214 .
- the rendering intent may also include skill related options, such as “new user” 216 or “experienced user” 218 .
- the input device 70 may be used to select and confirm an option among these options. Menus of the type shown in FIG. 3 may be employed to determine any of the items that may be used to determine the rendering intent. Once the user selects an option from any of these menus, their selection may be stored in the storage device for customization information 40 to be retrieved for future use based on the user identifier.
- FIG. 4 depicts a graphical user interface for a software application that can be used to assess the user's stereo fusing capability on the same display system.
- the input device 70 is assumed to be a mouse.
- the slider marker 310 is in the middle of the slider bar 320 .
- There is no horizontal disparity for the test stimulus 330 and the user should see the test stimulus 330 as having the same depth as the background 300 .
- the user starts to move the slider marker 310 to the right. This movement increases the crossed disparity of the test stimulus 330 so that it appears to pop out towards the user relative to the background 300 .
- the test stimulus 330 would no longer be fusible.
- FRCD Fusional Range for Crossed Disparity
- the user would click button 340 ‘Done’ to record this range, and the system is reset to the original state, i.e. the slider marker 310 is back to the middle of the slider bar 320 , and the test stimulus 330 resides in the same depth plane as the background 300 .
- the limit of the Fusional Range for Uncrossed Disparity (FRUD) is recorded similarly with the slider marker 310 being moved to the left.
- the display may show a test stimulus 330 with a larger disparity than anyone can be expected to fuse and the slider marker 310 can be used to decrease the disparity until the user is able to fuse the target at which time they can press the button 340 ‘Done’.
- the computer may automatically adjust the disparity of the target and the user may simply press the button 340 ‘Done’ whenever they are or are not able to fuse the target.
- the user interface may display a plurality of test stimulus 330 having a range of disparities and the user may simply indicate the test stimulus with the largest disparity that they are able to fuse. It is also understood that some display systems may allow some pixels to have a different focal distance than other pixels and therefore some of the test stimulus 330 used within the user interface may have a different focal distances. Through showing a number of the user interface screens in which focal distance and disparity are changed independently of one another, a profile may be built which indicates the fusable disparity as a function of focal distance for each observer. In yet other embodiments of the user interface, the user may be asked to provide a rating or other number associated with their perception of their psychophysical response to the test stimulus 330 .
- the user may be shown one or more test stimuli 330 and asked to indicate their comfort when viewing this test stimuli by entering a number corresponding to their comfort into a data entry field and then pressing the button 340 ‘Done’ to indicate that the response has been correctly entered.
- FIG. 5 shows the steps taken to determine the aim disparity range, as described in step 135 of FIG. 2 .
- the system obtains a user identifier in step 400 using input device 70 .
- the system determines if the user profile is existent or is new. If the user identifier is linked to an existing user profile, the user profile will be retrieved in step 420 . If the user identifier is new, the system would offer options to obtain the user profile in step 430 . There are three options to obtain this user profile. It can be obtained using the same display device with a computer program, as illustrated in FIG. 4 . It can also be obtained offline using optometric assessment, and then the data may be input into the system.
- a rendering intent is obtained from the input of the user or system administrator.
- two factors are assigned based on the rendering intent: Cs for skill level, and Ct for task type.
- the values for Cs and Ct are commonly between 0 and 1, but they may be larger than 1 for certain applications.
- Cs is usually set to a value smaller than that for the experienced users.
- the task factor Ct depends on the task types.
- a value is assigned to the adaptive factor Ca.
- the factor Ca is intended to compensate for the dynamic nature of the viewing experience. For example, the user may experience stereoscopically induced visual fatigue during long viewing periods. This fatigue may be detected by sensor 80 , and Ca would be set to a smaller value to preserve user comfort.
- the detection can take many forms. For example, it can be a recording of the change in blinking rate. It can also be a recording of the change in the pattern of eye gaze direction.
- step 470 an Aim Range of Crossed Disparity (ARCD) and an Aim Range of Uncrossed Disparity (ARUD) are determined, as shown in equations 1 and 2.
- the convergence point corresponds to a location in the scene space where the horizontal disparity is zero.
- ARCD Cs*Ct*Ca *FRCD (1)
- ARUD Cs*Ct*Ca *FRUD (2)
- FIG. 6 is a graphical illustration of how the default fusional ranges of crossed/uncrossed disparity may vary as a function of the viewing distance.
- FIG. 7 shows a method of using optometric data to determine the aim disparity range.
- the system obtains optometric data.
- This data may be obtained in many ways, including requiring the user to input this information using the input device 70 .
- Other ways of obtaining this information may include obtaining this information from another networked or removable storage device where the file is linked to the user identifier.
- the optometric data may include but are not limited to the following parameters: interpupillary distance, dissociated phoria, fusional reserves.
- Dissociated phoria is referred to as the amount by which the lines of sight of the eyes deviate from a corresponding fusion stimulus during monocular viewing.
- Fusional reserve is referred to as the amount of vergence, which is required to overcome disparity introduced by placing base-in (BI) or base-out (BO) lenses in front of a person's eyes.
- Phoria, base-in, and base-out fusional reserves are usually measured at near and distant accommodation planes.
- the optometric data may include any or all of the following parameters:
- FIG. 8 An example of a user interface screen for providing optometric data is shown in FIG. 8 .
- This user interface allows the user to enter their interpupillary distance 512 , near accommodation plane for measuring data 514 , the distant accommodation plane for measuring data 516 , their near phoria 518 , their distant phoria 520 , their near base-out fusional reserve 522 , their near base-in fusional reserve 524 , their distant base-out fusional reserve 526 and their distant base-in fusional reserve 528 .
- the user may dismiss this user interface button by pressing the ‘OK’ button 530 .
- the optometric data are stored in the storage device for customization information 40 .
- step 540 the optometric data measured for two accommodation planes are generalized for other accommodation planes. For this, one needs to convert the optometric data into common units (e.g., degrees of vergence angle, diopters) and calculate linear parameters describing phoria, BI and BO fusional reserve lines.
- common units e.g., degrees of vergence angle, diopters
- linear parameters describing phoria, BI and BO fusional reserve lines e.g., degrees of vergence angle, diopters
- k phoria arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d near ) - arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d distant ) P near - P distant ( 3 )
- c phoria arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d near ) - k phoria ⁇ P near ( 4 )
- k BO arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d near ) - arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d distant ) BO near - BO distant ( 5 )
- c BO arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d near ) - k BO ⁇ BO near ( 6 )
- k BI arc ⁇ ⁇ tan ⁇ ( IPD 2 ⁇ d near ) - arc ⁇ ⁇ tan ⁇ ( IPD 2
- Equations 3-8 represent vergence angle for a corresponding accommodation plane. More specifically, this parameter represents how much the eyes must converge from parallelism to fixate an object. It will be referred to as Convergence Demand, i.e., CD.
- Convergence Demand for a given display accommodation distance will be referred to as CD display .
- the optometric data are calculated for the accommodation plane of the display d display , which can be obtained from the storage device for display attributes 30 .
- PD display Phoria Demand (PD display ) and Fusional Reserve (FR display ) for the accommodation plane of the display d display :
- PD display P display ⁇ CD display (12)
- FR display (crossed) BO display ⁇ CD display (13)
- FR display (uncrossed) CD display ⁇ BI display (14)
- a value for a desired Comfort Level, CL is obtained.
- the comfort level can range from 0 (very uncomfortable) to 100 (very comfortable) and it depends on viewing task 200 , which can be obtained from the storage for customization information 40 . If the task 200 is to “provide fun and comfortable experience” then the comfort level could be high (e.g., 90), if the task is to “optimize detectability” then the comfort level could be moderate (e.g., 70).
- step 555 a fusion comfort range is calculated for both crossed and uncrossed conditions.
- CR ⁇ ( crossed ) w reserves ⁇ FR ⁇ ( crossed ) d + w phoria ⁇ P d - ln ⁇ ( 1 FP - 1 ) b ( 16 )
- CR ⁇ ( uncrossed ) - w reserves ⁇ ⁇ ⁇ FR ⁇ ( uncrossed ) d + w phoria ⁇ P d - ln ⁇ ( 1 FP - 1 ) b ( 17 )
- FIG. 9 shows an illustration plot representing the display accommodation plane 780 and optometric data: Near base-out fusional reserve 522 , Near base-in fusional reserve 524 , Distant base-out fusional reserve 526 , Distant base-in fusional reserve 528 , Near phoria 518 , Distant phoria 520 and Equal convergence-accommodation demand line 770 are represented in this figure. Note that the shaded area 790 represents the range of disparities that one could reasonably expect a user to be able to fuse.
- the display attributes are obtained in step 110 from storage device for display attributes 30 .
- the attributes include but are not limited to the following parameters: resolution, size of display, point of optical convergence, plane of accommodative focus, and display refresh rate.
- the stereo images are obtained in step 120 .
- imagery scenes they can be pairs of images for left and right eye views.
- the 3-D model of the scenes and the rendering method are obtained.
- a disparity map for a pair of images is obtained.
- distance to each pixel may be recorded at the time of capture. If a stereo image pair has been captured without depth information, there are several existing methods in the prior art to obtain this disparity map, including object matching and phase correlation.
- the disparity map can be obtained through analytical calculation, given the knowledge of scene 3-D model and the rendering method.
- FIG. 10 shows the relationship of the spatial arrangement of a pair of stereo cameras that may have been used to capture a real scene or that might have been simulated in a 3D graphical environment, the scene depth range, and the corresponding scene disparity range.
- Two cameras 800 and 810 are placed side by side with a separation distance 870 .
- the converging point of the two cameras is at 820 , which is away from the camera by a distance 880 .
- the scene contains an object at the farthest distance 830 and another object at the closest distance 840 .
- the convergence plane 820 has zero disparity.
- Between the camera and the convergence point is the Scene Range of Crossed Disparity 860 (SRCD).
- Between the convergence point and the farthest object in the scene is the Scene Range of Uncrossed Disparity 850 (SRUD).
- SRCD Scene Range of Crossed Disparity 860
- SRUD Scene Range of Uncrossed Disparity
- step 140 a customized disparity map is generated.
- this step will generally consist of redefining the convergence point, and remapping the scene disparity range based on the aim disparity range.
- FIG. 11 shows the remapping process.
- SRUD Scene Range of Uncrossed Disparity
- SRCD Scene Range of Crossed Disparity
- aim maximum uncrossed disparity 890 On the aim side there are aim maximum uncrossed disparity 890 , aim zero disparity/aim convergence point 910 , aim maximum crossed disparity 930 , Aim Range of Uncrossed Disparity (ARUD) 900 , and Aim Range of Crossed Disparity (ARCD) 920 .
- the overall scene disparity range (SRUD+SRCD) is larger than the aim disparity range (ARUD+ARCD), and the ratio of SRUD to SRCD is not equal to the ratio of ARUD to ARCD.
- SRCD, SRUD, ARUD, and ARCD are range values, and always have positive values.
- the second step is to rescale the total range of scene disparity so that it matches the aim range of disparity.
- FIG. 12( a ) shows an example of linear mapping.
- RSD Remapped Scene Disparity
- OSD Original Scene Disparity
- the remapping between Remapped Scene Disparity (RSD) and the Original Scene Disparity (OSD) can also take other forms. For example, it can be a nonlinear mapping, as shown in FIG. 12( b ). It can also take into account the region of interest (ROI), as shown in FIG. 12( c ).
- RSD Remapped Scene Disparity
- OSD Original Scene Disparity
- the step of generating a customized disparity map 140 may consist of the location, orientation, focal distance, magnification and depth of field characteristics of a pair of cameras that are used to emulate actual cameras as if they are capturing the scene.
- the convergence point can most easily be modified by changing the location, orientation and focal distance of the cameras to provide a point where the two cameras are converged and focused to a point in space.
- the scene disparity range can then be modified by changing parameters such as the separation of the cameras or the magnification of the scene to increase or decrease disparities.
- the depth of field of the cameras may be adjusted to blur near or distant objects such that the human brain is unable to interpret the disparity information from these near or distant objects.
- This disparity remapping process described in steps 130 , 135 , and 140 may apply to a single stereo image pair, an entire computer graphics scene, or an entire video sequence of stereo image pairs.
- stereo image pairs will either be rendered within systems employing computer graphic models or re-rendered for systems employing stereoscopic image data.
- the convergence point and the disparity range may be used to establish rendering parameters for computer graphics systems.
- the scene may be re-rendered to remap the disparities of objects in the left or right eye to match the aim disparity map.
- Various means for performing this mapping are known that either use information from one image to generate a second image or that use information from multiple image frames to generate a second image frame.
- the remapping methods described in, WO2003000122A1 and U.S. Pat. No. 4,647,965 can be used in the present invention.
- stereo images may simply be rendered for display.
- the remapping and re-rendering process in steps 135 , 140 and 150 can be a dynamic process, and is dependent on the input from sensor 80 .
- the sensor 80 senses a new head position, it means that the viewing distance is changed. This would result in a change in the aim disparity range, as depicted in FIG. 9 .
- the final step of the method is to display the images in step 160 .
- the images will be displayed using stereoscopic display device 60 .
- a user is defined as both an individual user of the stereoscopic display and also as a cluster of individual users having common perceptual characteristics for stereoscopic viewing.
- the process applied in FIG. 5 can then be used to classify each user into a category for one of the selected groups by determining which of the pre-selected groups has a visual performance that most closely matched the users visual performance.
- the user may self classify themselves using even simpler means of indicating that they belong in a group such as indicating if they typically experience eye strain or headaches when using stereo displays.
- the process applied in FIG. 5 can be simplified to include displaying a single scene or a series of scenes with similar depth information that have been rendered with different rendering parameters and then providing the user to select the image rendering that is the most useful or comfortable to view. Once this image is selected, the set of scenes with this same rendering may be presented to the user and/or all future images may be rendered using similar rendering parameters.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Geometry (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
Abstract
Description
ARCD=Cs*Ct*Ca*FRCD (1)
ARUD=Cs*Ct*Ca*FRUD (2)
IPD | interpupillary distance | ||
dnear | Near accommodation plane for measuring data | ||
ddistant | Distant accommodation plane for measuring data | ||
Pnear | Near phoria | ||
Pdistant | Distant phoria | ||
BOnear | Near base-out fusional reserve | ||
BInear | Near base-in fusional reserve | ||
BOdistant | Distant base-out fusional reserve | ||
BIdistant | Distant base-in fusional reserve | ||
PDdisplay =P display−CDdisplay (12)
FRdisplay(crossed)=BOdisplay−CDdisplay (13)
FRdisplay(uncrossed)=CDdisplay−BIdisplay (14)
FP=k CL×CL−c CL (15)
ARCD=2×CR(crossed) (18)
ARUD=2×−CR(uncrossed) (19)
(SRUD−x)/(SRCD+x)=ARUD/AUCD (20)
where x is a signed entity, and a positive value for x means that the convergence point is moved away from the cameras. SRCD, SRUD, ARUD, and ARCD are range values, and always have positive values.
K*(SRUD+SRCD)=ARUD+AUCD (21)
RSD=K*(OSD−x) (22)
where RSD, OSD and x are all signed entities. Positive values represent crossed disparity, and negative values uncrossed disparity. In the linear remapping the disparity is compressed equally throughout the disparity range, and there is no distortion of depth in the scene.
PARTS LIST |
10 | image source |
20 | image processor |
30 | storage device for display attributes |
40 | storage device for customization information |
50 | rendering processor |
60 | stereoscopic display device |
100 | obtain customization information |
110 | obtain disparity attributes |
120 | obtain images/3D graphics |
130 | obtain scene disparity map |
135 | determine aim disparity |
140 | generate customized disparity map |
150 | re-render scene |
160 | display images |
210 | optimize delectability |
212 | provide a fun and comfortable experience |
214 | maximize depth near the pointer |
216 | new user |
218 | experienced user |
300 | background |
310 | slider marker |
320 | slider bar |
330 | test stimulus |
340 | done |
400 | obtain user identifier |
410 | has user profile |
420 | retrieve user profile |
430 | create user profile |
440 | obtain rendering intent |
450 | assign values to Cs and Ct |
460 | assign values to Ca |
470 | calculate aim disparity range |
510 | obtain optometric parameters for |
a set of accommodation planes | |
540 | generalize optometric parameters |
for other accommodation planes | |
545 | calculate optometric parameters for |
accommodation planes of display | |
550 | obtain values for desired fusion comfort level |
555 | determine aim disparity range |
512 | interpupillary distance |
514 | near testing distance |
516 | distant testing distance |
518 | near phoria |
520 | distant phoria |
522 | near base-out fusional reserve |
524 | near base-in fusional reserve |
526 | distant base-out fusional reserve |
528 | distant base-in fusional reserve |
770 | equal convergence-accommodation demand line |
780 | display accommodation plane |
790 | range of disparities for comfortable fusing |
800 | left camera |
810 | right camera |
820 | convergence point/zero disparity |
830 | farthest object/maximum uncrossed disparity |
840 | closest object/maximum crossed disparity |
850 | scene range of uncrossed disparity (SRUD) |
860 | scene range of crossed disparity (SRCD) |
870 | camera separation distance |
880 | distance between camera and convergence point |
890 | aim maximum uncrossed disparity |
900 | aim range of uncrossed disparity (ARUD) |
910 | aim zero disparity/aim convergence point |
920 | aim range of crossed disparity (ARCD) |
930 | aim maximum crossed disparity |
Claims (15)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/789,272 US8094927B2 (en) | 2004-02-27 | 2004-02-27 | Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/789,272 US8094927B2 (en) | 2004-02-27 | 2004-02-27 | Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer |
Publications (2)
Publication Number | Publication Date |
---|---|
US20050190180A1 US20050190180A1 (en) | 2005-09-01 |
US8094927B2 true US8094927B2 (en) | 2012-01-10 |
Family
ID=34887237
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US10/789,272 Expired - Fee Related US8094927B2 (en) | 2004-02-27 | 2004-02-27 | Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer |
Country Status (1)
Country | Link |
---|---|
US (1) | US8094927B2 (en) |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100066816A1 (en) * | 2008-09-18 | 2010-03-18 | Kane Paul J | Stereoscopic display system with flexible rendering for multiple simultaneous observers |
US20100177171A1 (en) * | 2009-01-09 | 2010-07-15 | Marcus Michael A | Dual-view stereoscopic display using linear modulator arrays |
US20110074933A1 (en) * | 2009-09-28 | 2011-03-31 | Sharp Laboratories Of America, Inc. | Reduction of viewer discomfort for stereoscopic images |
US20110169926A1 (en) * | 2010-01-13 | 2011-07-14 | Tatsumi Sakaguchi | Data structure, recording apparatus and method, playback apparatus and method, and program |
US20110304618A1 (en) * | 2010-06-14 | 2011-12-15 | Qualcomm Incorporated | Calculating disparity for three-dimensional images |
US20120062548A1 (en) * | 2010-09-14 | 2012-03-15 | Sharp Laboratories Of America, Inc. | Reducing viewing discomfort |
US20120099195A1 (en) * | 2010-10-21 | 2012-04-26 | Myung-Ryul Choi | Eyewear, three-dimensional image display system employing the same, and method of allowing viewing of image |
US20120113107A1 (en) * | 2010-11-05 | 2012-05-10 | Jun Woo Jang | Image display device and driving method for thereof |
US20140267001A1 (en) * | 2013-03-12 | 2014-09-18 | Joshua J. Ratcliff | Techniques for automated evaluation of 3d visual content |
US20160029012A1 (en) * | 2013-04-05 | 2016-01-28 | Koninklijke Philips N.V. | Re-targeting a three-dimensional image signal |
CN105872528A (en) * | 2014-12-31 | 2016-08-17 | 深圳创锐思科技有限公司 | 3D display method, device and 3D display device |
TWI566576B (en) * | 2014-06-03 | 2017-01-11 | 宏碁股份有限公司 | Stereoscopic view synthesis method and apparatus using the same |
US9823474B2 (en) | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
US20170365101A1 (en) * | 2016-06-20 | 2017-12-21 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US9918066B2 (en) | 2014-12-23 | 2018-03-13 | Elbit Systems Ltd. | Methods and systems for producing a magnified 3D image |
US20180152698A1 (en) * | 2016-11-29 | 2018-05-31 | Samsung Electronics Co., Ltd. | Method and apparatus for determining interpupillary distance (ipd) |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
US10341635B2 (en) | 2017-05-17 | 2019-07-02 | National Chiao Tung University | Stereoscopic imaging method and device |
US10354427B2 (en) * | 2016-10-19 | 2019-07-16 | Samsung Display Co., Ltd. | Method of driving head mounted display and head mounted display performing the same |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
US20210192752A1 (en) * | 2019-12-23 | 2021-06-24 | Texas Instruments Incorporated | Cascaded architecture for disparity and motion prediction with block matching and convolutional neural network (cnn) |
US20220207776A1 (en) * | 2020-01-10 | 2022-06-30 | Dalian University Of Technology | Disparity image fusion method for multiband stereo cameras |
Families Citing this family (165)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
AU2003224642A1 (en) * | 2002-03-04 | 2003-09-22 | Vigilos, Inc. | Data archival system and method |
US20050275717A1 (en) * | 2004-06-10 | 2005-12-15 | Sarnoff Corporation | Method and apparatus for testing stereo vision methods using stereo imagery data |
KR100619067B1 (en) * | 2005-01-31 | 2006-08-31 | 삼성전자주식회사 | Stereoscopic projection system |
US20060250322A1 (en) * | 2005-05-09 | 2006-11-09 | Optics 1, Inc. | Dynamic vergence and focus control for head-mounted displays |
US20070165942A1 (en) * | 2006-01-18 | 2007-07-19 | Eastman Kodak Company | Method for rectifying stereoscopic display systems |
CN101127202B (en) * | 2006-08-18 | 2011-07-27 | 鸿富锦精密工业(深圳)有限公司 | Display device parameter automatic regulation system and method |
KR100776805B1 (en) * | 2006-09-29 | 2007-11-19 | 한국전자통신연구원 | Efficient image transmission method and apparatus using stereo vision processing for intelligent service robot system |
KR100834577B1 (en) * | 2006-12-07 | 2008-06-02 | 한국전자통신연구원 | Home intelligent service robot and method capable of searching and following moving of target using stereo vision processing |
KR20080076628A (en) * | 2007-02-16 | 2008-08-20 | 삼성전자주식회사 | Image display device for improving three-dimensional effect of stereo-scopic image and method thereof |
JP5138031B2 (en) * | 2007-05-11 | 2013-02-06 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Method, apparatus and system for processing depth related information |
US8253737B1 (en) * | 2007-05-17 | 2012-08-28 | Nvidia Corporation | System, method, and computer program product for generating a disparity map |
DE102007028654A1 (en) * | 2007-06-19 | 2009-01-15 | Institut für Arbeitsphysiologie an der Universität Dortmund | Method for ergonomic representation of spatial depth of objects in virtual environments, involves presenting view of object to observer in form of overlapping images for each eye of observer by stereoscopic projection system |
US8558832B1 (en) * | 2007-06-19 | 2013-10-15 | Nvida Corporation | System, method, and computer program product for generating a plurality of two-dimensional images and depth maps for a scene at a point in time |
US8390674B2 (en) * | 2007-10-10 | 2013-03-05 | Samsung Electronics Co., Ltd. | Method and apparatus for reducing fatigue resulting from viewing three-dimensional image display, and method and apparatus for generating data stream of low visual fatigue three-dimensional image |
US8878836B2 (en) * | 2008-02-29 | 2014-11-04 | Samsung Electronics Co., Ltd. | Method and apparatus for encoding datastream including additional information on multiview image and method and apparatus for decoding datastream by using the same |
CN102106152A (en) | 2008-07-24 | 2011-06-22 | 皇家飞利浦电子股份有限公司 | Versatile 3-D picture format |
US9251621B2 (en) * | 2008-08-14 | 2016-02-02 | Reald Inc. | Point reposition depth mapping |
EP2319016A4 (en) * | 2008-08-14 | 2012-02-01 | Reald Inc | Stereoscopic depth mapping |
WO2010040146A1 (en) * | 2008-10-03 | 2010-04-08 | Real D | Optimal depth mapping |
JP5456791B2 (en) * | 2009-01-26 | 2014-04-02 | トビー・テクノロジー・アーベー | System and method for determining a person's gaze point for a spatial domain image |
US8588515B2 (en) * | 2009-01-28 | 2013-11-19 | Electronics And Telecommunications Research Institute | Method and apparatus for improving quality of depth image |
JP2012518317A (en) * | 2009-02-18 | 2012-08-09 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Transfer of 3D observer metadata |
US8279267B2 (en) * | 2009-03-09 | 2012-10-02 | Mediatek Inc. | Apparatus and method for capturing images of a scene |
EP3073735A1 (en) * | 2009-03-21 | 2016-09-28 | RealD Inc. | Point reposition depth mapping |
US7899321B2 (en) * | 2009-03-23 | 2011-03-01 | James Cameron | Stereo camera with automatic control of interocular distance |
US8406619B2 (en) * | 2009-03-23 | 2013-03-26 | Vincent Pace & James Cameron | Stereo camera with automatic control of interocular distance |
US8823775B2 (en) * | 2009-04-30 | 2014-09-02 | Board Of Regents, The University Of Texas System | Body surface imaging |
JP5409107B2 (en) * | 2009-05-13 | 2014-02-05 | 任天堂株式会社 | Display control program, information processing apparatus, display control method, and information processing system |
JP2011035592A (en) * | 2009-07-31 | 2011-02-17 | Nintendo Co Ltd | Display control program and information processing system |
US9083958B2 (en) * | 2009-08-06 | 2015-07-14 | Qualcomm Incorporated | Transforming video data in accordance with three dimensional input formats |
JP5444955B2 (en) | 2009-08-31 | 2014-03-19 | ソニー株式会社 | Stereoscopic image display system, parallax conversion device, parallax conversion method, and program |
US20110080466A1 (en) * | 2009-10-07 | 2011-04-07 | Spatial View Inc. | Automated processing of aligned and non-aligned images for creating two-view and multi-view stereoscopic 3d images |
JP5405264B2 (en) * | 2009-10-20 | 2014-02-05 | 任天堂株式会社 | Display control program, library program, information processing system, and display control method |
CN102055956B (en) * | 2009-11-02 | 2017-05-10 | 通用汽车环球科技运作公司 | Vehicle-mounted three-dimensional video system and method for monitoring vehicle surrounding environment by using same |
JP4754031B2 (en) * | 2009-11-04 | 2011-08-24 | 任天堂株式会社 | Display control program, information processing system, and program used for stereoscopic display control |
US8711204B2 (en) | 2009-11-11 | 2014-04-29 | Disney Enterprises, Inc. | Stereoscopic editing for video production, post-production and display adaptation |
US9445072B2 (en) | 2009-11-11 | 2016-09-13 | Disney Enterprises, Inc. | Synthesizing views based on image domain warping |
US10095953B2 (en) | 2009-11-11 | 2018-10-09 | Disney Enterprises, Inc. | Depth modification for display applications |
US20120229604A1 (en) * | 2009-11-18 | 2012-09-13 | Boyce Jill Macdonald | Methods And Systems For Three Dimensional Content Delivery With Flexible Disparity Selection |
KR101634388B1 (en) * | 2009-12-07 | 2016-06-28 | 엘지전자 주식회사 | Method for displaying broadcasting data and mobile terminal thereof |
US20120287233A1 (en) * | 2009-12-29 | 2012-11-15 | Haohong Wang | Personalizing 3dtv viewing experience |
EP2355526A3 (en) | 2010-01-14 | 2012-10-31 | Nintendo Co., Ltd. | Computer-readable storage medium having stored therein display control program, display control apparatus, display control system, and display control method |
JP4758520B1 (en) * | 2010-03-05 | 2011-08-31 | シャープ株式会社 | Stereoscopic image display device and operation method of stereoscopic image display device |
US8730301B2 (en) * | 2010-03-12 | 2014-05-20 | Sony Corporation | Service linkage to caption disparity data transport |
US8265477B2 (en) * | 2010-03-31 | 2012-09-11 | James Cameron | Stereo camera with preset modes |
CN102835116B (en) * | 2010-04-01 | 2015-03-25 | 诺基亚公司 | Method and apparatus for selecting a stereoscopic imaging viewpoint pair |
GB2479784B (en) * | 2010-04-23 | 2012-11-07 | Nds Ltd | Image scaling |
CN102893613B (en) * | 2010-04-28 | 2016-06-22 | 富士胶片株式会社 | Stereo-picture regenerating unit and method, stereo photographic device, stereoscopic display device |
US20110273437A1 (en) * | 2010-05-04 | 2011-11-10 | Dynamic Digital Depth Research Pty Ltd | Data Dependent Method of Configuring Stereoscopic Rendering Parameters |
US9693039B2 (en) * | 2010-05-27 | 2017-06-27 | Nintendo Co., Ltd. | Hand-held electronic device |
JP5872185B2 (en) * | 2010-05-27 | 2016-03-01 | 任天堂株式会社 | Portable electronic devices |
JP2011248723A (en) * | 2010-05-28 | 2011-12-08 | Sony Corp | Image processing device, method and program |
US9030536B2 (en) | 2010-06-04 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for presenting media content |
JP5556394B2 (en) * | 2010-06-07 | 2014-07-23 | ソニー株式会社 | Stereoscopic image display system, parallax conversion device, parallax conversion method, and program |
JP6106586B2 (en) * | 2010-06-28 | 2017-04-05 | トムソン ライセンシングThomson Licensing | Method and apparatus for customizing 3D effect of 3D content |
US8640182B2 (en) | 2010-06-30 | 2014-01-28 | At&T Intellectual Property I, L.P. | Method for detecting a viewing apparatus |
US8593574B2 (en) | 2010-06-30 | 2013-11-26 | At&T Intellectual Property I, L.P. | Apparatus and method for providing dimensional media content based on detected display capability |
US9787974B2 (en) | 2010-06-30 | 2017-10-10 | At&T Intellectual Property I, L.P. | Method and apparatus for delivering media content |
US8918831B2 (en) | 2010-07-06 | 2014-12-23 | At&T Intellectual Property I, Lp | Method and apparatus for managing a presentation of media content |
US9049426B2 (en) | 2010-07-07 | 2015-06-02 | At&T Intellectual Property I, Lp | Apparatus and method for distributing three dimensional media content |
KR101731343B1 (en) * | 2010-07-14 | 2017-04-28 | 엘지전자 주식회사 | Mobile terminal and method for controlling thereof |
US9232274B2 (en) | 2010-07-20 | 2016-01-05 | At&T Intellectual Property I, L.P. | Apparatus for adapting a presentation of media content to a requesting device |
US9560406B2 (en) | 2010-07-20 | 2017-01-31 | At&T Intellectual Property I, L.P. | Method and apparatus for adapting a presentation of media content |
US9032470B2 (en) | 2010-07-20 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus for adapting a presentation of media content according to a position of a viewing apparatus |
US8994716B2 (en) | 2010-08-02 | 2015-03-31 | At&T Intellectual Property I, Lp | Apparatus and method for providing media content |
US8438502B2 (en) | 2010-08-25 | 2013-05-07 | At&T Intellectual Property I, L.P. | Apparatus for controlling three-dimensional images |
CN103392343B (en) * | 2010-09-20 | 2016-03-09 | 联发科技(新加坡)私人有限公司 | Display unit, display packing and Play System |
US8947511B2 (en) * | 2010-10-01 | 2015-02-03 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting three-dimensional media content |
US9035939B2 (en) * | 2010-10-04 | 2015-05-19 | Qualcomm Incorporated | 3D video control system to adjust 3D video rendering based on user preferences |
KR101723235B1 (en) * | 2010-10-04 | 2017-04-04 | 삼성전자주식회사 | Apparatus and method for attenuating three dimensional effect of stereoscopic image |
JP5066244B2 (en) * | 2010-10-29 | 2012-11-07 | 株式会社東芝 | Video playback apparatus and video playback method |
US20120127265A1 (en) * | 2010-11-18 | 2012-05-24 | Yi-Shu Chang | Apparatus and method for stereoscopic effect adjustment on video display |
US20120127155A1 (en) * | 2010-11-23 | 2012-05-24 | Sharp Laboratories Of America, Inc. | 3d comfort and fusion limit empirical model |
US9088835B2 (en) * | 2010-12-17 | 2015-07-21 | Thomson Licensing | Method for adjusting depth or view of three-dimensional streaming video |
JP5010732B2 (en) * | 2010-12-28 | 2012-08-29 | 株式会社東芝 | Stereoscopic image processing apparatus and stereoscopic image processing method |
US9041774B2 (en) * | 2011-01-07 | 2015-05-26 | Sony Computer Entertainment America, LLC | Dynamic adjustment of predetermined three-dimensional video settings based on scene content |
US9183670B2 (en) | 2011-01-07 | 2015-11-10 | Sony Computer Entertainment America, LLC | Multi-sample resolving of re-projection of two-dimensional image |
WO2012094077A1 (en) * | 2011-01-07 | 2012-07-12 | Sony Computer Entertainment America Llc | Multi-sample resolving of re-projection of two-dimensional image |
US20120200670A1 (en) * | 2011-02-04 | 2012-08-09 | Nokia Corporation | Method and apparatus for a disparity limit indicator |
KR20120101881A (en) * | 2011-03-07 | 2012-09-17 | 삼성전자주식회사 | Image display method and apparatus thereof |
JP2012204852A (en) * | 2011-03-23 | 2012-10-22 | Sony Corp | Image processing apparatus and method, and program |
KR101824005B1 (en) * | 2011-04-08 | 2018-01-31 | 엘지전자 주식회사 | Mobile terminal and image depth control method thereof |
EP2710550A2 (en) * | 2011-05-17 | 2014-03-26 | Max-Planck-Gesellschaft zur Förderung der Wissenschaften e.V. | Methods and device for processing digital stereo image content |
EP2525578A1 (en) * | 2011-05-19 | 2012-11-21 | Thomson Licensing | Method for adjusting the disparity of 3D content for an observer |
FR2968108A1 (en) | 2011-06-20 | 2012-06-01 | Thomson Licensing | Method for processing video image to display stereoscopic image on target screen of e.g. TV, involves calculating target disparity information associated to part of image according to disparity budget and original disparity information |
TWI504232B (en) * | 2011-06-22 | 2015-10-11 | Realtek Semiconductor Corp | Apparatus for rendering 3d images |
US9445046B2 (en) | 2011-06-24 | 2016-09-13 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting media content with telepresence |
US9030522B2 (en) | 2011-06-24 | 2015-05-12 | At&T Intellectual Property I, Lp | Apparatus and method for providing media content |
US8947497B2 (en) | 2011-06-24 | 2015-02-03 | At&T Intellectual Property I, Lp | Apparatus and method for managing telepresence sessions |
US9602766B2 (en) | 2011-06-24 | 2017-03-21 | At&T Intellectual Property I, L.P. | Apparatus and method for presenting three dimensional objects with telepresence |
US8587635B2 (en) | 2011-07-15 | 2013-11-19 | At&T Intellectual Property I, L.P. | Apparatus and method for providing media services with telepresence |
KR20130010543A (en) * | 2011-07-19 | 2013-01-29 | 삼성전자주식회사 | Display apparatus and control method thereof |
WO2013018004A1 (en) * | 2011-07-29 | 2013-02-07 | Sony Mobile Communications Ab | Gaze controlled focusing of stereoscopic content |
CA2750287C (en) | 2011-08-29 | 2012-07-03 | Microsoft Corporation | Gaze detection in a see-through, near-eye, mixed reality display |
CN103033936A (en) | 2011-08-30 | 2013-04-10 | 微软公司 | Head mounted display with iris scan profiling |
US9213163B2 (en) | 2011-08-30 | 2015-12-15 | Microsoft Technology Licensing, Llc | Aligning inter-pupillary distance in a near-eye display system |
US9025252B2 (en) | 2011-08-30 | 2015-05-05 | Microsoft Technology Licensing, Llc | Adjustment of a mixed reality display for inter-pupillary distance alignment |
JP5032694B1 (en) * | 2011-08-31 | 2012-09-26 | 株式会社東芝 | Video processing apparatus and video processing method |
US9554114B2 (en) * | 2011-09-15 | 2017-01-24 | Broadcom Corporation | Depth range adjustment for three-dimensional images |
US20130113879A1 (en) * | 2011-11-04 | 2013-05-09 | Comcast Cable Communications, Llc | Multi-Depth Adaptation For Video Content |
JP6215228B2 (en) | 2012-01-04 | 2017-10-18 | トムソン ライセンシングThomson Licensing | 3D image sequence processing |
US9536378B2 (en) | 2012-01-13 | 2017-01-03 | Igt Canada Solutions Ulc | Systems and methods for recommending games to registered players using distributed storage |
US9129489B2 (en) | 2012-01-13 | 2015-09-08 | Gtech Canada Ulc | Remote gaming method where venue's system suggests different games to remote player using a mobile gaming device |
US9011240B2 (en) | 2012-01-13 | 2015-04-21 | Spielo International Canada Ulc | Remote gaming system allowing adjustment of original 3D images for a mobile gaming device |
US9159189B2 (en) | 2012-01-13 | 2015-10-13 | Gtech Canada Ulc | Mobile gaming device carrying out uninterrupted game despite communications link disruption |
US9208641B2 (en) | 2012-01-13 | 2015-12-08 | Igt Canada Solutions Ulc | Remote gaming method allowing temporary inactivation without terminating playing session due to game inactivity |
US9269222B2 (en) | 2012-01-13 | 2016-02-23 | Igt Canada Solutions Ulc | Remote gaming system using separate terminal to set up remote play with a gaming terminal |
US9079098B2 (en) | 2012-01-13 | 2015-07-14 | Gtech Canada Ulc | Automated discovery of gaming preferences |
US9295908B2 (en) | 2012-01-13 | 2016-03-29 | Igt Canada Solutions Ulc | Systems and methods for remote gaming using game recommender |
US9558625B2 (en) | 2012-01-13 | 2017-01-31 | Igt Canada Solutions Ulc | Systems and methods for recommending games to anonymous players using distributed storage |
US9123200B2 (en) | 2012-01-13 | 2015-09-01 | Gtech Canada Ulc | Remote gaming using game recommender system and generic mobile gaming device |
US9280868B2 (en) | 2012-01-13 | 2016-03-08 | Igt Canada Solutions Ulc | Systems and methods for carrying out an uninterrupted game |
WO2013109282A1 (en) * | 2012-01-20 | 2013-07-25 | Empire Technology Development Llc | Mirror array display system |
US20130202191A1 (en) * | 2012-02-02 | 2013-08-08 | Himax Technologies Limited | Multi-view image generating method and apparatus using the same |
JP6024159B2 (en) * | 2012-03-30 | 2016-11-09 | 株式会社ニコン | Information presenting apparatus, information presenting system, server, information presenting method and program |
US9188731B2 (en) | 2012-05-18 | 2015-11-17 | Reald Inc. | Directional backlight |
EP2850488A4 (en) | 2012-05-18 | 2016-03-02 | Reald Inc | Directional backlight |
EP2680593A1 (en) * | 2012-06-26 | 2014-01-01 | Thomson Licensing | Method of adapting 3D content to an observer wearing prescription glasses |
US20140002344A1 (en) * | 2012-06-29 | 2014-01-02 | Mukund Pai | Dynamic display adjustment |
EP2699006A1 (en) * | 2012-08-16 | 2014-02-19 | ESSILOR INTERNATIONAL (Compagnie Générale d'Optique) | Pictures positioning on display elements |
WO2014029428A1 (en) * | 2012-08-22 | 2014-02-27 | Ultra-D Coöperatief U.A. | Three-dimensional display device and method for processing a depth-related signal |
US20140063206A1 (en) * | 2012-08-28 | 2014-03-06 | Himax Technologies Limited | System and method of viewer centric depth adjustment |
US9454879B2 (en) | 2012-09-18 | 2016-09-27 | Igt Canada Solutions Ulc | Enhancements to game components in gaming systems |
US9754442B2 (en) | 2012-09-18 | 2017-09-05 | Igt Canada Solutions Ulc | 3D enhanced gaming machine with foreground and background game surfaces |
US10659763B2 (en) | 2012-10-09 | 2020-05-19 | Cameron Pace Group Llc | Stereo camera system with wide and narrow interocular distance cameras |
KR20140063272A (en) * | 2012-11-16 | 2014-05-27 | 엘지전자 주식회사 | Image display apparatus and method for operating the same |
US9265458B2 (en) | 2012-12-04 | 2016-02-23 | Sync-Think, Inc. | Application of smooth pursuit cognitive testing paradigms to clinical drug development |
CA2861289A1 (en) | 2012-12-28 | 2014-06-28 | Gtech Canada Ulc | Multi-faceted game component and gaming surface in a 3d enhanced gaming machine |
US9319662B2 (en) | 2013-01-16 | 2016-04-19 | Elwha Llc | Systems and methods for differentiating between dominant and weak eyes in 3D display technology |
TWI622811B (en) | 2013-02-22 | 2018-05-01 | 瑞爾D斯帕克有限責任公司 | Directional backlight |
US9380976B2 (en) | 2013-03-11 | 2016-07-05 | Sync-Think, Inc. | Optical neuroinformatics |
DE102013204301A1 (en) * | 2013-03-12 | 2014-09-18 | C.R.S. Iimotion Gmbh | Depth adjustment of stereoscopic images |
US9600068B2 (en) | 2013-03-13 | 2017-03-21 | Sony Interactive Entertainment Inc. | Digital inter-pupillary distance adjustment |
US9571812B2 (en) | 2013-04-12 | 2017-02-14 | Disney Enterprises, Inc. | Signaling warp maps using a high efficiency video coding (HEVC) extension for 3D video coding |
KR20150006957A (en) * | 2013-07-09 | 2015-01-20 | 한국전자통신연구원 | Method and apparatus for transmitting 3D video streaming service |
WO2015057588A1 (en) | 2013-10-14 | 2015-04-23 | Reald Inc. | Light input for directional backlight |
WO2015057625A1 (en) | 2013-10-14 | 2015-04-23 | Reald Inc. | Control of directional display |
EP2908519A1 (en) * | 2014-02-14 | 2015-08-19 | Thomson Licensing | Method for displaying a 3D content on a multi-view display device, corresponding multi-view display device and computer program product |
AU2015240505B2 (en) * | 2014-04-03 | 2019-04-18 | Evolv Technologies, Inc. | Partitioning for radar systems |
US10347073B2 (en) | 2014-05-30 | 2019-07-09 | Igt Canada Solutions Ulc | Systems and methods for three dimensional games in gaming systems |
US9824524B2 (en) | 2014-05-30 | 2017-11-21 | Igt Canada Solutions Ulc | Three dimensional enhancements to game components in gaming systems |
US9699436B2 (en) | 2014-09-16 | 2017-07-04 | Microsoft Technology Licensing, Llc | Display with eye-discomfort reduction |
US9835792B2 (en) | 2014-10-08 | 2017-12-05 | Reald Spark, Llc | Directional backlight |
WO2016105541A1 (en) * | 2014-12-24 | 2016-06-30 | Reald Inc. | Adjustment of perceived roundness in stereoscopic image of a head |
CN105867597B (en) * | 2014-12-31 | 2020-01-10 | 深圳超多维科技有限公司 | 3D interaction method and 3D display equipment |
US9897806B2 (en) | 2015-02-10 | 2018-02-20 | Dreamworks Animation L.L.C. | Generation of three-dimensional imagery to supplement existing content |
US9721385B2 (en) * | 2015-02-10 | 2017-08-01 | Dreamworks Animation Llc | Generation of three-dimensional imagery from a two-dimensional image using a depth map |
RU2596062C1 (en) | 2015-03-20 | 2016-08-27 | Автономная Некоммерческая Образовательная Организация Высшего Профессионального Образования "Сколковский Институт Науки И Технологий" | Method for correction of eye image using machine learning and method of machine learning |
EP3779527A1 (en) | 2015-04-13 | 2021-02-17 | RealD Spark, LLC | Wide angle imaging directional backlights |
CN108463787B (en) | 2016-01-05 | 2021-11-30 | 瑞尔D斯帕克有限责任公司 | Gaze correction of multi-perspective images |
CN114554177A (en) | 2016-05-19 | 2022-05-27 | 瑞尔D斯帕克有限责任公司 | Wide-angle imaging directional backlight source |
EP4124795B1 (en) | 2016-05-23 | 2024-04-10 | RealD Spark, LLC | Wide angle imaging directional backlights |
US10097809B2 (en) * | 2016-11-11 | 2018-10-09 | Rovi Guides, Inc. | Systems and methods for adjusting display settings to reduce eye strain of multiple viewers |
EP3566094B1 (en) | 2017-01-04 | 2023-12-06 | RealD Spark, LLC | Optical stack for imaging directional backlights |
US10408992B2 (en) | 2017-04-03 | 2019-09-10 | Reald Spark, Llc | Segmented imaging directional backlights |
EP4293574A3 (en) | 2017-08-08 | 2024-04-03 | RealD Spark, LLC | Adjusting a digital representation of a head region |
EP3707554B1 (en) | 2017-11-06 | 2023-09-13 | RealD Spark, LLC | Privacy display apparatus |
CN111869205B (en) | 2018-01-19 | 2022-06-10 | Pcms控股公司 | Multiple focal planes with varying positions |
EP3743766A4 (en) | 2018-01-25 | 2021-12-22 | RealD Spark, LLC | Touch screen for privacy display |
US11477434B2 (en) | 2018-03-23 | 2022-10-18 | Pcms Holdings, Inc. | Multifocal plane based method to produce stereoscopic viewpoints in a DIBR system (MFP-DIBR) |
CN118158376A (en) | 2018-07-05 | 2024-06-07 | Pcms控股公司 | Method and system for near-eye focal plane overlay |
US20200168045A1 (en) | 2018-11-28 | 2020-05-28 | Igt | Dynamic game flow modification in electronic wagering games |
DE102019120998A1 (en) * | 2019-08-02 | 2021-02-04 | Psholix Ag | Method and apparatus for adaptive disparity control |
EP4214441A4 (en) | 2020-09-16 | 2024-08-28 | Reald Spark Llc | Vehicle external illumination device |
US11966049B2 (en) | 2022-08-02 | 2024-04-23 | Reald Spark, Llc | Pupil tracking near-eye display |
Citations (31)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4578301A (en) | 1983-08-23 | 1986-03-25 | Lambeg Industrial Research Association | Fabric reinforced cement structure |
US4647965A (en) | 1983-11-02 | 1987-03-03 | Imsand Donald J | Picture processing system for three dimensional movies and video systems |
US4778718A (en) | 1987-03-26 | 1988-10-18 | University Of Delaware | Fabric-reinforced cementitious sheet-like structures and their production |
US4894270A (en) | 1986-12-04 | 1990-01-16 | Nicholls Robert L | Fold and bond for constructing cement laminate structural shapes |
US4899498A (en) | 1988-10-03 | 1990-02-13 | Grieb Donald L | Foam cement highway sound barrier |
US5114653A (en) | 1985-11-07 | 1992-05-19 | Akzo N.V. | Processes of manufacturing prestressed concrete |
US5140415A (en) | 1990-03-19 | 1992-08-18 | L'etat Francais Represente Par Le Ministre Des P.T.E. (Centre National D'etudes Des Telecommunications - (Cnet) | Method of adjusting stereoscopic convergence in a stereoscopic imaging system |
US5350554A (en) | 1991-02-01 | 1994-09-27 | Glascrete, Inc. | Method for production of reinforced cementitious panels |
JPH08166559A (en) | 1994-12-13 | 1996-06-25 | Nippon Telegr & Teleph Corp <Ntt> | Head mounted display device |
JPH09201472A (en) | 1996-01-24 | 1997-08-05 | Taito Corp | Device for adjusting incident position and convergence point of binocular image in game machine for binocular stereoscopic image |
US5705233A (en) | 1995-03-24 | 1998-01-06 | Wisconsin Alumni Research Foundation | Fiber-reinforced cementitious composites |
US5726704A (en) | 1993-08-26 | 1998-03-10 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic image pickup and display apparatus |
US5737012A (en) | 1994-12-01 | 1998-04-07 | Olympus Optical Co., Ltd. | Head mounted image display apparatus and image forming apparatus related thereto |
US5891374A (en) | 1994-02-01 | 1999-04-06 | Northwestern University | Method of making extruded fiber reinforced cement matrix composites |
US5902528A (en) | 1997-06-13 | 1999-05-11 | Spragg; Peter H. | Method of making an article from a lightweight cementitious composition |
US6005607A (en) | 1995-06-29 | 1999-12-21 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus |
US6067192A (en) | 1995-12-28 | 2000-05-23 | Lichtenfield; Louis | Portable viewing apparatus having multiple interchargeable optical modules |
US6069192A (en) | 1998-10-13 | 2000-05-30 | Poly-Med, Inc. | Low fiber-loading composites with hybridized fiber/matrix interface |
US6110588A (en) | 1999-02-05 | 2000-08-29 | 3M Innovative Properties Company | Microfibers and method of making |
US6263574B1 (en) | 1999-03-02 | 2001-07-24 | Tenneco Packaging Inc. | Methods for using a support backer board system for siding |
US20020024516A1 (en) * | 2000-05-03 | 2002-02-28 | Qian Chen | Three-dimensional modeling and based on photographic images |
US20020034610A1 (en) | 1999-05-07 | 2002-03-21 | 3M Innovative Properties Company | Films having a microfibrillated surface and method of making |
US6420024B1 (en) | 2000-12-21 | 2002-07-16 | 3M Innovative Properties Company | Charged microfibers, microfibrillated articles and use thereof |
US6468451B1 (en) | 2000-06-23 | 2002-10-22 | 3M Innovative Properties Company | Method of making a fibrillated article |
WO2003000122A1 (en) | 2001-06-21 | 2003-01-03 | Dynamic Digital Depth Research Pty Ltd | Image processing system |
US20030044592A1 (en) | 2001-08-22 | 2003-03-06 | 3M Innovative Properties Company | Toughened cementitious composites |
US20030113018A1 (en) * | 2001-07-18 | 2003-06-19 | Nefian Ara Victor | Dynamic gesture recognition from stereo sequences |
US20030176593A1 (en) | 2000-08-09 | 2003-09-18 | Bertrand Bordes | Building material comprising a fibrous or filament reinforcement |
US20030197779A1 (en) * | 2002-04-23 | 2003-10-23 | Zhengyou Zhang | Video-teleconferencing system with eye-gaze correction |
US20040189720A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
-
2004
- 2004-02-27 US US10/789,272 patent/US8094927B2/en not_active Expired - Fee Related
Patent Citations (39)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4578301A (en) | 1983-08-23 | 1986-03-25 | Lambeg Industrial Research Association | Fabric reinforced cement structure |
US4647965A (en) | 1983-11-02 | 1987-03-03 | Imsand Donald J | Picture processing system for three dimensional movies and video systems |
US5114653A (en) | 1985-11-07 | 1992-05-19 | Akzo N.V. | Processes of manufacturing prestressed concrete |
US4894270A (en) | 1986-12-04 | 1990-01-16 | Nicholls Robert L | Fold and bond for constructing cement laminate structural shapes |
US4778718A (en) | 1987-03-26 | 1988-10-18 | University Of Delaware | Fabric-reinforced cementitious sheet-like structures and their production |
US4899498A (en) | 1988-10-03 | 1990-02-13 | Grieb Donald L | Foam cement highway sound barrier |
US5140415A (en) | 1990-03-19 | 1992-08-18 | L'etat Francais Represente Par Le Ministre Des P.T.E. (Centre National D'etudes Des Telecommunications - (Cnet) | Method of adjusting stereoscopic convergence in a stereoscopic imaging system |
US5350554A (en) | 1991-02-01 | 1994-09-27 | Glascrete, Inc. | Method for production of reinforced cementitious panels |
US5726704A (en) | 1993-08-26 | 1998-03-10 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic image pickup and display apparatus |
US6528151B1 (en) | 1994-02-01 | 2003-03-04 | Northwestern University | Extruded fiber reinforced cement matrix composites and method of making same |
US5891374A (en) | 1994-02-01 | 1999-04-06 | Northwestern University | Method of making extruded fiber reinforced cement matrix composites |
US5737012A (en) | 1994-12-01 | 1998-04-07 | Olympus Optical Co., Ltd. | Head mounted image display apparatus and image forming apparatus related thereto |
JPH08166559A (en) | 1994-12-13 | 1996-06-25 | Nippon Telegr & Teleph Corp <Ntt> | Head mounted display device |
US5705233A (en) | 1995-03-24 | 1998-01-06 | Wisconsin Alumni Research Foundation | Fiber-reinforced cementitious composites |
US6005607A (en) | 1995-06-29 | 1999-12-21 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic computer graphics image generating apparatus and stereoscopic TV apparatus |
US6175379B1 (en) * | 1995-06-29 | 2001-01-16 | Matsushita Electric Industrial Co., Ltd. | Stereoscopic CG image generating apparatus and stereoscopic TV apparatus |
US6067192A (en) | 1995-12-28 | 2000-05-23 | Lichtenfield; Louis | Portable viewing apparatus having multiple interchargeable optical modules |
JPH09201472A (en) | 1996-01-24 | 1997-08-05 | Taito Corp | Device for adjusting incident position and convergence point of binocular image in game machine for binocular stereoscopic image |
US5902528A (en) | 1997-06-13 | 1999-05-11 | Spragg; Peter H. | Method of making an article from a lightweight cementitious composition |
US6358575B1 (en) | 1997-06-13 | 2002-03-19 | Peter H. Spragg | Method of making an article from a lightweight cementitious composition and a decorative article made therefrom |
US6069192A (en) | 1998-10-13 | 2000-05-30 | Poly-Med, Inc. | Low fiber-loading composites with hybridized fiber/matrix interface |
US6110588A (en) | 1999-02-05 | 2000-08-29 | 3M Innovative Properties Company | Microfibers and method of making |
US6432347B1 (en) | 1999-02-05 | 2002-08-13 | 3M Innovative Properties Company | Process of making a microfibrillated article |
US20010010116A1 (en) | 1999-03-02 | 2001-08-02 | Lubker John W. | Methods for using a support backer board system for siding |
US6418610B2 (en) | 1999-03-02 | 2002-07-16 | Pactiv Corporation | Methods for using a support backer board system for siding |
US6263574B1 (en) | 1999-03-02 | 2001-07-24 | Tenneco Packaging Inc. | Methods for using a support backer board system for siding |
US6586073B2 (en) | 1999-05-07 | 2003-07-01 | 3M Innovative Properties Company | Films having a microfibrillated surface and method of making |
US20020034610A1 (en) | 1999-05-07 | 2002-03-21 | 3M Innovative Properties Company | Films having a microfibrillated surface and method of making |
US7224357B2 (en) * | 2000-05-03 | 2007-05-29 | University Of Southern California | Three-dimensional modeling based on photographic images |
US20020024516A1 (en) * | 2000-05-03 | 2002-02-28 | Qian Chen | Three-dimensional modeling and based on photographic images |
US6468451B1 (en) | 2000-06-23 | 2002-10-22 | 3M Innovative Properties Company | Method of making a fibrillated article |
US20030176593A1 (en) | 2000-08-09 | 2003-09-18 | Bertrand Bordes | Building material comprising a fibrous or filament reinforcement |
US6420024B1 (en) | 2000-12-21 | 2002-07-16 | 3M Innovative Properties Company | Charged microfibers, microfibrillated articles and use thereof |
WO2003000122A1 (en) | 2001-06-21 | 2003-01-03 | Dynamic Digital Depth Research Pty Ltd | Image processing system |
US20030113018A1 (en) * | 2001-07-18 | 2003-06-19 | Nefian Ara Victor | Dynamic gesture recognition from stereo sequences |
US20030044592A1 (en) | 2001-08-22 | 2003-03-06 | 3M Innovative Properties Company | Toughened cementitious composites |
US20030197779A1 (en) * | 2002-04-23 | 2003-10-23 | Zhengyou Zhang | Video-teleconferencing system with eye-gaze correction |
US20040189720A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
Non-Patent Citations (5)
Title |
---|
"Image Distortions in Stereoscopic Video Systems" by Andrew Woods, Tom Docherty, and Rolf Koch. Proceedings of the SPIE, vol. 1915, Stereoscopic Displays and Applications IV, pp. 1-13, 1993. |
Binocular vision and stereopsis by Ian P. Howard and Brian J. Rogers, 1995. Chapter entitled "The limits of stereoscopic vision", p. 159. |
Dhond et al., "Stereo matching in the presence of narrow occluding objects using dynamic disparity search," IEEE, vol. 17, pp. 719-724. * |
Drivers for Windows "NVIDIA 3D Stereo User's Guide" (for Detonator XP), Revision 2.0. NVIDIA Corporation, Nov. 8, 2001. |
Woods et al., "Image Distortion in Stereoscopic Video System," 1993, SPIE, vol. 1915, pp. 1-13. * |
Cited By (41)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20100066816A1 (en) * | 2008-09-18 | 2010-03-18 | Kane Paul J | Stereoscopic display system with flexible rendering for multiple simultaneous observers |
US8217996B2 (en) * | 2008-09-18 | 2012-07-10 | Eastman Kodak Company | Stereoscopic display system with flexible rendering for multiple simultaneous observers |
US20100177171A1 (en) * | 2009-01-09 | 2010-07-15 | Marcus Michael A | Dual-view stereoscopic display using linear modulator arrays |
US8233035B2 (en) * | 2009-01-09 | 2012-07-31 | Eastman Kodak Company | Dual-view stereoscopic display using linear modulator arrays |
US20110074933A1 (en) * | 2009-09-28 | 2011-03-31 | Sharp Laboratories Of America, Inc. | Reduction of viewer discomfort for stereoscopic images |
US8284235B2 (en) * | 2009-09-28 | 2012-10-09 | Sharp Laboratories Of America, Inc. | Reduction of viewer discomfort for stereoscopic images |
US8675050B2 (en) * | 2010-01-13 | 2014-03-18 | Sony Corporation | Data structure, recording apparatus and method, playback apparatus and method, and program |
US20110169926A1 (en) * | 2010-01-13 | 2011-07-14 | Tatsumi Sakaguchi | Data structure, recording apparatus and method, playback apparatus and method, and program |
US20110304618A1 (en) * | 2010-06-14 | 2011-12-15 | Qualcomm Incorporated | Calculating disparity for three-dimensional images |
US20120062548A1 (en) * | 2010-09-14 | 2012-03-15 | Sharp Laboratories Of America, Inc. | Reducing viewing discomfort |
US20120099195A1 (en) * | 2010-10-21 | 2012-04-26 | Myung-Ryul Choi | Eyewear, three-dimensional image display system employing the same, and method of allowing viewing of image |
US8743112B2 (en) * | 2010-11-05 | 2014-06-03 | Lg Display Co., Ltd. | Image display device and driving method for thereof |
US20120113107A1 (en) * | 2010-11-05 | 2012-05-10 | Jun Woo Jang | Image display device and driving method for thereof |
US20140267001A1 (en) * | 2013-03-12 | 2014-09-18 | Joshua J. Ratcliff | Techniques for automated evaluation of 3d visual content |
WO2014163865A1 (en) * | 2013-03-12 | 2014-10-09 | Intel Corporation | Techniques for automated evaluation of 3d visual content |
US9182817B2 (en) * | 2013-03-12 | 2015-11-10 | Intel Corporation | Techniques for automated evaluation of 3D visual content |
US20160029012A1 (en) * | 2013-04-05 | 2016-01-28 | Koninklijke Philips N.V. | Re-targeting a three-dimensional image signal |
US10409079B2 (en) | 2014-01-06 | 2019-09-10 | Avegant Corp. | Apparatus, system, and method for displaying an image using a plate |
US10303242B2 (en) | 2014-01-06 | 2019-05-28 | Avegant Corp. | Media chair apparatus, system, and method |
TWI566576B (en) * | 2014-06-03 | 2017-01-11 | 宏碁股份有限公司 | Stereoscopic view synthesis method and apparatus using the same |
US9918066B2 (en) | 2014-12-23 | 2018-03-13 | Elbit Systems Ltd. | Methods and systems for producing a magnified 3D image |
CN105872528A (en) * | 2014-12-31 | 2016-08-17 | 深圳创锐思科技有限公司 | 3D display method, device and 3D display device |
CN105872528B (en) * | 2014-12-31 | 2019-01-15 | 深圳超多维科技有限公司 | 3D display method, apparatus and 3D display equipment |
US9823474B2 (en) | 2015-04-02 | 2017-11-21 | Avegant Corp. | System, apparatus, and method for displaying an image with a wider field of view |
US9995857B2 (en) | 2015-04-03 | 2018-06-12 | Avegant Corp. | System, apparatus, and method for displaying an image using focal modulation |
US20170365101A1 (en) * | 2016-06-20 | 2017-12-21 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US11049326B2 (en) * | 2016-06-20 | 2021-06-29 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US11734896B2 (en) * | 2016-06-20 | 2023-08-22 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US20210398357A1 (en) * | 2016-06-20 | 2021-12-23 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US10332315B2 (en) * | 2016-06-20 | 2019-06-25 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US20190287309A1 (en) * | 2016-06-20 | 2019-09-19 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US10719992B2 (en) * | 2016-06-20 | 2020-07-21 | Magic Leap, Inc. | Augmented reality display system for evaluation and modification of neurological conditions, including visual processing and perception conditions |
US10354427B2 (en) * | 2016-10-19 | 2019-07-16 | Samsung Display Co., Ltd. | Method of driving head mounted display and head mounted display performing the same |
US10506219B2 (en) * | 2016-11-29 | 2019-12-10 | Samsung Electronics Co., Ltd. | Method and apparatus for determining interpupillary distance (IPD) |
US10979696B2 (en) * | 2016-11-29 | 2021-04-13 | Samsung Electronics Co., Ltd. | Method and apparatus for determining interpupillary distance (IPD) |
US20180152698A1 (en) * | 2016-11-29 | 2018-05-31 | Samsung Electronics Co., Ltd. | Method and apparatus for determining interpupillary distance (ipd) |
US10341635B2 (en) | 2017-05-17 | 2019-07-02 | National Chiao Tung University | Stereoscopic imaging method and device |
US20210192752A1 (en) * | 2019-12-23 | 2021-06-24 | Texas Instruments Incorporated | Cascaded architecture for disparity and motion prediction with block matching and convolutional neural network (cnn) |
US11694341B2 (en) * | 2019-12-23 | 2023-07-04 | Texas Instmments Incorporated | Cascaded architecture for disparity and motion prediction with block matching and convolutional neural network (CNN) |
US20220207776A1 (en) * | 2020-01-10 | 2022-06-30 | Dalian University Of Technology | Disparity image fusion method for multiband stereo cameras |
US11948333B2 (en) * | 2020-01-10 | 2024-04-02 | Dalian University Of Technology | Disparity image fusion method for multiband stereo cameras |
Also Published As
Publication number | Publication date |
---|---|
US20050190180A1 (en) | 2005-09-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8094927B2 (en) | Stereoscopic display system with flexible rendering of disparity map according to the stereoscopic fusing capability of the observer | |
JP5515301B2 (en) | Image processing apparatus, program, image processing method, recording method, and recording medium | |
US7428001B2 (en) | Materials and methods for simulating focal shifts in viewers using large depth of focus displays | |
US8913790B2 (en) | System and method for analyzing three-dimensional (3D) media content | |
CN109901710B (en) | Media file processing method and device, storage medium and terminal | |
US20060028543A1 (en) | Method and apparatus for controlling convergence distance for observation of 3D image | |
CN103329165B (en) | The pixel depth value of the virtual objects that the user in scaling three-dimensional scenic controls | |
US8692870B2 (en) | Adaptive adjustment of depth cues in a stereo telepresence system | |
CN102884803A (en) | Image processing device, image processing method, program, and integrated circuit | |
JPH11155152A (en) | Method and system for three-dimensional shape information input, and image input device thereof | |
CN108141578A (en) | Camera is presented | |
Hwang et al. | Instability of the perceived world while watching 3D stereoscopic imagery: a likely source of motion sickness symptoms | |
JP6024159B2 (en) | Information presenting apparatus, information presenting system, server, information presenting method and program | |
WO2018010677A1 (en) | Information processing method, wearable electric device, processing apparatus, and system | |
JP5096643B1 (en) | Congestion capability determination device and method | |
CN110433062B (en) | Visual function training system based on dynamic video images | |
Celikcan et al. | Attention-aware disparity control in interactive environments | |
Terzic et al. | Causes of discomfort in stereoscopic content: a review | |
US12081722B2 (en) | Stereo image generation method and electronic apparatus using the same | |
EP3419287A1 (en) | An apparatus and a method for displaying a 3d image | |
US11517195B2 (en) | Panum's area measurement method, apparatus, and wearable display device | |
JP2018191079A (en) | Multifocal visual output method, multifocal visual output apparatus | |
JP5521991B2 (en) | Stereoscopic image display device | |
JP2024015651A (en) | Information processing device, information processing method, and program | |
Gurrieri | Improvements in the visualization of stereoscopic 3D imagery |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: EASTMAN KODAK COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JIN, ELAINE W.;MILLER, MICHAEL E.;ENDRIKHOVSKI, SERGUEI;AND OTHERS;REEL/FRAME:015040/0289;SIGNING DATES FROM 20040225 TO 20040227 Owner name: EASTMAN KODAK COMPANY, NEW YORK Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:JIN, ELAINE W.;MILLER, MICHAEL E.;ENDRIKHOVSKI, SERGUEI;AND OTHERS;SIGNING DATES FROM 20040225 TO 20040227;REEL/FRAME:015040/0289 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
AS | Assignment |
Owner name: CITICORP NORTH AMERICA, INC., AS AGENT, NEW YORK Free format text: SECURITY INTEREST;ASSIGNORS:EASTMAN KODAK COMPANY;PAKON, INC.;REEL/FRAME:028201/0420 Effective date: 20120215 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: LASER-PACIFIC MEDIA CORPORATION, NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: PAKON, INC., INDIANA Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK PORTUGUESA LIMITED, NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: EASTMAN KODAK INTERNATIONAL CAPITAL COMPANY, INC., Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: NPEC INC., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: FPC INC., CALIFORNIA Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK REALTY, INC., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: CREO MANUFACTURING AMERICA LLC, WYOMING Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK AVIATION LEASING LLC, NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: EASTMAN KODAK COMPANY, NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK PHILIPPINES, LTD., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK AMERICAS, LTD., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: QUALEX INC., NORTH CAROLINA Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: FAR EAST DEVELOPMENT LTD., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK IMAGING NETWORK, INC., CALIFORNIA Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 Owner name: KODAK (NEAR EAST), INC., NEW YORK Free format text: PATENT RELEASE;ASSIGNORS:CITICORP NORTH AMERICA, INC.;WILMINGTON TRUST, NATIONAL ASSOCIATION;REEL/FRAME:029913/0001 Effective date: 20130201 |
|
AS | Assignment |
Owner name: INTELLECTUAL VENTURES FUND 83 LLC, NEVADA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EASTMAN KODAK COMPANY;REEL/FRAME:030221/0362 Effective date: 20130201 |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
AS | Assignment |
Owner name: MONUMENT PEAK VENTURES, LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:INTELLECTUAL VENTURES FUND 83 LLC;REEL/FRAME:041941/0079 Effective date: 20170215 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20200110 |
|
AS | Assignment |
Owner name: MONUMENT PEAK VENTURES, LLC, TEXAS Free format text: RELEASE BY SECURED PARTY;ASSIGNOR:INTELLECTUAL VENTURES FUND 83 LLC;REEL/FRAME:064599/0304 Effective date: 20230728 |