[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN114092318A - 3D display method and device of 2D image and storage medium - Google Patents

3D display method and device of 2D image and storage medium Download PDF

Info

Publication number
CN114092318A
CN114092318A CN202111305761.4A CN202111305761A CN114092318A CN 114092318 A CN114092318 A CN 114092318A CN 202111305761 A CN202111305761 A CN 202111305761A CN 114092318 A CN114092318 A CN 114092318A
Authority
CN
China
Prior art keywords
image
offset
target
determining
target user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111305761.4A
Other languages
Chinese (zh)
Inventor
汤仕兵
台正
王婷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xiaodou Vision Chongqing Medical Technology Co ltd
Original Assignee
Shenzhen Xiaodou Vision Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xiaodou Vision Technology Co ltd filed Critical Shenzhen Xiaodou Vision Technology Co ltd
Priority to CN202111305761.4A priority Critical patent/CN114092318A/en
Publication of CN114092318A publication Critical patent/CN114092318A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/08Projecting images onto non-planar surfaces, e.g. geodetic screens
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

本申请提供了一种2D图像的3D显示方法及相关设备,可以周期性的调整待生成3D画面的左右画面之间的偏移量。该方法包括:确定目标图像所对应的第一图像和第二图像,其中,所述目标图像为待进行3D显示的2D图像;确定在预设距离观看进行3D显示后的所述2D图像的目标用户所对应的基准偏移区间;将所述第一图像与所述第二图像之间的目标偏移量在所述基准偏移区间进行周期性调整;将调整所述目标偏移量后的所述第一图像和所述第二图像进行交织,以将所述2D图像进行3D显示。

Figure 202111305761

The present application provides a 3D display method of a 2D image and a related device, which can periodically adjust the offset between the left and right pictures of the 3D picture to be generated. The method includes: determining a first image and a second image corresponding to a target image, wherein the target image is a 2D image to be displayed in 3D; determining a target for viewing the 2D image after 3D display at a preset distance The reference offset interval corresponding to the user; periodically adjust the target offset between the first image and the second image in the reference offset interval; adjust the target offset after adjusting the target offset. The first image and the second image are interleaved to display the 2D image in 3D.

Figure 202111305761

Description

3D display method and device of 2D image and storage medium
Technical Field
The present disclosure relates to the field of visual training, and in particular, to a method and an apparatus for 3D display of 2D images, and a storage medium.
Background
The vision training is a training mode of eyes and brain, retrains the relation between the brain and the eyes, and is used for continuously training the stimulation and the training of a brain vision neurocognitive system, increasing the eye movement, focusing, vision fixing capacity, the cooperative capacity of the eyes, the vision processing capacity, the visual function of treating amblyopia and the like doing eye movement.
The binocular fusion phenomenon is a visual phenomenon, when two eyes observe the same object, the object of the object image is formed on the retinas of the two eyes, and then the object is transmitted to the same area of the visual center of the cortex layer through the optic nerves on the two sides respectively, so that the two eyes are fused into a complete and single perception experience of the object image. Generally, binocular visual matching fusion occurs easily when two views have similar or related images, brightness or color, or else the entire process tends to be binocular diplopia, both alternate or unilateral suppression.
Binocular vision refers to vision in which both eyes observe an object at the same time. Although the two eyes form retinal images separately, normal binocular vision can fuse two views into a single perceptual object. If the two eyes observe a plane object, the two retina images are both located at corresponding points of the retinas of the two eyes, and the corresponding points are the same; if the two eyes observe a three-dimensional object, the retina images of the two eyes are not identical, so that binocular parallax is formed, and three-dimensional perception is generated.
However, when 2D images are converted into 3D images, due to the existence of the observation angle, a user often only receives a view with one eye, and the other eye cannot receive the view, so that the user cannot view clear 3D images, and the viewing effect is affected.
Disclosure of Invention
The application provides a 3D display method, a device and a storage medium of a 2D image, which can periodically adjust the target offset between a first image and a second image in a reference offset interval, ensure that both eyes of a user can receive views, and further observe a clear 3D image.
A first aspect of the present application provides a 3D display method of a 2D image, including:
determining a first image and a second image corresponding to a target image, wherein the target image is a 2D image to be subjected to 3D display;
determining a reference offset interval corresponding to a target user who watches the 2D image after 3D display at a preset distance;
periodically adjusting the target offset between the first image and the second image in the reference offset interval;
and interleaving the first image and the second image after the target offset is adjusted so as to display the 2D image in a 3D mode.
In one possible design, the determining a reference offset interval corresponding to a target user viewing the 3D display image at a preset distance includes:
determining an initial visual parameter corresponding to the target user;
adjusting the offset between a third image and a fourth image corresponding to the first test 3D image;
recording actual visual parameters corresponding to the target user when the shifted third image and the shifted fourth image are displayed after blanking;
determining a first maximum offset and a first minimum offset between a center of the third image and a center of the fourth image when the actual visual parameters match the initial visual parameters;
and determining the reference offset interval according to the first maximum offset and the first minimum offset.
In one possible design, the determining a reference offset interval corresponding to a target user viewing the 2D image after 3D display at a preset distance includes:
shifting at least one image of a fifth image and a sixth image according to the operation instruction of the target user, wherein the fifth image and the sixth image are images corresponding to a second test 3D image;
if a recording instruction of the target user is received, recording a second maximum offset and a second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction;
and determining the reference offset interval according to the second maximum offset and the second minimum offset.
In one possible design, the periodically adjusting the target offset between the first image and the second image at the reference offset interval includes:
periodically adjusting the target offset amount in the reference offset interval by the following formula:
Δd=f(t);
where Δ d is the target offset, f (t) is a periodic continuous function, and t is a time variable.
In one possible design, the method further includes:
determining a first offset between the position of the first image after adjustment and the position of the first image before adjustment;
determining a second offset between the position of the second image after adjustment and the position of the second image before adjustment;
determining a sum of the first offset and the second offset as the target offset.
A second aspect of the present application provides a terminal device, including:
the device comprises a first determining unit, a second determining unit and a display unit, wherein the first determining unit is used for determining a first image and a second image corresponding to a target image, and the target image is a 2D image to be subjected to 3D display;
the second determining unit is used for determining a reference offset interval corresponding to a target user who watches the 2D image after 3D display at a preset distance;
an adjusting unit, configured to periodically adjust a target offset amount between the first image and the second image in the reference offset section;
and a generating unit, configured to interleave the first image and the second image after the target offset amount is adjusted, so as to perform 3D display on the 2D image.
In one possible design, the second determining unit is specifically configured to:
determining an initial visual parameter corresponding to the target user;
adjusting the offset between a third image and a fourth image corresponding to the first test 3D image;
recording actual visual parameters corresponding to the target user when the shifted third image and the shifted fourth image are displayed after blanking;
determining a first maximum offset and a first minimum offset between a center of the third image and a center of the fourth image when the actual visual parameters match the initial visual parameters;
and determining the reference offset interval according to the first maximum offset and the first minimum offset.
In one possible design, the second determining unit is further specifically configured to:
shifting at least one image of a fifth image and a sixth image according to the operation instruction of the target user, wherein the fifth image and the sixth image are images corresponding to a second test 3D image;
if a recording instruction of the target user is received, recording a second maximum offset and a second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction;
and determining the reference offset interval according to the second maximum offset and the second minimum offset.
In one possible design, the adjusting unit is specifically configured to:
periodically adjusting the target offset amount in the reference offset interval by the following formula:
Δd=f(t);
where Δ d is the target offset, f (t) is a periodic continuous function, and t is a time variable.
In one possible design, the first determination unit is further configured to:
determining a first offset between the position of the first image after adjustment and the position of the first image before adjustment;
determining a second offset between the position of the second image after adjustment and the position of the second image before adjustment;
determining a sum of the first offset and the second offset as the target offset.
A third aspect of the application provides a computer device comprising at least one connected processor, a memory and a transceiver, wherein the memory is configured to store program code, and the processor is configured to invoke the program code in the memory to perform the steps of the method for 3D display of 2D images according to the first aspect.
A fourth aspect of the present application provides a computer storage medium comprising instructions which, when run on a computer, cause the computer to perform the steps of the method for 3D display of 2D images according to any of the above aspects.
In summary, it can be seen that, in the embodiment provided by the application, compared with the related art, the target offset between the first image and the second image can be periodically adjusted, so that the adjusted target offset is always located between the reference offset intervals, it is ensured that both eyes of a user can receive a view when watching a 2D image converted into a 3D image, and a clear 3D image can be observed.
Drawings
Fig. 1 is a schematic flowchart of a 3D display method of a 2D image according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a standard test 3D image provided by an embodiment of the present application;
fig. 3 is a schematic view of a virtual structure of a terminal device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware structure of a terminal device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
The terms "first," "second," and the like in the description and in the claims of the present application and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that the embodiments described herein may be practiced otherwise than as specifically illustrated or described herein. Furthermore, the terms "comprise," "include," and "have," and any variations thereof, are intended to cover non-exclusive inclusions, such that a process, method, system, article, or apparatus that comprises a list of steps or modules is not necessarily limited to those steps or modules expressly listed, but may include other steps or modules not expressly listed or inherent to such process, method, article, or apparatus, the division of modules presented herein is merely a logical division that may be implemented in a practical application in a further manner, such that a plurality of modules may be combined or integrated into another system, or some feature vectors may be omitted, or not implemented, and such that couplings or direct couplings or communicative coupling between each other as shown or discussed may be through some interfaces, indirect couplings or communicative coupling between modules may be electrical or other similar, this application is not intended to be limiting. The modules or sub-modules described as separate components may or may not be physically separated, may or may not be physical modules, or may be distributed in a plurality of circuit modules, and some or all of the modules may be selected according to actual needs to achieve the purpose of the present disclosure.
The 3D display method, the device and the storage medium of the 2D image can be applied to visual training, conventional 3D video processing and personalized 3D display schemes.
Referring to fig. 1, please refer to a schematic flowchart of a 3D display method for a 2D image provided in an embodiment of the present application, where fig. 1 is a schematic flowchart of the 3D display method for a 2D image provided in the embodiment of the present application, and the method includes:
101. and determining a first image and a second image corresponding to the target image.
In this embodiment, the terminal device may first obtain a target image, where the target image is a 2D image to be subjected to 3D display, the target image may be an independent 2D image, or may also be a 2D image of each frame in a video stream, and is not particularly limited, and after obtaining the target image, may determine a first image and a second image corresponding to the target image, that is, copy the 2D image to obtain the first image and the second image.
102. And determining a reference offset section corresponding to a target user who watches the 2D image for 3D display at a preset distance.
In this embodiment, the terminal device may determine a reference offset interval corresponding to a target user viewing a 3D image after 3D display at a preset distance (for example, a distance between eyes of the target user and a display screen of the terminal device is 40 cm, and certainly, other distances may also be used, and are not specifically limited), that is, different users have different reference offset intervals, which is described in detail: following method for determining reference offset interval corresponding to target user
The method for determining the reference offset interval corresponding to a target user who watches a 2D image for 3D display at a preset distance by using a terminal device comprises the following steps:
determining an initial visual parameter corresponding to the target user;
adjusting the offset between a third image and a fourth image corresponding to the first test 3D image;
recording actual visual parameters corresponding to the target user when the shifted third image and the shifted fourth image are displayed after blanking;
determining a first maximum offset and a first minimum offset between the center of the third image and the center of the fourth image when the actual visual parameters are matched with the initial visual parameters;
and determining the reference offset interval according to the first maximum offset and the first minimum offset.
In this embodiment, the terminal device may first determine an initial visual parameter corresponding to the target user, where the initial visual parameter is an initial visual parameter of a left eye and an initial visual parameter of a right eye of the target user in a relaxed state (for example, the initial visual parameters may be obtained by an eye tracker, or may also be obtained by other methods, which is not particularly limited to the above embodiments) The initial visual parameter of the left eye is (x)L0,yL0,rL0) The initial visual parameter for the right eye is (x)R0,yR0,rR0) And a binocular horizontal central disparity visual parameter distance _ x _ eyes, wherein distance _ x _ eyes ═ xL0-xR0And | x is the horizontal coordinate of the pupil, y is the vertical coordinate of the pupil, and r is the pupil radius. Here, the third image is an image corresponding to the left eye, and the fourth image is an image corresponding to the right eye.
Then, the offset between the third image and the fourth image is adjusted to obtain a first offset, it is understood that, here, adjusting the offset between the third image and the fourth image may be fixing the third image, and shifting the fourth image in a direction away from the third image, of course, the fourth image may also be fixing the fourth image, and shifting the third image in a direction away from the fourth image, and of course, the third image and the fourth image may also be simultaneously shifted in a direction away from each other, which is not limited specifically, for convenience of description, the third image is fixed, and the fourth image is shifted in a direction away from the third image to calculate a reference offset section:
shifting the fourth image to the direction far away from the third image, recording the shift variation quantity distance _ x of the fourth image shifted to the direction far away from the third image, blanking the fourth image and the third image, then presenting, and recording the actual visual parameter (x) corresponding to the target user after presentingL1,yL1,rL1) And (x)R1,yR1,rR1) And determining whether the initial visual parameter matches the actual visual parameter (the method for determining whether the initial visual parameter matches the actual visual parameter may be to determine whether the offset variation distance _ x is in direct proportion to distance _ x _ eyes, if so, the matching is determined, or of course, other methods may also be used, for example, determining whether the difference between the actual visual parameter and the initial visual parameter is smaller than a preset value, if so, the matching is determined, and no limitation is made specifically), and if so, determining the offset between the center of the third image and the center of the fourth image after the offset as a reference offset regionIf not, continuously shifting the fourth image to a direction away from the third image on the basis of the shift variation (the distance of each shift may be a preset distance), and repeatedly executing the above steps until the actual visual parameter corresponding to the target user matches the initial visual parameter when the fourth image and the third image are displayed again after being blanked, where the shift amount between the center of the fourth image and the center of the third image is the extreme value of the reference shift interval. And repeatedly executing the steps, fixing the third image, shifting the fourth image to the direction close to the third image, and determining the other extreme value of the reference shifting interval.
Referring to fig. 2, a manner of determining a reference offset interval is described, where fig. 2 is a schematic view of an embodiment of a method for 3D displaying a 2D image according to an embodiment of the present application, where a third image is fixed, a fourth image is moved away from the third image to adjust an offset between the third image and the fourth image, where 201 is the third image, 202 is the fourth image, the fourth image 202 is offset (in a direction indicated by an arrow 203 in fig. 2) in a direction away from the third image 201, the offset is a preset distance, then an offset between a center 202A of the fourth image 202 and a center 201A of the third image 201 is recorded, the fourth image 202 and the third image 201 are blanked, then the third image is re-rendered, and an actual visual parameter corresponding to a target user when the second image is re-rendered is recorded, and whether the actual visual parameter matches the initial visual parameter of the target user in a relaxed state is recorded, if so, determining the offset amount between the center 202A of the shifted fourth image 202 and the center 201A of the third image 201 as an extreme value of the reference offset interval, if not, shifting the fourth image 202 again in a direction away from the third image 201 on the basis of the shifted fourth image, the shifted distance being the same as the preset distance (that is, each shift may be the same or different according to the preset distance), and repeating the above steps until the actual visual parameter corresponding to the target user matches the initial visual parameter when the fourth image 202 and the third image 201 are displayed again after being blanked, where at this time, the offset amount between the center 202A of the fourth image 202 and the center 201A of the third image 201 is the extreme value of the reference offset interval.
Secondly, the step that the terminal equipment determines a reference offset interval corresponding to a target user who watches the 2D image after the 3D display at a preset distance comprises the following steps:
shifting at least one of a fifth image and a sixth image according to an operation instruction of a target user, wherein the fifth image and the sixth image are images corresponding to the second test 3D image;
if a recording instruction of the target user is received, recording a second maximum offset and a second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction;
and determining the reference offset interval according to the second maximum offset and the second minimum offset.
In this embodiment, the terminal device may display the second test 3D image and send a prompt message, where the prompt message is used to prompt the target user to shift at least one of the fifth image and the sixth image, and then the target user may operate the second standard test 3D image, and the terminal device may receive an operation instruction of the target user and shift at least one of the fifth image and the sixth image according to the operation instruction of the user (where the shift may be that the sixth image is kept fixed, the fifth image is shifted in a direction away from the sixth image, and of course, the fifth image is kept fixed, and the sixth image is shifted in a direction away from the fifth image, and of course, the fifth image and the sixth image are simultaneously shifted in a direction away from each other, and is not limited specifically) until the target user cannot perceive that the fifth image and the sixth image form the 3D image in the brain, and sending a recording instruction, and recording the offset between the center of the fifth image and the center of the sixth image by the terminal equipment according to the recording instruction of the target user, wherein the offset between the center of the fifth image and the center of the sixth image is an extreme value in the reference offset interval. And repeatedly executing the steps, fixing the fifth image, shifting the sixth image to the direction close to the fifth image, and determining the other extreme value of the reference shifting interval.
It should be noted that, the step 101 may determine the first image and the second image corresponding to the target image, and the step 102 may determine the reference offset interval corresponding to the target user who views the 2D image after 3D display at the preset distance, however, there is no sequential execution order limitation between the two steps, and the step 101 may be executed first, or the step 102 may be executed first, or executed simultaneously, which is not specifically limited.
103. And periodically adjusting the target offset between the first image and the second image in a reference offset interval.
In this embodiment, after determining the reference offset interval and the first image and the second image corresponding to the target image, the terminal device may first determine a target offset amount between the first image and the second image, and periodically adjust the target offset amount between the first image and the second image in the reference offset interval.
It can be understood that the terminal device may periodically adjust the target offset amount in the reference offset interval by the following formula:
Δd=f(t);
where Δ d is the target offset, f (t) is a periodic continuous function, f (t) may be a cosine function, and t is a time variable. That is, the value of Δ d, the direction and frequency of the change in the first and second images varies as the periodic continuous function varies.
It should be noted that the terminal device may determine the target offset by:
determining a first offset between the position of the adjusted first image and the position of the first image before adjustment;
determining a second offset between the position of the adjusted second image and the position of the second image before adjustment;
the sum of the first offset amount and the second offset amount is determined as a target offset amount.
In this embodiment, when detecting a target offset between a first image and a second image after forming a 3D display image in real time, the terminal device may determine a first offset between a position of the first image after adjustment and a position of the first image before adjustment, determine a second offset between a position of the second image after adjustment and a position of the second image before adjustment, and determine a sum of the first offset and the second offset as the target offset, where the first image and the second image are at initial positions before adjustment, and the first image and the second image are at actual positions when the 3D image is formed after processing.
104. And interleaving the first image and the second image after the target offset is adjusted so as to display the 2D image in a 3D mode.
In this embodiment, after adjusting the target offset amount each time, the terminal device may interleave the first image and the second image after adjusting the target offset amount to generate and display a 3D image.
In summary, it can be seen that, in the embodiment provided by the application, the terminal device may periodically adjust the target offset between the first image and the second image, so that the adjusted target offset is always located between the reference offset intervals, and it is ensured that both eyes of a user can receive a view when watching a 2D image converted into a 3D image, and then the user can observe a clear 3D image.
The embodiments of the present application are described above from the perspective of a 3D display method of a 2D image, and the embodiments of the present application are described below from the perspective of a terminal device.
Referring to fig. 3, fig. 3 is a schematic view of a virtual structure of a terminal device according to an embodiment of the present application, where the terminal device 300 includes:
a first determining unit 301, configured to determine a first image and a second image corresponding to a target image, where the target image is a 2D image to be 3D displayed;
a second determining unit 302, configured to determine a reference offset interval corresponding to a target user viewing the 2D image after 3D display at a preset distance;
an adjusting unit 303, configured to periodically adjust a target offset amount between the first image and the second image in the reference offset section;
a generating unit 304, configured to interleave the first image and the second image after the target offset amount is adjusted, so as to perform 3D display on the 2D image.
In one possible design, the second determining unit 302 is specifically configured to:
determining an initial visual parameter corresponding to the target user;
adjusting the offset between a third image and a fourth image corresponding to the first test 3D image;
recording actual visual parameters corresponding to the target user when the shifted third image and the shifted fourth image are displayed after blanking;
determining a first maximum offset and a first minimum offset between a center of the third image and a center of the fourth image when the actual visual parameters match the initial visual parameters;
and determining the reference offset interval according to the first maximum offset and the first minimum offset.
In a possible design, the second determining unit 302 is further specifically configured to:
shifting at least one image of a fifth image and a sixth image according to the operation instruction of the target user, wherein the fifth image and the sixth image are images corresponding to a second test 3D image;
if a recording instruction of the target user is received, recording a second maximum offset and a second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction;
and determining the reference offset interval according to the second maximum offset and the second minimum offset.
In one possible design, the adjusting unit 303 is specifically configured to:
periodically adjusting the target offset amount in the reference offset interval by the following formula:
Δd=f(t);
where Δ d is the target offset, f (t) is a periodic continuous function, and t is a time variable.
In one possible design, the first determining unit 301 is further configured to:
determining a first offset between the position of the adjusted first image and the position of the first image before adjustment;
determining a second offset between the position of the adjusted second image and the position of the second image before the adjustment;
the sum of the first offset amount and the second offset amount is determined as a target offset amount.
Next, another terminal device provided in the embodiment of the present application is introduced, please refer to fig. 4, where fig. 4 is a schematic diagram of a hardware structure of the terminal device provided in the embodiment of the present application, and the terminal device 400 includes:
a receiver 401, a transmitter 402, a processor 403 and a memory 404 (wherein the number of processors 403 in the terminal device 400 may be one or more, one processor is taken as an example in fig. 4). In some embodiments of the present application, the receiver 401, the transmitter 402, the processor 403 and the memory 404 may be connected by a bus or other means, wherein fig. 4 illustrates the connection by a bus.
Memory 404 may include both read-only memory and random-access memory and provides instructions and data to processor 403. A portion of the memory 404 may also include NVRAM. The memory 404 stores an operating system and operating instructions, executable modules or data structures, or a subset or an expanded set thereof, wherein the operating instructions may include various operating instructions for performing various operations. The operating system may include various system programs for implementing various basic services and for handling hardware-based tasks.
Processor 403 controls the operation of the terminal device, and processor 403 may also be referred to as a CPU. In a specific application, the various components of the terminal device are coupled together by a bus system, wherein the bus system may include a power bus, a control bus, a status signal bus, etc., in addition to a data bus. For clarity of illustration, the various buses are referred to in the figures as a bus system.
The 3D display method of the 2D image disclosed in the embodiment of the present application may be applied to the processor 403, or implemented by the processor 403. The processor 403 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the method shown in fig. 1 may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 403. The processor 403 may be a general purpose processor, DSP, ASIC, FPGA or other programmable logic device, discrete gate or transistor logic device, discrete hardware component. The various methods, steps, and logic blocks disclosed in the embodiments of the present application may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 404, and the processor 403 reads the information in the memory 404 and completes the steps of the method in combination with the hardware.
The embodiment of the present application further provides a computer-readable medium, which includes a computer execution instruction, where the computer execution instruction enables a server to execute the 3D display method for a 2D image described in the foregoing embodiment, and the implementation principle and the technical effect are similar, and are not described herein again.
It should be noted that the above-described embodiments of the apparatus are merely schematic, where the units described as separate parts may or may not be physically separate, and the parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. In addition, in the drawings of the embodiments of the apparatus provided in the present application, the connection relationship between the modules indicates that there is a communication connection therebetween, and may be implemented as one or more communication buses or signal lines.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present application can be implemented by software plus necessary general-purpose hardware, and certainly can also be implemented by special-purpose hardware including special-purpose integrated circuits, special-purpose CPUs, special-purpose memories, special-purpose components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions may be various, such as analog circuits, digital circuits, or dedicated circuits. However, for the present application, the implementation of a software program is more preferable. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk of a computer, and includes instructions for enabling a computer device (which may be a personal computer, a server, or a network device) to execute the methods described in the embodiments of the present application.
In the above embodiments, the implementation may be wholly or partially realized by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, cause the processes or functions described in accordance with the embodiments of the application to occur, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that a computer can store or a data storage device, such as a server, a data center, etc., that is integrated with one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and such modifications and substitutions do not depart from the essence of the corresponding technical solutions.

Claims (10)

1.一种2D图像的3D显示方法,其特征在于,包括:1. a 3D display method of 2D image, is characterized in that, comprises: 确定目标图像所对应的第一图像和第二图像,其中,所述目标图像为待进行3D显示的2D图像;determining the first image and the second image corresponding to the target image, wherein the target image is a 2D image to be displayed in 3D; 确定在预设距离观看进行3D显示后的所述2D图像的目标用户所对应的基准偏移区间;determining a reference offset interval corresponding to a target user viewing the 2D image after 3D display at a preset distance; 将所述第一图像与所述第二图像之间的目标偏移量在所述基准偏移区间进行周期性调整;Periodically adjusting the target offset between the first image and the second image in the reference offset interval; 将调整所述目标偏移量后的所述第一图像和所述第二图像进行交织,以将所述2D图像进行3D显示。The first image and the second image after adjusting the target offset are interleaved to display the 2D image in 3D. 2.根据权利要求1所述的方法,其特征在于,所述确定在预设距离观看所述3D显示图像的目标用户所对应的基准偏移区间包括:2. The method according to claim 1, wherein the determining a reference offset interval corresponding to a target user viewing the 3D display image at a preset distance comprises: 确定所述目标用户所对应的初始视觉参数;determining the initial visual parameters corresponding to the target user; 调整第一测试3D图像所对应的第三图像与第四图像之间的偏移量;adjusting the offset between the third image and the fourth image corresponding to the first test 3D image; 记录偏移后的所述第三图像和所述第四图像在消隐后呈现时所述目标用户所对应的实际视觉参数;recording the actual visual parameters corresponding to the target user when the shifted third image and the fourth image are presented after blanking; 确定所述实际视觉参数与所述初始视觉参数匹配时,所述第三图像的中心与所述第四图像的中心之间的第一最大偏移量和第一最小偏移量;determining the first maximum offset and the first minimum offset between the center of the third image and the center of the fourth image when the actual visual parameter matches the initial visual parameter; 根据所述第一最大偏移量和所述第一最小偏移量确定为所述基准偏移区间。The reference offset interval is determined according to the first maximum offset and the first minimum offset. 3.根据权利要求1所述的方法,其特征在于,所述确定在预设距离观看进行3D显示后的所述2D图像的目标用户所对应的基准偏移区间包括:3. The method according to claim 1, wherein the determining a reference offset interval corresponding to a target user viewing the 2D image after 3D display at a preset distance comprises: 根据所述目标用户的操作指令将第五图像和第六图像中至少一个图像进行偏移,所述第五图像和所述第六图像为第二测试3D图像所对应的图像;Offset at least one of the fifth image and the sixth image according to the operation instruction of the target user, where the fifth image and the sixth image are images corresponding to the second test 3D image; 若接收到所述目标用户的记录指令,则根据所述记录指令记录所述第五图像的中心与所述第六图像的中心之间的第二最大偏移量和第二最小偏移量;If a recording instruction from the target user is received, record the second maximum offset and the second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction; 根据所述第二最大偏移量和所述第二最小偏移量确定为所述基准偏移区间。The reference offset interval is determined according to the second maximum offset and the second minimum offset. 4.根据权利要求1至3中任一项所述的方法,其特征在于,所述将所述第一图像与所述第二图像之间的目标偏移量在所述基准偏移区间进行周期性调整包括:4. The method according to any one of claims 1 to 3, wherein the performing the target offset between the first image and the second image in the reference offset interval Periodic adjustments include: 通过如下公式将所述目标偏移量在所述基准偏移区间进行周期性调整:The target offset is periodically adjusted in the reference offset interval by the following formula: Δd=f(t);Δd=f(t); 其中,Δd为所述目标偏移量,f(t)为周期性连续函数,t为时间变量。Among them, Δd is the target offset, f(t) is a periodic continuous function, and t is a time variable. 5.根据权利要求1至3中任一项所述的方法,其特征在于,所述方法还包括:5. The method according to any one of claims 1 to 3, wherein the method further comprises: 确定调整后所述第一图像的位置与调整前所述第一图像的位置间的第一偏移量;determining a first offset between the position of the first image after adjustment and the position of the first image before adjustment; 确定调整后所述第二图像的位置与调整前所述第二图像的位置间的第二偏移量;determining a second offset between the position of the second image after adjustment and the position of the second image before adjustment; 将所述第一偏移量和所述第二偏移量的和值确定为所述目标偏移量。A sum of the first offset and the second offset is determined as the target offset. 6.一种终端设备,其特征在于,包括:6. A terminal device, comprising: 第一确定单元,用于确定目标图像所对应的第一图像和第二图像,其中,所述目标图像为待进行3D显示的2D图像;a first determining unit, configured to determine a first image and a second image corresponding to a target image, wherein the target image is a 2D image to be displayed in 3D; 第二确定单元,用于确定在预设距离观看进行3D显示后的所述2D图像的目标用户所对应的基准偏移区间;a second determining unit, configured to determine a reference offset interval corresponding to a target user viewing the 2D image after 3D display at a preset distance; 调整单元,用于将所述第一图像与所述第二图像之间的目标偏移量在所述基准偏移区间进行周期性调整;an adjustment unit, configured to periodically adjust the target offset between the first image and the second image in the reference offset interval; 生成单元,用于将调整所述目标偏移量后的所述第一图像和所述第二图像进行交织,以将所述2D图像进行3D显示。A generating unit, configured to interlace the first image and the second image after adjusting the target offset, so as to display the 2D image in 3D. 7.根据权利要求6所述的终端设备,其特征在于,所述第二确定单元具体用于:7. The terminal device according to claim 6, wherein the second determining unit is specifically configured to: 确定所述目标用户所对应的初始视觉参数;determining the initial visual parameters corresponding to the target user; 调整第一测试3D图像所对应的第三图像与第四图像之间的偏移量;adjusting the offset between the third image and the fourth image corresponding to the first test 3D image; 记录偏移后的所述第三图像和所述第四图像在消隐后呈现时所述目标用户所对应的实际视觉参数;recording the actual visual parameters corresponding to the target user when the shifted third image and the fourth image are presented after blanking; 确定所述实际视觉参数与所述初始视觉参数匹配时,所述第三图像的中心与所述第四图像的中心之间的第一最大偏移量和第一最小偏移量;determining the first maximum offset and the first minimum offset between the center of the third image and the center of the fourth image when the actual visual parameter matches the initial visual parameter; 根据所述第一最大偏移量和所述第一最小偏移量确定为所述基准偏移区间。The reference offset interval is determined according to the first maximum offset and the first minimum offset. 8.根据权利要求6所述的终端设备,其特征在于,所述第二确定单元还具体用于:8. The terminal device according to claim 6, wherein the second determining unit is further specifically configured to: 根据所述目标用户的操作指令将第五图像和第六图像中至少一个图像进行偏移,所述第五图像和所述第六图像为第二测试3D图像所对应的图像;Offset at least one of the fifth image and the sixth image according to the operation instruction of the target user, where the fifth image and the sixth image are images corresponding to the second test 3D image; 若接收到所述目标用户的记录指令,则根据所述记录指令记录所述第五图像的中心与所述第六图像的中心之间的第二最大偏移量和第二最小偏移量;If a recording instruction from the target user is received, record the second maximum offset and the second minimum offset between the center of the fifth image and the center of the sixth image according to the recording instruction; 根据所述第二最大偏移量和所述第二最小偏移量确定为所述基准偏移区间。The reference offset interval is determined according to the second maximum offset and the second minimum offset. 9.根据权利要求6至8中任一项所述的终端设备,其特征在于,所述调整单元具体用于:9. The terminal device according to any one of claims 6 to 8, wherein the adjustment unit is specifically configured to: 通过如下公式将所述目标偏移量在所述基准偏移区间进行周期性调整:The target offset is periodically adjusted in the reference offset interval by the following formula: Δd=f(t);Δd=f(t); 其中,Δd为所述目标偏移量,f(t)为周期性连续函数,t为时间变量。Among them, Δd is the target offset, f(t) is a periodic continuous function, and t is a time variable. 10.一种计算机存储介质,其特征在于,包括:10. A computer storage medium, comprising: 指令,当所述指令在计算机上运行时,使得计算机执行权利要求1至5中任一项所述的2D图像的3D显示方法。The instructions, when executed on the computer, cause the computer to execute the 3D display method of the 2D image according to any one of claims 1 to 5 .
CN202111305761.4A 2021-11-05 2021-11-05 3D display method and device of 2D image and storage medium Pending CN114092318A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111305761.4A CN114092318A (en) 2021-11-05 2021-11-05 3D display method and device of 2D image and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111305761.4A CN114092318A (en) 2021-11-05 2021-11-05 3D display method and device of 2D image and storage medium

Publications (1)

Publication Number Publication Date
CN114092318A true CN114092318A (en) 2022-02-25

Family

ID=80299059

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111305761.4A Pending CN114092318A (en) 2021-11-05 2021-11-05 3D display method and device of 2D image and storage medium

Country Status (1)

Country Link
CN (1) CN114092318A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110679147A (en) * 2017-03-22 2020-01-10 奇跃公司 Depth-based foveated rendering for display systems
US20200293828A1 (en) * 2019-03-15 2020-09-17 Nvidia Corporation Techniques to train a neural network using transformations
CN113411564A (en) * 2021-06-21 2021-09-17 纵深视觉科技(南京)有限责任公司 Method, device, medium and system for measuring human eye tracking parameters

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110679147A (en) * 2017-03-22 2020-01-10 奇跃公司 Depth-based foveated rendering for display systems
US20200293828A1 (en) * 2019-03-15 2020-09-17 Nvidia Corporation Techniques to train a neural network using transformations
CN113411564A (en) * 2021-06-21 2021-09-17 纵深视觉科技(南京)有限责任公司 Method, device, medium and system for measuring human eye tracking parameters

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
张冬阳;张建奇;: "面向三角靶标的人眼对比度阈值特性实验表征", 西安电子科技大学学报, vol. 43, no. 06, 31 December 2016 (2016-12-31) *

Similar Documents

Publication Publication Date Title
US10572010B2 (en) Adaptive parallax adjustment method and virtual reality display device
Banks et al. 3D Displays
CN109901710B (en) Media file processing method and device, storage medium and terminal
WO2010084724A1 (en) Image processing device, program, image processing method, recording method, and recording medium
CN108259883B (en) Image processing method, head-mounted display, and readable storage medium
US10885651B2 (en) Information processing method, wearable electronic device, and processing apparatus and system
JP6024159B2 (en) Information presenting apparatus, information presenting system, server, information presenting method and program
CN113940622A (en) Visual fusion intersection angle measuring method, device and storage medium
JP2010171628A (en) Image processing device, program, image processing method, recording method, and recording medium
US10834380B2 (en) Information processing apparatus, information processing method, and storage medium
US11601637B2 (en) Multifocal display devices and methods
CN114092318A (en) 3D display method and device of 2D image and storage medium
Li et al. Visual discomfort in 3DTV: Definitions, causes, measurement, and modeling
CN109922326B (en) Method, device, medium and equipment for determining resolution of naked eye 3D video image
CN113946213A (en) Method and device for determining visual fusion crossing angle and storage medium
CN114092417A (en) Method and device for adjusting 3D display image offset and storage medium
WO2023216619A1 (en) 3d display method and 3d display device
CN113763472A (en) Method and device for determining viewpoint width and storage medium
Kang Wei et al. Three-dimensional scene navigation through anaglyphic panorama visualization
CN114998563A (en) Method and device for generating 3D image, computer equipment and storage medium
KR20160041403A (en) Method for gernerating 3d image content using information on depth by pixels, and apparatus and computer-readable recording medium using the same
US12141350B2 (en) Vergence based gaze matching for mixed-mode immersive telepresence application
Fornalczyk et al. Stereoscopic image perception quality factors
Xing Towards Reliable Stereoscopic 3D Quality Evaluation: Subjective Assessment and Objective Metrics
US20240223740A1 (en) Method and device for adjusting depth of stereoscopic image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20240105

Address after: Room 5-1, Building 1, No. 12 Shigui Avenue, Jieshi Town, Banan District, Chongqing, 401320

Applicant after: Xiaodou Vision (Chongqing) Medical Technology Co.,Ltd.

Address before: 518101 satellite building 1601, No. 61, Gaoxin South ninth Road, high tech Zone community, Yuehai street, Nanshan District, Shenzhen, Guangdong

Applicant before: Shenzhen Xiaodou Vision Technology Co.,Ltd.