[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN107609946B - Display control method and computing device - Google Patents

Display control method and computing device Download PDF

Info

Publication number
CN107609946B
CN107609946B CN201710848204.4A CN201710848204A CN107609946B CN 107609946 B CN107609946 B CN 107609946B CN 201710848204 A CN201710848204 A CN 201710848204A CN 107609946 B CN107609946 B CN 107609946B
Authority
CN
China
Prior art keywords
vertex
model
dimensional
clothes
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201710848204.4A
Other languages
Chinese (zh)
Other versions
CN107609946A (en
Inventor
潘永路
冯文泰
孙英男
刘勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Ufashion Technology Co ltd
Original Assignee
Beijing Ufashion Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Ufashion Technology Co ltd filed Critical Beijing Ufashion Technology Co ltd
Priority to CN201710848204.4A priority Critical patent/CN107609946B/en
Publication of CN107609946A publication Critical patent/CN107609946A/en
Application granted granted Critical
Publication of CN107609946B publication Critical patent/CN107609946B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a display control method, which is suitable for being executed in computing equipment and comprises the following steps: acquiring a three-dimensional garment model of the garment to be tried on and a hollow template corresponding to the three-dimensional garment model, wherein the hollow template represents the front-back relation of coordinates of each vertex in the three-dimensional garment model and the three-dimensional body model of the user; carrying out transparent display processing on the three-dimensional body model of the user according to the hollow template to obtain a hollow user body; and overlapping the clothes to be tried on the hollow-out user body so as to display the effect of trying on the clothes of the user. The invention also discloses a computing device for executing the method.

Description

Display control method and computing device
Technical Field
The invention relates to the field of three-dimensional data processing, in particular to a display control method and computing equipment.
Background
With the development of AR (augmented reality) technology, its application in the fields of military affairs, medical treatment, entertainment, games, network video communication, etc. is more and more extensive. A typical application is virtual fitting, i.e. replacing real clothes worn by a user in an image of the user taken by a camera with virtual clothes, thereby displaying a three-dimensional effect of the user wearing the virtual clothes (or the virtual clothes being overlaid on the image of the user).
A common display scheme of virtual fitting is to combine an image of a user who has photographed the user with an image of clothes to be fitted, and in particular, to combine the image of clothes to be fitted onto the image of the user's body by changing the size of the clothes to be fitted based on body contour data on the image of the user's body, so that the virtual clothes to be fitted are always drawn in front of the body. The processing mode has two defects, namely, only the fitting effect of the front side of the user is displayed, so that the user cannot see the back side effect of the clothes during virtual fitting; secondly, when the front-back relationship between the body of the user and the clothes to be tried on is complex, it is difficult to properly generate a composite image, for example, when the hand of the user is in front of the trunk, the clothes to be tried on is also drawn in front, and the hand is hidden behind the clothes to be tried on, which is different from the actual trying effect desired by the user.
Therefore, an effective display control scheme is needed to solve the above problems and improve the user experience.
Disclosure of Invention
To this end, the present invention provides a display control method and computing device in an attempt to solve or at least alleviate at least one of the problems identified above.
According to an aspect of the present invention, there is provided a display control method, adapted to be executed in a computing device, comprising the steps of: acquiring a three-dimensional garment model of the garment to be tried on and a hollow template corresponding to the three-dimensional garment model, wherein the hollow template represents the front-back relation of coordinates of each vertex in the three-dimensional garment model and the three-dimensional body model of the user; carrying out transparent display processing on the three-dimensional body model of the user according to the hollow template to obtain a hollow user body; and overlapping the clothes to be tried on the hollow-out user body so as to display the effect of trying on the clothes of the user.
Optionally, in the display control method according to the present invention, before the step of obtaining the three-dimensional garment model and the corresponding hollow template, the method further includes a step of pre-building the three-dimensional garment model: acquiring depth data of clothes to be tried on; and constructing a three-dimensional garment model of the garment to be tried on according to the depth data.
Optionally, in the display control method according to the present invention, the method further includes the step of creating a three-dimensional model of the body of the user: acquiring depth data of a user body; and constructing a three-dimensional model of the user's body from the depth data.
Optionally, in the display control method according to the present invention, the method further includes the step of pre-establishing a hollow template corresponding to each clothes to be tried on: searching a second vertex in the three-dimensional body model matched with each first vertex in the three-dimensional garment model of the clothes to be tried on; judging the front-back relation between the first vertex and the second vertex through the coordinate value of the first vertex and the coordinate value of the matched second vertex; determining a second vertex in the three-dimensional body model of the user covered by the clothes to be tried on according to the front-back relation; and generating the hollow template according to the determined coordinate value of the second vertex.
Optionally, in the display control method according to the present invention, the step of determining the anteroposterior relationship between the first vertex and the second vertex by the first vertex coordinate value and the matched second vertex coordinate value includes: judging the Z-axis value of the first vertex coordinate and the Z-axis value of the matched second vertex coordinate: when the Z-axis value of the first vertex coordinate is smaller than the Z-axis value of the second vertex coordinate, the second vertex is drawn before the matched first vertex; when the Z-axis value of the first vertex coordinate is not less than the Z-axis value of the second vertex coordinate, the first vertex is drawn before the matching second vertex.
Alternatively, in the display control method according to the present invention, the step of determining the second vertex in the three-dimensional model of the user's body covered with the clothing to be worn according to the anteroposterior relationship includes: and when the first vertex is drawn before the matched second vertex, the corresponding second vertex is covered by the clothes to be tried on, and the coordinate value of the second vertex is recorded.
Optionally, in the display control method according to the present invention, the hollow template is in a JSON data format.
Optionally, in the display control method according to the present invention, the step of performing transparent display processing on the three-dimensional model of the body of the user according to the hollow template includes: the Aphla value of the UV channel of the second vertex recorded in the stencil template was set to transparent.
According to another aspect of the present invention, there is provided a computing device comprising: one or more processors; a memory; and one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs including instructions for performing any of the methods described above.
According to a further aspect of the invention there is provided a computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform any of the methods described above.
According to the display control scheme, when a user tries to wear a certain piece of clothes, the three-dimensional model of the clothes to be tried is obtained, and meanwhile the hollow template is obtained; and carrying out corresponding transparent display processing on the three-dimensional body model according to the hollow template to obtain a hollow user body, and then covering the clothes to be tried on the hollow user body to display the effect of trying on the clothes by the user. The problem of penetration during virtual fitting is effectively solved by transparent display of the body part covered by the garment.
In addition, a three-dimensional garment model of the garment to be tried on and a corresponding three-dimensional user body model are collected in advance, a hollow template is generated according to the front-back relation of the vertex coordinates of the hollow template, and the hollow template is stored in the three-dimensional garment model. Therefore, the problem of system blockage caused by a large amount of calculation is effectively solved, meanwhile, most of places covered by clothes on the body are not calculated, and the whole picture display effect is not influenced.
Drawings
To the accomplishment of the foregoing and related ends, certain illustrative aspects are described herein in connection with the following description and the annexed drawings, which are indicative of various ways in which the principles disclosed herein may be practiced, and all aspects and equivalents thereof are intended to be within the scope of the claimed subject matter. The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description read in conjunction with the accompanying drawings. Throughout this disclosure, like reference numerals generally refer to like parts or elements.
FIG. 1 shows a schematic diagram of a computing device 100, according to an embodiment of the invention;
fig. 2 shows a schematic diagram of an exemplary configuration of a virtual fitting system 200;
FIG. 3A shows a normal effect diagram of a user trying on a garment; FIG. 3B is a diagram showing the effect of improper perforation treatment when a user tries on the clothes;
FIG. 4 illustrates a flow diagram of a display control method 400 according to one embodiment of the invention;
FIG. 5 illustrates a flow diagram of a method 500 for creating a stencil for a three-dimensional model of a garment, in accordance with one embodiment of the present invention; and
FIG. 6A shows a schematic view of a hollowed out user's body, according to one embodiment of the invention; fig. 6B shows a display effect diagram after the clothes corresponding to fig. 6A is tried on.
Detailed Description
Exemplary embodiments of the present disclosure will be described in more detail below with reference to the accompanying drawings. While exemplary embodiments of the present disclosure are shown in the drawings, it should be understood that the present disclosure may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the disclosure to those skilled in the art.
Fig. 1 is a block diagram of an example computing device 100. In a basic configuration 102, computing device 100 typically includes system memory 106 and one or more processors 104. A memory bus 108 may be used for communication between the processor 104 and the system memory 106.
Depending on the desired configuration, the processor 104 may be any type of processing, including but not limited to: a microprocessor (μ P), a microcontroller (μ C), a Digital Signal Processor (DSP), or any combination thereof. The processor 104 may include one or more levels of cache, such as a level one cache 110 and a level two cache 112, a processor core 114, and registers 116. The example processor core 114 may include an Arithmetic Logic Unit (ALU), a Floating Point Unit (FPU), a digital signal processing core (DSP core), or any combination thereof. The example memory controller 118 may be used with the processor 104, or in some implementations the memory controller 118 may be an internal part of the processor 104.
Depending on the desired configuration, system memory 106 may be any type of memory, including but not limited to: volatile memory (such as RAM), non-volatile memory (such as ROM, flash memory, etc.), or any combination thereof. System memory 106 may include an operating system 120, one or more applications 122, and program data 124. In some embodiments, application 122 may be arranged to operate with program data 124 on an operating system. The program data 124 includes instructions, and in the computing device 100 according to the present invention, the program data 124 contains instructions for a display control method.
Computing device 100 may also include an interface bus 140 that facilitates communication from various interface devices (e.g., output devices 142, peripheral interfaces 144, and communication devices 146) to the basic configuration 102 via the bus/interface controller 130. The example output device 142 includes a graphics processing unit 148 and an audio processing unit 150. They may be configured to facilitate communication with various external devices, such as a display or speakers, via one or more a/V ports 152. Example peripheral interfaces 144 may include a serial interface controller 154 and a parallel interface controller 156, which may be configured to facilitate communication with external devices such as input devices (e.g., keyboard, mouse, pen, voice input device, touch input device) or other peripherals (e.g., printer, scanner, etc.) via one or more I/O ports 158. An example communication device 146 may include a network controller 160, which may be arranged to facilitate communications with one or more other computing devices 162 over a network communication link via one or more communication ports 164.
A network communication link may be one example of a communication medium. Communication media may typically be embodied by computer readable instructions, data structures, program modules, and may include any information delivery media, such as carrier waves or other transport mechanisms, in a modulated data signal. A "modulated data signal" may be a signal that has one or more of its data set or its changes made in such a manner as to encode information in the signal. By way of non-limiting example, communication media may include wired media such as a wired network or private-wired network, and various wireless media such as acoustic, Radio Frequency (RF), microwave, Infrared (IR), or other wireless media. The term computer readable media as used herein may include both storage media and communication media.
Computing device 100 may be implemented as a server, such as a file server, a database server, an application server, a WEB server, etc., or as part of a small-form factor portable (or mobile) electronic device, such as a cellular telephone, a Personal Digital Assistant (PDA), a personal media player device, a wireless WEB-watch device, a personal headset device, an application specific device, or a hybrid device that include any of the above functions. Computing device 100 may also be implemented as a personal computer including both desktop and notebook computer configurations.
In recent years, a technique called Augmented Reality (AR) for overlaying additional information on the real world and presenting it to a user has received attention. Information to be presented to a user in AR technology can be visualized by using various forms of virtual objects such as text, icons, or animations. One major application of AR technology is to support the activities of users in the real world. Hereinafter, the AR technology is applied to the virtual fitting system.
Through the virtual fitting system using the AR technology, the user can experience virtual fitting in real time. According to the virtual fitting system of one embodiment of the present invention, the virtual clothes to be fitted can be displayed on the body of the user by acquiring the anteroposterior relationship (or referred to as an overlay relationship) of the clothes to be fitted and the body of the user.
Fig. 2 is a diagram illustrating an exemplary configuration scenario of a virtual fitting system 200 according to an embodiment of the present invention. An outline of the virtual fitting system 200 will be described below with reference to fig. 2.
In fig. 2, a virtual fitting system 200 includes an imaging part 210 for imaging a subject (i.e., a user), an image processing part 220 for "putting on" a virtual garment on the subject, and a display part 230 for displaying an image of the subject with the virtual garment. There is no particular limitation on the place where the virtual fitting system 200 is installed, and for example, the virtual fitting system 200 may be installed at a user's home or may be installed in a public place such as a brand store, a mall, an airport, a train station, or the like.
According to an implementation of the present invention, the imaging section 210 may include a photographing device (e.g., a camera) and a sensor. The photographing means is used to photograph an object existing in a real space, and in the example shown in fig. 2, a subject (e.g., a person) as a real space object is photographed by a camera. The sensor has a function of detecting a parameter from the real space, and transmits the detected data (e.g., depth data) to the image processing section 220. For example, the sensor may be constituted by an infrared sensor.
Alternatively, the imaging section 210 may employ a Kinect somatosensory camera. The Kinect somatosensory camera is provided with 3 lenses, and the middle lens is an RGB color camera and is used for collecting color images; the left and right lenses are an infrared emitter and an infrared CMOS camera, respectively, which form a 3D structured light depth sensor for collecting depth data (i.e., the distance from an object in the scene to the camera). The process and principle of the Kinect somatosensory camera for collecting the color image and the depth data of the measured object belong to the common knowledge in the field, and are not explained in detail herein.
It should be noted that the imaging part is not limited to the Kinect motion sensing camera, and any scheme capable of collecting the depth information of the scene in front of the shooting device (e.g., camera) is within the protection scope of the present invention.
According to an embodiment of the present invention, the image processing portion 220 and the display portion 230 may employ the computing device 100 to implement their functionality. The image processing part 220 synthesizes the virtual clothes to be tried on the photographed user body by the display control method of the present invention according to the data collected by the imaging part 210, and then the display part 230 displays the fitting effect.
Specifically, the image processing portion 220 calculates the position of the bone point of the user according to the user data (including, for example, the color image data and the depth data of the user) acquired by the imaging portion 210, and further constructs a three-dimensional model of the body of the user. Then, according to the obtained three-dimensional model of the clothes to be tried-on and the three-dimensional model of the body of the user, corresponding matching calculation is performed, the clothes to be tried-on are overlapped on the body of the user (for example, the clothes to be tried-on is bound to the skeleton points of the user), and the display part 230 displays the effect graph of the clothes to be tried-on of the user.
In some existing try-on effect displays, a penetration problem often occurs in the process of rendering and coloring a three-dimensional model of clothes and a three-dimensional model of a body of a user respectively, as shown in fig. 3A and 3B, fig. 3A is a normal effect diagram of trying on clothes by the user, fig. 3B is an effect diagram of improper penetration treatment, and as can be seen from fig. 3B, clothes and the body at the shoulders of the user are penetrated (a body part penetrates to the front of the clothes), and the clothes effect is not displayed correctly.
Next, a procedure in which the computing apparatus 100 executes the display control method 400 according to the embodiment of the present invention is described with reference to the flowchart of fig. 4.
The method 400 begins with step S410, obtaining a three-dimensional garment model of a garment to be tried on and a corresponding hollow template thereof.
According to one implementation of the invention, before the user tries on the clothes, all the clothes to be tried on are three-dimensionally modeled in batch: the depth data of the garment to be tried-on is acquired by using a camera and/or a sensor (such as the imaging part 210 described above), and then a three-dimensional model of the garment to be tried-on is constructed according to the depth data. According to one embodiment of the present invention, model batch-yield is performed in conjunction with KinectFusion. KinectFusion allows a user to hold and move a standard Kinect camera, and to quickly create a detailed 3D reconstruction of an indoor scene using depth data captured by the Kinect camera. The three-dimensional scanning and modeling of the garment to be tried-on are technical content disclosed in the art, and therefore, will not be described herein. It should be noted that the present invention is not limited to three-dimensional scanning and/or three-dimensional modeling by means of KinectFusion, and any means capable of three-dimensional modeling the clothes to be tried on or the body of the user can be combined with the embodiments of the present invention to complete the display control scheme of the present invention.
According to the implementation mode of the invention, for a certain type of clothes, the clothes three-dimensional models (S-size clothes three-dimensional model, M-size clothes three-dimensional model, L-size clothes three-dimensional model and the like) can be obtained by scanning the clothes with different sizes (S-size clothes three-dimensional model, M-size clothes three-dimensional model, L-size clothes three-dimensional model and the like); or only scanning the clothes with one size to obtain the three-dimensional clothes model, and then zooming to obtain the three-dimensional clothes models with other sizes. The embodiments of the present invention are not limited thereto.
Meanwhile, different models are selected to correspond to different clothes models or sizes, the model statues are used as original standard statues, and three-dimensional scanning and reconstruction are carried out on the bodies of the different models to construct a body three-dimensional model. For example, a three-dimensional body model corresponding to S code, a three-dimensional body model corresponding to M code …, and so on. According to the embodiment of the invention, the three-dimensional body model is constructed by referring to the process of constructing the three-dimensional garment model in the foregoing.
According to the embodiment of the invention, the hollow template corresponding to the clothes three-dimensional model is stored in the clothes three-dimensional model of each clothes to be tried on, and the hollow template represents the front-back relation of each vertex coordinate in the clothes three-dimensional model and the body three-dimensional model.
According to an implementation of the present invention, both the garment three-dimensional model and the body three-dimensional model established in advance may be stored in the image processing part 220. Or stored in a remote server, the server is connected to the virtual fitting system 200 through a network, and when the user wants to fit, the virtual fitting system 200 obtains the corresponding three-dimensional garment model and the corresponding three-dimensional body model stored in the remote server through a calling interface.
The process of the method 500 for creating a hollow template of a three-dimensional model of a garment will be described in detail below with reference to fig. 5.
In step S510, a second vertex in the three-dimensional model of the body that matches each first vertex in the three-dimensional model of the garment to be tried on is found.
Taking each vertex in the three-dimensional garment model of the garment to be tried on as a first vertex, finding a vertex in the three-dimensional body model of the user matched with each first vertex, taking the vertex as a second vertex, for example, traversing a certain vertex A on the left shoulder in the three-dimensional garment model of the S code of a certain garment to calculate the distance between the vertex A and the vertex B closest to the vertex A, namely the vertex corresponding to the left shoulder in the three-dimensional body model.
The process of finding matching vertices is described by the following example code segment:
Figure BDA0001412603420000081
Figure BDA0001412603420000091
in the code example above, the clothVtx and body vtx represent a garment three-dimensional model and a body three-dimensional model, respectively, both being three-dimensional arrays of the type Vextor 3. The distance calculation formula is not limited in the present invention, and any distance calculation formula of three-dimensional vectors can be combined with the embodiments of the present invention to implement the display control scheme of the present invention.
Subsequently, in step S520, the anteroposterior relationship between the first vertex and the second vertex is determined by the first vertex coordinate value and the matched second vertex coordinate value.
Specifically, the magnitude of the Z-axis value in the first vertex coordinate value and the Z-axis value in the matched second vertex coordinate value is determined: if the Z-axis value of the first vertex coordinate is less than the Z-axis value of the second vertex coordinate, indicating that the three-dimensional body model penetrated through the three-dimensional garment model at that vertex, then the second vertex was drawn before the matching first vertex; conversely, if the Z-axis value of the first vertex coordinate is not less than the Z-axis value of the second vertex coordinate, indicating that the three-dimensional model of clothing penetrated the three-dimensional model of the body at that vertex, then the first vertex was drawn before the matching second vertex.
Subsequently in step S530, a second vertex in the three-dimensional model of the body of the user covered by the clothing to be fitted is determined according to the context found in step S414. That is, when it is judged that the first vertex is drawn before the matched second vertex, the corresponding second vertex is covered by the clothing to be tried on, and at this time, the coordinate value of the second vertex is recorded.
Subsequently, in step S540, a stencil template is generated according to the determined coordinate values of the second vertex. In other words, the hollow template stores the vertex coordinates covered by the three-dimensional clothes model in the three-dimensional body model. According to one embodiment of the invention, the hollow template is in a JSON data format.
Optionally, the hollow template corresponding to the three-dimensional garment model of each piece of clothes to be tried on is stored in the three-dimensional garment model in an encrypted manner. When a user tries on a certain piece of clothes, the hollow template is read by adopting a corresponding decryption method while the three-dimensional clothes model of the clothes is obtained.
In step S420, transparent display processing is performed on the three-dimensional body model of the user according to the read hollow template, so as to obtain a hollow user body. According to one embodiment of the invention, Aphla values of the UV channels of the second vertexes recorded in the stencil template are all set to be transparent. Fig. 6A is a schematic diagram of a hollow-out user's body generated according to a hollow-out template of a certain piece of clothes to be tried on, and as shown in fig. 6A, the user's body is displayed in a transparent display manner on the upper body part of the user.
When a user selects a certain garment to be fitted to start fitting, the virtual fitting system 200 acquires depth data of the user's body via the imaging section 210, and constructs a three-dimensional model of the user's body from the depth data. The user may have a size that is different from the original standard size (i.e., the model size). In some extreme cases, it is possible that this difference may affect the subsequent display. Therefore, according to another embodiment of the present invention, the three-dimensional body model of the user is used as the current three-dimensional body model, the three-dimensional body model of the model collected in advance is used as the original three-dimensional body model, the current three-dimensional body model and the original three-dimensional body model are interpolated, the second vertex coordinates corresponding to the current three-dimensional body model are calculated according to the second vertex coordinates in the hollow template, and the corresponding hollow-out user body is generated according to the calculated second vertex coordinates in the current three-dimensional body model.
In the following step S430, the clothes to be tried on are overlapped on the hollow-out user body (for example, the three-dimensional clothes model is "tied" to the skeleton points according to the skeleton point positions of the three-dimensional clothes model), so as to show the effect of trying on the clothes by the user. As shown in fig. 6B, the display effect diagram of the clothes after fitting corresponding to fig. 6A is shown, and as shown in fig. 6B, the hollow part on the body is covered by the clothes corresponding to the three-dimensional model of the clothes, and the whole picture is not affected.
According to the display control scheme of the present invention, when a user tries to wear a certain piece of clothes, the virtual fitting system 200 acquires a hollow template while acquiring a three-dimensional model of the piece of clothes to be tried. The image processing part 220 performs corresponding transparent display processing on the three-dimensional body model according to the hollow template to obtain a hollow user body, then covers the clothes to be tried on the hollow user body, and the display part 230 displays the clothes trying on effect of the user. The problem of penetration during virtual fitting is effectively solved by transparent display of the body part covered by the garment.
In addition, due to the increase of hardware devices, people have higher and higher requirements on the fineness degree of the three-dimensional model. Since 2000k vertices are obtained when there are few beautiful garment models (or user body models), if matching operation is performed on the three-dimensional body model and the three-dimensional garment model of each user during fitting, the calculation amount increases by a geometric multiple, and in a few cases, hundreds of thousands of calculations are performed, and in a more case, millions of calculations are performed. This is a high cost for the user equipment overhead and the stuttering can affect the user's fitting experience. According to the display control scheme, the three-dimensional clothes model of the clothes to be tried on and the corresponding three-dimensional user body model are collected in advance, the hollow template is generated according to the front-back relation of the vertex coordinates of the hollow template, and the hollow template is stored in the three-dimensional clothes model. Therefore, the problem of system blockage caused by a large amount of calculation is effectively solved, meanwhile, most of places covered by clothes on the body are not calculated, and the whole picture display effect is not influenced.
In the description provided herein, numerous specific details are set forth. It is understood, however, that embodiments of the invention may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in the foregoing description of exemplary embodiments of the invention, various features of the invention are sometimes grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the disclosure and aiding in the understanding of one or more of the various inventive aspects. However, the disclosed method should not be interpreted as reflecting an intention that: that the invention as claimed requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single foregoing disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this invention.
Those skilled in the art will appreciate that the modules or units or components of the devices in the examples disclosed herein may be arranged in a device as described in this embodiment or alternatively may be located in one or more devices different from the devices in this example. The modules in the foregoing examples may be combined into one module or may be further divided into multiple sub-modules.
Those skilled in the art will appreciate that the modules in the device in an embodiment may be adaptively changed and disposed in one or more devices different from the embodiment. The modules or units or components of the embodiments may be combined into one module or unit or component, and furthermore they may be divided into a plurality of sub-modules or sub-units or sub-components. All of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or elements of any method or apparatus so disclosed, may be combined in any combination, except combinations where at least some of such features and/or processes or elements are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings) may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features included in other embodiments, rather than other features, combinations of features of different embodiments are meant to be within the scope of the invention and form different embodiments. For example, in the following claims, any of the claimed embodiments may be used in any combination.
The various techniques described herein may be implemented in connection with hardware or software or, alternatively, with a combination of both. Thus, the methods and apparatus of the present invention, or certain aspects or portions thereof, may take the form of program code (i.e., instructions) embodied in tangible media, such as floppy diskettes, CD-ROMs, hard drives, or any other machine-readable storage medium, wherein, when the program is loaded into and executed by a machine, such as a computer, the machine becomes an apparatus for practicing the invention.
In the case of program code execution on programmable computers, the mobile terminal generally includes a processor, a storage medium readable by the processor (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device. Wherein the memory is configured to store program code; the processor is configured to perform the method of the invention according to instructions in said program code stored in the memory.
Furthermore, some of the described embodiments are described herein as a method or combination of method elements that can be performed by a processor of a computer system or by other means of performing the described functions. A processor having the necessary instructions for carrying out the method or method elements thus forms a means for carrying out the method or method elements. Further, the elements of the apparatus embodiments described herein are examples of the following apparatus: the apparatus is used to implement the functions performed by the elements for the purpose of carrying out the invention.
As used herein, unless otherwise specified the use of the ordinal adjectives "first", "second", "third", etc., to describe a common object, merely indicate that different instances of like objects are being referred to, and are not intended to imply that the objects so described must be in a given sequence, either temporally, spatially, in ranking, or in any other manner.
While the invention has been described with respect to a limited number of embodiments, those skilled in the art, having benefit of this description, will appreciate that other embodiments can be devised which do not depart from the scope of the invention as described herein. Furthermore, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the inventive subject matter. Accordingly, many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the appended claims. The present invention has been disclosed in an illustrative rather than a restrictive sense, and the scope of the present invention is defined by the appended claims.

Claims (8)

1. A display control method, the method being adapted to be executed in a computing device, comprising the steps of:
acquiring depth data of a user body; and
constructing a three-dimensional model of the body of the user from the depth data;
acquiring a three-dimensional garment model of a garment to be tried on and a hollow template corresponding to the three-dimensional garment model, wherein the hollow template represents the front-back relation of coordinates of each vertex in the three-dimensional garment model and the three-dimensional body model of a user;
carrying out transparent display processing on the three-dimensional body model of the user according to the hollow template to obtain a hollow user body; and
overlapping the clothes to be tried on the hollow-out user body to display the effect of trying on the clothes by the user;
wherein, the method also comprises the step of pre-establishing a hollow template corresponding to each clothes to be tried on,
selecting different models corresponding to different clothes to be tried on, taking the model figure as an original standard figure, and performing three-dimensional scanning and reconstruction on the body of the model to construct a three-dimensional body model of the model;
searching a second vertex in the three-dimensional body model matched with each first vertex in the three-dimensional garment model of the clothes to be tried on;
judging the front-back relation between the first vertex and the second vertex through the coordinate value of the first vertex and the coordinate value of the matched second vertex;
determining a second vertex in the body three-dimensional model of the model covered by the clothes to be tried on according to the front-back relation; and
taking a body three-dimensional model of a user as a current body three-dimensional model, taking a body three-dimensional model of a model which is acquired in advance and corresponds to clothes to be tried as an original body three-dimensional model, carrying out interpolation operation on the current body three-dimensional model and the original body three-dimensional model, and correspondingly calculating a second vertex coordinate corresponding to the current body three-dimensional model according to a second vertex in the body three-dimensional model of the model covered by the clothes to be tried;
and generating a hollow template according to the determined second vertex coordinates corresponding to the current body three-dimensional model.
2. The method of claim 1, wherein before the step of obtaining the three-dimensional garment model and the corresponding hollow template, the method further comprises a step of pre-establishing the three-dimensional garment model:
acquiring depth data of clothes to be tried on; and
and constructing a three-dimensional garment model of the garment to be tried on according to the depth data.
3. The method of claim 1, wherein determining the context of the first vertex and the second vertex by the first vertex coordinate value and the matching second vertex coordinate value comprises:
judging the Z-axis value of the first vertex coordinate and the Z-axis value of the matched second vertex coordinate:
when the Z-axis value of the first vertex coordinate is smaller than the Z-axis value of the second vertex coordinate, the second vertex is drawn before the matched first vertex;
when the Z-axis value of the first vertex coordinate is not less than the Z-axis value of the second vertex coordinate, the first vertex is drawn before the matching second vertex.
4. The method of claim 3, wherein the step of determining a second vertex in the three-dimensional model of the user's body covered by the garment to be tried on from a context comprises:
and when the first vertex is drawn before the matched second vertex, the corresponding second vertex is covered by the clothes to be tried on, and the coordinate value of the second vertex is recorded.
5. The method of any one of claims 1-4, wherein the stencil template is in a JSON data format.
6. The method of claim 3 or 4, wherein the step of transparently displaying the three-dimensional model of the body of the user according to the hollow template comprises the following steps:
and setting the Aphla value of the UV channel of the second vertex recorded in the hollow template as transparent.
7. A computing device, comprising:
one or more processors;
a memory; and
one or more programs, wherein the one or more programs are stored in the memory and configured to be executed by the one or more processors, the one or more programs comprising instructions for performing any of the methods of claims 1-6.
8. A computer readable storage medium storing one or more programs, the one or more programs comprising instructions, which when executed by a computing device, cause the computing device to perform any of the methods of claims 1-6.
CN201710848204.4A 2017-09-19 2017-09-19 Display control method and computing device Expired - Fee Related CN107609946B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710848204.4A CN107609946B (en) 2017-09-19 2017-09-19 Display control method and computing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710848204.4A CN107609946B (en) 2017-09-19 2017-09-19 Display control method and computing device

Publications (2)

Publication Number Publication Date
CN107609946A CN107609946A (en) 2018-01-19
CN107609946B true CN107609946B (en) 2020-11-06

Family

ID=61060943

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710848204.4A Expired - Fee Related CN107609946B (en) 2017-09-19 2017-09-19 Display control method and computing device

Country Status (1)

Country Link
CN (1) CN107609946B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108404414B (en) * 2018-03-26 2021-09-24 网易(杭州)网络有限公司 Picture fusion method and device, storage medium, processor and terminal
CN108648061A (en) * 2018-05-18 2018-10-12 北京京东尚科信息技术有限公司 image generating method and device
CN109541812A (en) * 2018-11-12 2019-03-29 西安电子科技大学 A kind of body three-dimensional display apparatus and its control method
CN112704881B (en) * 2020-12-29 2024-07-09 苏州亿歌网络科技有限公司 Dress assembling display method, device, server and storage medium
CN113610960B (en) * 2021-07-19 2024-04-02 福建凯米网络科技有限公司 Method, device and storage medium for preventing mold penetration during three-dimensional model merging

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156810A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 Augmented reality real-time virtual fitting system and method thereof
CN103049852A (en) * 2012-12-19 2013-04-17 武汉世纪炎龙网络科技有限公司 Virtual fitting system
US9123176B2 (en) * 2012-06-27 2015-09-01 Reallusion Inc. System and method for performing three-dimensional motion by two-dimensional character
CN106548392A (en) * 2016-10-27 2017-03-29 河海大学常州校区 A kind of virtual fitting implementation method based on webGL technologies
CN106652010A (en) * 2016-10-19 2017-05-10 武汉布偶猫科技有限公司 Transparent mapping method based on annular area

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102156810A (en) * 2011-03-30 2011-08-17 北京触角科技有限公司 Augmented reality real-time virtual fitting system and method thereof
US9123176B2 (en) * 2012-06-27 2015-09-01 Reallusion Inc. System and method for performing three-dimensional motion by two-dimensional character
CN103049852A (en) * 2012-12-19 2013-04-17 武汉世纪炎龙网络科技有限公司 Virtual fitting system
CN106652010A (en) * 2016-10-19 2017-05-10 武汉布偶猫科技有限公司 Transparent mapping method based on annular area
CN106548392A (en) * 2016-10-27 2017-03-29 河海大学常州校区 A kind of virtual fitting implementation method based on webGL technologies

Also Published As

Publication number Publication date
CN107609946A (en) 2018-01-19

Similar Documents

Publication Publication Date Title
CN107609946B (en) Display control method and computing device
EP3370208B1 (en) Virtual reality-based apparatus and method to generate a three dimensional (3d) human face model using image and depth data
CN111787242B (en) Method and apparatus for virtual fitting
US9911220B2 (en) Automatically determining correspondences between three-dimensional models
US9928411B2 (en) Image processing apparatus, image processing system, image processing method, and computer program product
US12100156B2 (en) Garment segmentation
US20150279098A1 (en) Smart device and virtual experience providing server providing virtual experience service method using digitalexperience service method using digital clothes
US10360444B2 (en) Image processing apparatus, method and storage medium
CN113657357B (en) Image processing method, image processing device, electronic equipment and storage medium
US20150269759A1 (en) Image processing apparatus, image processing system, and image processing method
JP6640294B1 (en) Mixed reality system, program, portable terminal device, and method
CN107481280B (en) Correction method of skeleton points and computing device
JP2023065502A (en) Mixed reality display device and mixed reality display method
KR20170019917A (en) Apparatus, method and computer program for generating 3-dimensional model of clothes
JP2013008257A (en) Image composition program
KR102287939B1 (en) Apparatus and method for rendering 3dimensional image using video
KR20190001896A (en) Appartus and method for displaying hierarchical depth image in virtual realilty
US20230196602A1 (en) Real-time garment exchange
CN113781291B (en) Image processing method, device, electronic equipment and storage medium
US11127218B2 (en) Method and apparatus for creating augmented reality content
WO2021179919A1 (en) System and method for virtual fitting during live streaming
WO2021179936A9 (en) System and method for virtual fitting
US20240233272A9 (en) System and method for auto-generating and sharing customized virtual environments
CN112312110B (en) Non-transitory computer readable medium, image processing apparatus, and image processing method
WO2019044333A1 (en) Simulation device, simulation method, and computer program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20201106