WO2021179936A9 - System and method for virtual fitting - Google Patents
System and method for virtual fitting Download PDFInfo
- Publication number
- WO2021179936A9 WO2021179936A9 PCT/CN2021/078533 CN2021078533W WO2021179936A9 WO 2021179936 A9 WO2021179936 A9 WO 2021179936A9 CN 2021078533 W CN2021078533 W CN 2021078533W WO 2021179936 A9 WO2021179936 A9 WO 2021179936A9
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- model
- user
- product
- fit
- segment
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/16—Cloth
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2021—Shape modification
Definitions
- Virtual fitting applications enable users to visualize various products in relation to a particular person.
- each of the available types of virtual fitting applications suffer from a number of disadvantages.
- Embodiments of the invention address these and other problems individually and collectively.
- the present invention relates generally to methods and systems related to virtual fitting applications. More particularly, embodiments of the present invention provide methods and systems for determining a level of fit for a user and product. Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
- the methods involve obtaining a first 3D model associated with a user and obtaining a second 3D model associated with a product.
- the second 3D model may be segmented into a plurality of segments, with each segment of the plurality of segments being associated with one or more material properties.
- the methods further involve fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
- Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
- One embodiment of the disclosure is directed to a method comprising receiving a fitting request comprising at least an indication of a user and a product, obtaining a first 3D model associated with the user, obtaining a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties, fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
- Another embodiment of the disclosure is directed to a system comprising a processor, and a memory including instructions that, when executed with the processor, cause the system to, at least receive a fitting request comprising at least an indication of a user and a product, obtain a first 3D model associated with the user, obtain a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties, fit the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determine a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
- Yet another embodiment of the disclosure is directed to a non-transitory computer readable medium storing specific computer-executable instructions that, when executed by a processor, cause a computer system to at least receive a fitting request comprising at least an indication of a user and a product, obtain a first 3D model associated with the user, obtain a second 3D model associated with the product, the second 3D model being segmented based on material properties, fit the second 3D model onto the first 3D model such that each segment of the second 3D model is deformed by the first 3D model in accordance with the material properties for the segment, and determine a level of fit for the user and the product based on deformation of each segment.
- embodiments of the present disclosure involve methods and systems that provide an accurate level of fit for a particular product with respect to a user. While there are a number of available virtual fitting systems, they are ineffective at providing accurate fit information. For example, some systems use digital avatars of human users and then fit digital garments onto the avatars. Some systems use a camera or other device to track a user's movement or joint positions of the user (e.g., a knee or elbow) and use these joints to drive the motion of a virtual avatar. Some other systems track a contour of the user. Then, images of virtual garments are deformed and then overlaid on top of the user image to create an augmented view as ifthe user is wearing the garment.
- a camera or other device to track a user's movement or joint positions of the user (e.g., a knee or elbow) and use these joints to drive the motion of a virtual avatar.
- Some other systems track a contour of the user. Then, images of virtual garments are deformed and then overlaid on top of the user
- FIG. 1 depicts an illustrative example of a system in which records of objects and their positioning data may be generated automatically in accordance with at least some embodiments.
- FIG. 2 depicts a system architecture for a system that determines a level of fit for a user and a product in accordance with at least some embodiments.
- FIG. 3 depicts techniques for segmenting a 3D model of a product in accordance with at least some embodiments.
- FIG. 4 is a simplified flowchart illustrating a method of determining a level of fit for a product and a user according to an embodiment of the present invention.
- FIG. 5 depicts an illustrative example of a technique for obtaining 3D models using sensor data in accordance with at least some embodiments.
- FIG. 6 illustrates an example technique for fitting a 3D model for a product onto a 3D model for a user in order to determine a level of fit in accordance with at least some embodiments.
- FIG. 7 illustrates a flow diagram depicting a process for determining a level of fit for a product and user in accordance with at least some embodiments.
- FIG. 8 illustrates examples of components of a computer system 800 according to certain embodiments.
- the present invention relates generally to methods and systems related to virtual reality applications. More particularly, embodiments of the present invention provide methods and systems for determining a level of fit for a user and product. Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
- FIG. 1 depicts an illustrative example of a system in which a level of fit may be generated for a user and product in some embodiments of the present invention.
- a user device 102 may be used to provide a request to a mobile application server 104 for product fit information.
- the user device in some cases, may be used to obtain user data 106, which may be provided to the mobile application server 104 to be used in generating product fit information.
- the mobile application server 104 may include, or have access to, object model data 108 from which product data 110 may be obtained in order to complete the request.
- the mobile application server 104 may be configured to combine the user data 106 and the product data 110 in order to determine a level of fit for a particular user.
- the mobile application server 104 may, in some embodiments, provide virtual fit data 112 back to the user device 102, which may be rendered on a display for a user to view.
- the user device 102 represents a suitable computing device that includes one or more graphical processing units (GPUs) , one or more general purpose processors (GPPs) , and one or more memories storing computer-readable instructions that are executable by at least one of the processors to perform various functionalities of the embodiments of the present disclosure.
- GPUs graphical processing units
- GPSs general purpose processors
- user device 102 can be any of a smartphone, a tablet, a laptop, a personal computer, a gaming console, or a smart television.
- the user device 102 may additionally include a range camera (i.e., depth sensor) and/or a RGB optical sensor, such as a camera.
- the user device may be used to capture and/or generate user data 106.
- User data 106 may include information related to a particular user (e.g., a user of the user device 102) for which a level of fit should be determined with respect to a product.
- the user data 106 may include data about the user which may be used to determine the level of fit.
- user data 106 may include dimensions of the user.
- User data 106 may be captured in any suitable format.
- user data 106 may include a point cloud, a 3D mesh or model, or a string of characters that includes measurements at predetermined locations. In some cases, capturing user data 106 may involve receiving information about the user which is manually input into the user device 102.
- a user may input measurements for various parts of his or her body via a keypad.
- capturing user data 106 may involve using a camera and/or a depth sensor to capture images /depth information related to the user.
- the user device 102 may be further configured to generate a 3D model from the captured images/depth information. This process is described in greater detail with respect to FIG. 5 below.
- the mobile application server 104 may include any computing device capable of determining a level of fit for a user in accordance with the techniques described herein. In order to make such a determination, the mobile application server 104 may receive user data 106 from the user device 102 as well as product data 110 from object model data 108. It should be noted that while the user data 106 may be received by the mobile application server 104 at the same time that the mobile application server 104 receives a request to determine a level of fit for a particular user and a particular product, the mobile application server 104 may also receive user data 106 prior to, and independent of, any request to determine a level of fit. For example, the mobile application server 104 may receive the user data 106 during an enrollment phase during which a user establishes an account with the mobile application server 104.
- Object model data 108 may include any computer-readable storage medium having stored thereon one or more 3D models.
- the object model data 108 may be a database maintained by the mobile application server or another server.
- the 3D models stored in object model data 108 may be representative of products which can be worn by a user, such as clothing items (e.g., garments) or accessories.
- the object model data 108 may store 3D models for multiple versions of a product (e.g., different sizes and/or styles) .
- the mobile application server 104 retrieves product data 110, which includes a 3D model associated with the particular product, from the object model data 108.
- the mobile application server 104 determines a level of fit by fitting a 3D model of the product (product data 110) onto a 3D model of a user (user data 106) . This may involve deforming portions of the 3D model of the product based on material properties associated with those portions of the 3D model. For example, portions of the 3D model of the product may be associated with a level of elasticity or rigidity such that the portion of the 3D model may be deformed in accordance with that level of elasticity or rigidity.
- the level of fit can be determined based on a degree to which deformations are made to the 3D model. For example, where a portion of the 3D model is associated with a level of elasticity (e.g., an amount that the portion can be stretched) , a level of fit may be determined based on an extent to which the portion of the 3D model is stretched. In other words, a level of fit may involve a measurement of how much a portion of the 3D model is stretched as a percentage of the total amount that the portion can be stretched. Ifthe amount that the portion of the 3D model is stretched is high in relation to the total elasticity (e.g., the amount that the portion can be stretched) , then it may be determined that the product is a tight fit. In some cases, if the amount that the portion is stretched is greater than a threshold fit value, it may be determined that the product is not a fit for this user. Different portions of the 3D model of the product may each be associated with different threshold fit values.
- the level of fit can be determined based on an amount of space between one or more portions of the 3D model of the product and the 3D model of the user. For example, a total volume of space may be determined for a volume located between the 3D model of the product and the 3D model of the user. In this example, if that total volume of space is greater than some threshold fit value, it may be determined that the product is not a fit for this user. In some cases, such a threshold fit value may be proportional to a size of the 3D model of the product or the 3D model of the user.
- determining that the product does not fit the user may prompt the mobile application server 104 to automatically (e.g., without further prompt by the user) determine a level of fit for a different version of the product (e.g., a different size or style of the product) by retrieving a second 3D model associated with that version of the product and repeating the process described above.
- FIG. 1 For clarity, a certain number of components are shown in FIG. 1. It is understood, however, that embodiments of the invention may include more than one of each component. In addition, some embodiments of the invention may include fewer than or greater than all of the components shown in FIG. 1. In addition, the components in FIG. 1 may communicate via any suitable communication medium (including the internet) , using any suitable communication protocol.
- any suitable communication medium including the internet
- FIG. 2 depicts a system architecture for a system that determines a level of fit for a user and a product in accordance with at least some embodiments.
- a user device 202 may be in communication with a number of other components, including at least a mobile application server 204.
- the mobile application server 204 may perform at least a portion of the processing functions required by a mobile application installed upon the user device.
- the user device 202 and mobile application server 204 may be examples of the user device 102 and mobile application server 104 respectively described with respect to FIG. 1.
- a user device 202 may be any suitable electronic device that is capable of providing at least a portion of the capabilities described herein.
- the user device 202 may be any electronic device capable of capturing user data and/or presenting a rendered image.
- a user device may be capable of establishing a communication session with another electronic device (e.g., mobile application server 204) and transmitting /receiving data from that electronic device.
- a user device may include the ability to download and/or execute mobile applications.
- User devices may include mobile communication devices as well as personal computers and thin-client devices.
- a user device may comprise any portable electronic device that has a primary function related to communication.
- a user device may be a smart phone, a personal data assistant (PDA) , or any other suitable handheld device.
- PDA personal data assistant
- the user device can be implemented as a self-contained unit with various components (e.g., input sensors, one or more processors, memory, etc. ) integrated into the user device.
- components e.g., input sensors, one or more processors, memory, etc.
- Reference in this disclosure to an “output” of a component or an “output” of a sensor does not necessarily imply that the output is transmitted outside of the user device. Outputs of various components might remain inside a self-contained unit that defines a user device.
- the user device 202 may include at least one memory 206 and one or more processing units (or processor (s) ) 208.
- the processor (s) 208 may be implemented as appropriate in hardware, computer-executable instructions, firmware or combinations thereof. Computer-executable instruction or firmware implementations of the processor (s) 208 may include computer-executable or machine executable instructions written in any suitable programming language to perform the various functions described.
- the user device 202 may also include one or more input sensors 210 for receiving user and/or environmental input. There may be a variety of input sensors 210 capable of detecting user or environmental input, such as an accelerometer, a camera device, a depth sensor, a microphone, a global positioning system (e.g., GPS) receiver, etc.
- the one or more input sensors 210 may include a range camera device (e.g., a depth sensor) capable of generating a range image, as well as a camera device configured to capture image information.
- a range camera may be any device configured to identify a distance or range of an object or objects from the range camera.
- the range camera may generate a range image (or range map) , in which pixel values correspond to the detected distance for that pixel.
- the pixel values can be obtained directly in physical units (e.g., meters) .
- the user device may employ a range camera that operates using structured light.
- a range camera that operates using structured light a projector projects light onto an object or objects in a structured pattern. The light may be of a range that is outside of the visible range (e.g., infrared or ultraviolet) .
- the range camera may be equipped with one or more camera devices configured to obtain an image of the object with the reflected pattern. Distance information may then be generated based on distortions in the detected pattern. It should be noted that although this disclosure focuses on the use of a range camera using structured light, any suitable type of range camera, including those that operate using stereo triangulation, sheet of light triangulation, time-of-flight, interferometry, coded aperture, or any other suitable technique for range detection, would be useable by the described system.
- the memory 206 may store program instructions that are loadable and executable on the processor (s) 208, as well as data generated during the execution of these programs.
- the memory 206 may be volatile (such as random access memory (RAM) ) and/or non-volatile (such as read-only memory (ROM) , flash memory, etc. ) .
- the user device 202 may also include additional storage 212, such as either removable storage or non-removable storage including, but not limited to, magnetic storage, optical disks, and/or tape storage.
- the disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for the computing devices.
- the memory 206 may include multiple different types of memory, such as static random access memory (SRAM) , dynamic random access memory (DRAM) or ROM.
- SRAM static random access memory
- DRAM dynamic random access memory
- the memory 206 may include an operating system 214 and one or more application programs or services for implementing the features disclosed herein including at least a mobile application 216.
- the memory 206 may also include application data 218, which provides information to be generated by and/or consumed by the mobile application 216.
- the application data 218 may be stored in a database.
- a mobile application may be any set of computer executable instructions installed upon, and executed from, a user device 202.
- Mobile applications may be installed on a user device by a manufacturer of the user device or by another entity.
- the mobile application 216 may cause a user device to establish a communication session with a mobile application server 204 that provides baekend support for the mobile application 216.
- a mobile application server 204 may maintain account information associated with a particular user device and/or user.
- a user may be required to log into a mobile application in order to access functionality provided by the mobile application 216.
- the mobile application 216 is configured to provide user information to the mobile application server 204 and to present information received from the mobile application server 204 to a user. More particularly, the mobile application 216 is configured to obtain measurement data for a user and to submit that measurement data to a mobile application server 104 in relation to a request for a level of fit for a product. In some embodiments, the mobile application 216 may also receive an indication of a product from a user for which a level of fit is to be determined.
- the mobile application 216 may receive output from the input sensors 210 and generate a 3D model based upon that output.
- the mobile application 216 may receive depth information (e.g., a range image) from a depth sensor (e.g., a range camera) , such as the depth sensors previously described with respect to input sensors 210 as well as image information from a camera input sensor. Based on this information, the mobile application 216 may determine the bounds of an object (e.g., a user) to be identified. For example, a sudden variance in depth within the depth information may indicate a border or outline of an object. In another example, the mobile application 216 may utilize one or more machine vision techniques and/or machine learning to identify the bounds of an object.
- depth information e.g., a range image
- a depth sensor e.g., a range camera
- the mobile application 216 may determine the bounds of an object (e.g., a user) to be identified. For example, a sudden variance in depth within the depth information may indicate
- the mobile application 216 may receive image information from a camera input sensor 210 and may identify potential objects within the image information based on variances in color or texture data detected within the image or based on learned patterns. In some embodiments, the mobile application 216 may cause the user device 202 to transmit the output obtained from the input sensors 210 to the mobile application server 204, which may then perform one or more object recognition techniques upon that output.
- the user device 202 may also contain communications interface (s) 220 that enable the user device 202 to communicate with any other suitable electronic devices.
- the communication interface 220 may enable the user device 202 to communicate with other electronic devices on a network (e.g., on a private network) .
- the user device 202 may include a Bluetooth wireless communication module, which allows it to communicate with another electronic device.
- the user device 202 may also include input/output (I/O) device (s) and/or ports 222, such as for enabling connection with a keyboard, a mouse, a pen, a voice input device, a touch input device, a display, speakers, a printer, etc.
- I/O input/output
- the user device 202 may communicate with the mobile application server 204 via a communication network.
- the communication network may include any one or a combination of many different types of networks, such as cable networks, the Internet, wireless networks, cellular networks, and other private and/or public networks.
- the communication network may comprise multiple different networks.
- the user device 202 may utilize a wireless local area network (WLAN) to communicate with a wireless router, which may then route the communication over a public network (e.g., the Internet) to the mobile application server 204.
- WLAN wireless local area network
- the mobile application server 204 may be any computing device or plurality of computing devices configured to perform one or more calculations on behalf of the mobile application 216 on the user device 202.
- the mobile application 216 may be in periodic communication with the mobile application server 204.
- the mobile application 216 may receive updates, push notifications, or other instructions from the mobile application server 204.
- the mobile application 216 and mobile application server 204 may utilize a proprietary encryption and/or decryption scheme to secure communications between the two.
- the mobile application server 204 may be executed by one or more virtual machines implemented in a hosted computing environment.
- the hosted computing environment may include one or more rapidly provisioned and released computing resources, which computing resources may include computing, networking, and/or storage devices.
- a hosted computing environment may also be referred to as a cloud-computing environment.
- the mobile application server 204 may include at least one memory 224 and one or more processing units (or processor (s) ) 226.
- the processor (s) 226 may be implemented as appropriate in hardware, computer-executable instructions, firmware or combinations thereof.
- Computer-executable instruction or firmware implementations of the processor (s) 226 may include computer-executable or machine executable instructions written in any suitable programming language to perform the various functions described.
- the memory 224 may store program instructions that are loadable and executable on the processor (s) 226, as well as data generated during the execution of these programs.
- the memory 224 may be volatile (such as random access memory (RAM) ) and/or non-volatile (such as read-only memory (ROM) , flash memory, etc. ) .
- the mobile application server 204 may also include additional storage 228, such as either removable storage or non-removable storage including, but not limited to, magnetic storage, optical disks, and/or tape storage.
- the disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for the computing devices.
- the memory 224 may include multiple different types of memory, such as static random access memory (SRAM) , dynamic random access memory (DRAM) or ROM.
- SRAM static random access memory
- DRAM dynamic random access memory
- the memory 224 may include an operating system 230 and one or more application programs or services for implementing the features disclosed herein including at least a module for fitting a 3D model of a product onto a 3D model of a user (deforming module 232) and/or a module for determining a level of fit for a combination of a 3D model of a product and a 3D model of a user (fit estimation module 234) .
- the memory 224 may also include account data 236, which provides information associated with user accounts maintained by the described system, user model data 238, which maintains 3D models associated with each user of an account, and/or object model data 240, which maintains 3D models associated with a number of objects (products) .
- account data 236, the user model data 238, or the object model data 240 may be stored in a database.
- object model data 240 may be an electronic catalog that includes data related to objects available for sale from a resource provider, such as a retailer or other suitable merchant.
- the memory 224 and the additional storage 228, both removable and non-removable, are examples of computer-readable storage media.
- computer-readable storage media may include volatile or non-volatile, removable or non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data.
- modules may refer to programming modules executed by computing systems (e.g., processors) that are installed on and/or executed from the mobile application server 204.
- the mobile application server 204 may also contain communications connection (s) 242 that allow the mobile application server 204 to communicate with a stored database, another computing device or server, user terminals, and/or other components of the described system.
- the mobile application server 204 may also include input/output (I/O) device (s) and/or ports 244, such as for enabling connection with a keyboard, a mouse, a pen, a voice input device, a touch input device, a display, speakers, a printer, etc.
- I/O input/output
- ports 244 such as for enabling connection with a keyboard, a mouse, a pen, a voice input device, a touch input device, a display, speakers, a printer, etc.
- the memory 224 may include the deforming module 232, the fit estimation module 234, the database containing account data 236, the database containing user model data 238, and/or the database containing object model data 240.
- the deforming module 232 may be configured to, in conjunction with the processors 226, apply deformations to a 3D model of a product in order to fit it onto a 3D model of a user.
- the deforming module 232 may have access to one or more rules which delineate how specific product types (e.g., shirt, pants, etc. ) should be deformed (e.g., stretched and/or bended) in order to be fit onto a user model.
- specific product types e.g., shirt, pants, etc.
- the deforming module 232 may snap certain portions of the 3D model of the product onto specific portions of the 3D model of the user.
- the 3D model of a shirt may be positioned so that sleeves of the 3D model of the shirt encompass arms of a 3D model of a user. Additionally, the 3D model of a shirt may also be positioned so that the collar of the 3D model of the shirt encompasses a neck of a 3D model of a user. The remainder of the 3D model of the shirt may then be deformed such that the interior of the 3D model of the shirt lies outside of or along the exterior of the 3D model of the user by stretching and bending the portions of the 3D model of the shirt.
- the deforming module 232 may, after fitting the 3D model of the product onto the 3D model of a user, determine a level of deformation for each segment of the 3D model of the product.
- the deforming module 232 may record an amount of stretching or bending which has been performed for each segment of the 3D model of the product. For example, the deforming module 232 may subtract an amount of surface area for a particular segment of the 3D model of the product before fitting it onto the 3D model of the user from an amount of surface area for that same particular segment of the 3D model of the product after fitting it onto the 3D model of the user. In this example, the difference in surface area would represent the amount that the segment has been stretched.
- the deforming module 232 may be further configured to change a pose of the 3D model of the user in order to estimate a level of deformation of the 3D model of the product for that pose. In some cases, the deforming module 232 may determine a level of deformation for several different poses.
- the fit estimation module 234 may be configured to, in conjunction with the processors 226, determine a level of fit for a product and a user. To do this, the fit estimation module 234 may determine whether the level of deformation determined by the deforming module 232 for each segment of the 3D model of the product is within an acceptable range of deformation based on material properties for that segment.
- Each 3D model of a product stored within the object model data 240 may be associated with various material properties.
- a 3D model of a product may be segmented into a plurality of segments, where each segment is associated with separate material properties.
- a 3D model of a product, or a segment of that 3D model of a product may be associated with a particular material and the fit estimation module 234 may be configured to determine one or more material properties based on that material (e.g., from a material information database) . In determining a level of fit, the fit estimation module 234 may determine a maximum level of deformation for each segment of the 3D model of the product based on its corresponding material properties.
- a maximum amount of available stretch may be determined for a segment based on an elasticity value for that segment.
- the fit estimation module 234 may then determine, for each segment of the 3D model of the product, whether the level of deformation determined by the deforming module 232 is greater than the maximum level of deformation for that segment. This may be repeated for a number of different poses of the 3D model of the user. If the level of deformation determined by the deforming module 232 is greater than the maximum level of deformation for a segment, then the fit estimation module 234 may determine that the product cannot fit the user. Ia these cases, the mobile application server 204 may select a different version of the product to determine a level of fit for. For example, the mobile application server 204 may select a next larger size of the product and determine a level of fit for that product.
- the fit estimation module 234 may determine that a product is too large for a user. To do this, the fit estimation module 234 may determine that one or more segments of the 3D model of the product extends past one or more bounds associated with the product type. For example, if a sleeve segment of a 3D model of a shirt extends past a wrist of a 3D model of a user, then the fit estimation module 234 may determine that the product is too large for the user. In some embodiments, the fit estimation module 234 may determine the product is too large for the user based upon a volume of space located between the 3D model of the product and the 3D model of the user exceeding some threshold.
- the mobile application server 204 may select a different version of the product to determine a level of fit for. For example, the mobile application server 204 may select a next smaller size of the product and determine a level of fit for that product. Ifthe mobile application server 204 encounters a scenario in which the current product is too large and the next smaller size of the product is too small, then the mobile application server 204 may provide a notification that there is no good fit for this particular user and product combination.
- a level of fit may be represented by a numerical value. Such a value may correspond to a determined level of deformation divided by a maximum level of deformation.
- a level of fit may represent an overall value for the product. For example, the level of fit for a product may be generated as an average of the level of fit for each segment of the 3D model of the product.
- each of the object entries within the object model data 240 may be associated with a three-dimensional (3D) model of that object.
- the 3D model may be combined with a second 3D model of a user and provided to the mobile application 216 such that the user device 202 is caused to display the combination of the 3D models on a display of the user device.
- the mobile application 216 may enable a user of the user device 202 to move, rotate, or otherwise reposition the combination of 3D models in order to see how the combinations would appear in a new position.
- FIG. 3 depicts techniques for segmenting a 3D model of a product in accordance with at least some embodiments.
- FIG. 3 depicts an illustrative example of a 3D model for a product 302 which may be stored in an object model database (e.g., object model data 240 of FIG. 2) .
- the 3D model 302 is depicted as being segmented into a number of segments 304 (A-E) , each of which includes information about a different portion of the 3D model 302.
- A-E segments 304
- Such a 3D model 302 may be segmented in any logical fashion.
- the 3D model may be segmented along seams of that garment.
- the 3D model may be segmented according to material type.
- the system described herein may store an indication of a number of material properties in relation to a 3D model, some of which may be stored in relation to particular segments of the 3D model. Some of the material properties may indicate a physical property of the material (such as elasticity or rigidity) which impact how the material can be deformed whereas some material properties may indicate a visual property of the material (such as sheen or reflectivity) which impact how the material is rendered.
- the segment may be associated with a particular material and the material properties of that segment may be determined by querying a material property database.
- Annotations may be associated with their respective segments in any suitable manner.
- such an annotation may be stored as metadata appended to particular segments of a 3D model.
- a user may manually generate annotations for a 3D model within a computer-animated drawing (CAD) software application.
- CAD computer-animated drawing
- a manufacturer of a product may generate a 3D model of that product, segment that 3D model based on any logical feature, and then annotate each segment with its material properties.
- the 3D model may then be provided to a mobile application server, where it may be stored in an object model database in association with a particular product.
- annotation 306 represents various material information stored in relation to segment 304 (A) of 3D model 302.
- Each other segment 304 (B-E) may include an annotation similar to annotation 306 which includes an indication of a material and/or material properties.
- boundaries between segments may be separately associated with an annotation.
- seam 308 may represent a boundary between segment 304(B) and 304 (E) and may include an annotation 310 associated with a thread material used to connect the two segments.
- FIG. 4 is a simplified flowchart illustrating a method of determining a level of fit for a product and a user according to an embodiment of the present invention.
- the flow is described in connection with a computer system that is an example of the computer systems described herein.
- Some or all of the operations of the flows can be implemented via specific hardware on the computer system and/or can be implemented as computer-readable instructions stored on a non-transitory computer-readable medium of the computer system.
- the computer-readable instructions represent programmable modules that include code executable by a processor of the computer system. The execution of such instructions configures the computer system to perform the respective operations.
- Each programmable module in combination with the processor represents a means for performing a respective operation (s) . While the operations are illustrated in a particular order, it should be understood that no particular order is necessary and that one or more operations may be omitted, skipped, and/or reordered.
- the method includes receiving a request to determine a level of fit for a product and a user. For example, a user may, when considering whether to purchase a particular product offered by an online retailer, request a determination as to a level of fit for that product with respect to the user. Such a request may be submitted by the user in order to receive an appropriate size recommendation (since garment sizes often vary between brands) .
- the request is transmitted by a user device operated by the user and subsequently received by a mobile application server.
- the method includes identifying the user associated with the received request.
- the user maintains an account with a mobile application server and the user is identified by virtue of login credentials provided by the user.
- the user is identified via a user device identifier received in the request.
- the method includes retrieving a user model to be used in determining a level of fit for the request. In some embodiments, this involves retrieving a 3D model of a user stored in association with the identified user. In some embodiments, this involves generating a 3D model of a user based on information about the user (e.g., measurement data) . In some embodiments, a 3D model of the user may be received from a user device, either along with the request or separately. It should be noted that techniques for using a user device to generate a 3D model of an object (e.g., a user) are described in greater detail with respect to FIG. 5 below.
- the method includes identifying a product associated with the request.
- the request includes an identifier for the product which may be used to identify it.
- the request includes a stock-keeping unit (SKU) number or other suitable identifier.
- SKU stock-keeping unit
- a user viewing a website dedicated to a particular product may initiate the request via a button on the website, in which case the product is identified by virtue of being associated with the website.
- the method includes retrieving a 3D model of the product. This involves querying an object model database for the identified product.
- the method includes fitting the retrieved 3D model of the product to the 3D model of the user.
- various contact points on the 3D model of the product are snapped to particular vertices or points located on the 3D model of the user.
- the 3D model of the product is then deformed such that the interior surface of the 3D model of the product lies outside of (i.e., does not overlap with) the exterior surface of the 3D model of the user.
- the amount that the 3D model of the product is deformed is noted.
- the method includes comparing the noted amount that the 3D model of the product is deformed is greater than some threshold deformation value.
- the threshold deformation value is determined as a function of one or more material property values stored in relation to the 3D model (or a segment therein) .
- the method results in a determination that the product does not fit the user.
- the method further involves updating a size and/or style of the product at 416 and then repeating the method from step 410.
- the method includes determining a level of fit for the product and user. More particularly, the method includes determining a level of fit as a numerical value based on the relationship between the amount that the 3D model of the product is deformed and the threshold deformation value. This numerical level of fit is then provided in response to the request.
- FIG. 4 provides a particular method of determining a level of fit for a product and a user according to an embodiment of the present invention. As noted above, other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated in FIG. 4 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
- FIG. 5 depicts an illustrative example of a technique for obtaining 3D models using sensor data in accordance with at least some embodiments.
- sensor data 502 may be obtained from one or more input sensors installed upon a user device.
- the captured sensor data 502 includes image information 504 captured by a camera device as well as depth map information 506 captured by a depth sensor.
- the sensor data 502 may include image information 504.
- One or more image processing techniques may be used on image information 504 in order to identify one or more objects within that image information 504.
- edge detection may be used to identify a section 508 within the image information 504 that includes an object.
- discontinuities in brightness, color, and/or texture may be identified across an image in order to detect edges of various objects within the image.
- Section 508 depicts an illustrative example image of a chair in which such discontinuities have been emphasized.
- the sensor data 502 may include depth information 506.
- depth information 506 a value may be assigned to each pixel that represents a distance between the user device and a particular point corresponding to the location of that pixel.
- the depth information 506 may be analyzed to detect sudden variances in depth within the depth information 506. For example, sudden changes in distance may indicate an edge or a border of an object within the depth information 506.
- the sensor data 502 may include both image information 504 and depth information 506.
- objects may first be identified in either the image information 504 or the depth information 506 and various attributes of the objects may be determined from the other information.
- edge detection techniques may be used to identify a section of the image information 504 that includes an object 508.
- the section 508 may then be mapped to a corresponding section 510 in the depth information to determine depth information for the identified object (e.g., a point cloud) .
- a section 510 that includes an object may first be identified within the depth information 506.
- the section 510 may then be mapped to a corresponding section 508 in the image information to determine appearance attributes for the identified object (e.g., color or texture values) .
- various attributes e.g., color, texture, point cloud data, object edges
- a point cloud for the object may be generated from the depth information and/or image information and compared to point cloud data stored in a database to identify a closest matching 3D model.
- a 3D model of an object e.g., a user or a product
- a mesh may be created from point cloud data obtained from a section 510 of depth information 506. The system may then map appearance data from a section of image information 504 corresponding to section 510 to the mesh to generate a basic 3D model.
- sensor data captured by a user device may be used to generate a 3D model of a user using the techniques described above.
- This 3D model of a user may then be provided to a mobile application server as user data.
- sensor data may be used to generate a 3D model of a product, which may then be stored in an object model database 238.
- a user wishing to sell a product may capture sensor data related to the product from his or her user device.
- the user’s user device may then generate a 3D model in the manner outlined above and may provide that 3D model to the mobile application server.
- FIG. 6 illustrates an example technique for fitting a 3D model for a product onto a 3D model for a user in order to determine a level of fit in accordance with at least some embodiments.
- FIG. 6 depicts a 3D model of a user 602 and a 3D model of a product 604. It should be noted that unless the 3D models are to be rendered (which they may not be) , there is no need to include image data in a 3D model. Accordingly, a 3D model may include only structural (e.g., mesh) data for the purposes of this technique.
- the 3D model of the product 604 may include a number of contact points which are intended to be snapped to particular areas of the 3D model of the user 602. For example, a contact point 606 (A) on a shoulder strap of the 3D model of the product 604 may be snapped to a particular vertex or point 606 (B) located on a shoulder of the 3D model of the user 602. This may be repeated for a number of contact points located throughout the 3D model of the product 604.
- contact points for a product may be determined automatically based on a category or type of the 3D model of the product 604. For example, all 3D models of t-shirts may utilize the same set of contact points.
- contact points for a particular 3D model of a product 604 may be set by a user, such as a manufacturer of the product.
- a rigid body transformation can be estimated from the corresponding points.
- a rigid body transformation may consist of a translation and rotation.
- the 3D model of the product 604 may be deformed such that the interior surface of the 3D model of the product 604 lies against or outside of an exterior surface of the 3D model of the user 602. This may involve stretching and/or bending the 3D model of the product 604 in order to move one or more surfaces toward the exterior of the 3D model of the user 602 without removing the contact points of the 3D model of the product 604 from their current locations on the 3D model of the user 602. As the 3D model of the product 604 is deformed in this manner, the amount of deformation associated with each portion of the 3D model of the product 604 is noted.
- a physics engine may be used to accurately predict the deformation of the material in the 3D model of the product 604 as it is deformed. Adjustments of certain parts of the 3D model of the product 604 can be done using mesh editing techniques (e.g., Laplacian Surface Editing) , in which one or more ROIs (region of interest) of the product mesh can be adjusted in accordance with a set of parameters.
- mesh editing techniques e.g., Laplacian Surface Editing
- the process of fitting can be done by adjusting a set of parameters that control the deformation of one or more ROIs of the product model until the product model fits the user model.
- the set of parameters can be defined as a set of measurements, such as the displacement for each of the vertices.
- This process can be formulated as an optimization process, where a few different optimization algorithms can be used to find the best set of parameters that minimizes one or more cost functions.
- the one of more cost functions can be defined in many ways. It may be defined as the number of penetrations between the meshes of the two 3D models. It may also be defined as the average distance between the vertices of the body mesh to the garment mesh, etc.
- a fitted model 608 is generated.
- the technique involves determining whether the amount of deformation associated with each portion of the 3D model of the product 604 needed to generate the fitted model is within acceptable threshold values.
- material property values are obtained for each segment and threshold values are determined from those material property values.
- the amount of deformation associated with each segment of the 3D model may then be compared to the respective threshold value.
- material property values may indicate an elasticity value for a particular segment. In this example, a maximum amount that the segment may be stretched is determined from that elasticity value.
- the amount that the segment was stretched to create the fitted model 608 would then be compared to that maximum amount that the segment may be stretched. If the amount that the segment was stretched exceeds the maximum amount that the segment may be stretched, then the product cannot fit the user. If the amount of deformation associated with each segment of the 3D model is within acceptable threshold values (e.g., no threshold value is exceeded) , then a level of fit may be determined.
- the level of fit may be determined as a proportion of the amount that the segment was stretched divided by the maximum threshold by which the segment may be stretched.
- a level of fit may correspond to a level of comfort, in that a higher degree of deformation in relation to the maximum deformation threshold might equate to a lower level of comfort.
- a level of fit can be determined based on an amount of space between one or more portions of the 3D model of the product and the 3D model of the user. For example, a total volume may be determined for a space located between the 3D model of the product and the 3D model of the user. In this example, if that total volume of space is greater than some threshold fit value, it may be determined that the product is not a fit for this user (e.g., the product is too large or would be uncomfortable) . In some cases, such a threshold fit value may be proportional to a size of the 3D model of the product or the 3D model of the user.
- particular types of products may be associated with a number of fit reference points 610 (A-D) which represent particular reference locations along which deformation should be measured against a threshold deformation value.
- A-D fit reference points 610
- an amount of deformation of the 3D model of the product 604 needed to generate the fitted model 608 is measured across the reference location and then compared to a threshold value for that reference location.
- the threshold value for that reference location may be determined based on material properties associated with the segment of the 3D model of the product 604 that includes the reference location.
- FIG. 7 illustrates a flow diagram depicting a process for determining a level of fit for a product and user in accordance with at least some embodiments.
- the process 700 depicted in FIG. 7 may be performed by a mobile application server (e.g., mobile application server 204 of FIG. 2) in communication with a user device (e.g., user device 202 of FIG. 2) .
- a mobile application server e.g., mobile application server 204 of FIG. 2
- a user device e.g., user device 202 of FIG. 2
- Process 700 involves receiving a fitting request for a user and a product at 702.
- the request may be received from a user device and the user may be identified by virtue of being associated with the user device.
- Process 700 involves obtaining a first 3D model associated with the user at 704.
- the first 3D model is received and stored by a mobile application server prior to receiving the fitting request.
- the first 3D model may be received along with the fitting request.
- Process 700 involves obtaining a second 3D model associated with the product at 706.
- the second 3D model is segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties.
- material properties may comprise one or more of an elasticity value and a rigidity value.
- Process 700 involves fitting the second 3D model onto the first 3D model at 708, such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment.
- Process 700 involves determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments at 710.
- the level of fit is determined based at least in part on an amount of deformation needed to fit the second 3D model onto the first 3D model.
- the level of fit is determined based at least in part on a volume of space determined to be between the second 3D model and the first 3D model.
- the level of fit comprises a numerical value corresponding to the deformation of each segment of the plurality of segments in relation to a threshold level of deformation.
- Process 700 further involves providing a size recommendation to a user at 712.
- the product is associated with a first size or style
- the method further comprises, upon determining an amount of deformation needed to fit the second 3D model onto the first 3D model is greater than a threshold deformation value, a second size or style of the product is recommended to the user.
- Process 700 further involves providing the second 3D model fitted onto the first 3D model to a user device at 714.
- the second 3D model fitted onto the first 3D model may be caused to be rendered on a display of the user device.
- FIG. 8 illustrates examples of components of a computer system 800 according to certain embodiments.
- the computer system 800 is an example of the computer system described herein above. Although these components are illustrated as belonging to a same computer system 800, the computer system 800 can also be distributed.
- the computer system 800 includes at least a processor 802, a memory 804, a storage device 806, input/output peripherals (I/O) 808, communication peripherals 810, and an interface bus 812.
- the interface bus 812 is configured to communicate, transmit, and transfer data, controls, and commands among the various components of the computer system 800.
- the memory 804 and the storage device 806 include computer-readable storage media, such as RAM, ROM, electrically erasable programmable read-only memory (EEPROM) , hard drives, CD-ROMs, optical storage devices, magnetic storage devices, electronic non-volatile computer storage, for example FLASH TM memory, and other tangible storage media. Any of such computer readable storage media can be configured to store instructions or program codes embodying aspects of the disclosure.
- the memory 804 and the storage device 806 also include computer readable signal media.
- a computer readable signal medium includes a propagated data signal with computer readable program code embodied therein. Such a propagated signal takes any of a variety of forms including, but not limited to, electromagnetic, optical, or any combination thereof.
- a computer readable signal medium includes any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use in connection with the computer system 800.
- the memory 804 includes an operating system, programs, and applications.
- the processor 802 is configured to execute the stored instructions and includes, for example, a logical processing unit, a microprocessor, a digital signal processor, and other processors.
- the memory 804 and/or the processor 802 can be virtualized and can be hosted within another computer system of, for example, a cloud network or a data center.
- the I/O peripherals 808 include user interfaces, such as a keyboard, screen (e.g., a touch screen) , microphone, speaker, other input/output devices, and computing components, such as graphical processing units, serial ports, parallel ports, universal serial buses, and other input/output peripherals.
- the I/O peripherals 808 are connected to the processor 802 through any of the ports coupled to the interface bus 812.
- the communication peripherals 810 are configured to facilitate communication between the computer system 800 and other computing devices over a communications network and include, for example, a network interface controller, modem, wireless and wired interface cards, antenna, and other communication peripherals.
- a computing device can include any suitable arrangement of components that provide a result conditioned on one or more inputs.
- Suitable computing devices include multipurpose microprocessor-based computer systems accessing stored software that programs or configures the computer system from a general-purpose computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device.
- Embodiments of the methods disclosed herein may be performed in the operation of such computing devices.
- the order of the blocks presented in the examples above can be varied-for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel.
- based on is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited.
- use of “based at least in part on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based at least in part on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
Landscapes
- Engineering & Computer Science (AREA)
- Architecture (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
Described herein are a system and techniques for determining a level of fit for a product with respect to a user. The technique involves obtaining a first 3D model associated with a user and obtaining a second 3D model associated with a product. In this system the second 3D model is segmented into a plurality of segments, with each segment of the plurality of segments being associated with one or more material properties. The technique further involves fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
Description
CROSS-REFERENCE TO RELATED APPLICATION
This application claims priority to U.S. Provisional Patent Application no. 62/987,196 filed on March 09, 2020, entitled “SYSTEM AND METHOD FOR VIRTUAL FITTING, " which is hereby incorporated by reference in its entirety.
Virtual fitting applications enable users to visualize various products in relation to a particular person. There are several different types of virtual fitting applications based on various approaches. However, each of the available types of virtual fitting applications suffer from a number of disadvantages.
Embodiments of the invention address these and other problems individually and collectively.
SUMMARY
The present invention relates generally to methods and systems related to virtual fitting applications. More particularly, embodiments of the present invention provide methods and systems for determining a level of fit for a user and product. Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
The methods involve obtaining a first 3D model associated with a user and obtaining a second 3D model associated with a product. The second 3D model may be segmented into a plurality of segments, with each segment of the plurality of segments being associated with one or more material properties. The methods further involve fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments. Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
One embodiment of the disclosure is directed to a method comprising receiving a fitting request comprising at least an indication of a user and a product, obtaining a first 3D model associated with the user, obtaining a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties, fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
Another embodiment of the disclosure is directed to a system comprising a processor, and a memory including instructions that, when executed with the processor, cause the system to, at least receive a fitting request comprising at least an indication of a user and a product, obtain a first 3D model associated with the user, obtain a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties, fit the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment, and determine a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
Yet another embodiment of the disclosure is directed to a non-transitory computer readable medium storing specific computer-executable instructions that, when executed by a processor, cause a computer system to at least receive a fitting request comprising at least an indication of a user and a product, obtain a first 3D model associated with the user, obtain a second 3D model associated with the product, the second 3D model being segmented based on material properties, fit the second 3D model onto the first 3D model such that each segment of the second 3D model is deformed by the first 3D model in accordance with the material properties for the segment, and determine a level of fit for the user and the product based on deformation of each segment.
Numerous benefits are achieved by way of the present system over conventional systems. For example, embodiments of the present disclosure involve methods and systems that provide an accurate level of fit for a particular product with respect to a user. While there are a number of available virtual fitting systems, they are ineffective at providing accurate fit information. For example, some systems use digital avatars of human users and then fit digital garments onto the avatars. Some systems use a camera or other device to track a user's movement or joint positions of the user (e.g., a knee or elbow) and use these joints to drive the motion of a virtual avatar. Some other systems track a contour of the user. Then, images of virtual garments are deformed and then overlaid on top of the user image to create an augmented view as ifthe user is wearing the garment. However, these virtual fitting systems fail to account for material properties of a garment model, which can lead to an inaccurate portrayal of the garment. In the described system, product models are segmented and material properties are stored in association with various segments of the product model. During a fitting process, an amount of deformation is determined based on those material properties. By accounting for properties of a product model based on the materials of the product, the product model can be adjusted during a virtual fitting, and is not limited to a fixed size. This improves the accuracy of fitting and can enable fitting based on body shape. These and other embodiments of the invention along with many of its advantages and features are described in more detail in conjunction with the text below and attached figures.
FIG. 1 depicts an illustrative example of a system in which records of objects and their positioning data may be generated automatically in accordance with at least some embodiments.
FIG. 2 depicts a system architecture for a system that determines a level of fit for a user and a product in accordance with at least some embodiments.
FIG. 3 depicts techniques for segmenting a 3D model of a product in accordance with at least some embodiments.
FIG. 4 is a simplified flowchart illustrating a method of determining a level of fit for a product and a user according to an embodiment of the present invention.
FIG. 5 depicts an illustrative example of a technique for obtaining 3D models using sensor data in accordance with at least some embodiments.
FIG. 6 illustrates an example technique for fitting a 3D model for a product onto a 3D model for a user in order to determine a level of fit in accordance with at least some embodiments.
FIG. 7 illustrates a flow diagram depicting a process for determining a level of fit for a product and user in accordance with at least some embodiments.
FIG. 8 illustrates examples of components of a computer system 800 according to certain embodiments.
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
The present invention relates generally to methods and systems related to virtual reality applications. More particularly, embodiments of the present invention provide methods and systems for determining a level of fit for a user and product. Embodiments of the present invention are applicable to a variety of applications in virtual reality and computer-based fitting systems.
FIG. 1 depicts an illustrative example of a system in which a level of fit may be generated for a user and product in some embodiments of the present invention. In FIG. 1, a user device 102 may be used to provide a request to a mobile application server 104 for product fit information. The user device, in some cases, may be used to obtain user data 106, which may be provided to the mobile application server 104 to be used in generating product fit information. The mobile application server 104 may include, or have access to, object model data 108 from which product data 110 may be obtained in order to complete the request. The mobile application server 104 may be configured to combine the user data 106 and the product data 110 in order to determine a level of fit for a particular user. The mobile application server 104 may, in some embodiments, provide virtual fit data 112 back to the user device 102, which may be rendered on a display for a user to view.
In an example, the user device 102 represents a suitable computing device that includes one or more graphical processing units (GPUs) , one or more general purpose processors (GPPs) , and one or more memories storing computer-readable instructions that are executable by at least one of the processors to perform various functionalities of the embodiments of the present disclosure. For instance, user device 102 can be any of a smartphone, a tablet, a laptop, a personal computer, a gaming console, or a smart television. The user device 102 may additionally include a range camera (i.e., depth sensor) and/or a RGB optical sensor, such as a camera.
The user device may be used to capture and/or generate user data 106. User data 106, may include information related to a particular user (e.g., a user of the user device 102) for which a level of fit should be determined with respect to a product. The user data 106 may include data about the user which may be used to determine the level of fit. For example, user data 106 may include dimensions of the user. User data 106 may be captured in any suitable format. For example, user data 106 may include a point cloud, a 3D mesh or model, or a string of characters that includes measurements at predetermined locations. In some cases, capturing user data 106 may involve receiving information about the user which is manually input into the user device 102. For example, a user may input measurements for various parts of his or her body via a keypad. In some cases, capturing user data 106 may involve using a camera and/or a depth sensor to capture images /depth information related to the user. The user device 102 may be further configured to generate a 3D model from the captured images/depth information. This process is described in greater detail with respect to FIG. 5 below.
The mobile application server 104 may include any computing device capable of determining a level of fit for a user in accordance with the techniques described herein. In order to make such a determination, the mobile application server 104 may receive user data 106 from the user device 102 as well as product data 110 from object model data 108. It should be noted that while the user data 106 may be received by the mobile application server 104 at the same time that the mobile application server 104 receives a request to determine a level of fit for a particular user and a particular product, the mobile application server 104 may also receive user data 106 prior to, and independent of, any request to determine a level of fit. For example, the mobile application server 104 may receive the user data 106 during an enrollment phase during which a user establishes an account with the mobile application server 104.
Upon receiving user data 106 and product data 110, the mobile application server 104 determines a level of fit by fitting a 3D model of the product (product data 110) onto a 3D model of a user (user data 106) . This may involve deforming portions of the 3D model of the product based on material properties associated with those portions of the 3D model. For example, portions of the 3D model of the product may be associated with a level of elasticity or rigidity such that the portion of the 3D model may be deformed in accordance with that level of elasticity or rigidity.
In some embodiments, the level of fit can be determined based on a degree to which deformations are made to the 3D model. For example, where a portion of the 3D model is associated with a level of elasticity (e.g., an amount that the portion can be stretched) , a level of fit may be determined based on an extent to which the portion of the 3D model is stretched. In other words, a level of fit may involve a measurement of how much a portion of the 3D model is stretched as a percentage of the total amount that the portion can be stretched. Ifthe amount that the portion of the 3D model is stretched is high in relation to the total elasticity (e.g., the amount that the portion can be stretched) , then it may be determined that the product is a tight fit. In some cases, if the amount that the portion is stretched is greater than a threshold fit value, it may be determined that the product is not a fit for this user. Different portions of the 3D model of the product may each be associated with different threshold fit values.
In some embodiments, the level of fit can be determined based on an amount of space between one or more portions of the 3D model of the product and the 3D model of the user. For example, a total volume of space may be determined for a volume located between the 3D model of the product and the 3D model of the user. In this example, if that total volume of space is greater than some threshold fit value, it may be determined that the product is not a fit for this user. In some cases, such a threshold fit value may be proportional to a size of the 3D model of the product or the 3D model of the user.
In some cases, determining that the product does not fit the user (e.g., determining a low level of fit) may prompt the mobile application server 104 to automatically (e.g., without further prompt by the user) determine a level of fit for a different version of the product (e.g., a different size or style of the product) by retrieving a second 3D model associated with that version of the product and repeating the process described above.
For clarity, a certain number of components are shown in FIG. 1. It is understood, however, that embodiments of the invention may include more than one of each component. In addition, some embodiments of the invention may include fewer than or greater than all of the components shown in FIG. 1. In addition, the components in FIG. 1 may communicate via any suitable communication medium (including the internet) , using any suitable communication protocol.
FIG. 2 depicts a system architecture for a system that determines a level of fit for a user and a product in accordance with at least some embodiments. In FIG. 2, a user device 202 may be in communication with a number of other components, including at least a mobile application server 204. The mobile application server 204 may perform at least a portion of the processing functions required by a mobile application installed upon the user device. The user device 202 and mobile application server 204 may be examples of the user device 102 and mobile application server 104 respectively described with respect to FIG. 1.
A user device 202 may be any suitable electronic device that is capable of providing at least a portion of the capabilities described herein. In particular, the user device 202 may be any electronic device capable of capturing user data and/or presenting a rendered image. In some embodiments, a user device may be capable of establishing a communication session with another electronic device (e.g., mobile application server 204) and transmitting /receiving data from that electronic device. A user device may include the ability to download and/or execute mobile applications. User devices may include mobile communication devices as well as personal computers and thin-client devices. In some embodiments, a user device may comprise any portable electronic device that has a primary function related to communication. For example, a user device may be a smart phone, a personal data assistant (PDA) , or any other suitable handheld device. The user device can be implemented as a self-contained unit with various components (e.g., input sensors, one or more processors, memory, etc. ) integrated into the user device. Reference in this disclosure to an “output” of a component or an “output” of a sensor does not necessarily imply that the output is transmitted outside of the user device. Outputs of various components might remain inside a self-contained unit that defines a user device.
In one illustrative configuration, the user device 202 may include at least one memory 206 and one or more processing units (or processor (s) ) 208. The processor (s) 208 may be implemented as appropriate in hardware, computer-executable instructions, firmware or combinations thereof. Computer-executable instruction or firmware implementations of the processor (s) 208 may include computer-executable or machine executable instructions written in any suitable programming language to perform the various functions described. The user device 202 may also include one or more input sensors 210 for receiving user and/or environmental input. There may be a variety of input sensors 210 capable of detecting user or environmental input, such as an accelerometer, a camera device, a depth sensor, a microphone, a global positioning system (e.g., GPS) receiver, etc. The one or more input sensors 210 may include a range camera device (e.g., a depth sensor) capable of generating a range image, as well as a camera device configured to capture image information.
For the purposes of this disclosure, a range camera (e.g., a depth sensor) may be any device configured to identify a distance or range of an object or objects from the range camera. In some embodiments, the range camera may generate a range image (or range map) , in which pixel values correspond to the detected distance for that pixel. The pixel values can be obtained directly in physical units (e.g., meters) . In at least some embodiments of the disclosure, the user device may employ a range camera that operates using structured light. In a range camera that operates using structured light, a projector projects light onto an object or objects in a structured pattern. The light may be of a range that is outside of the visible range (e.g., infrared or ultraviolet) . The range camera may be equipped with one or more camera devices configured to obtain an image of the object with the reflected pattern. Distance information may then be generated based on distortions in the detected pattern. It should be noted that although this disclosure focuses on the use of a range camera using structured light, any suitable type of range camera, including those that operate using stereo triangulation, sheet of light triangulation, time-of-flight, interferometry, coded aperture, or any other suitable technique for range detection, would be useable by the described system.
The memory 206 may store program instructions that are loadable and executable on the processor (s) 208, as well as data generated during the execution of these programs. Depending on the configuration and type of user device 202, the memory 206 may be volatile (such as random access memory (RAM) ) and/or non-volatile (such as read-only memory (ROM) , flash memory, etc. ) . The user device 202 may also include additional storage 212, such as either removable storage or non-removable storage including, but not limited to, magnetic storage, optical disks, and/or tape storage. The disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for the computing devices. In some implementations, the memory 206 may include multiple different types of memory, such as static random access memory (SRAM) , dynamic random access memory (DRAM) or ROM. Turning to the contents of the memory 206 in more detail, the memory 206 may include an operating system 214 and one or more application programs or services for implementing the features disclosed herein including at least a mobile application 216. The memory 206 may also include application data 218, which provides information to be generated by and/or consumed by the mobile application 216. In some embodiments, the application data 218 may be stored in a database.
For the purposes of this disclosure, a mobile application may be any set of computer executable instructions installed upon, and executed from, a user device 202. Mobile applications may be installed on a user device by a manufacturer of the user device or by another entity. In some embodiments, the mobile application 216 may cause a user device to establish a communication session with a mobile application server 204 that provides baekend support for the mobile application 216. A mobile application server 204 may maintain account information associated with a particular user device and/or user. In some embodiments, a user may be required to log into a mobile application in order to access functionality provided by the mobile application 216.
In accordance with at least some embodiments, the mobile application 216 is configured to provide user information to the mobile application server 204 and to present information received from the mobile application server 204 to a user. More particularly, the mobile application 216 is configured to obtain measurement data for a user and to submit that measurement data to a mobile application server 104 in relation to a request for a level of fit for a product. In some embodiments, the mobile application 216 may also receive an indication of a product from a user for which a level of fit is to be determined.
In accordance with at least some embodiments, the mobile application 216 may receive output from the input sensors 210 and generate a 3D model based upon that output. For example, the mobile application 216 may receive depth information (e.g., a range image) from a depth sensor (e.g., a range camera) , such as the depth sensors previously described with respect to input sensors 210 as well as image information from a camera input sensor. Based on this information, the mobile application 216 may determine the bounds of an object (e.g., a user) to be identified. For example, a sudden variance in depth within the depth information may indicate a border or outline of an object. In another example, the mobile application 216 may utilize one or more machine vision techniques and/or machine learning to identify the bounds of an object. In this example, the mobile application 216 may receive image information from a camera input sensor 210 and may identify potential objects within the image information based on variances in color or texture data detected within the image or based on learned patterns. In some embodiments, the mobile application 216 may cause the user device 202 to transmit the output obtained from the input sensors 210 to the mobile application server 204, which may then perform one or more object recognition techniques upon that output.
The user device 202 may also contain communications interface (s) 220 that enable the user device 202 to communicate with any other suitable electronic devices. In some embodiments, the communication interface 220 may enable the user device 202 to communicate with other electronic devices on a network (e.g., on a private network) . For example, the user device 202 may include a Bluetooth wireless communication module, which allows it to communicate with another electronic device. The user device 202 may also include input/output (I/O) device (s) and/or ports 222, such as for enabling connection with a keyboard, a mouse, a pen, a voice input device, a touch input device, a display, speakers, a printer, etc.
In some embodiments, the user device 202 may communicate with the mobile application server 204 via a communication network. The communication network may include any one or a combination of many different types of networks, such as cable networks, the Internet, wireless networks, cellular networks, and other private and/or public networks. In addition, the communication network may comprise multiple different networks. For example, the user device 202 may utilize a wireless local area network (WLAN) to communicate with a wireless router, which may then route the communication over a public network (e.g., the Internet) to the mobile application server 204.
The mobile application server 204 may be any computing device or plurality of computing devices configured to perform one or more calculations on behalf of the mobile application 216 on the user device 202. In some embodiments, the mobile application 216 may be in periodic communication with the mobile application server 204. For example, the mobile application 216 may receive updates, push notifications, or other instructions from the mobile application server 204. In some embodiments, the mobile application 216 and mobile application server 204 may utilize a proprietary encryption and/or decryption scheme to secure communications between the two. In some embodiments, the mobile application server 204 may be executed by one or more virtual machines implemented in a hosted computing environment. The hosted computing environment may include one or more rapidly provisioned and released computing resources, which computing resources may include computing, networking, and/or storage devices. A hosted computing environment may also be referred to as a cloud-computing environment.
In one illustrative configuration, the mobile application server 204 may include at least one memory 224 and one or more processing units (or processor (s) ) 226. The processor (s) 226 may be implemented as appropriate in hardware, computer-executable instructions, firmware or combinations thereof. Computer-executable instruction or firmware implementations of the processor (s) 226 may include computer-executable or machine executable instructions written in any suitable programming language to perform the various functions described.
The memory 224 may store program instructions that are loadable and executable on the processor (s) 226, as well as data generated during the execution of these programs. Depending on the configuration and type of mobile application server 204, the memory 224 may be volatile (such as random access memory (RAM) ) and/or non-volatile (such as read-only memory (ROM) , flash memory, etc. ) . The mobile application server 204 may also include additional storage 228, such as either removable storage or non-removable storage including, but not limited to, magnetic storage, optical disks, and/or tape storage. The disk drives and their associated computer-readable media may provide non-volatile storage of computer-readable instructions, data structures, program modules, and other data for the computing devices. In some implementations, the memory 224 may include multiple different types of memory, such as static random access memory (SRAM) , dynamic random access memory (DRAM) or ROM. Turning to the contents of the memory 224 in more detail, the memory 224 may include an operating system 230 and one or more application programs or services for implementing the features disclosed herein including at least a module for fitting a 3D model of a product onto a 3D model of a user (deforming module 232) and/or a module for determining a level of fit for a combination of a 3D model of a product and a 3D model of a user (fit estimation module 234) . The memory 224 may also include account data 236, which provides information associated with user accounts maintained by the described system, user model data 238, which maintains 3D models associated with each user of an account, and/or object model data 240, which maintains 3D models associated with a number of objects (products) . In some embodiments, one or more of the account data 236, the user model data 238, or the object model data 240 may be stored in a database. In some embodiments, the object model data 240 may be an electronic catalog that includes data related to objects available for sale from a resource provider, such as a retailer or other suitable merchant.
The memory 224 and the additional storage 228, both removable and non-removable, are examples of computer-readable storage media. For example, computer-readable storage media may include volatile or non-volatile, removable or non-removable media implemented in any method or technology for storage of information such as computer-readable instructions, data structures, program modules or other data. As used herein, the term “modules” may refer to programming modules executed by computing systems (e.g., processors) that are installed on and/or executed from the mobile application server 204. The mobile application server 204 may also contain communications connection (s) 242 that allow the mobile application server 204 to communicate with a stored database, another computing device or server, user terminals, and/or other components of the described system. The mobile application server 204 may also include input/output (I/O) device (s) and/or ports 244, such as for enabling connection with a keyboard, a mouse, a pen, a voice input device, a touch input device, a display, speakers, a printer, etc.
Turning to the contents of the memory 224 in more detail, the memory 224 may include the deforming module 232, the fit estimation module 234, the database containing account data 236, the database containing user model data 238, and/or the database containing object model data 240.
In some embodiments, the deforming module 232 may be configured to, in conjunction with the processors 226, apply deformations to a 3D model of a product in order to fit it onto a 3D model of a user. The deforming module 232 may have access to one or more rules which delineate how specific product types (e.g., shirt, pants, etc. ) should be deformed (e.g., stretched and/or bended) in order to be fit onto a user model. In order to fit a 3D model of a product onto a 3D model of a user, the deforming module 232 may snap certain portions of the 3D model of the product onto specific portions of the 3D model of the user. For example, the 3D model of a shirt may be positioned so that sleeves of the 3D model of the shirt encompass arms of a 3D model of a user. Additionally, the 3D model of a shirt may also be positioned so that the collar of the 3D model of the shirt encompasses a neck of a 3D model of a user. The remainder of the 3D model of the shirt may then be deformed such that the interior of the 3D model of the shirt lies outside of or along the exterior of the 3D model of the user by stretching and bending the portions of the 3D model of the shirt. The deforming module 232 may, after fitting the 3D model of the product onto the 3D model of a user, determine a level of deformation for each segment of the 3D model of the product. In other words, the deforming module 232 may record an amount of stretching or bending which has been performed for each segment of the 3D model of the product. For example, the deforming module 232 may subtract an amount of surface area for a particular segment of the 3D model of the product before fitting it onto the 3D model of the user from an amount of surface area for that same particular segment of the 3D model of the product after fitting it onto the 3D model of the user. In this example, the difference in surface area would represent the amount that the segment has been stretched. The deforming module 232 may be further configured to change a pose of the 3D model of the user in order to estimate a level of deformation of the 3D model of the product for that pose. In some cases, the deforming module 232 may determine a level of deformation for several different poses.
In some embodiments, the fit estimation module 234 may be configured to, in conjunction with the processors 226, determine a level of fit for a product and a user. To do this, the fit estimation module 234 may determine whether the level of deformation determined by the deforming module 232 for each segment of the 3D model of the product is within an acceptable range of deformation based on material properties for that segment.
Each 3D model of a product stored within the object model data 240 may be associated with various material properties. In some cases, a 3D model of a product may be segmented into a plurality of segments, where each segment is associated with separate material properties. In some cases, a 3D model of a product, or a segment of that 3D model of a product, may be associated with a particular material and the fit estimation module 234 may be configured to determine one or more material properties based on that material (e.g., from a material information database) . In determining a level of fit, the fit estimation module 234 may determine a maximum level of deformation for each segment of the 3D model of the product based on its corresponding material properties. For example, a maximum amount of available stretch may be determined for a segment based on an elasticity value for that segment. The fit estimation module 234 may then determine, for each segment of the 3D model of the product, whether the level of deformation determined by the deforming module 232 is greater than the maximum level of deformation for that segment. This may be repeated for a number of different poses of the 3D model of the user. If the level of deformation determined by the deforming module 232 is greater than the maximum level of deformation for a segment, then the fit estimation module 234 may determine that the product cannot fit the user. Ia these cases, the mobile application server 204 may select a different version of the product to determine a level of fit for. For example, the mobile application server 204 may select a next larger size of the product and determine a level of fit for that product.
In some embodiments, the fit estimation module 234 may determine that a product is too large for a user. To do this, the fit estimation module 234 may determine that one or more segments of the 3D model of the product extends past one or more bounds associated with the product type. For example, if a sleeve segment of a 3D model of a shirt extends past a wrist of a 3D model of a user, then the fit estimation module 234 may determine that the product is too large for the user. In some embodiments, the fit estimation module 234 may determine the product is too large for the user based upon a volume of space located between the 3D model of the product and the 3D model of the user exceeding some threshold. In these cases, the mobile application server 204 may select a different version of the product to determine a level of fit for. For example, the mobile application server 204 may select a next smaller size of the product and determine a level of fit for that product. Ifthe mobile application server 204 encounters a scenario in which the current product is too large and the next smaller size of the product is too small, then the mobile application server 204 may provide a notification that there is no good fit for this particular user and product combination.
As described herein, a level of fit may be represented by a numerical value. Such a value may correspond to a determined level of deformation divided by a maximum level of deformation. In some cases, a level of fit may represent an overall value for the product. For example, the level of fit for a product may be generated as an average of the level of fit for each segment of the 3D model of the product.
In some embodiments, each of the object entries within the object model data 240 may be associated with a three-dimensional (3D) model of that object. In these embodiments, the 3D model may be combined with a second 3D model of a user and provided to the mobile application 216 such that the user device 202 is caused to display the combination of the 3D models on a display of the user device. The mobile application 216 may enable a user of the user device 202 to move, rotate, or otherwise reposition the combination of 3D models in order to see how the combinations would appear in a new position.
FIG. 3 depicts techniques for segmenting a 3D model of a product in accordance with at least some embodiments. In particular, FIG. 3 depicts an illustrative example of a 3D model for a product 302 which may be stored in an object model database (e.g., object model data 240 of FIG. 2) . In FIG. 3, the 3D model 302 is depicted as being segmented into a number of segments 304 (A-E) , each of which includes information about a different portion of the 3D model 302. Such a 3D model 302 may be segmented in any logical fashion. For example, in the case that the 3D model is of a garment, the 3D model may be segmented along seams of that garment. In another example, the 3D model may be segmented according to material type.
The system described herein may store an indication of a number of material properties in relation to a 3D model, some of which may be stored in relation to particular segments of the 3D model. Some of the material properties may indicate a physical property of the material (such as elasticity or rigidity) which impact how the material can be deformed whereas some material properties may indicate a visual property of the material (such as sheen or reflectivity) which impact how the material is rendered. In some embodiments, the segment may be associated with a particular material and the material properties of that segment may be determined by querying a material property database.
Annotations may be associated with their respective segments in any suitable manner. In some embodiments, such an annotation may be stored as metadata appended to particular segments of a 3D model. In some cases, a user may manually generate annotations for a 3D model within a computer-animated drawing (CAD) software application. For example, a manufacturer of a product may generate a 3D model of that product, segment that 3D model based on any logical feature, and then annotate each segment with its material properties. The 3D model may then be provided to a mobile application server, where it may be stored in an object model database in association with a particular product.
By way of illustrating an example of material properties associated with a segment of a 3D model, annotation 306 represents various material information stored in relation to segment 304 (A) of 3D model 302. Each other segment 304 (B-E) may include an annotation similar to annotation 306 which includes an indication of a material and/or material properties. Additionally, boundaries between segments may be separately associated with an annotation. For example, seam 308 may represent a boundary between segment 304(B) and 304 (E) and may include an annotation 310 associated with a thread material used to connect the two segments.
FIG. 4 is a simplified flowchart illustrating a method of determining a level of fit for a product and a user according to an embodiment of the present invention. The flow is described in connection with a computer system that is an example of the computer systems described herein. Some or all of the operations of the flows can be implemented via specific hardware on the computer system and/or can be implemented as computer-readable instructions stored on a non-transitory computer-readable medium of the computer system. As stored, the computer-readable instructions represent programmable modules that include code executable by a processor of the computer system. The execution of such instructions configures the computer system to perform the respective operations. Each programmable module in combination with the processor represents a means for performing a respective operation (s) . While the operations are illustrated in a particular order, it should be understood that no particular order is necessary and that one or more operations may be omitted, skipped, and/or reordered.
At 402, the method includes receiving a request to determine a level of fit for a product and a user. For example, a user may, when considering whether to purchase a particular product offered by an online retailer, request a determination as to a level of fit for that product with respect to the user. Such a request may be submitted by the user in order to receive an appropriate size recommendation (since garment sizes often vary between brands) . The request is transmitted by a user device operated by the user and subsequently received by a mobile application server.
At 404, the method includes identifying the user associated with the received request. In some embodiments, the user maintains an account with a mobile application server and the user is identified by virtue of login credentials provided by the user. In some embodiments, the user is identified via a user device identifier received in the request.
At 406, the method includes retrieving a user model to be used in determining a level of fit for the request. In some embodiments, this involves retrieving a 3D model of a user stored in association with the identified user. In some embodiments, this involves generating a 3D model of a user based on information about the user (e.g., measurement data) . In some embodiments, a 3D model of the user may be received from a user device, either along with the request or separately. It should be noted that techniques for using a user device to generate a 3D model of an object (e.g., a user) are described in greater detail with respect to FIG. 5 below.
At 408, the method includes identifying a product associated with the request. In some embodiments, the request includes an identifier for the product which may be used to identify it. For example, the request includes a stock-keeping unit (SKU) number or other suitable identifier. Alternatively, a user viewing a website dedicated to a particular product may initiate the request via a button on the website, in which case the product is identified by virtue of being associated with the website. At 410, the method includes retrieving a 3D model of the product. This involves querying an object model database for the identified product.
At 412, the method includes fitting the retrieved 3D model of the product to the 3D model of the user. To do this, various contact points on the 3D model of the product are snapped to particular vertices or points located on the 3D model of the user. The 3D model of the product is then deformed such that the interior surface of the 3D model of the product lies outside of (i.e., does not overlap with) the exterior surface of the 3D model of the user. The amount that the 3D model of the product is deformed is noted. Techniques for fitting a 3D model of a product to a 3D model of the user are described in greater detail with respect to FIG. 6 below.
At 414, the method includes comparing the noted amount that the 3D model of the product is deformed is greater than some threshold deformation value. The threshold deformation value is determined as a function of one or more material property values stored in relation to the 3D model (or a segment therein) . Upon determining that the threshold deformation value has been exceeded by the amount that the 3D model of the product is deformed, the method results in a determination that the product does not fit the user. The method further involves updating a size and/or style of the product at 416 and then repeating the method from step 410.
At 418, provided that the threshold deformation value has not been exceeded by the amount that the 3D model of the product is deformed, the method includes determining a level of fit for the product and user. More particularly, the method includes determining a level of fit as a numerical value based on the relationship between the amount that the 3D model of the product is deformed and the threshold deformation value. This numerical level of fit is then provided in response to the request.
It should be appreciated that the specific steps illustrated in FIG. 4 provide a particular method of determining a level of fit for a product and a user according to an embodiment of the present invention. As noted above, other sequences of steps may also be performed according to alternative embodiments. For example, alternative embodiments of the present invention may perform the steps outlined above in a different order. Moreover, the individual steps illustrated in FIG. 4 may include multiple sub-steps that may be performed in various sequences as appropriate to the individual step. Furthermore, additional steps may be added or removed depending on the particular applications. One of ordinary skill in the art would recognize many variations, modifications, and alternatives.
FIG. 5 depicts an illustrative example of a technique for obtaining 3D models using sensor data in accordance with at least some embodiments. In accordance with at least some embodiments, sensor data 502 may be obtained from one or more input sensors installed upon a user device. The captured sensor data 502 includes image information 504 captured by a camera device as well as depth map information 506 captured by a depth sensor.
As stated above, the sensor data 502 may include image information 504. One or more image processing techniques may be used on image information 504 in order to identify one or more objects within that image information 504. For example, edge detection may be used to identify a section 508 within the image information 504 that includes an object. To do this, discontinuities in brightness, color, and/or texture may be identified across an image in order to detect edges of various objects within the image. Section 508 depicts an illustrative example image of a chair in which such discontinuities have been emphasized.
As also stated above, the sensor data 502 may include depth information 506. In depth information 506, a value may be assigned to each pixel that represents a distance between the user device and a particular point corresponding to the location of that pixel. The depth information 506 may be analyzed to detect sudden variances in depth within the depth information 506. For example, sudden changes in distance may indicate an edge or a border of an object within the depth information 506.
In some embodiments, the sensor data 502 may include both image information 504 and depth information 506. In at least some of these embodiments, objects may first be identified in either the image information 504 or the depth information 506 and various attributes of the objects may be determined from the other information. For example, edge detection techniques may be used to identify a section of the image information 504 that includes an object 508. The section 508 may then be mapped to a corresponding section 510 in the depth information to determine depth information for the identified object (e.g., a point cloud) . In another example, a section 510 that includes an object may first be identified within the depth information 506. In this example, the section 510 may then be mapped to a corresponding section 508 in the image information to determine appearance attributes for the identified object (e.g., color or texture values) .
In some embodiments, various attributes (e.g., color, texture, point cloud data, object edges) of an object identified in sensor data 502 may be used as input to a machine learning module in order to identify or generate a 3D model 512 that matches the identified object. In some embodiments, a point cloud for the object may be generated from the depth information and/or image information and compared to point cloud data stored in a database to identify a closest matching 3D model. Alternatively, a 3D model of an object (e.g., a user or a product) may be generated using the sensor data 502. To do this, a mesh may be created from point cloud data obtained from a section 510 of depth information 506. The system may then map appearance data from a section of image information 504 corresponding to section 510 to the mesh to generate a basic 3D model. Although particular techniques are described, it should be noted that there are a number of techniques for identifying particular objects from sensor output.
As described elsewhere, sensor data captured by a user device (e.g., user device 102 of FIG. 1) may be used to generate a 3D model of a user using the techniques described above. This 3D model of a user may then be provided to a mobile application server as user data. In some embodiments, sensor data may be used to generate a 3D model of a product, which may then be stored in an object model database 238. For example, a user wishing to sell a product may capture sensor data related to the product from his or her user device. The user’s user device may then generate a 3D model in the manner outlined above and may provide that 3D model to the mobile application server.
FIG. 6 illustrates an example technique for fitting a 3D model for a product onto a 3D model for a user in order to determine a level of fit in accordance with at least some embodiments. Illustratively, FIG. 6 depicts a 3D model of a user 602 and a 3D model of a product 604. It should be noted that unless the 3D models are to be rendered (which they may not be) , there is no need to include image data in a 3D model. Accordingly, a 3D model may include only structural (e.g., mesh) data for the purposes of this technique.
In some embodiments, the 3D model of the product 604 may include a number of contact points which are intended to be snapped to particular areas of the 3D model of the user 602. For example, a contact point 606 (A) on a shoulder strap of the 3D model of the product 604 may be snapped to a particular vertex or point 606 (B) located on a shoulder of the 3D model of the user 602. This may be repeated for a number of contact points located throughout the 3D model of the product 604. In some embodiments, contact points for a product may be determined automatically based on a category or type of the 3D model of the product 604. For example, all 3D models of t-shirts may utilize the same set of contact points. In some embodiments, contact points for a particular 3D model of a product 604 may be set by a user, such as a manufacturer of the product.
Once the set of contact points have been matched to appropriate points on the 3D model of the user 602, a rigid body transformation can be estimated from the corresponding points. A rigid body transformation may consist of a translation and rotation. By applying the rigid body transformation to the 3D model of the product 604, the 3D model of the product 604 and the 3D model of the user 602 are aligned approximately in 3D. Then the contacts points of the 3D model of the product 604 can be snapped to the corresponding points on the 3D model of the user 602 by adding a small translation to the contact points of the 3D model of the product 604. Once a number of contact points (e.g., 606) have been snapped to an appropriate point on the 3D model of the user 602, the 3D model of the product 604 may be deformed such that the interior surface of the 3D model of the product 604 lies against or outside of an exterior surface of the 3D model of the user 602. This may involve stretching and/or bending the 3D model of the product 604 in order to move one or more surfaces toward the exterior of the 3D model of the user 602 without removing the contact points of the 3D model of the product 604 from their current locations on the 3D model of the user 602. As the 3D model of the product 604 is deformed in this manner, the amount of deformation associated with each portion of the 3D model of the product 604 is noted. During this step, a physics engine may be used to accurately predict the deformation of the material in the 3D model of the product 604 as it is deformed. Adjustments of certain parts of the 3D model of the product 604 can be done using mesh editing techniques (e.g., Laplacian Surface Editing) , in which one or more ROIs (region of interest) of the product mesh can be adjusted in accordance with a set of parameters.
In other words, the process of fitting can be done by adjusting a set of parameters that control the deformation of one or more ROIs of the product model until the product model fits the user model. The set of parameters can be defined as a set of measurements, such as the displacement for each of the vertices. This process can be formulated as an optimization process, where a few different optimization algorithms can be used to find the best set of parameters that minimizes one or more cost functions. The one of more cost functions can be defined in many ways. It may be defined as the number of penetrations between the meshes of the two 3D models. It may also be defined as the average distance between the vertices of the body mesh to the garment mesh, etc.
Once the interior surface of the 3D model of the product 604 lies against or outside of an exterior surface of the 3D model of the user 602, a fitted model 608 is generated. Once the fitted model has been generated, the technique involves determining whether the amount of deformation associated with each portion of the 3D model of the product 604 needed to generate the fitted model is within acceptable threshold values. To do this, material property values are obtained for each segment and threshold values are determined from those material property values. The amount of deformation associated with each segment of the 3D model may then be compared to the respective threshold value. For example, material property values may indicate an elasticity value for a particular segment. In this example, a maximum amount that the segment may be stretched is determined from that elasticity value. The amount that the segment was stretched to create the fitted model 608 would then be compared to that maximum amount that the segment may be stretched. Ifthe amount that the segment was stretched exceeds the maximum amount that the segment may be stretched, then the product cannot fit the user. If the amount of deformation associated with each segment of the 3D model is within acceptable threshold values (e.g., no threshold value is exceeded) , then a level of fit may be determined. By way of example, the level of fit may be determined as a proportion of the amount that the segment was stretched divided by the maximum threshold by which the segment may be stretched. In some cases, a level of fit may correspond to a level of comfort, in that a higher degree of deformation in relation to the maximum deformation threshold might equate to a lower level of comfort.
Alternatively, or in addition, a level of fit can be determined based on an amount of space between one or more portions of the 3D model of the product and the 3D model of the user. For example, a total volume may be determined for a space located between the 3D model of the product and the 3D model of the user. In this example, if that total volume of space is greater than some threshold fit value, it may be determined that the product is not a fit for this user (e.g., the product is too large or would be uncomfortable) . In some cases, such a threshold fit value may be proportional to a size of the 3D model of the product or the 3D model of the user.
In some embodiments, particular types of products may be associated with a number of fit reference points 610 (A-D) which represent particular reference locations along which deformation should be measured against a threshold deformation value. In these embodiments, an amount of deformation of the 3D model of the product 604 needed to generate the fitted model 608 is measured across the reference location and then compared to a threshold value for that reference location. As above, the threshold value for that reference location may be determined based on material properties associated with the segment of the 3D model of the product 604 that includes the reference location.
FIG. 7 illustrates a flow diagram depicting a process for determining a level of fit for a product and user in accordance with at least some embodiments. The process 700 depicted in FIG. 7 may be performed by a mobile application server (e.g., mobile application server 204 of FIG. 2) in communication with a user device (e.g., user device 202 of FIG. 2) .
Process 700 involves receiving a fitting request for a user and a product at 702. The request may be received from a user device and the user may be identified by virtue of being associated with the user device.
Process 700 involves obtaining a first 3D model associated with the user at 704. In some embodiments, the first 3D model is received and stored by a mobile application server prior to receiving the fitting request. In some embodiments, the first 3D model may be received along with the fitting request.
Process 700 involves obtaining a second 3D model associated with the product at 706. The second 3D model is segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties. Such material properties may comprise one or more of an elasticity value and a rigidity value.
Process 700 involves fitting the second 3D model onto the first 3D model at 708, such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment.
Process 700 involves determining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments at 710. In some embodiments, the level of fit is determined based at least in part on an amount of deformation needed to fit the second 3D model onto the first 3D model. In some embodiments, the level of fit is determined based at least in part on a volume of space determined to be between the second 3D model and the first 3D model. In some embodiments, the level of fit comprises a numerical value corresponding to the deformation of each segment of the plurality of segments in relation to a threshold level of deformation.
Process 700 further involves providing a size recommendation to a user at 712. In some embodiments, the product is associated with a first size or style, and the method further comprises, upon determining an amount of deformation needed to fit the second 3D model onto the first 3D model is greater than a threshold deformation value, a second size or style of the product is recommended to the user.
Process 700 further involves providing the second 3D model fitted onto the first 3D model to a user device at 714. Upon being received at the user device, the second 3D model fitted onto the first 3D model may be caused to be rendered on a display of the user device.
FIG. 8 illustrates examples of components of a computer system 800 according to certain embodiments. The computer system 800 is an example of the computer system described herein above. Although these components are illustrated as belonging to a same computer system 800, the computer system 800 can also be distributed.
The computer system 800 includes at least a processor 802, a memory 804, a storage device 806, input/output peripherals (I/O) 808, communication peripherals 810, and an interface bus 812. The interface bus 812 is configured to communicate, transmit, and transfer data, controls, and commands among the various components of the computer system 800. The memory 804 and the storage device 806 include computer-readable storage media, such as RAM, ROM, electrically erasable programmable read-only memory (EEPROM) , hard drives, CD-ROMs, optical storage devices, magnetic storage devices, electronic non-volatile computer storage, for example FLASH
TM memory, and other tangible storage media. Any of such computer readable storage media can be configured to store instructions or program codes embodying aspects of the disclosure. The memory 804 and the storage device 806 also include computer readable signal media. A computer readable signal medium includes a propagated data signal with computer readable program code embodied therein. Such a propagated signal takes any of a variety of forms including, but not limited to, electromagnetic, optical, or any combination thereof. A computer readable signal medium includes any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use in connection with the computer system 800.
Further, the memory 804 includes an operating system, programs, and applications. The processor 802 is configured to execute the stored instructions and includes, for example, a logical processing unit, a microprocessor, a digital signal processor, and other processors. The memory 804 and/or the processor 802 can be virtualized and can be hosted within another computer system of, for example, a cloud network or a data center. The I/O peripherals 808 include user interfaces, such as a keyboard, screen (e.g., a touch screen) , microphone, speaker, other input/output devices, and computing components, such as graphical processing units, serial ports, parallel ports, universal serial buses, and other input/output peripherals. The I/O peripherals 808 are connected to the processor 802 through any of the ports coupled to the interface bus 812. The communication peripherals 810 are configured to facilitate communication between the computer system 800 and other computing devices over a communications network and include, for example, a network interface controller, modem, wireless and wired interface cards, antenna, and other communication peripherals.
While the present subject matter has been described in detail with respect to specific embodiments thereof, it will be appreciated that those skilled in the art, upon attaining an understanding of the foregoing may readily produce alterations to, variations of, and equivalents to such embodiments. Accordingly, it should be understood that the present disclosure has been presented for purposes of example rather than limitation, and does not preclude inclusion of such modifications, variations, and/or additions to the present subject matter as would be readily apparent to one of ordinary skill in the art. Indeed, the methods and systems described herein may be embodied in a variety of other forms; furthermore, various omissions, substitutions and changes in the form of the methods and systems described herein may be made without departing from the spirit of the present disclosure. The accompanying claims and their equivalents are intended to cover such forms or modifications as would fall within the scope and spirit of the present disclosure.
Unless specifically stated otherwise, it is appreciated that throughout this specification discussions utilizing terms such as “processing, ” “computing, ” “calculating, ” “determining, ” and “identifying” or the like refer to actions or processes of a computing device, such as one or more computers or a similar electronic computing device or devices, that manipulate or transform data represented as physical electronic or magnetic quantities within memories, registers, or other information storage devices, transmission devices, or display devices of the computing platform.
The system or systems discussed herein are not limited to any particular hardware architecture or configuration. A computing device can include any suitable arrangement of components that provide a result conditioned on one or more inputs. Suitable computing devices include multipurpose microprocessor-based computer systems accessing stored software that programs or configures the computer system from a general-purpose computing apparatus to a specialized computing apparatus implementing one or more embodiments of the present subject matter. Any suitable programming, scripting, or other type of language or combinations of languages may be used to implement the teachings contained herein in software to be used in programming or configuring a computing device.
Embodiments of the methods disclosed herein may be performed in the operation of such computing devices. The order of the blocks presented in the examples above can be varied-for example, blocks can be re-ordered, combined, and/or broken into sub-blocks. Certain blocks or processes can be performed in parallel.
Conditional language used herein, such as, among others, “can, ” “could, ” “might, ” “may, ” “e.g., ” and the like, unless specifically stated otherwise, or otherwise understood within the context as used, is generally intended to convey that certain examples include, while other examples do not include, certain features, elements, and/or steps. Thus, such conditional language is not generally intended to imply that features, elements and/or steps are in any way required for one or more examples or that one or more examples necessarily include logic for deciding, with or without author input or prompting, whether these features, elements and/or steps are included or are to be performed in any particular example.
The terms “including, ” “including, ” “having, ” and the like are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, acts, operations, and so forth. Also, the term “or” is used in its inclusive sense (and not in its exclusive sense) so that when used, for example, to connect a list of elements, the term “or” means one, some, or all of the elements in the list. The use of “adapted to” or “configured to” herein is meant as open and inclusive language that does not foreclose devices adapted to or configured to perform additional tasks or steps. Additionally, the use of “based on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Similarly, the use of “based at least in part on” is meant to be open and inclusive, in that a process, step, calculation, or other action “based at least in part on” one or more recited conditions or values may, in practice, be based on additional conditions or values beyond those recited. Headings, lists, and numbering included herein are for ease of explanation only and are not meant to be limiting.
The various features and processes described above may be used independently of one another, or may be combined in various ways. All possible combinations and sub-combinations are intended to fall within the scope of the present disclosure. In addition, certain method or process blocks may be omitted in some implementations. The methods and processes described herein are also not limited to any particular sequence, and the blocks or states relating thereto can be performed in other sequences that are appropriate. For example, described blocks or states may be performed in an order other than that specifically disclosed, or multiple blocks or states may be combined in a single block or state. The example blocks or states may be performed in serial, in parallel, or in some other manner. Blocks or states may be added to or removed from the disclosed examples. Similarly, the example systems and components described herein may be configured differently than described. For example, elements may be added to, removed from, or rearranged compared to the disclosed examples.
Claims (20)
- A method comprising:receiving a fitting request comprising at least an indication of a user and a product;obtaining a first 3D model associated with the user;obtaining a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties;fitting the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment; anddetermining a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
- The method of claim 1 wherein the level of fit is determined based at least in part on an amount of deformation needed to fit the second 3D model onto the first 3D model.
- The method of claim 1 wherein the level of fit is determined based at least in part on a volume of space determined to be between the second 3D model and the first 3D model.
- The method of claim 1 wherein the product is associated with a first size or style, the method further comprising upon determining an amount of deformation needed to fit the second 3D model onto the first 3D model is greater than a threshold deformation value, a second size or style of the product to be recommended to the user.
- The method of claim 1 wherein the level of fit comprises a numerical value corresponding to the deformation of each segment of the plurality of segments in relation to a threshold level of deformation.
- The method of claim 1 wherein the material properties comprise one or more of an elasticity value or a rigidity value.
- The method of claim 1 further comprising providing the second 3D model fitted onto the first 3D model to a user device, such that the second 3D model fitted onto the first 3D model is caused to be rendered on a display of the user device.
- A system comprising:a processor; anda memory including instructions that, when executed with the processor, cause the system to, at least:receive a fitting request comprising at least an indication of a user and a product;obtain a first 3D model associated with the user;obtain a second 3D model associated with the product, the second 3D model being segmented into a plurality of segments, wherein each segment of the plurality of segments is associated with one or more material properties;fit the second 3D model onto the first 3D model such that each segment of the plurality of segments is deformed by the first 3D model in accordance with the one or more material properties associated with each segment; anddetermine a level of fit for the user and the product based on a deformation of each segment of the plurality of segments.
- The system of claim 8 wherein the level of fit is determined based at least in part on an amount of deformation needed to fit the second 3D model onto the first 3D model.
- The system of claim 8 wherein the level of fit is determined based at least in part on a volume of space determined to be between the second 3D model and the first 3D model.
- The system of claim 8 wherein the product is associated with a first size or style, and wherein the instructions further cause the system to upon determining an amount of deformation needed to fit the second 3D model onto the first 3D model is greater than a threshold deformation value, a second size or style of the product to be recommended to the user.
- The system of claim 8 wherein the level of fit comprises a numerical value corresponding to the deformation of each segment of the plurality of segments in relation to a threshold level of deformation.
- The system of claim 8 wherein the material properties comprise one or more of an elasticity value or a rigidity value.
- The system of claim 8 wherein the instructions further cause the system to provide the second 3D model fitted onto the first 3D model to a user device, such that the second 3D model fitted onto the first 3D model is caused to be rendered on a display of the user device.
- A non-transitory computer readable medium storing specific computer-executable instructions that, when executed by a processor, cause a computer system to at least:receive a fitting request comprising at least an indication of a user and a product;obtain a first 3D model associated with the user;obtain a second 3D model associated with the product, the second 3D model being segmented based on material properties;fit the second 3D model onto the first 3D model such that each segment of the second 3D model is deformed by the first 3D model in accordance with the material properties for the segment; anddetermine a level of fit for the user and the product based on deformation of each segment.
- The non-transitory computer readable medium of claim 15 wherein the level of fit is determined based at least in part on an amount of deformation needed to fit the second 3D model onto the first 3D model.
- The non-transitory computer readable medium of claim 15 wherein the level of fit is determined based at least in part on a volume of space determined to be between the second 3D model and the first 3D model.
- The non-transitory computer readable medium of claim 15 wherein the product is associated with a first size, and wherein the instructions further cause the computer system to upon determining an amount of deformation needed to fit the second 3D model onto the first 3D model is greater than a threshold deformation value, a second size or style of the product to be recommended to the user.
- The non-transitory computer readable medium of claim 15 wherein the level of fit comprises a numerical value corresponding to the deformation of each segment of the second 3D model in relation to a threshold level of deformation.
- The non-transitory computer readable medium of claim 15 wherein the material properties comprise one or more of an elasticity value or a rigidity value.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202180016193.8A CN115315728A (en) | 2020-03-09 | 2021-03-01 | System and method for virtual adaptation |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202062987196P | 2020-03-09 | 2020-03-09 | |
US62/987,196 | 2020-03-09 |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2021179936A1 WO2021179936A1 (en) | 2021-09-16 |
WO2021179936A9 true WO2021179936A9 (en) | 2022-09-09 |
Family
ID=77671213
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/078533 WO2021179936A1 (en) | 2020-03-09 | 2021-03-01 | System and method for virtual fitting |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN115315728A (en) |
WO (1) | WO2021179936A1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140201023A1 (en) * | 2013-01-11 | 2014-07-17 | Xiaofan Tang | System and Method for Virtual Fitting and Consumer Interaction |
CN103714466A (en) * | 2013-12-24 | 2014-04-09 | 中国科学院深圳先进技术研究院 | Virtual clothes fitting system and virtual clothes fitting method |
JP2015184875A (en) * | 2014-03-24 | 2015-10-22 | 株式会社東芝 | Data processing device and data processing program |
WO2016135078A1 (en) * | 2015-02-23 | 2016-09-01 | Fittingbox | Process and method for real-time physically accurate and realistic-looking glasses try-on |
WO2018183291A1 (en) * | 2017-03-29 | 2018-10-04 | Google Llc | Systems and methods for visualizing garment fit |
-
2021
- 2021-03-01 WO PCT/CN2021/078533 patent/WO2021179936A1/en active Application Filing
- 2021-03-01 CN CN202180016193.8A patent/CN115315728A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN115315728A (en) | 2022-11-08 |
WO2021179936A1 (en) | 2021-09-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10740941B2 (en) | Processing user selectable product images and facilitating visualization-assisted virtual dressing | |
US10163003B2 (en) | Recognizing combinations of body shape, pose, and clothing in three-dimensional input images | |
US10653962B2 (en) | Generating and utilizing digital avatar data for online marketplaces | |
US20190244407A1 (en) | System, device, and method of virtual dressing utilizing image processing, machine learning, and computer vision | |
CN107967693B (en) | Video key point processing method and device, computing equipment and computer storage medium | |
US9898742B2 (en) | Virtual dressing room | |
EP3370208A2 (en) | Virtual reality-based apparatus and method to generate a three dimensional (3d) human face model using image and depth data | |
CN108475439B (en) | Three-dimensional model generation system, three-dimensional model generation method, and recording medium | |
US20190244281A1 (en) | Digital wardrobe | |
JP5248806B2 (en) | Information processing apparatus and information processing method | |
US20220188897A1 (en) | Methods and systems for determining body measurements and providing clothing size recommendations | |
US20150134302A1 (en) | 3-dimensional digital garment creation from planar garment photographs | |
EP3384458A1 (en) | Automatic-guided image capturing and presentation | |
US20150269759A1 (en) | Image processing apparatus, image processing system, and image processing method | |
US20160071321A1 (en) | Image processing device, image processing system and storage medium | |
CN107609946B (en) | Display control method and computing device | |
EP3881287A1 (en) | Methods and systems for evaluating the size of a garment | |
JP7371154B2 (en) | Apparatus and method for detecting facial pose, image processing system, and storage medium | |
KR20170019917A (en) | Apparatus, method and computer program for generating 3-dimensional model of clothes | |
WO2021179936A9 (en) | System and method for virtual fitting | |
WO2015172229A1 (en) | Virtual mirror systems and methods | |
US20220005266A1 (en) | Method for processing two-dimensional image and device for executing method | |
WO2021179919A1 (en) | System and method for virtual fitting during live streaming | |
WO2022081745A1 (en) | Real-time rendering of 3d wearable articles on human bodies for camera-supported computing devices | |
CN115937964B (en) | Method, device, equipment and storage medium for estimating gesture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21769024 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21769024 Country of ref document: EP Kind code of ref document: A1 |