US10909743B2 - Multiscale 3D texture synthesis - Google Patents
Multiscale 3D texture synthesis Download PDFInfo
- Publication number
- US10909743B2 US10909743B2 US15/856,759 US201715856759A US10909743B2 US 10909743 B2 US10909743 B2 US 10909743B2 US 201715856759 A US201715856759 A US 201715856759A US 10909743 B2 US10909743 B2 US 10909743B2
- Authority
- US
- United States
- Prior art keywords
- texture
- map
- hierarchical
- resolution
- hierarchical algorithm
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 230000015572 biosynthetic process Effects 0.000 title description 2
- 238000003786 synthesis reaction Methods 0.000 title description 2
- 238000000034 method Methods 0.000 claims abstract description 128
- 230000000007 visual effect Effects 0.000 claims abstract description 19
- 238000009877 rendering Methods 0.000 claims abstract description 9
- 230000015654 memory Effects 0.000 claims description 24
- 238000006073 displacement reaction Methods 0.000 claims description 9
- 238000007670 refining Methods 0.000 abstract description 3
- 238000010801 machine learning Methods 0.000 description 17
- 238000013459 approach Methods 0.000 description 11
- 230000000306 recurrent effect Effects 0.000 description 9
- 238000013528 artificial neural network Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 7
- 230000000694 effects Effects 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 230000006403 short-term memory Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 230000001172 regenerating effect Effects 0.000 description 3
- 239000013598 vector Substances 0.000 description 3
- 238000003491 array Methods 0.000 description 2
- 238000004040 coloring Methods 0.000 description 2
- 238000004590 computer program Methods 0.000 description 2
- 230000002787 reinforcement Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000003066 decision tree Methods 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000003064 k means clustering Methods 0.000 description 1
- 238000003012 network analysis Methods 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 239000011435 rock Substances 0.000 description 1
- 238000012706 support-vector machine Methods 0.000 description 1
- 239000002023 wood Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/23—Clustering techniques
- G06F18/231—Hierarchical techniques, i.e. dividing or merging pattern sets so as to obtain a dendrogram
-
- G06K9/6219—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Definitions
- This disclosure relates to generating texture maps for use in rendering visual output. More particularly, the present disclosure relates generating texture maps at multiple resolution scales using hierarchical algorithms.
- Texture maps add colouring and other properties to 3D models, turning them from flat un-textured surfaces into more detailed representations of the object being modelled.
- the texture is mapped onto the 3D model.
- Texture maps can comprise colour maps, such as RGB maps, that apply colouring the 3D models. These define the surface colours for a particular texture, and are perhaps the most common form of texture map. Colour maps are also referred to as diffuse maps. Generally, they contain no lighting or shading information relating to the surface texture.
- Text maps on top of the colour maps can be used to add additional texture features to the 3D model.
- Normal maps are a map of the surface normal vectors of the texture, which can be used to enhance lighting effects on the model.
- Specular maps contain information relating to the shininess of areas on the surface.
- Displacement maps, bump maps, depth maps and height maps all add the effect of raised and lowered details on the surface.
- These additional texture maps are often referred to as additional channels of the texture map.
- the textures applied to an object need to have different visual qualities at different scales. For example, an object close by is required to have a more detailed texture, but on a smaller scale, than objects at a distance, which can appear less detailed but need to be rendered on a larger scale.
- a large texture at a high resolution needs to be stored. This consumes a large amount of memory and can also be computationally expensive, especially when a large number of different textures are required for rendering objects.
- Machine learning is the field of study where a computer or computers learn to perform classes of tasks using the feedback generated from the experience or data gathered that the machine learning process acquires during computer performance of those tasks.
- machine learning can be broadly classed as supervised and unsupervised approaches, although there are particular approaches such as reinforcement learning and semi-supervised learning which have special rules, techniques and/or approaches.
- Supervised machine learning is concerned with a computer learning one or more rules or functions to map between example inputs and desired outputs as predetermined by an operator or programmer, usually where a data set containing the inputs is labelled.
- Unsupervised learning is concerned with determining a structure for input data, for example when performing pattern recognition, and typically uses unlabelled data sets.
- Reinforcement learning is concerned with enabling a computer or computers to interact with a dynamic environment, for example when playing a game or driving a vehicle.
- Unsupervised machine learning For unsupervised machine learning, there is a range of possible applications such as, for example, the application of computer vision techniques to image processing or video enhancement. Unsupervised machine learning is typically applied to solve problems where an unknown data structure might be present in the data. As the data is unlabelled, the machine learning process is required to operate to identify implicit relationships between the data for example by deriving a clustering metric based on internally derived information.
- an unsupervised learning technique can be used to reduce the dimensionality of a data set and attempt to identify and model relationships between clusters in the data set, and can for example generate measures of cluster membership or identify hubs or nodes in or between clusters (for example using a technique referred to as weighted correlation network analysis, which can be applied to high-dimensional data sets, or using k-means clustering to cluster data by a measure of the Euclidean distance between each datum).
- Semi-supervised learning is typically applied to solve problems where there is a partially labelled data set, for example where only a subset of the data is labelled.
- Semi-supervised machine learning makes use of externally provided labels and objective functions as well as any implicit data relationships.
- the machine learning algorithm can be provided with some training data or a set of training examples, in which each example is typically a pair of an input signal/vector and a desired output value, label (or classification) or signal.
- the machine learning algorithm analyses the training data and produces a generalised function that can be used with unseen data sets to produce desired output values or signals for the unseen input vectors/signals. The user needs to decide what type of data is to be used as the training data, and to prepare a representative real-world set of data.
- the user must however take care to ensure that the training data contains enough information to accurately predict desired output values without providing too many features (which can result in too many dimensions being considered by the machine learning process during training, and could also mean that the machine learning process does not converge to good solutions for all or specific examples).
- the user must also determine the desired structure of the learned or generalised function, for example whether to use support vector machines or decision trees.
- aspects and/or embodiments seek to provide an improved method of texture generation for use in visual rendering.
- a method for generating textures for use in rendering visual output comprising the steps of: generating, using a first hierarchical algorithm, a first texture from one or more sets of initialisation data; and selectively refining the first texture, using one or more further hierarchical algorithms, to generate one or more further textures from at least a section of the first texture and one or more sets of further initialisation data; wherein at least a section of each of the one or more further textures differs from the first texture.
- one or more of the one or more further textures may differ from the first texture in being at a higher resolution and/or in being different class of texture map and/or in comprising a region of texture not present in first texture. This can enable a more varied set of textures to be generated from the first texture, which can provide a more aesthetically pleasing rendering of one or more 3D objects.
- the one or more sets of initialisation data may comprise one or more initial sets of random noise, which may comprising the step of generating the one or more initial sets of random noise.
- the one or more sets of random noise may be generated from one or more sets of noise parameters, wherein the one or more sets of noise parameters may comprise at least one of: a random noise generator type; a random noise generator distribution; a seed or initialisation parameter; and/or one or more directional components and/or one or more noise generation procedures.
- the method may further comprise the step of storing the one or more sets of noise parameters. This can enable the one or more further textures to be randomly modified which may result in a greater level of variation.
- the one or more sets of further initialisation data may comprise one or more sets of further random noise which may comprise the step of generating the one or more sets of further random noise.
- the one or more sets of further random noise may be generated from one or more sets of further noise parameters, wherein the one or more sets of further noise parameters may comprise at least one of: a random noise generator type; a random noise generator distribution; a seed; and/or a directional component or procedure.
- the method may further comprise the step of storing the one or more sets of further noise parameters. The generation of one or more sets of further random noise can enable different sets of noise to be generated based upon different parameters and/or based upon different hardware characteristics of a system or apparatus designed to execute the method.
- the one or more sets of initialisation data may comprise a known texture and/or wherein the one or more sets of further initialisation data may comprise a known texture.
- the known texture comprises at least a part of a previously generated first texture and/or previously generated one or more further textures. This can enable the one or more generated textures to be visually similar to the one or more known textures provided as initialisation data.
- the method may further comprise the step of storing the first texture and/or one or more further textures in a texture library after they have been generated. This can enable generated textures to be reused without the need for regenerating them.
- the first texture and/or one or more of the one or more further textures may comprise a colour map.
- the first texture and/or one or more of the one or more textures may further comprise one or more additional maps comprising at least one of: a normal map; a height map; a bump map; a displacement map; a specular map; or a depth map, and/or wherein the one or more additional maps may be jointly optimised with the colour map and any of the other one or more additional maps. This can enable a more varied texture to be generated based upon a number of factors including the object upon with the texture may be applied.
- the method may further comprise the step of generating one or more additional maps from the first texture using a separate hierarchical algorithm.
- the method may further comprise the step of generating one or more further additional maps from the second texture using an additional separate hierarchical algorithm, wherein the additional separate hierarchical algorithm may additionally take the one or more additional maps as an input.
- the one or more additional maps and/or one or more further additional maps may comprise at least one of: a normal map; a height map; a bump map; a displacement map; a specular map; or a depth map. This can enable one or more different hierarchical algorithms which may have a different set of parameters to generate one or more textures different from the one or more other hierarchical algorithms.
- the separate hierarchical algorithm used to generate the one or more additional maps from the first texture can be different to the first hierarchical algorithm, or identical to it.
- the first and one or more of the one or more further hierarchical algorithms share one or more layers.
- Having a number of layers in some embodiments, which may or may not be sequential, recurrent, recursive, branching or merging allows different levels of processing to occur at different times and the layers can work in parallel, ensuring optimal efficiency when enhancing the resolution of the visual data.
- the first texture and/or one or more further textures may be output in real time.
- the one or more further textures may comprise a plurality of further textures, and the hierarchical algorithms may be texture specific, wherein the hierarchical algorithms may be selected from a library of hierarchical algorithms based on properties of a texture required to be generated. This enables any generated textures to be displayed and applied to an object quickly, and also enables more accurate generation of the textures based upon the parameters of the hierarchical algorithms and other system/texture requirements.
- the hierarchical algorithms were developed using a learned approach, and/or the hierarchical algorithms may comprise a feed forward process, and/or the hierarchical algorithms may form a part of an iterative optimisation process.
- hierarchical or non-hierarchical algorithms can be substantially accurate and therefore enable a more accurate reconstruction, for example produce higher quality visual data from the low-quality visual data that is transmitted, for example where quality can be measured by resolution, a perceptual measure or metric determining that the quality is sufficiently aesthetically pleasing or by a low reproduction error rate in comparison to the original high-quality visual data.
- the hierarchical or non-hierarchical algorithms can produce higher quality versions of visual data using the fidelity data.
- a down-sampled version of the resulting visual data comes out to be the same or similar as a down-sampled version of the original visual data.
- using a learned approach can substantially tailor the hierarchical model or models for each portion of visual data.
- each of the hierarchical algorithms may comprise at least one: a nonlinear hierarchical algorithm; a neural network; a convolutional neural network; a layered algorithm; a recurrent neural network; a long short-term memory network; a multi-dimensional convolutional network; a memory network; or a gated recurrent network.
- a method of training hierarchical algorithms for use in texture generation comprising the steps of: generating one or more textures at a first resolution from a known texture at a second resolution using a hierarchical algorithm; and optimising at least a subset of the hierarchical algorithm parameters based on results of comparing the statistics of the one or more textures to statistics of one or more further known textures at the first resolution; wherein the first resolution is higher than the second resolution.
- the statistics of the one or more textures may be calculated using a descriptor network.
- the hierarchical algorithm may stored in a library of hierarchical algorithms after being optimised. This enables the trained hierarchical algorithm to be reused without the need for regenerating it.
- the hierarchical algorithm may be trained to be texture specific, this may enable the hierarchical algorithm to be optimised for generating a texture based upon a specific texture provided as an input.
- hierarchical algorithm is preferably used to connote an algorithm with a hierarchical a structure.
- examples of such algorithms include, but are not limited to, a nonlinear hierarchical algorithm; a neural network; a convolutional neural network; a layered algorithm; a recurrent neural network; a long short-term memory network; a multi-dimensional convolutional network; a memory network; or a gated recurrent network.
- FIG. 1 illustrates an example of a multiscale texture generation network
- FIG. 2 illustrates an embodiment of a texture generation process for generating multiple texture map types
- FIG. 3 illustrates an alternative embodiment of a texture generation process for generating multiple texture map types
- FIG. 4 illustrates an embodiment of a texture generation process for generating extensions of a known texture
- FIG. 5 shows an apparatus comprising a processing apparatus and memory according to an embodiment.
- FIG. 1 illustrates an example of a multiscale texture generation network.
- Initialisation data 101 is input into a first pretrained hierarchical algorithm 103 .
- the first hierarchical algorithm 103 uses the initialisation data 101 to generate a first texture map 105 at a first resolution, which it then outputs.
- This first texture map 105 is then used as an input, along with a further set of initialisation data 107 , for a second hierarchical algorithm 109 .
- the second hierarchical algorithm 109 uses the first texture map 105 and the further initialisation data 107 to generate a second texture map 111 , which is then output.
- the second texture map 11 is at a higher resolution to the first texture map 105 .
- This process can then be repeated on the second texture map 111 , to generate a third texture map 113 at a higher resolution than the second texture map 111 using a third hierarchical algorithm 115 and another set of initialisation data 117 .
- the process can be repeated N times to generate a series of texture maps at increasingly higher resolutions.
- the output texture map of the m th hierarchical algorithm is used along with a new set of initialisation data as the input for (m+1) th hierarchical algorithm until texture maps at all the required resolutions have been generated, or until the highest resolution that the hierarchical algorithms have been trained for is reached.
- the initialisation data used at each step of the process comprises random noise, and/or a known texture sample.
- the initialisation data can be saved in order to generate identical textures in the future. This will be required when a user, for example a player in a game, views the same texture in the same context at a future point in order to provide temporally consistency to an object's texture. Properties of the initialisation data, such as the amount of noise present, can be altered to provide a degree of control over the generated textures.
- the initialisation data can be one or more sets of parameters for generating random noise, such as the random generator type and/or distribution, a seed to start the generation from, and/or a directional component or procedure for how to proceed with the noise generation.
- Using a known texture as initialisation data allows variations on an existing artwork created by an artist to be created.
- the output textures comprise colour maps, such as an RGB map.
- These output textures can be saved in a library for future use to avoid regenerating them every time they are required.
- this library only stores the generated textures temporarily, for example only while a player of a computer game is in the vicinity of the area for which the textures were generated.
- the generation of the textures at multiple resolution scales can be performed on an ad-hoc, on the fly basis to meet a requirement for a particular texture at a certain resolution scale. For example, during a computer game a player may approach a textured object within the game from a distance. At this distance, the texture map for the object may only be required at a low resolution, so only the first texture map is generated. As the player approaches the object, higher resolution texture maps will be required to maintain the visual quality of the textured object, and these texture maps are required to be consistent with the coarser previously generated coarser texture maps. These higher resolution texture maps will be generated from the coarser texture maps and initialisation data by the hierarchical algorithms as and when they are required for display to the player—in this example as the player approaches the texture object. In effect, the network of hierarchical algorithms generates a texture at a coarse scale, and then selectively refines the texture where needed.
- the series of hierarchical algorithms are trained on known sets of texture maps of a particular texture at different resolutions.
- the set contains known texture maps at a variety of resolutions, each indexed by an integer, s, up to the total number of texture resolutions that will be output by the series of hierarchical algorithms, S.
- a known texture map and a set of random noise is input.
- the texture map output of the s th hierarchical algorithm in the series is denoted by f ⁇ s (x s-1 , Z s-1 ), where x s-1 is a known texture map at the resolution output by the (s ⁇ 1) th hierarchical algorithm in the sequence and Z s-1 is the set of random noise used as the input to the s th hierarchical algorithm.
- the parameters of each of the hierarchical algorithms, ⁇ are what is being varied in order to optimise the hierarchical algorithms.
- the texture map output by each of the hierarchical algorithms in the sequence is input into a descriptor network, which calculates some statistical properties, ⁇ (f ⁇ s (x s-1 ,Z s-1 )), of the generated texture maps. These are compared with statistical properties of the known texture maps at the same resolution, ⁇ (x s ).
- the hierarchical algorithms are optimised by (approximately) minimising an expectation value, for example:
- the training objective is to match image statistics at multiple scales.
- the hierarchical algorithms can be stored in a library of hierarchical algorithms, along with metadata relating to the sets of texture on which they were trained.
- This metadata can include, for example, the texture that the hierarchical algorithm was trained on, the resolutions of the training textures, and/or the texture map types output by the hierarchical algorithm.
- the method can also be used to generate texture maps that differ from each other in other properties instead.
- the second, and possibly subsequent, texture maps generated by this method may be at the same resolution as the first texture map, but contain additional visual content that is not present in the first texture map.
- FIG. 2 illustrates an embodiment of a texture generation process for generating multiple texture map types.
- one or more sets of initialisation data 201 such as random noise, one or more sets of parameters that can be used to generate random noise, or a known texture, are used as an input into a first hierarchical algorithm 203 .
- the first hierarchical algorithm 203 outputs a first set of texture maps 205 , which includes a colour map, such as an RGB map, and one or more further additional maps, for example at least one of: a normal map; a height map; a bump map; a displacement map; a specular map; and/or a depth map.
- a colour map such as an RGB map
- the first set of texture maps 205 can then be used as an input into a second hierarchical algorithm 207 , along with a new set of initialisation data 209 .
- the second hierarchical algorithm 207 will output a further set of textures 211 .
- At least a section of the further set of textures 211 differs from the first set of textures 203 . This difference can be, for example, that the further set of textures 211 is at a higher resolution to the first set of textures 203 , that it contains texture content that is not present in the first set of textures 203 , and/or that at least one of the additional texture maps is a different type to those in the first set of texture maps 203 .
- the set of further textures 211 output of the further hierarchical algorithm can be used as input for another hierarchical algorithm to generate more sets of further texture maps. This can be repeated iteratively, using the set of textures output by the (n ⁇ 1) th hierarchical algorithm in the sequence and a set of initialisation data as the input for the n th hierarchical algorithm in the sequence. The n th hierarchical algorithm will then output the next set of textures.
- the hierarchical algorithms used in this model are trained on sets of known texture maps, y.
- the set contains known texture maps with a variety of properties, each indexed by an integer, s, up to the total number, S.
- the sets of known texture maps include both colour maps, such as RGB maps, and known additional maps, as described above.
- each of the additional texture maps also referred to as additional channels, is input into a descriptor network to calculate statistical properties of that texture map.
- the descriptor network treats each of these channels as RGB channels when generating the statistics.
- y be an M ⁇ N image/set of texture maps with extra channels corresponding to additional texture map types—i.e. instead of having M ⁇ N ⁇ 3 dimensions, as would be the case for just an RGB map, y will have M ⁇ N ⁇ D channels.
- Each of the types of texture map in y corresponds to three of D available channels. For example y 1:3 denotes the RGB channels and y 4:6 could denote the normal map channels.
- the known set of texture maps and a set of random noise is input.
- the texture map output into a particular set of channels (d to d+2) of the s th hierarchical algorithm in the series is denoted by f ⁇ s (y s-1 ,Z s-1 ) d:d+2 , where y s-1 is a known texture map with the desired properties of the output of the (s ⁇ 1) th hierarchical algorithm in the sequence and Z s-1 is the set of random noise used as the input to the s th hierarchical algorithm.
- the parameters of each of the hierarchical algorithms, ⁇ are what is being varied in order to optimise the hierarchical algorithms.
- the channels associated with each of the texture maps output by the hierarchical algorithms in the sequence being trained are input into a descriptor network, which calculates some statistical properties, ⁇ (f ⁇ s (y s-1 ,Z s-1 ) d:d+2 ), of the generated texture maps. These are compared with statistical properties of the known texture maps with the desired properties in that channel, ⁇ (y d:d+2 s ).
- the hierarchical algorithms are optimised by minimising an expectation value, for example:
- arbitrary subsets of channels can be selected to optimise the hierarchical algorithms; the channels used in the expectation value do not have to correspond with the types of texture maps being used.
- the hierarchical algorithms can be stored in a library of hierarchical algorithms, along with metadata relating to the sets of texture on which they were trained.
- This metadata can include, for example, the texture that the hierarchical algorithm was trained on, the resolutions of the training textures, and/or the texture map types output by the hierarchical algorithm.
- FIG. 3 illustrates an alternative an embodiment of a texture generation process for generating multiple texture map types.
- one or more sets of initialisation data 301 such as random noise, one or more sets of parameters that can be used to generate random noise, or a known texture, are used as an input into a first hierarchical algorithm 303 .
- the first hierarchical algorithm 303 outputs a first texture map 305 , such as a colour map.
- This first texture map 305 is used as the input for hierarchical algorithm A 307 , which generates one or more further texture maps 309 of a different type to the first texture map 303 , but at the same resolution as the first texture map 303 .
- hierarchical algorithm A 307 generates a normal map from the colour map.
- the first texture map 303 along with a new set of initialisation data 311 , is used as an input of a second hierarchical 313 algorithm to generate a second texture map 315 that is of the same type as the first texture map 305 , but at a higher resolution to the first texture map 305 .
- This second texture map 315 is used as the input for hierarchical algorithm B 317 , which generates one or more further texture maps 319 of a different type to the second texture map 315 , but at the same resolution as the second texture map 315 .
- the further texture maps 309 output by hierarchical algorithm A 307 can also be used as an input to hierarchical algorithm B 317 .
- the method can be repeated iteratively, using the texture map output by the (n ⁇ 1) th hierarchical algorithm in the sequence and a set of initialisation data as the input for the n th hierarchical algorithm in the sequence.
- the n th hierarchical algorithm will them output the next set of textures at a higher resolution to the (n ⁇ 1) th texture map.
- the n th texture map will then be used as an input to an additional hierarchical algorithm that generates one or more further texture maps of a different type to the n th texture map, but at the same resolution as the n th texture map.
- These additional hierarchical algorithms can optionally take the one or more further texture maps generated from the (n ⁇ 1) th texture map as input as well.
- the hierarchical algorithms used to generate the texture maps from random noise and previously generated texture maps are trained as described in relation to the embodiment in FIG. 4 .
- the additional hierarchical algorithms such as hierarchical algorithm A and hierarchical algorithm B, are trained separately.
- the further texture map output of the s th additional hierarchical algorithm in the series is denoted by g ⁇ s (x s , y s-1 ),
- x s is a known texture map of the same type as the first texture map and at the resolution of the further texture map being output by the additional hierarchical algorithm being trained.
- y s-1 denotes one or more known further texture map at a resolution lower than the further texture map being output by the additional hierarchical algorithm being trained.
- ⁇ denotes a set of parameters of the additional hierarchical algorithm that are optimised, or have been optimised, during the training process.
- a descriptor network is applied to the further texture map output by the hierarchical algorithm being trained in order to calculate a set of statistical properties of the output further texture map, ⁇ (g ⁇ s (x s , y s-1 )). This is compared with statistical properties of known further texture maps at the same resolution as the output further texture map, ⁇ (y s ), in order to optimise the set of parameters of the additional hierarchical algorithm, ⁇ .
- the hierarchical algorithms are optimised by minimising an expectation value, for example:
- the additional hierarchical algorithms can be stored in a library of hierarchical algorithms, along with metadata relating to the sets of texture on which they were trained.
- This metadata can include, for example, the texture that the additional hierarchical algorithm was trained on, the resolutions of the training textures, and/or the texture map types output by the additional hierarchical algorithm.
- the generation of the texture maps can be performed in real time in an ad-hoc basis as the requirement to display new textures becomes apparent.
- the textures are then generated when, for example, a player or viewer first sees them, rather than having exact textures pre-determined. As a result, a completely new look can be created each time the texture is generated, which can result, for example, in an different game world every time a player starts a new game.
- FIG. 4 illustrates an embodiment of a texture generation process for generating extensions of a known texture.
- texture maps at different resolutions ( 401 and 403 ) have previously been generated using any of the embodiments described in FIGS. 1 to 3 .
- a new texture map region 405 is requested for generation that partially overlaps with a previously generated texture map 403 at a particular resolution.
- the corresponding overlapping sections of the texture map at coarser resolutions 407 are determined, along with the corresponding sections of initialisation data used to generate the higher resolution texture maps from the lower resolution texture maps 409 .
- the initialisation data used to generate the previously generated texture maps was stored so that the previous initialisation data can be reused in this method.
- the complete coarser scale texture maps required to generate the requested texture map will not have all been generated. These are therefore generated using a new first set of initialisation data 411 that is identical with the original set of initialisation data in the overlap regions 413 .
- This new first set of initialisation data 411 is input into the first hierarchical algorithm 415 , which outputs a new first texture map 417 which identical with the previously generated first texture map in the overlapping regions 407 , but contains newly generated textures in the non-overlapping regions.
- the new first texture map 417 is then input into a second hierarchical algorithm 419 , along with an additional set of new initialisation data 421 that is identical in the overlapping regions 409 to the initialisation data used in generating the second texture map 423 .
- This is the initialisation data on which the requested new texture map directly depends.
- the non-overlapping regions contain newly generated additional initialisation data, such as new random noise.
- the output of this second hierarchical algorithm 419 is a second new texture map 405 that is identical with the previously generated second texture map 403 in the overlapping regions 425 , but contains newly generated textures in the non-overlapping regions.
- the processes can be repeated iteratively to generate newly requested texture maps at any of the possible resolution scales that are identical to the previously calculated texture maps in the overlapping regions.
- the generation of the new texture map regions can be performed in real time in an ad-hoc basis as the requirement to extend the previously generated textures becomes apparent. For example, a player in a game is moving within a game world.
- the texture maps are first generated to render an object in the game world when it is first seen. However, as the view of the object changes, for example as the player moves past it, new textures are required to render previously unseen parts of the object that are consistent with the textures already applied to the object. The method will calculate these as required.
- the embodiment described in relation to FIG. 4 can be combined with any of the embodiments described in relation to FIGS. 1 to 3 .
- the further texture maps in FIG. 3 for the requested new texture map can be generated by determining the known texture maps in the overlapping regions.
- the hierarchical algorithms used can be trained to be texture specific. For example, a set of hierarchical algorithms specifically output a particular type of rock texture or a particular type of wood texture.
- the hierarchical algorithms can be stored in a library of hierarchical algorithms indexed by metadata, which includes the type of texture they are trained to output. In use, the hierarchical algorithms are selected from this library based on the desired texture to be generated.
- the hierarchical algorithms can be used in a feed forward fashion, such as a pre-trained texture network.
- the hierarchical algorithms are trained to output texture maps when given an input (e.g. a texture/random noise).
- One hierarchical algorithm/network is trained per texture.
- the input is run through the network in a feed-forward fashion to get an output.
- a texture and/or random noise pattern may be used as the given input, and when the hierarchical algorithm has been trained for the texture or random noise pattern, the input may be provided to said algorithm in a feed-forward fashion.
- the hierarchical algorithms can be used in as part of an iterative optimisation process, where textures are generated to represent properties of given existing textures as an input.
- textures are generated to represent properties of given existing textures as an input.
- at least 2 inputs are required, which may be representative of one or more textures and/or one or more random noise patterns.
- the output is iteratively optimised at run time to have similar features to the input texture map(s) producing a synthesised version of the content of any input visual data provided to the hierarchical algorithms.
- the methods described herein can be performed remotely to the output destination of the texture maps.
- the methods may also be performed on a distributed network.
- Any feature in one aspect may be applied to other aspects, in any appropriate combination.
- method aspects may be applied to system aspects, and vice versa.
- any, some and/or all features in one aspect can be applied to any, some and/or all features in any other aspect, in any appropriate combination.
- Some of the example embodiments are described as processes or methods depicted as diagrams. Although the diagrams describe the operations as sequential processes, operations may be performed in parallel, or concurrently or simultaneously. In addition, the order or operations may be re-arranged. The processes may be terminated when their operations are completed, but may also have additional steps not included in the figures. The processes may correspond to methods, functions, procedures, subroutines, subprograms, etc.
- Methods discussed above may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
- the program code or code segments to perform the relevant tasks may be stored in a machine or computer readable medium such as a storage medium.
- a processing apparatus may perform the relevant tasks.
- FIG. 5 shows an apparatus 500 comprising a processing apparatus 502 and memory 504 according to an embodiment.
- Computer-readable code 506 may be stored on the memory 504 and may, when executed by the processing apparatus 502 , cause the apparatus 500 to perform methods as described here, for example a method with reference to FIGS. 1 to 4 .
- the processing apparatus 502 may be of any suitable composition and may include one or more processors of any suitable type or suitable combination of types.
- the term “processing apparatus” should be understood to encompass computers having differing architectures such as single/multi-processor architectures and sequencers/parallel architectures.
- the processing apparatus may be a programmable processor that interprets computer program instructions and processes data.
- the processing apparatus may include plural programmable processors.
- the processing apparatus may be, for example, programmable hardware with embedded firmware.
- the processing apparatus may alternatively or additionally include Graphics Processing Units (GPUs), or one or more specialised circuits such as field programmable gate arrays FPGA, Application Specific Integrated Circuits (ASICs), signal processing devices etc.
- GPUs Graphics Processing Units
- ASICs Application Specific Integrated Circuits
- processing apparatus may be referred to as computing apparatus or processing means.
- the processing apparatus 502 is coupled to the memory 504 and is operable to read/write data to/from the memory 504 .
- the memory 504 may comprise a single memory unit or a plurality of memory units, upon which the computer readable instructions (or code) is stored.
- the memory may comprise both volatile memory and non-volatile memory.
- the computer readable instructions/program code may be stored in the non-volatile memory and may be executed by the processing apparatus using the volatile memory for temporary storage of data or data and instructions.
- volatile memory include RAM, DRAM, and SDRAM etc.
- non-volatile memory include ROM, PROM, EEPROM, flash memory, optical storage, magnetic storage, etc.
- Methods described in the illustrative embodiments may be implemented as program modules or functional processes including routines, programs, objects, components, data structures, etc., that perform particular tasks or implement particular functionality, and may be implemented using existing hardware.
- Such existing hardware may include one or more processors (e.g. one or more central processing units), digital signal processors (DSPs), application-specific-integrated-circuits, field programmable gate arrays (FPGAs), computers, or the like.
- software implemented aspects of the example embodiments may be encoded on some form of non-transitory program storage medium or implemented over some type of transmission medium.
- the program storage medium may be magnetic (e.g. a floppy disk or a hard drive) or optical (e.g. a compact disk read only memory, or CD ROM), and may be read only or random access.
- the transmission medium may be twisted wire pair, coaxial cable, optical fibre, or other suitable transmission medium known in the art. The example embodiments are not limited by these aspects in any given implementation.
- a method for generating textures for use in rendering visual output comprising the steps of:
- one or more of the one or more further textures differs from the first texture in being different class of texture map.
- the one or more sets of initialisation data comprises one or more initial sets of random noise.
- the one or more sets of noise parameters comprises at least one of: a random noise generator type; a random noise generator distribution; a seed or initialisation parameter; and/or one or more directional components and/or one or more noise generation procedures.
- a method according to example 12, wherein the one or more sets of further noise parameters comprises at least one of: a random noise generator type; a random noise generator distribution; a seed; and/or a directional component or procedure.
- a method according to any preceding example further comprising the step of storing the first texture and/or one or more further textures in a texture library after they have been generated.
- the first texture and/or one or more of the one or more textures further comprises one or more additional maps comprising at least one of: a normal map; a height map; a bump map; a displacement map; a specular map; or a depth map.
- a method according to any of examples 23 or 24, wherein the one or more additional maps and/or one or more further additional maps comprises at least one of: a normal map; a height map; a bump map; a displacement map; a specular map; or a depth map.
- the one or more further textures comprises a plurality of further textures.
- each of the hierarchical algorithms comprises at least one: a nonlinear hierarchical algorithm; a neural network; a convolutional neural network; a layered algorithm; a recurrent neural network; a long short-term memory network; a multi-dimensional convolutional network; a memory network; or a gated recurrent network.
- a method of training hierarchical algorithms for use in texture generation comprising the steps of:
- the hierarchical algorithm comprises at least one of a nonlinear hierarchical algorithm; a neural network; a convolutional neural network; a layered algorithm; a recurrent neural network; a long short-term memory network; a multi-dimensional convolutional network; a memory network; or a gated recurrent network.
- Apparatus comprising:
- At least one memory including computer program code which, when executed by the at least one processor, causes the apparatus to perform the method of any one of examples 1 to 39.
- a computer readable medium having computer readable code stored thereon, the computer readable code, when executed by at least one processor, causing the performance of the method of any one of examples 1 to 39.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- General Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Computer Graphics (AREA)
- Evolutionary Computation (AREA)
- General Health & Medical Sciences (AREA)
- Computational Linguistics (AREA)
- Software Systems (AREA)
- Mathematical Physics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Computing Systems (AREA)
- Molecular Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Generation (AREA)
Abstract
Description
Claims (20)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1608101.0 | 2016-05-09 | ||
GB1608101 | 2016-05-09 | ||
GBGB1608101.0A GB201608101D0 (en) | 2016-05-09 | 2016-05-09 | Multiscale 3D texture synthesis |
PCT/GB2017/051277 WO2017194921A1 (en) | 2016-05-09 | 2017-05-09 | Multiscale 3d texture synthesis |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/GB2017/051277 Continuation WO2017194921A1 (en) | 2016-05-09 | 2017-05-09 | Multiscale 3d texture synthesis |
Publications (2)
Publication Number | Publication Date |
---|---|
US20180122127A1 US20180122127A1 (en) | 2018-05-03 |
US10909743B2 true US10909743B2 (en) | 2021-02-02 |
Family
ID=56297396
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/856,759 Active 2037-05-15 US10909743B2 (en) | 2016-05-09 | 2017-12-28 | Multiscale 3D texture synthesis |
Country Status (5)
Country | Link |
---|---|
US (1) | US10909743B2 (en) |
EP (1) | EP3298587A1 (en) |
DE (1) | DE202017007534U1 (en) |
GB (1) | GB201608101D0 (en) |
WO (1) | WO2017194921A1 (en) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB201608101D0 (en) | 2016-05-09 | 2016-06-22 | Magic Pony Technology Ltd | Multiscale 3D texture synthesis |
US10628989B2 (en) * | 2018-07-16 | 2020-04-21 | Electronic Arts Inc. | Photometric image processing |
US11022861B2 (en) | 2018-07-16 | 2021-06-01 | Electronic Arts Inc. | Lighting assembly for producing realistic photo images |
CN110686633B (en) * | 2019-08-30 | 2021-09-14 | 深圳大学 | Landslide displacement prediction method and device and electronic equipment |
US11941780B2 (en) | 2020-05-11 | 2024-03-26 | Sony Interactive Entertainment LLC | Machine learning techniques to create higher resolution compressed data structures representing textures from lower resolution compressed data structures |
Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5680475A (en) * | 1992-09-16 | 1997-10-21 | U.S. Philips Corporation | System for processing textured images, texture analyser and texture synthesizer |
US5889526A (en) * | 1994-11-25 | 1999-03-30 | Matsushita Electric Industrial Co., Ltd. | Interpolation apparatus and method, and image generation apparatus including such an apparatus |
US6292191B1 (en) | 1996-12-30 | 2001-09-18 | Cirrus Logic, Inc. | Dynamically selectable MIP map blending for a software graphics engine |
US6304268B1 (en) * | 1997-06-26 | 2001-10-16 | S3 Graphics Co., Ltd. | Trilinear texture filtering of two levels of detail based on a single level of detail |
US20030193496A1 (en) * | 2002-04-16 | 2003-10-16 | Sony Computer Entertainment Inc. | Image processing system, image processing method, semiconductor device, computer program, and recording medium |
US20040257364A1 (en) * | 2003-06-18 | 2004-12-23 | Basler Gregory A. | Shadow casting within a virtual three-dimensional terrain model |
US6853373B2 (en) * | 2001-04-25 | 2005-02-08 | Raindrop Geomagic, Inc. | Methods, apparatus and computer program products for modeling three-dimensional colored objects |
JP2005275797A (en) | 2004-03-24 | 2005-10-06 | Namco Ltd | Program, information storage medium, image generation system |
US20060038823A1 (en) * | 2004-08-20 | 2006-02-23 | Arcas Blaise A Y | System and method for upscaling low-resolution images |
US20070002071A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Parallel texture synthesis by upsampling pixel coordinates |
US20070002067A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Magnification of indirection textures |
US20070165035A1 (en) * | 1998-08-20 | 2007-07-19 | Apple Computer, Inc. | Deferred shading graphics pipeline processor having advanced features |
US20070223887A1 (en) * | 2005-09-09 | 2007-09-27 | Matsushita Electric Industrial Co., Ltd. | Image processing method, image recording method, image processing device and image file format |
US20080273042A1 (en) * | 2007-05-01 | 2008-11-06 | Giquila Corporation | Apparatus and method for texture level of detail computation |
US7710424B1 (en) * | 2004-11-18 | 2010-05-04 | Nvidia Corporation | Method and system for a texture-aware virtual memory subsystem |
US20110065506A1 (en) | 2009-09-15 | 2011-03-17 | Microsoft Corporation | Mega-mesh sculpting for environments |
US20110115806A1 (en) * | 2009-11-19 | 2011-05-19 | Rogers Douglas H | High-compression texture mapping |
US20130342553A1 (en) * | 2012-06-25 | 2013-12-26 | Intel Corporation | Texture mapping techniques |
US20140071124A1 (en) * | 2012-09-12 | 2014-03-13 | Fujitsu Semiconductor Limited | Image processing apparatus |
US20160358099A1 (en) * | 2015-06-04 | 2016-12-08 | The Boeing Company | Advanced analytical infrastructure for machine learning |
US9679362B2 (en) * | 2010-12-30 | 2017-06-13 | Tomtom Global Content B.V. | System and method for generating textured map object images |
WO2017194921A1 (en) | 2016-05-09 | 2017-11-16 | Magic Pony Technology Limited | Multiscale 3d texture synthesis |
US10726356B1 (en) * | 2016-08-01 | 2020-07-28 | Amazon Technologies, Inc. | Target variable distribution-based acceptance of machine learning test data sets |
-
2016
- 2016-05-09 GB GBGB1608101.0A patent/GB201608101D0/en not_active Ceased
-
2017
- 2017-05-09 WO PCT/GB2017/051277 patent/WO2017194921A1/en active Application Filing
- 2017-05-09 EP EP17723497.8A patent/EP3298587A1/en not_active Withdrawn
- 2017-05-09 DE DE202017007534.2U patent/DE202017007534U1/en active Active
- 2017-12-28 US US15/856,759 patent/US10909743B2/en active Active
Patent Citations (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5680475A (en) * | 1992-09-16 | 1997-10-21 | U.S. Philips Corporation | System for processing textured images, texture analyser and texture synthesizer |
US5889526A (en) * | 1994-11-25 | 1999-03-30 | Matsushita Electric Industrial Co., Ltd. | Interpolation apparatus and method, and image generation apparatus including such an apparatus |
US6292191B1 (en) | 1996-12-30 | 2001-09-18 | Cirrus Logic, Inc. | Dynamically selectable MIP map blending for a software graphics engine |
US6304268B1 (en) * | 1997-06-26 | 2001-10-16 | S3 Graphics Co., Ltd. | Trilinear texture filtering of two levels of detail based on a single level of detail |
US20070165035A1 (en) * | 1998-08-20 | 2007-07-19 | Apple Computer, Inc. | Deferred shading graphics pipeline processor having advanced features |
US6853373B2 (en) * | 2001-04-25 | 2005-02-08 | Raindrop Geomagic, Inc. | Methods, apparatus and computer program products for modeling three-dimensional colored objects |
US20030193496A1 (en) * | 2002-04-16 | 2003-10-16 | Sony Computer Entertainment Inc. | Image processing system, image processing method, semiconductor device, computer program, and recording medium |
US20040257364A1 (en) * | 2003-06-18 | 2004-12-23 | Basler Gregory A. | Shadow casting within a virtual three-dimensional terrain model |
JP2005275797A (en) | 2004-03-24 | 2005-10-06 | Namco Ltd | Program, information storage medium, image generation system |
US20060038823A1 (en) * | 2004-08-20 | 2006-02-23 | Arcas Blaise A Y | System and method for upscaling low-resolution images |
US7710424B1 (en) * | 2004-11-18 | 2010-05-04 | Nvidia Corporation | Method and system for a texture-aware virtual memory subsystem |
US20070002067A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Magnification of indirection textures |
US20070002071A1 (en) * | 2005-06-30 | 2007-01-04 | Microsoft Corporation | Parallel texture synthesis by upsampling pixel coordinates |
US20070223887A1 (en) * | 2005-09-09 | 2007-09-27 | Matsushita Electric Industrial Co., Ltd. | Image processing method, image recording method, image processing device and image file format |
US20080273042A1 (en) * | 2007-05-01 | 2008-11-06 | Giquila Corporation | Apparatus and method for texture level of detail computation |
US20110065506A1 (en) | 2009-09-15 | 2011-03-17 | Microsoft Corporation | Mega-mesh sculpting for environments |
US20110115806A1 (en) * | 2009-11-19 | 2011-05-19 | Rogers Douglas H | High-compression texture mapping |
US9679362B2 (en) * | 2010-12-30 | 2017-06-13 | Tomtom Global Content B.V. | System and method for generating textured map object images |
US20130342553A1 (en) * | 2012-06-25 | 2013-12-26 | Intel Corporation | Texture mapping techniques |
US20140071124A1 (en) * | 2012-09-12 | 2014-03-13 | Fujitsu Semiconductor Limited | Image processing apparatus |
US20160358099A1 (en) * | 2015-06-04 | 2016-12-08 | The Boeing Company | Advanced analytical infrastructure for machine learning |
WO2017194921A1 (en) | 2016-05-09 | 2017-11-16 | Magic Pony Technology Limited | Multiscale 3d texture synthesis |
US10726356B1 (en) * | 2016-08-01 | 2020-07-28 | Amazon Technologies, Inc. | Target variable distribution-based acceptance of machine learning test data sets |
Non-Patent Citations (8)
Title |
---|
Englert, et al, "A Model for Description and Synthesis of Heterogeneous Textures", Proceedings of the European Computer Graphics Conference, Sep. 4, 1989, 13 pages. |
Gatys, et al., "Texture Synthesis and the Controlled Generation of Natural Stimuli Using Convolutional Neural Networks", retrieved on Aug. 29, 2016 from https://pdfs.semanticscholar.org/ecef/28ddfcacf4f2ba9c22a3c8296d4e19322d3d.pdf, May 27, 2015, 9 pages. |
Gelenbe, et al., "Learning in the Multiple Class Random Neural Network", IEEE Transactions on Neural Networks, vol. 13, No. 6, Nov. 1, 2002, 23 pages. |
International Search Report and Written Opinion for PCT Application No. PCT/GB2017/051277, dated Aug. 8, 2017, 13 pages. |
Office Action for European Application No. 17723497.8, dated Jun. 10, 2020, 6 pages. |
Search Report for Application No. GB1608101.0, dated Oct. 27, 2016, 4 pages. |
Slot, et al., "Fast Generation of Natural Textures With Cellular Neural Networks-based Stitching", 2010 12th International Workshop on Cellular Nanoscale Networks and their Applications (CNNA), IEEE, Feb. 3, 2010, 4 pages. |
Yoo, et al., "Texture Enhancement for Improving Single-Image Super-Resolution Performance", Signal Processing: Image Communication, vol. 46, May 3, 2016, pp. 29-39. |
Also Published As
Publication number | Publication date |
---|---|
GB201608101D0 (en) | 2016-06-22 |
EP3298587A1 (en) | 2018-03-28 |
WO2017194921A1 (en) | 2017-11-16 |
US20180122127A1 (en) | 2018-05-03 |
DE202017007534U1 (en) | 2022-05-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10909743B2 (en) | Multiscale 3D texture synthesis | |
CN111489412B (en) | Semantic image synthesis for generating substantially realistic images using neural networks | |
Wang et al. | Efficient example-based painting and synthesis of 2d directional texture | |
US12125256B2 (en) | Generator exploitation for deepfake detection | |
US20110216976A1 (en) | Updating Image Segmentation Following User Input | |
Xu et al. | Designing one unified framework for high-fidelity face reenactment and swapping | |
US20210319090A1 (en) | Authenticator-integrated generative adversarial network (gan) for secure deepfake generation | |
Qian et al. | Aesthetic art simulation for embroidery style | |
US9013485B2 (en) | Systems and methods for synthesizing high fidelity stroke data for lower dimension input strokes | |
US10922852B2 (en) | Oil painting stroke simulation using neural network | |
Mousavi et al. | Ai playground: Unreal engine-based data ablation tool for deep learning | |
US20240193204A1 (en) | Adjusting attribution for content generated by an artificial intelligence (ai) | |
Ullah et al. | DSFMA: Deeply supervised fully convolutional neural networks based on multi-level aggregation for saliency detection | |
CN114565964A (en) | Emotion recognition model generation method, recognition method, device, medium and equipment | |
Wang et al. | Deep Learning-Based Scene Processing and Optimization for Virtual Reality Classroom Environments: A Study. | |
Ouyang et al. | Cartoon colorization with gray image generated from sketch | |
CN113763496A (en) | Image coloring method, device and computer readable storage medium | |
Mejjati et al. | Generating object stamps | |
US12086901B2 (en) | Generating digital paintings utilizing an intelligent painting pipeline for improved brushstroke sequences | |
Cui et al. | Traditional art design expression based on embedded system development | |
Zhang et al. | Zero-Shot Real Facial Attribute Separation and Transfer at Novel Views | |
Li et al. | Real-time image carrier generation based on generative adversarial network and fast object detection | |
Yang et al. | Instance-level image synthesis method based on multi-scale style transformation | |
Huang et al. | Research on Stylization of Landscape Drawing Rendering Based on Computer Vision Techniques | |
Huang et al. | Preset Emotional Experience: Adaptive Detection and Complexity Interpretation of Implicit Order in Facial Emotion Recognition Calculations |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
AS | Assignment |
Owner name: MAGIC PONY TECHNOLOGY LIMITED, UNITED KINGDOM Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:THEIS, LUCAS;WANG, ZEHAN;BISHOP, ROBERT DAVID;SIGNING DATES FROM 20180124 TO 20180130;REEL/FRAME:044905/0882 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
CC | Certificate of correction | ||
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |