[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

US20190347865A1 - Three-dimensional drawing inside virtual reality environment - Google Patents

Three-dimensional drawing inside virtual reality environment Download PDF

Info

Publication number
US20190347865A1
US20190347865A1 US14/859,175 US201514859175A US2019347865A1 US 20190347865 A1 US20190347865 A1 US 20190347865A1 US 201514859175 A US201514859175 A US 201514859175A US 2019347865 A1 US2019347865 A1 US 2019347865A1
Authority
US
United States
Prior art keywords
dimensional
user
reality environment
virtual reality
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/859,175
Other languages
English (en)
Inventor
Patrick Ryan HACKETT
Andrew Lee SKILLMAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Google LLC
Original Assignee
Google LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Google LLC filed Critical Google LLC
Priority to US14/859,175 priority Critical patent/US20190347865A1/en
Assigned to GOOGLE INC. reassignment GOOGLE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: SKILLMAN, ANDREW LEE, HACKETT, PATRICK RYAN
Assigned to GOOGLE LLC reassignment GOOGLE LLC CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: GOOGLE INC.
Publication of US20190347865A1 publication Critical patent/US20190347865A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T13/00Animation
    • G06T13/203D [Three Dimensional] animation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/12Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/16Cloth
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2012Colour editing, changing, or manipulating; Use of colour codes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2021Shape modification

Definitions

  • This description generally relates to the field of computer software and more specifically to the field of virtual reality computer software.
  • GUIs Graphical User Interfaces
  • a computer-implemented method includes producing a representation of a display of a three-dimensional virtual reality environment and defining a plurality of virtual areas and at least one three-dimensional drawing plane within the virtual reality environment.
  • the method also includes providing a plurality of toolsets in the virtual reality environment.
  • the toolsets are configured to receive interactive commands from at least one two-dimensional input device coupled to a computing device and associated with a user.
  • the method also includes generating a three-dimensional drawing in at least one of the plurality of virtual areas that is generated according to the movement pattern and depicted in the at least one virtual area as an object being drawn on the three-dimensional drawing plane and by the user using the two-dimensional input device.
  • Generation of the drawing is in response to detecting a toolset selection and a movement pattern from the at least one two-dimensional input device.
  • the method further includes, in response to detecting additional movement patterns indicating a change to the drawing plane, tilting the at least one virtual area in a direction associated with at least one of the additional movement patterns to enable the user to generate a modified three-dimensional drawing.
  • the method additionally includes, in response to receiving a plurality of additional movement patterns indicating drawing motions, each drawing motion including at least one initial location, direction, and final location, generating one or more three-dimensional brush strokes, according to each drawing motion, using a tool selected from one of the toolsets.
  • Each of the one or more three-dimensional brush strokes are generated and displayed in real time in the at least one panel area and on the three-dimensional drawing plane according to the plurality of additional movement patterns.
  • the plurality of toolsets include a plurality of panels for drawing configured with selectable brushes, color hues, textures, visual effects, and movement controls.
  • the method can include presenting a color palette representation as a three-dimensional object for color selection.
  • Example implementations may include one or more of the following features.
  • the three-dimensional drawing plane is configured to be a planar drawing guide rotatable on at least three axes for the user to draw within the virtual reality environment.
  • the method may also include providing selectable portions on the three-dimensional drawing plane to enable the user to simulate moving to other virtual areas within the virtual reality environment surrounding the three-dimensional drawing to view another perspective of the three-dimensional drawing and to modify another perspective of the modified three-dimensional drawing.
  • the plurality of additional movement patterns are provided as a hand gestures simulating brush strokes.
  • the at least one two-dimensional input device includes a mouse, a keyboard, a mobile device, a tablet pen, or any combination thereof.
  • the method may also provide a network interface for multiple computing devices to participate in the virtual reality environment shared by the multiple computing devices, wherein providing the network interface includes enabling multiple users each using at least one uniquely identified two-dimensional input device to collaborate in the virtual reality environment, create drawings in a shared virtual reality environment, and to affect change in the shared virtual reality environment.
  • the method may also include exporting the virtual reality environment for another computing device to provide access to the virtual reality environment for another user or system to navigate therein, or to print the virtual reality environment using a three-dimensional printer. Exporting is for filmmaking, rapid prototyping, game making, storytelling, or any combination thereof
  • a system in another general aspect, includes a movement tracking module configured to detect location information pertaining to a plurality of user movements associated with a two-dimensional input device used to interface with a virtual reality environment and to generate triangular geometries for tracking position information within the virtual reality environment, the position information corresponding to an initial input location and a current input location for the two-dimensional input device, the triangular geometries being generated each time the two-dimensional input device is moved.
  • the triangular geometries include at least two triangles defining a three-dimensional starting point for a cursor of the two-dimensional input device, represented in the virtual reality environment, and a three-dimensional ending point for the cursor of the two-dimensional input device.
  • the system also includes a three-dimensional drawing plane including a planar drawing guide for receiving a first drawing generated by a user with the two-dimensional input device in the virtual reality environment.
  • the planar drawing guide is at a first angle and moveable to a second angle while the first drawing remains visible in the virtual environment at the first angle.
  • the planar drawing guide Upon being moved to the second angle, the planar drawing guide is adaptable to receive additional drawings at the second angle, the additional drawings generated in a different plane than the first drawing and being depicted as an overlay to the first drawing.
  • the three-dimensional drawing plane is configured to be a planar drawing guide rotatable on at least three axes for receiving drawing content within the virtual reality environment.
  • Example implementations may include one or more of the following features.
  • the three-dimensional drawing plane includes selectable portions to enable the user to simulate moving to a plurality of virtual areas within the virtual reality environment surrounding the first and the additional drawings to view another perspective of the first drawing and the additional drawings and to modify another perspective of the first drawing and the additional drawings.
  • FIG. 1 is a block diagram of an example system for providing a virtual reality environment (e.g., a VR space) in a three-dimensional environment in which a user can generate three-dimensional drawings.
  • a virtual reality environment e.g., a VR space
  • FIG. 2 is a diagram that illustrates a head mounted display (HMD) device accessing VR content with a computing device in the VR space of FIG. 1 .
  • HMD head mounted display
  • FIG. 3 is a perspective view of the object to be manipulated in virtual reality.
  • FIG. 4 is a perspective view of the computer keyboard and mouse input mechanisms.
  • FIG. 5 is a top-down view, showing a user pressing a key to modify the object in virtual reality.
  • FIG. 6 is a top-down view, showing a user releasing a key to revert modification of the object in virtual reality.
  • FIG. 7 is a top-down view, showing a user rotating the object in virtual reality by pressing a key and moving the computer mouse.
  • FIG. 8 is a top-down view, showing the direct relationship between computer mouse movement and object rotation.
  • FIG. 9 is a top down view, further explaining the relationship between computer mouse movement and object rotation.
  • FIG. 10 is a top-down view, showing a user pressing a key to modify the object in virtual reality.
  • FIG. 11 is a perspective view, showing a user translating the object in virtual reality by pressing a key and moving their head.
  • FIG. 12 is a perspective view, showing the direct relationship between user head movement and object translation.
  • FIG. 13 is a top-down view, showing a user pressing a button on the motion controller to modify the object in virtual reality, as well as a user releasing a button on the motion controller to revert modification of the object in virtual reality.
  • FIG. 14 is a top-down view, showing a user rotating the object in virtual reality by pressing a button on the motion controller and rotating the controller.
  • FIG. 15 is a top-down view, showing a user pressing a button on the smartphone to modify the object in virtual reality, as well as showing a user releasing a button on the smartphone to revert modification of the object in virtual reality.
  • FIG. 16 is a top-down view, showing a user rotating the object in virtual reality by pressing a button on the mobile device and rotating the mobile device.
  • FIG. 17 is a perspective view, showing a user translating the object in virtual reality by pressing a button on the mobile device and moving the mobile device.
  • FIGS. 18A-B are examples of various color spaces represented in two and three dimensions.
  • FIG. 19 is a perspective view of an example, three-dimensional color picker.
  • FIG. 20 is a screenshot of a functional prototype, three-dimensional color picker.
  • FIG. 21 is an example of a three-dimensional color picker augmented with additional information about the scene.
  • FIG. 22 is a perspective view of a GUI panel represented as a two-dimensional object in a virtual reality environment.
  • FIG. 23 is a perspective view, showing a user facing away from a GUI panel and the GUI panel is in a deactivated state.
  • FIG. 24 is a perspective view, showing a user facing a GUI panel and the GUI panel is in an active state.
  • FIG. 25 is a perspective view, showing the direct relationship between a user's two-dimensional input device and the location of a GUI panel in a virtual reality environment.
  • FIG. 26 is a perspective view, showing the direct relationship between the orientation of a user's three-dimensional input device and the orientation of a GUI panel in a virtual reality environment.
  • FIG. 27 is a perspective view of a user creating objects inside a virtual reality environment.
  • FIG. 28 is a perspective view of a user defining a set of objects as a frame.
  • FIG. 29 is a perspective view of a series of frames inside a virtual reality environment.
  • FIG. 30 is a perspective view of manipulation of a series of frames inside a virtual reality environment.
  • FIG. 31 is a perspective view of data associated with a frame inside a virtual reality environment.
  • FIG. 32 is a perspective view of playback of an ordered series of frames inside a virtual reality environment.
  • FIG. 33 is a perspective view of a user manipulating an invisible object inside a virtual reality environment.
  • FIG. 34 is a diagram showing a sequence of frames with associated data that can result in different outcomes during playback.
  • FIG. 35 represent perspective views of a user creating objects inside a virtual reality environment.
  • FIG. 36 represent perspective views of a user importing objects inside a virtual reality environment.
  • FIG. 37 is a perspective view of a head mounted display equipped with stereo cameras.
  • FIG. 38 is a side by side stereo view of the virtual reality environment.
  • FIG. 39 is a side by side stereo view of the virtual reality environment combined with the real world environment.
  • FIG. 40 is a side by side stereo view of the virtual reality environment combined with the real world environment, after a three-dimensional drawing has been created.
  • FIG. 41 is a side by side stereo view of the virtual reality environment with the created three-dimensional drawing, but with the real world environment turned off.
  • FIG. 42 is a perspective view of a user leaving the virtual reality tracking volume, and augment reality elements being activated.
  • FIG. 43 is a perspective view of a user manually activating augmented reality.
  • FIG. 45 is a perspective view of object recognition activating augmented reality.
  • FIG. 46 is a perspective view of a tracking marker that drives augmented reality visibility.
  • FIG. 47 is a front view of specific elements of the stereo camera feed being rendered into the virtual reality environment.
  • FIG. 48 is a front view of additional data related to real-world objects being rendered into a virtual reality environment.
  • FIG. 49 is a perspective view of a user at a computer with a depth-sensing camera facing them.
  • FIG. 50 is a perspective view of a user connected to another user via direct cable or Internet.
  • FIG. 51 is a virtual reality view of a local and networked user represented by a three-dimensional point cloud inside the virtual reality environment.
  • FIG. 53 is a perspective view of a user at a computer with a light-sensing camera facing them.
  • FIG. 54 is a virtual reality view of a user represented by a two-dimensional mesh inside a virtual reality environment.
  • FIG. 55 is a virtual reality view of a networked user represented with stylized accessories.
  • FIG. 56 is a perspective view of a user at a computer, interacting with a virtual reality environment.
  • FIG. 57 is a perspective view of multiple users, all at computers, interacting with a virtual reality environment.
  • FIG. 58 is a perspective view of multiple users connected together via direct cable or through the Internet.
  • FIG. 59 is a perspective view of a user connected to another user, with their input modifying the virtual reality environment of the other user.
  • FIG. 60 is a perspective view of a user connected to another user, with their microphone recording sounds and playing it through the speakers of the other user.
  • FIG. 61 is a perspective view of a user receiving sounds in the virtual reality environment from a specific location.
  • FIG. 62 is a perspective view of the relationship between the position of a user and the sounds users hear from them in a virtual reality environment.
  • FIG. 63 is a flow chart diagramming one embodiment of a process to provide a virtual reality environment (e.g., a VR space) in a three-dimensional environment in which a user can generate three-dimensional drawings.
  • a virtual reality environment e.g., a VR space
  • FIG. 64 is an example screenshot representing a drawing plane and a selected brush panel in a VR space.
  • FIG. 65 is an example screenshot representing a drawing plane and a selected color panel in a VR space.
  • FIG. 66 is an example screenshot representing a tilted drawing plane with user drawn content depicted in the plane and user drawn content in a second plane in a VR space.
  • FIG. 67 is an example screenshot representing user drawn content in a VR space.
  • FIG. 68 is another example screenshot representing user drawn content in a VR space.
  • FIG. 69 is another example screenshot representing user drawn content in a VR space.
  • FIG. 70 shows an example of a computer device and a mobile computer device that can be used to implement the techniques described herein.
  • Generating an environment in which a user can create three-dimensional drawings can include methods for controlling three-dimensional objects and content while inside a VR space, color representation and selection while inside the VR space, generating graphical user interfaces, sequencing frames for animation, augmenting existing three-dimensional virtual objects, augmenting real-world vision with computer graphics, representing a user's body while in a VR space, and collaborating amongst users and spectating while inside a VR space.
  • Content in the VR space can be generated and/or annotated by a user accessing toolsets (e.g., tool palettes) defined by the systems and methods described in this disclosure.
  • tool palettes may include brushes, cursors, panels, canvas simulators, shapes, surfaces, and texturizers, any or all of which can be used together to create and/or modify content in the VR space.
  • the tool palette can include mechanisms to import preexisting files containing 2D or 3D objects including, but not limited to images representing data, art, photographs, models, and/or augmented reality content.
  • a user may annotate portions of the VR space by accessing one or more tools to import images of objects and use the tools in the tool palettes to add content or modify those images by drawing, painting, scribbling, moving, illuminating or shadowing, or otherwise generating and manipulating portions of the images in the VR space.
  • the initially uploaded images and user-modified images can be manipulated around two or more axes during and after application of modifications/annotations.
  • such images can be shared with other users for review and/or collaboration.
  • Particular implementations described in this disclosure may enable a user to draw in 3D in the VR space.
  • the user can generate begin and end points by drawing in the air with an input device, such as a controller, sensor, or an input device.
  • the user can point and direct the tracked device such that portions of the VR space can be drawn upon (e.g., with brush strokes, objects, annotations, texturizers, etc.).
  • the systems described below can track the drawing motions of the user, generate artistic or annotated content based on those motions, and provide for moving around the content, the x-y plane or y-z plane (or other coordinate system) that the content is being generated in.
  • the user can lift the input device into the VR space (which can show the user her hands via the HMD device).
  • the user can begin to draw/paint on a selected surface normal (oriented in 3D space). If the user begins to draw a circle surrounding her body, the circle will appear from the input device as the user begins to draw the circle.
  • the input device may be depicted to the user as a paintbrush, pen, controller, or other selected tool.
  • the user can tilt the plane/surface normal to begin drawing in another vector space (e.g., another dimension or plane).
  • another vector space e.g., another dimension or plane.
  • the user may complete the circle surrounding herself with a shape that appears to be a hula hoop. The user can move the hula hoop around within the VR space as she desires.
  • the systems and methods described in this disclosure can provide for importing objects into the VR space.
  • a user can upload objects into a system hosting a VR application.
  • the VR application can provide the objects for display in the VR space.
  • the display can be viewed by a user accessing an HMD device.
  • Imported objects can be used to provide a visual reference for a user beginning to draw in three-dimensions within the VR space.
  • the objects can be traced, or in some implementations, can be used as a guide in which the user can judge distances and shapes for recreating a drawing or other notation for the object.
  • the user can draw on the imported object to annotate portions of the object.
  • the imported objects may be 2D or 3D and can include 3D models, scans, mesh models, depth collages, etc.
  • Imported images can include any displayable file type including, but not limited to a CAD file, a jpeg file, a png, a bitmap file, or other file type.
  • the user can export images generated, modified, or otherwise changed within the VR space.
  • Example controls may include using a keyboard, a mouse, or a 3D controller to move a pointer.
  • the pointer may represent an area under a sketch tool depicted in the VR space.
  • the pointer may represent an area in which a sketch is being generated.
  • Example mouse motions can include using a left mouse click to draw, using a middle mouse click to pan along a VR space x-y plane, using a right mouse click to pan along world z-axis, and using a double click middle mouse button to reset the pointer to the center of a sketching surface.
  • Example keyboard keys that can control content in the VR space include holding the control key to rotate the sketching surface, holding the control key and left mouse button to rotate the sketching surface along a roll axis, holding the shift key to lock the sketching surface to a camera, using the caps lock key to toggle grid locked mode on the sketching surface, holding the tab key to adjust a brush size, pressing a spacebar to reset the sketching surface to the center of scene, double tapping the control key to reset the sketching surface orientation, selecting the (z) key to undo a stroke or action, pressing the (x) key to redo a stroke or action.
  • Such controls can also be configured to translate a surface or object in the VR space while the particular surface or object is locked.
  • systems and methods described herein can be configured to detect and react to movements such as head tilt behavior and/or eye gaze behavior associated with a user and an HMD device being worn by the user.
  • the systems and methods can be used to detect and react accordingly to particular tool palettes generated for drawing in 3D space.
  • FIG. 1 is a block diagram of an example system for providing a virtual reality environment (e.g., a VR space) in a 3D environment in which a user can generate 3D drawings.
  • the system 100 may provide the 3D VR space, drawing tools, and VR content for a user to access, view, and interact with using the methods, components, and techniques described herein.
  • system 100 can provide the user with options for accessing the images, content, virtual objects, and VR controls using eye gaze, hand gestures, head movements, and/or other user-based movements within the VR space.
  • a user can generate 3D drawings in portions of the VR space and interact with such drawings using input devices, and tools configured to generate artistic drawings or annotations on drawings or other VR objects.
  • the example system 100 includes a plurality of computing devices that can exchange data over a network 101 .
  • the devices may represent clients or servers and can communicate via network 101 , or other network.
  • the client devices may include a mobile device, an electronic tablet, a laptop, a camera, a game controller, VR glasses or HMD device, or other such electronic device that may be used to access VR content.
  • the example system 100 includes a mobile device 102 , a game controller 103 , a laptop computing device 104 , head mounted display (HMD) device 106 , and VR drawing system 108 .
  • Devices 102 , 103 , 104 , and 106 may represent client devices.
  • Mobile device 102 , game controller 103 , laptop 104 , and HMD device 106 can include one or more processors and one or more memory devices.
  • the devices 102 - 106 can execute a client operating system and one or more client applications that can access, control, and/or display VR content on a display device included in each respective device.
  • the VR drawing system 108 may represent a server device.
  • VR drawing system 108 may include any number of repositories storing images, objects, content and/or virtual reality software modules that can generate, modify, or execute display of virtual reality scenes and content.
  • the HMD device 106 may represent a virtual reality headset, glasses, eyepiece, or other wearable device capable of displaying virtual reality content.
  • the HMD device 106 can execute a VR application 110 , which can playback received and/or processed images to a user.
  • the VR application 110 can be hosted by or interfaced with one or more of the devices 102 , 103 104 , 106 , or 108 , shown in FIG. 1 .
  • the mobile device 102 can be placed and/or located within the HMD device 106 .
  • the mobile device 102 can include a display device that can be used as the screen for the HMD device 106 .
  • the mobile device 102 can include hardware and/or software for executing the VR application 110 .
  • the devices 102 , 103 , 104 , 106 , and 108 can be laptop or desktop computers, smartphones, personal digital assistants, portable media players, tablet computers, gaming devices, or other appropriate computing devices that can communicate, using the network 101 , with other computing devices or computer systems.
  • the VR drawing system 108 can include a VR application 110 .
  • the VR application 110 can be configured to execute on or interface to any or all of devices 102 , 103 , 104 , 106 , and 108 .
  • the HMD device 106 can be connected to device 102 , device 103 , or device 104 to access VR content on VR drawing system 108 , for example.
  • Devices 102 - 104 can be connected (wired or wirelessly) to HMD device 106 , which can provide VR content for display and interactive drawing.
  • the connection may include use of one or more of the high-speed wireless communication protocols described herein.
  • a wired connection can include a cable with an appropriate connector on either end for plugging into devices 102 - 104 .
  • the cable can include a Universal Serial Bus (USB) connector on both ends.
  • USB connectors can be the same USB type connector or the USB connectors can each be a different type of USB connector.
  • USB connectors can include, but are not limited to, USB A-type connectors, USB B-type connectors, micro-USB A connectors, micro-USB B connectors, micro-USB AB connectors, USB five pin Mini-b connectors, USB four pin Mini-b connectors, USB 3.0 A-type connectors, USB 3.0 B-type connectors, USB 3.0 Micro B connectors, and USB C-type connectors.
  • the wired connection can include a cable with an appropriate connector on either end for plugging into the HMD device 106 and devices 102 - 104 .
  • the cable can include a Universal Serial Bus (USB) connector on both ends.
  • USB connectors can be the same USB type connector or the USB connectors can each be a different type of USB connector.
  • one or more content/drawing servers e.g., VR drawing system 108
  • one or more computer-readable storage devices can communicate with the computing devices 102 or 104 using network 101 to provide VR content and selectable drawing tools to the devices 102 - 106 .
  • the network 101 can be a public communications network (e.g., the Internet, cellular data network, dialup modems over a telephone network) or a private communications network (e.g., private LAN, leased lines).
  • the computing devices 102 - 108 can communicate with the network 101 using one or more high-speed wired and/or wireless communications protocols (e.g., 802.11 variations, WiFi, Bluetooth, Transmission Control Protocol/Internet Protocol (TCP/IP), Ethernet, IEEE 802.3, etc.).
  • high-speed wired and/or wireless communications protocols e.g., 802.11 variations, WiFi, Bluetooth, Transmission Control Protocol/Internet Protocol (TCP/IP), Ethernet, IEEE 802.3, etc.
  • the mobile device 102 can execute the VR application 110 and provide content and drawing capabilities to a user accessing the VR space.
  • the laptop computing device 104 can execute the VR application 110 and can provide content and drawing capabilities to a user accessing the VR space.
  • the one or more servers and one or more computer-readable storage devices can communicate with the mobile device 102 and/or laptop computing device 104 using the network 101 to provide content and drawing capabilities for display in HMD device 106 .
  • the VR drawing system 108 includes a movement tracking module 112 can be configured to track user position and motion within the VR space as well as tracking drawing content.
  • the movement tracking module 112 can employ a geometrical concept to determine user movement of input devices to generate drawing content and brush strokes, in particular.
  • the geometrical concept is described as a quad.
  • Quads can be generated and manipulated by quad generator 113 .
  • the quad generator may be configured to generate triangular geometries for tracking position information within the virtual reality environment.
  • the position information may correspond to an initial input location and a current input location for the three-dimensional input device.
  • the triangular geometries may be generated each time the three-dimensional input device is moved.
  • the quad generator can generate triangular geometries that are adapted to be combined to generate drawing content in the virtual reality environment.
  • the drawing content can be configured with a user-selected texture, color, and/or shade.
  • Quads can include at least two triangular geometries (i.e., triangles) that can be used to define positional information for the pointer object (e.g., represented as a brush tip or input mechanism position).
  • the triangular geometries include at least two triangles defining a three dimensional starting point for a cursor, represented in the virtual reality environment, and a three-dimensional ending point for the cursor.
  • the positional information can include a beginning pointer location and a current pointer location.
  • the system 100 can generate a quad and positional information corresponding to the quad.
  • the normal of one or both of the triangles that define the quad can be used to define a forward vector.
  • the normal of the pointer object represents the normal of a first triangle in the quad.
  • the normal of the pointer object in the current position represents the normal of the second triangle.
  • a right vector can be obtained by performing the cross product of the two normals.
  • Each movement the user makes can be used to generate quads, and each quad can be stitched or appended together to generate a smooth brushstroke (e.g., ribbon of color, texture, line drawing, or other object or artifact representing user movement when generating 3D drawing content in the VR space).
  • a smooth brushstroke e.g., ribbon of color, texture, line drawing, or other object or artifact representing user movement when generating 3D drawing content in the VR space.
  • the look of a quad can be is defined by the texture, material, color, and shade or luminance.
  • the texture is a property of the material, with the material being unique per brush and functioning to define the behavior the texture may have with respect to lighting in a VR space (e.g., scene).
  • the color of a quad is set per vertex, and defined by the user, as described in detail below.
  • the shade can be applied to the quad using various inputs from the VR space to modify the look of the quad. Inputs that can affect the shade include color, time, audio input, world space position, model space position, and light/luminance values, as described in detail below.
  • the movement tracking module 112 can include capability for head tracking.
  • the HMD device 106 can determine directions that a user's head is moving. The user can nod, turn, or tilt her head to indicate which tool to select, which panel to access, and/or which other functionality to invoke or revoke.
  • the movement tracking module 112 may also include capability for gaze tracking. Gaze tracking can interface with one or more sensors, computing systems, cameras, and/or controls to detect gaze/eye movement associated with the user while the user is in the VR space.
  • the one or more sensors, computing systems, cameras, and/or controls may be housed in HMD device 106 , for example.
  • the gaze tracking can track or monitor the direction of a user's eye gaze (i.e., tracking or monitoring where/which direction the user is looking). In general, gaze tracking may include tracking both the orientation and location of one eye or both eyes with respect to a defined coordinate system.
  • the user is in control of the pointer object.
  • the system 108 can record the point object position.
  • the system 108 can measure the difference from a previously recorded pointer object position and generate a new quad in response to the pointer object being moved by the user.
  • the generated new quad may be represent two triangles, with the forward vector defined by the distance between the points, the pointer forward as the quad normal, and the cross product of those two defining the right-hand vector.
  • the width of the quad may be defined by the right-hand vector multiplied by the current brush size, which may be controlled by the user.
  • the system 108 can stitch the quads together to create a smooth, ribbon effect. Stitching the quads together may include matching a leading edge of a previous quad with a trailing edge of the current quad. Midpoint mathematical calculations can be used to ensure quad triangles do not fold in on other quad triangles. In addition, if the dot product of the forward vectors of two sequential quads is greater than an amount relative to the vector size multiplied by a scalar, the system 108 can trigger a break of the ribbon, which can begin a new sequence of quads. In some implementations, smoothing algorithms can be applied to the normals of sequential quads to generate a consistent look to the ribbons/brushstrokes.
  • the system 108 may not stitch the quads together and instead may assign random orientations to a forward vector, which can function to generate a spray paint effect. Such an effect may be associated with particle brushes that can be selected from a brush tool palette.
  • the system 108 instead of generating and stitching quads, can generate billboard stripes.
  • the VR drawing system 108 also includes tool palettes 114 including, but not limited to panels 118 .
  • the VR application 110 can provide a number of panels 118 within the VR space.
  • Panels 118 may be represented as 2D or 3D interactive images in the VR space.
  • panels are 3D objects that can include a number of user-selectable controls or content.
  • the panels may be affixed to content in the VR space or may appear floating in the VR space and at the ready to receive user selections or simply provide visual guidance.
  • the panels may light up to indicate available tools, drawing planes, pointer locations, or other indicators that can trigger head or eye movement from a user.
  • a pointer object can be positioned on a panel and a user can move around the pointer object on the panel to select portions of the panel.
  • one or more panels may be positioned by the system 108 (or the user). For example, the user may select a first panel by grabbing the panel and moving it in the 3D VR space. The user may choose to make the first panel a focus and as such may center the panel in a central position. The user may choose to select another panel that hosts brushes and paint tools, for example, and may position that panel to a left or right side, depending on the user's handedness. Similarly, a third panel, such as a color panel 116 can be selected and moved around at the convenience of the user or at a default instantiated by the system 108 .
  • Color panel 116 may represent a three-dimensional tool palette configured to provide, in a VR space, at least one color palette menu represented as a three-dimensional cube in three-dimensional space.
  • the cube may include a two dimensional saturation area including a cross section of spaces representing an intensity for a number of different hues.
  • the intensity may define a degree to which each hue differs from white.
  • the intensity may be depicted numerically, graphically, textually, or both.
  • the cube also includes one-dimensional hue area including selectable hues. Upon selection of one of the hues the color panel 116 /cube may automatically adjust the two dimensional saturation area to reflect a position of at least one of the selected hues in the three-dimensional cube.
  • a number of panels can be configured to attach to a location of a 3D motion/position tracked controller.
  • the panels can be attached in a wand formation around the controller, for example.
  • the panels may be arranged around the controller (e.g., and the users hand if the user is holding the controller) similar to a painters palette, in which the user can rotate her hand to rotate the controller and trigger display of additional panels and options in a circular fashion.
  • the system 100 can detect a user eye gaze or head movement and use such input to activate/select a particular panel. For example, the user can nod toward a panel to activate the panel and begin moving, selecting, or otherwise interface with the panel.
  • the tool palettes 114 also include, but are not limited to brushes 120 .
  • Brushes 120 can apply to any tool used to generate drawings, objects, and/or content in the VR application 110 .
  • a brush may be defined by the material and shader associated with the brush.
  • the material optionally contains a texture.
  • the brush color is defined by the user.
  • the material, shader, texture, and color may define the look of a quad generated by a selected brush.
  • Toggle Grid Lock Toggles Grid Locked mode on Sketching Surface Auto-Orient Automatically adjusts the orientation of the sketching surface after a rotation to ensure ‘up’ on the mouse is ‘up’ on the Sketching Surface
  • the focal point of the exported .gif is the current center of the Sketching Surface.
  • Animated .gif files are saved in the Gifs folder
  • a user can select a constraining tool to paint or draw a particular shape.
  • a constraining tool to paint or draw a particular shape.
  • One such example includes a straight edge tool that can be selected to provide a straight line from a beginning to ending point selected by the user and in the selected brush stroke.
  • Another example includes a mirror brush that can be selected to free form mirror a drawing that the user is actively drawing in the VR environment.
  • the mirror brush can mirror such a drawing left to right, top to bottom, or any other 2D or 3D mirroring angle.
  • the mirror brush can replicate to any number of axes.
  • the mirror brush can be set to mirror across axes such that a 3D mirrored drawing can be replicated across all three axes in 3D.
  • the brush panel can include a brush selector in which brushes, patterns, colors, and textures can be selected for use in the VR space.
  • the selection mechanism can appear to the user in the VR space as a pointer that turns into a spherical shape indicating that selections are possible.
  • the user can position the sphere around to select a brush for example, the VR application 110 can provide a pop up or tool tip message that indicates sizes, colors, and other attributes.
  • the user can switch between brushes using the brush panel that can appear and reappear in 3D space as requested by the user.
  • FIG. 2 is a diagram that illustrates an HMD device 106 (or VR device) accessing VR content with a mobile device 102 , for example.
  • a user 202 may be accessing VR drawing system 108 by interfacing with content in system 108 (with controller 103 ).
  • the user 202 may be accessing a color palette 204 and may be drawing content within a panel 206 .
  • Color palette 204 and panel 206 are shown as dotted line figures because the depicted content is provided within the VR space that the user 202 is viewing in HMD 106 .
  • the user 202 can put on the HMD device 106 by placing the device 106 over the eyes of the user 202 .
  • the HMD device 106 can interface with/connect to mobile device 102 and/or controller 103 , for example, using one or more high-speed wired and/or wireless communications protocols (e.g., WiFi, Bluetooth, Bluetooth LE, USB, etc.) or by using an HEMI interface.
  • the connection can provide the content to the HMD device 106 for display to the user on a screen included in the device 106 .
  • One or more sensors can be included on controller 103 and can be triggered, by users accessing device 103 and HMD device 106 , to provide input to the VR space.
  • the sensors can include, but are not limited to, a touchscreen, accelerometers, gyroscopes, pressure sensors, biometric sensors, temperature sensors, humidity sensors, and ambient light sensors.
  • the controller 103 can use the sensors to determine an absolute position and/or a detected rotation of the controller 103 in the VR space that can then be used as input to the VR space.
  • the controller 103 may be incorporated into the VR space as a mobile phone, a paint brush, a pencil or pen, a drawing tool, a controller, a remote, or other object etc. Positioning of the controller 103 by the user when incorporated into the VR space can allow the user to position the mobile phone, paint brush, pencil or pen, drawing tool, controller, remote, or other object in the VR space.
  • one or more input devices can be used to access content and provide input to the VR space.
  • the input devices can include, but are not limited to, a touchscreen, a keyboard, one or more buttons, a trackpad, a touchpad, a pointing device, a mouse, a trackball, a joystick, a camera, and a microphone.
  • a user interacting with an input device can cause a particular action to occur in the VR space.
  • control mechanisms of objects in a virtual reality environment may not translate well to conventional input mechanisms.
  • Virtual reality environments are typically built in three dimensions, but conventional input mechanisms, like the computer keyboard and mouse, are built for two-dimensional environments.
  • the following describes a method of manipulating an object in a three-dimensional virtual reality environment with computer keyboard and two-dimensional input mechanisms.
  • a two-dimensional input mechanism can be, but is not limited to, a computer mouse, a stylus, or a touchpad.
  • FIG. 3 there is shown an object 302 that is viewable in a VR space 304 .
  • FIG. 4 shows the conventional computer keyboard 402 and mouse 404 input mechanisms.
  • FIG. 5 it is shown that the use of a key 502 on the keyboard 504 can cause a visible change on the object 506 in the VR space.
  • FIG. 6 shows the converse of this concept, in that when the use of the key 502 on the keyboard 504 has stopped, the visible change on the object will cease to be visible, as shown by unselected object 508 .
  • FIG. 7 it is shown that movement of the mouse 702 while the use of a key 704 on the keyboard 708 can rotate the object 710 .
  • the amount of movement 802 of mouse 804 a - c is directly related to the amount of rotation on the object, shown by object 806 a and object 806 b.
  • FIG. 10 it is shown that the use of a key 1002 on the keyboard 1004 may cause a visible change on the object 1006 in the VR space. If the user were to lift her finger 1008 , the object 1006 would be removed from view, thereby depicting the converse of the concept in FIG. 10 , in that when the use of the key on the keyboard has stopped being pressed, the visible change on the object will cease to be visible.
  • a drawing plane/cutting plane may be locked to the user's head position. In particular, if the user holds the shift key and leans back, the system 100 can bring the cutting plane forward. If the user holds the shift key on the keyboard and turns her head to the left, the cutting plane can be adapted to rotate leftward.
  • the system 108 can switch to a pivoting mode on the cutting plane.
  • the object and the plane depicting the object
  • the system 108 can generate a bisecting line on the cutting plane.
  • a movement (indicated by arrow 1102 ) of the user's head 1104 while a key 1106 on the keyboard 1108 is pressed can translate (e.g., move, tilt, change, etc.) an object 1110 in some fashion.
  • the amount of movement of the user's head is directly related to the amount of translation on the object.
  • the method by which the object 1110 is translated is shown in FIG. 12 and described as follows: the position of the user's head moves from the position shown at head position 1214 to the position shown at head position 1216 .
  • the position change from 1214 to 1216 is recorded when the use of the key on the keyboard begins.
  • the vector of change from this position change is used as input to the translation amount of the object in virtual reality.
  • the translation is applied to the object when the use of the key on the keyboard ends.
  • Example use cases of the object control with two-dimensional input feature include the ability to position dialog windows in a 3D program, a sketching surface for drawing, building walls in an architecture tool, or game elements in a video game.
  • the keyboard and mouse can allow movement in three dimensions.
  • the keyboard represents a 1-dimensional input while the mouse represents a 2-dimensional input.
  • the VR space is 3D
  • combining the keyboard and mouse can allow for movement in all three dimensions.
  • a 2-dimensional drawing/cutting plane can be accessed in the 3D VR space and when a user moves the mouse around, the pointer of the mouse (and a pointer or beginning point for creating drawings in the VR space) can move around on that plane.
  • a drawing can be generated by clicking and dragging the mouse and additional movements and cutting plane can be accessed by holding down certain keys on the keyboard.
  • the orientation and position of a particular cutting plane can be manipulated using keystrokes and mouse movement and input.
  • a user can begin to draw a side of a building and then can rotate the cutting plane (e.g., tilt the cutting plane) and then begin to draw or paint additional content, which can appear in the VR space as if the user is generating/drawing/painting two sides of the building.
  • the cutting plane may be locked to the user's head position.
  • the system 100 can bring the cutting plane forward. If the user holds the shift key on the keyboard and turns her head to the left, the cutting plane can be adapted to rotate leftward.
  • the user may select a stylus and use the stylus as a two-dimensional input mechanism to interact similar to the mouse input described above.
  • a mobile device can be used to draw in the VR space and can function to determine orientation information during use and to communicate such orientation data and movement data to VR application 110 , for example.
  • a user can perform a pinch, a swipe, a shift, a tilt, or other input to signal the VR application about object movements within the VR environment.
  • tilting the mobile device to the right may cause the object being drawn to be tilted to the right.
  • tilting the mobile device to the right may begin a rotation of the object to the right.
  • the user may select a controller, such as controller 103 , configured to provide three-dimensional input.
  • the controller can generally determine orientation and position information within the VR space.
  • a user can select one or more buttons or controls on the controller to activate movement of objects and drawing content within the VR space.
  • FIG. 13 there is shown an object 1302 a that is viewable in a VR space and is modified according to user manipulation of controller 1304 a .
  • controller 1304 a - b may cause a visible change on the object in the VR space. It also shows the converse of this concept, in that when the use of the button on the motion controller 1304 a has stopped, the visible change on the object (shown at 1302 b ) may cease to be visible.
  • rotation of a motion controller 1404 a to 1404 b while the use of a button on the motion controller can rotate a VR object 1402 a to 1402 b .
  • the amount of rotation 1406 on the motion controller may be directly related to the amount of rotation 1408 on the object 1402 b .
  • the method by which the object 1402 a is rotated includes recording the orientation of the motion controller when the use of the button on the motion controller 1404 a begins. The quaternion difference from this rotation is used as input to the rotation amount of the object in the VR space. The rotation is applied to the object 1402 a when the use of the button on the motion controller ends.
  • buttons on the motion controller 1404 a can cause a visible change on the object 1402 a in the VR space.
  • the visible change on the object shown by 1410 .
  • Example use cases of the object control with three-dimensional input feature include the ability to position dialog windows in a 3D program, a sketching surface for drawing, building walls in an architecture tool, or game elements in a video game.
  • rotation 1602 of the mobile device 1502 a to 1502 b while the use of a touch on the mobile device will rotate 1602 the object 1504 a to 1504 b , as shown by rotation 1602 .
  • the amount of rotation 1602 on the mobile device is directly related to the amount of rotation 1604 on the object.
  • the orientation of the mobile device is recorded when the use of the touch on the mobile device begins.
  • the quaternion difference from this rotation is used as input to the rotation amount of the object in virtual reality.
  • the rotation is applied to the object when the use of the touch on the mobile device ends.
  • FIG. 17 it is shown that the use of a touch on the mobile device 1502 a can cause a visible change on the object 1504 a in the VR space.
  • the converse of this concept may also hold, that when the use of the touch on the mobile device has stopped (at 1502 b ), the visible change on the object may cease to be visible (at 1504 b ). Movement of the mobile device while a touch on the mobile device is used will translate the object. The amount of movement of the mobile device may be directly related to the amount of translation on the object.
  • Example use cases of the object control with smartphone feature include the ability to position dialog windows in a 3D program, a sketching surface for drawing, building walls in an architecture tool, or game elements in a video game.
  • Color is a three dimensional quantity but is traditionally represented on a two dimensional surface.
  • the following description includes a method of representing, and choosing, a color in a 3D VR space by representing that color in 3D.
  • Color can be a complex concept, and may commonly be reduced to a 3D color space for use in computer graphics. By defining a color space, colors can be identified numerically by particular coordinates. In virtual reality, true 3D color objects can be generated and manipulated.
  • In order to render the color space without obscuring colors inside the volume, users can select a two-dimensional slice by positioning a cross section inside the volume. Colors in front of the slice should not be represented, as they will obscure colors on the cross section. Colors behind the cross section can be rendered with partial, or full, transparency.
  • one way of positioning the cross section is to define its position as one of the color space axis 1902 , 1904 a , and 1904 b .
  • Manipulating the value of one or more of these axes changes the position in 3D space of the cross section from cross section 1906 a to cross section 1906 b .
  • the colors on the cross section 1906 b updates accordingly.
  • a single 2D position on the cross section 1906 a combined with the value of that third axis 1904 a or 1904 b , fully describes the coordinate of a desired color.
  • the cross section 1906 b can also be positioned in other ways described above with respect to object control. For example, motion controller or head position can be used to manipulate the cross section.
  • the concept of a three dimensional color space, and color picker can be expanded to contain information more complex than just the three-dimensional color spaces.
  • colors that are in use in the VR drawing system 108 in the VR space are highlighted allowing users to visualize their color palette in three dimensions.
  • Hues 2004 are shown here on the right side of color space 2002 .
  • the hues range from lightness/bright or lightness and darkness, which can be depicted as a color, shade, or numerical value. Sliding a hue slider (not shown) up and down can cause the color palette 2006 a to physically (e.g., virtually) move in the VR space, as shown by growing color palette 2006 b.
  • a screenshot 2100 shows a color palette 2100 that can be depicted in 3D and as a 3D colorbox (e.g., color volume).
  • the palette 2100 may appear 3D to the user and be modeled and depicted as a cross-sectioned space 2102 representing a color selector.
  • space 2102 may be represented as a cross-section of a cube that translates according to a user-selectable hue and then the texture on the cross-section updates color according to the position that it is cross-sectioning the cube.
  • a user can select a hue 2104 to begin painting the drawing 2106 and can reselect additional hues to change colors and begin drawing in the reselected hues, accordingly.
  • hues may be textures, rather than colors.
  • the quads described above can be generated with a number of textures.
  • a 3D geometry can be applied to the quads as the drawings are generated by a user.
  • brightness e.g., ultraviolet numerical values
  • the system 100 can be used to stretch ultraviolet hues from 0 (darkness) to 1 (sunlight brightness) across an entire brushstroke.
  • the system 100 can repeat a swatch of UV from 0 to 1 by resetting the hue to generate a repeating light swatch of quads.
  • Color may be represented on a cube in triangular form.
  • a bottom left corner of a portion of a cube may be a triangle in which one vortex (triangle tip) is colored one hue, while the remaining portions of the triangle fade to additional hues.
  • the vortex color of the triangles in each quad are shown as a hue/color that the user has selected.
  • the vortex may be tinted blue so that a brush stroke painted with such a hue-texture combination can be shown as blue.
  • GUIs Graphical User Interfaces
  • VR spaces are built in three dimensions, but conventional GUIs are built for two-dimensional screens.
  • the following is a method for displaying and interacting with a GUI in a VR space.
  • FIG. 22 shows a user 2202 accessing a VR space with an HMD device.
  • the content and panels shown floating around the user 2202 depicts an example of what the user would see within the VR space.
  • a relationship between input devices HMD device 2204 , keyboard 2206 , and mouse 2208 ) and a pointer 2210 .
  • a GUI panel 2212 is shown and is represented as a two-dimensional object in the VR space.
  • the GUI panel 2212 contains selectable components (e.g., controls showing numbers 1 , 2 , 3 , and 4 ) as well as a representation of the user pointing device 2210 .
  • selectable components e.g., controls showing numbers 1 , 2 , 3 , and 4
  • input from the user's two-dimensional input device can be mapped directly to the location of the pointing device on the two-dimensional object in the VR space, as shown by arrow 2214 .
  • GUI panel 2216 is shown as an example of a menu shown as a three-dimensional object in a VR space.
  • the user can select controls, buttons, or content shown on panel 2216 in a similar fashion to panel 2212 .
  • the user can move panels 2212 and 2216 and alternatively can move around panels 2212 and 2216 .
  • GUI panels can be, but are not limited to, two-dimensional objects or three-dimensional objects.
  • user input devices can be, but are not limited to, two-dimensional, like a computer mouse, three-dimensional, or one-dimensional, in the case of a keyboard key, as described in detail above.
  • a user 2302 is shown inside a VR space facing away (arrow 2304 ) from a GUI panel 2306 .
  • the GUI panel 2306 is in a ‘deactivated’ state, and as such, may be unresponsive to user inputs.
  • FIG. 24 it shows the user 2302 inside a VR space, facing the GUI panel 2402 .
  • the GUI panel 2402 is in an ‘active’ state, and may be responsive to user inputs.
  • the GUI panel 2306 has changed size when switching to the active state, shown by GUI panel 2402 .
  • Other characteristics the GUI panel can change when switching from deactivated to active include, but are not limited to, shape, color, orientation, ambient sound, and position.
  • the method by which a GUI panel switches from deactivated to active is as follows: the current user position and orientation is used to construct a ray, with the user position being the ray origin and the user facing direction being the ray direction. This ray is cast into the scene and checked for intersection with the GUI panels. Upon receiving a successful intersection with a panel, that panel switches from the deactivated state to the active state. In the event that the ray intersects with two panels, the panel nearest the ray origin may be selected. Once a panel is in the active state, it may remain in the active state until the current user orientation ray does not intersect with the panel. At that point the panel may switch from the active state to the deactivated state.
  • FIG. 25 it shows a GUI panel in an active state, with the representation of the user's pointing device (e.g., the pointer 2502 ) visibly modified to show that it is in a transformation state.
  • the modification of the pointer 2502 to show that it is in a transformation state can be, but is not limited to, change of size, shape, color, orientation, position, or ambient sound.
  • the pointer 2502 can be switched to the transformation state by pressing a key on a keyboard, pressing a mouse on a mouse button, or pressing a button on a three-dimensional input device.
  • the GUI panel 2504 can be transformed with additional input from the user.
  • FIG. 26 shows the user 2602 rotating the orientation of a GUI panel 2604 by rotating a three-dimensional input device 2606 while the pointer 2608 is in a transformation state. This can result in a rotation of panel 2604 to end up in a position shown by panel 2610 . It is also shown that the amount of rotation on the three-dimensional input device 2606 may be directly related to the amount of rotation on the GUI panel 2604 .
  • GUI panel The types of transformations a user can apply to a GUI panel include, but are not limited to, translation, rotation, scale, color adjustment, ambient sound, or a state change relative to the function of the GUI panel as applied to the VR space.
  • Example use cases of the graphical user interface feature include the ability to position dialog windows in a 3D program, the ability to emulate multiple monitors inside a VR space, and the ability to organize items, such as photos or file folders, inside a VR space.
  • FIG. 27 shows a user 2702 creating a set of objects 2704 , 2706 in a VR space.
  • FIG. 28 shows the user 2702 defining this set of objects 2704 , 2706 as a frame.
  • the user 2702 can clear content from her workspace at anytime. For example, the user 2702 can clear the VR space of all objects by swiping them away or removing them to another area or memory location.
  • FIG. 29 it is shown that a sequence of frames 2902 is displayed to the user in the VR space as a series of objects 1 - 5 .
  • FIG. 30 shows these frames can be re-ordered, duplicated, removed, and otherwise manipulated, as shown by frames 3002 .
  • FIG. 31 shows these frames 2902 and/or 3002 each have data 3102 associated with them, including, but not limited to, a name, a display duration, and a reference to the next frame or possible frames to show.
  • Frames 3202 , 3204 , and 3206 represent VR space manipulation that the user performed, as shown in FIGS. 27 and 28 .
  • the frame playback may repeat from the beginning, stop, or have other behaviors, determined by the configuration of the VR space.
  • FIG. 33 shows a user 3302 manipulating the location of a sound object 3304 in the VR space (i.e., moving the sound object to location 3306 when played).
  • the user 3302 defines a series of objects as a frame and one of the objects is a sound object, the sound can be played from that 3D position in the VR space when the frame is viewed during playback.
  • Other types of invisible frame objects include, but are not limited to, two-dimensional sound, particle effects, full-screen visual effects, request for input from user, input device haptic feedback, launching of an external application, network connection of another user, and loading of another sequence of frames.
  • frame 3402 can be combined with frame 3404 and frame 3408 to generate an animated sequence.
  • frame 3402 can be combined with 3406 and frame 3410 or 3412 to make additional animated sequences.
  • Other combinations are possible and the user can combine any or all data generated within the VR space.
  • Example use cases of the animation feature is the ability to create a slideshow of images for a presentation, tell a story through a sequence of frames like a comic book, show an animated character or scene, create a choose-your-own-adventure story, or connect users that are viewing the same type of material.
  • a user 3502 is shown creating and importing objects 3504 and 3506 inside the VR space.
  • the user can import a three-dimensional model 3504 into the environment from the user's computer, a networked location, or a location on the Internet.
  • FIG. 36 it is shown the user can create objects 3602 inside the VR space with a three-dimensional model 3504 imported. These objects can be saved to a file on the user's computer with a reference to the imported three-dimensional model, so that if it is loaded at another time or on another user's computer, the VR space can import the appropriate three-dimensional model.
  • the method by which the two-dimensional image is mapped to the three-dimensional objects is as follows: the two-dimensional image is placed at a location inside the three-dimensional VR space.
  • the two-dimensional image is given a three-dimensional bounding box, of appropriate size relative to the height and width of the image. This bounding box is not restricted from change by the user and follows the position and orientation of the image.
  • the textures on the object are replaced by the image, with UV coordinates relative to the world space pixel positions inside the bounding box, projected from the image.
  • the user 3502 can bring in images such as bitmap files, jpeg files, png files or a 3D models or images from another application.
  • the user can draw and/or annotate on those images.
  • the user can import blueprints for a house plan.
  • the user and his architect for example, can annotate and modify (in 2D and 3D) the blueprints in the VR space.
  • the user can scale the model to life-sized and walk around within the drawn model to get a feel for actual content spacing, etc.
  • the imported content can be shrunk or grown according to user input.
  • the user can share the experience with one or more users. Multiple users can access the VR space to view the imported files and can begin to annotate the imported objects for other users to view. Such annotations can be near real time and other users can spectate while a first user performs modifications or annotations on the imported objects.
  • the system 100 can store user-generated objects and modifications and mark-ups performed on such objects.
  • the object and the VR space can be stored.
  • the system 100 can save all input positions that correlate to user performed brush strokes, object annotations, object generations, etc.
  • the input positions may include information such as a position in the VR space and a pointer location.
  • the system 100 can store which brush or brush sizes generated particular content. Accordingly, when system 100 loads the stored sketch/object, rather than simply providing the sketch as a new object in the VR space, the system 100 understands how the content was created and so the system 100 can draw such content in real time. For markup/annotations, a thought process, flow, or reasoning can be gained because the content is generated in the VR space in the same way the original user generated the content.
  • annotations performed by users on a map may be useful to other users hiking near locations on the annotated map.
  • a hiker can access the VR space to view the annotated map to find amenities or locations before continuing in a specific direction.
  • Other example use cases of the mark up feature include the ability to add notations to three-dimensional objects such as maps, architecture, or 3D models, or to build depth into two-dimensional images.
  • the user cannot see real world objects, such as input devices or other humans, while in a three-dimensional VR space.
  • the following is a method for selectively providing a representation of the real world environment inside of a VR space based on various user triggered, or environmentally triggered, events.
  • FIG. 37 an example of the type of HMD hardware 3700 used for an augmented virtual reality world with elements of the real world.
  • it is a pair of cameras placed at the user's eye position.
  • FIG. 38 an example of a typical side by side stereo VR space is shown. This example depicts a left eye view and a right eye view as a user would see content in a VR space using an HMD device, for example.
  • FIG. 39 shows the VR space and real world environment combined into a single view. Rendering the VR space on top of the real world environment is a good approximation of the order objects should appear, but if depth data is available for the real world it should be properly sorted with the virtual environment.
  • FIG. 40 it is shown a user can trace real world objects 4002 in three-dimensions in the virtual world. This is an example of life, or reference, drawing in three-dimensions.
  • FIG. 41 it is shown that the user or software can disable the real life augmented reality view, revealing just the three-dimensional representation of that real life environment that is captured in the drawing application.
  • the system 108 removed the real world table from the VR space after the user drew the dragon 4102 .
  • FIG. 42 Various methods for activating the augmented reality environment are now shown.
  • the user presses the spacebar to reveal the real world environment.
  • Any other combination of keys, in application menu options, voice commands, or other natural motion gestures could also manually activate the augmented reality environment.
  • FIG. 43 shows how the software activates the augmented reality environment when users leave, or come within a certain tolerance, of that boundary.
  • FIG. 44 Another method for activating the augmented reality environment is shown in FIG. 44 .
  • the user's view direction is used to reveal the real world environment as they turn away from forward direction defined by the application.
  • This also shows that the software can render augmented reality elements at partial opacity, for example when the user is looking ninety degrees away from the forward vector.
  • FIG. 45 a more sophisticated method of activating the augmented reality environment is shown.
  • the user looks down, and shape recognition is performed on the stereo camera video feed looking for a specific match, in this case a keyboard. If a keyboard shape is detected, then the augmented reality environment is shown. Because the location of the keyboard in the stereo camera feed is known due to the shape recognition step, it can selectively be shown while other elements of the augmented reality environment are not.
  • FIG. 46 another method of activating the augmented reality environment is shown.
  • image recognition is performed looking for markers, or fiducials, in the stereo camera feed.
  • the augmented reality environment is shown. Similar to FIG. 45 , with additional markers placed on the input devices, the software can estimate the screen space extents of those markers, and the input device can be shown while the remaining augmented reality environment is not.
  • FIG. 46 it is shown how the above techniques can be used to isolate specific objects in the real world, and augment the VR space with only those objects. It is noted that any three-dimensional positional or orientation information from real world objects can be used to selectively reveal those objects, via augmented reality, in the virtual world.
  • a properly tracked six-axis motion controller can have its position and size projected onto screen space—then used as a mask to reveal the real world video feed.
  • the inverse is also true. Given the three-dimensional position and/or orientation of real world objects identified via the stereo cameras, these objects can be rendered as virtual objects, inside the virtual world.
  • the software can render a representation of a keyboard or mouse, identified through stereo cameras or any other method, with additional controls and other metadata rendered alongside it, as shown in FIGS. 47 and 48 .
  • stereo video from the real world inside of virtual reality applications has a variety of valuable uses. Some examples include, but are not limited to: For safety—If the user gets out of the chair they should be able to see environment obstacles. For convenience—If the user wants to find an input device like the keyboard to compose an email, they can use AR to do so without removing their HMD.
  • For tracking by tracking the position and orientation of fiducial and other markers placed on real world object, representations of those objects can be brought into the virtual world. Alternatively, the stereo video feed of those objects can be brought into the virtual world.
  • a keyboard used by a flight simulator could have the input controls represented as virtual tags above the keyboard.
  • a user's physical body is not meaningfully represented in a VR space.
  • the following is a method of capturing a representation of a user's body suitable for display in a VR space.
  • the device used for capture can be, but not limited to, a depth-sensing camera, a light-sensing camera of any wavelength, or any combination.
  • a user in the VR space is outfitted with an HMD device for viewing virtual content and for detecting eye movements, head movements, etc.
  • an HMD device for viewing virtual content and for detecting eye movements, head movements, etc.
  • each user's head or full body can be represented and located within the VR space.
  • a depth scanning camera can be used to user locations and body part locations. From this information, the system 100 can attach wardrobe pieces, jewelry, sketched clothing, accessories, costumes, etc.
  • system 100 can track 6 degrees of freedom using 3D position tracked controllers in combination with the HMD device.
  • the VR drawing system 108 allows for generating brushstrokes in midair and in 3D, and those strokes can be roughly assigned to a portion of a user's body. When that portion of the user's body is transformed, rotated, walking around, the brushstrokes can move relative to the body part movement.
  • One example can include a first user represented in the VR space drawing on or near other users represented in the VR space.
  • the first user can add costumes, horns, clothing, wigs, objects, or other drawing object or content to other users in the VR space.
  • Another aspect of body representation can include allowing a user to draw with another body part, instead of holding a brush or tool in their hand.
  • the user may invoke entire body movements to draw content in the VR space.
  • FIG. 49 there is shown a user at a computer in a VR space with a depth-sensing camera 4904 facing the user.
  • the depth-sensing camera 4904 is active and recording input.
  • FIG. 50 there is shown another user 5002 , at a different computer, connected to the first user via direct cable or Internet 5004 .
  • FIG. 51 and FIG. 52 there is shown a networked user 5102 in a VR space with and a local user 5104 displayed in that environment as a three-dimensional object.
  • the method by which the users are shown in the VR space is as follows: the depth-sensing camera facing the user is recording input and sending it to the other user's VR space via direct cable or Internet.
  • the VR space receiving the input displays a three-dimensional representation of the player.
  • FIG. 51 and FIG. 52 show two particular useful methods.
  • the users are shown as point clouds.
  • Each point represents a three dimensional x, y, and depth position detected by the camera. Depending on the light-sensing properties of the depth-sensing camera, these points could be colored to reflect the actual color of the user, or any other color for stylistic purposes.
  • the mesh is scaled and positioned according to the specifications of the VR space.
  • the users are shown as polygonal meshes, where each control point position defined by the x and y coordinates of the image and the z coordinate taken from the depth calculation. Certain polygonal edges are hidden in areas of the mesh where there is not enough information to accurately rebuild the three-dimensional shape at that point. Depending on the light-sensing properties of the depth-sensing camera, this mesh could be colored to reflect the actual color of the user, or any other color for stylistic purposes.
  • the mesh is scaled and positioned according to the specifications of the VR space.
  • FIG. 53 there is shown a user 5302 at a computer in a VR space with a light-sensing camera 5304 facing them.
  • the light-sensing camera 5304 is active and recording input.
  • FIG. 54 there is shown a user in a VR space with another user displayed in that environment as a two-dimensional object.
  • the method by which the user is shown in the VR space is as follows: the light-sensing camera facing the user is recording input and sending it to the other user's VR space via direct cable or Internet.
  • the VR space receiving the input displays a two-dimensional mesh, with each control point position defined by the x and y coordinates of the image. Depending on the light-sensing properties of the camera, this mesh could be colored to reflect the actual color of the user, or any other color for stylistic purposes.
  • the mesh is scaled and positioned according to the specifications of the VR space.
  • the method for accessorizing a user is to obtain the transform (position/rotation/scale) of a body part, either by accessing that user's skeleton data if that information is provided by the camera API, or by identifying a specific, tracked feature (for example the head) if that information is provided by the camera API. Once the transform is obtained, any render-able entity can be parented to that transform with an appropriate child transform.
  • the player is stylized to be shown wearing a hat.
  • the transform is used to rescale the body representation of the head, shrinking it in size. This type of effect and can be achieved by attaching a render-able object that uses distortion effects to shrink the objects that appear behind it, or by modifying the raw body representation data before the player is rendered.
  • the third example shows the player's body representation rendered using a stylized, shader-based effect.
  • Example use cases of the body representation feature are the ability to show a representation of multiple users inside a VR space, to allow a viewing space for three-dimensional users, and to support augmentation of a user while they are in a VR space. Users can also customize their appearance, and other users' appearance, by stylizing and accessorizing the body representations.
  • Collaboration and spectating are difficult in a VR space. Collaboration and spectating would require users to be in the same location, alternating turns viewing the VR space.
  • the following is a method of allowing multiple users to be able to collaborate and spectate in a VR space.
  • a user may wish to share a drawing session or completed drawing with other users via networking.
  • the system 100 can enable positioning of a sharing user in the 3D VR space relative to another user who wishes to view the sharing user's content. If the sharing user wishes to speak along with draw, the system 100 can provide audio sounds from a portion of the VR space that the sharing user is speaking.
  • an instructor wishes to draw a solar system in the classroom to show size and distance information, she can do so and draw planets and a sun.
  • the students may be in the VR space viewing the objects that the instructor is creating.
  • the instructor may be standing near the sun and asking the students questions.
  • the questions would be provided as audio content that is provided near the area that the professor is standing.
  • FIG. 56 there is shown a user at a computer, interacting with a VR space.
  • FIG. 57 shows another user, or multiple users, at different computers, interacting with a VR space.
  • FIG. 56 shows the flow with which input is received by the input device of the user and interpreted by the VR space.
  • FIG. 58 shows the users have established a connection with their computers via direct cable or through the Internet.
  • the input generated from one user's input device is interpreted by the other users' VR space.
  • movement from the input device of the user has created an object in the other user's VR space.
  • the method by which the object has been created is as follows: the input generated by the user's input devices is recorded by the user's VR space and sent to the other users' VR space via direct cable or Internet.
  • the VR space receiving the input from the other users interprets this as direct input into its environment, acting as if the current user had modified the environment.
  • FIG. 60 it is shown that if a user is using a computer microphone and a connected user is using computer speakers or headphones, sounds recorded by the microphone are played through the computer speakers or headphones.
  • FIG. 61 we see in further detail that the sound is played from a specific location inside the VR space, determined by the location of the user in that environment.
  • each player is given a specific location inside the VR space. This position can be augmented by movements the user makes in the real world.
  • the sounds are played through the speakers of the other user's at the augmented location of the user in the VR space.
  • Example use cases of this feature is the ability to give a teaching class, to read from a book to a group of people, for multiple people to simultaneously share details on an idea inside a VR space.
  • FIG. 63 is a flow chart diagramming one embodiment of a process to provide a virtual reality environment (e.g., a VR space) in a three-dimensional environment in which a user can generate three-dimensional drawings.
  • a virtual reality environment e.g., a VR space
  • the system 100 can produce a representation of a display of a three-dimensional virtual reality environment.
  • the system 100 can also define a number of virtual areas configured to be modified as well as at least one three-dimensional drawing plane within the virtual reality environment.
  • the system 100 can generate a 3D VR space with an interactive graphical user interface in which a user can generate (e.g., draw) 3D content (e.g., 3D objects) and interact with such content.
  • the system 100 can provide a plurality of toolsets in the virtual reality environment.
  • the toolsets may be configured to receive interactive commands from at least one two-dimensional input device coupled to a computing device and associated with a user.
  • the system 100 can generate a three-dimensional drawing in at least one of the plurality of virtual areas, in response to detecting a toolset selection and a movement pattern from at least one two-dimensional input device.
  • the at least one two-dimensional input device may include a mouse, a keyboard, a mobile device, a tablet pen, or any combination thereof.
  • the three-dimensional drawing may be generated according to the movement pattern and depicted in the at least one virtual area as an object being drawn on the three-dimensional drawing plane.
  • Input from a user operating the two-dimensional input device can cause the display of the drawing to be generated.
  • movement patterns and additional movement patterns may be provided as a hand gestures simulating brush strokes.
  • the system 100 can tilt the at least one virtual area in a direction associated with at least one of the additional movement patterns to enable the user to generate a modified three-dimensional drawing. This may be in response to detecting additional movement patterns indicating a change to the drawing plane.
  • the system 100 can generate one or more three-dimensional brush strokes, in response to receiving a number of additional movement patterns indicating drawing motions.
  • Each drawing motion may include at least one initial location, direction, and final location.
  • Each brush stroke can be generated according to each drawing motion using a tool selected from one of the toolsets.
  • Each of the one or more three-dimensional brush strokes may be generated and displayed in real time in the at least one panel area and on the three-dimensional drawing plane according to the plurality of additional movement patterns.
  • the three-dimensional drawing plane may be configured to be a planar drawing guide rotatable on at least three axes for the user to draw within the virtual reality environment.
  • the method 6300 also includes providing selectable portions on the three-dimensional drawing plane to enable the user to simulate moving to other virtual areas within the virtual reality environment surrounding the three-dimensional drawing to view another perspective of the three-dimensional drawing and to modify another perspective of the modified three-dimensional drawing.
  • the method 6300 may also include having system 100 provide a network interface for multiple computing devices to participate in the virtual reality environment shared by the multiple computing devices, wherein providing the network interface includes enabling multiple users each using at least one uniquely identified two-dimensional input device to collaborate in the virtual reality environment, create drawings in a shared virtual reality environment, and to affect change in the shared virtual reality environment.
  • the method 6300 can include exporting the three-dimensional space for another computing device to provide access to the three-dimensional space for another user or system to navigate therein, or to print the three-dimensional space using a three-dimensional printer.
  • the exporting may be for filmmaking, rapid prototyping, game making, storytelling, or any combination thereof.
  • additional movement patterns can be received from a user.
  • the patterns may be from an input feed associated with the user of an input device, such as a game controller, a mobile device, a mouse, etc.
  • the systems described herein can tilt at least one virtual area in a direction associated with the movement pattern.
  • the VR application 110 in combination with the movement tracking module 112 can function to receive user input and trigger a tilting of a portion of the graphical user interface in the VR application 110 .
  • the tilt may correspond to a degree of user movement, for example.
  • the system 100 can, in response to receiving the user-generated input, generate one or more three-dimensional brush strokes using the tool selected from the toolset.
  • Each of the one or more three-dimensional brush strokes may be generated and displayed in real time in a panel area beginning at an initial location, generated toward a corresponding direction, and ending at a corresponding final location associated with each of the one or more three-dimensional brush strokes.
  • the number of virtual areas represent portions of the virtual reality environment configured to receive user input.
  • the toolset includes selectable input mechanisms to generate a brushstroke, generate a drawing, create an object, modify an object, delete an object, clone an object, import an object, or any combination thereof, in at least one virtual area in the three-dimensional space.
  • a two-dimensional representation of a physical body of the user can be converted into a three-dimensional representation in the three-dimensional space and animating the three-dimensional representation in the three-dimensional space.
  • the conversion and animation may occur in response to detecting, with the input device, a movement of the physical body of the user, the movement associated with user interactions with content in the three-dimensional space.
  • the system 100 can then enable transformation of the three-dimensional representation based on the user interactions in the three-dimensional space, including rescaling, styling, and instantaneous accessorizing. This can allow the user to generate content in the VR space and then resize such content during a VR session or at a future VR session.
  • user content can be accessed and shared between other VR users.
  • the system 100 can provide a network interface for multiple computing devices to participate in the three-dimensional space shared by the multiple computing devices.
  • the interface can enable multiple users each using one or more uniquely identified input devices to collaborate in the three-dimensional space.
  • the interface can also allow users to interact with three-dimensional objects in a shared three-dimensional space and to affect change in the shared three-dimensional space.
  • the system 100 can generate a set of three-dimensional objects to represent a number of animations inside the three-dimensional space for the user to interact with.
  • FIG. 64 is an example screenshot 6400 representing a drawing plane and a selected brush panel in a VR space.
  • FIG. 65 is an example screenshot 6500 representing a drawing plane and a selected color panel in a VR space.
  • FIG. 66 is an example screenshot 6600 representing a tilted drawing plane with user drawn content depicted in the plane and user drawn content in a second plane in a VR space.
  • FIG. 67 is an example screenshot 6700 representing user drawn content in a VR space.
  • FIG. 68 is another example screenshot 6800 representing user drawn content in a VR space.
  • FIG. 69 is another example screenshot 6900 representing user drawn content in a VR space.
  • FIG. 70 shows an example of a generic computer device 7000 and a generic mobile computer device 7050 , which may be used with the techniques described here.
  • Computing device 7000 includes a processor 7002 , memory 7004 , a storage device 7006 , a high-speed interface 7008 connecting to memory 7004 and high-speed expansion ports 7010 , and a low speed interface 7012 connecting to low speed bus 7014 and storage device 7006 .
  • Each of the components 7002 , 7004 , 7006 , 7008 , 7010 , and 7012 are interconnected using various busses, and may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 7002 can process instructions for execution within the computing device 7000 , including instructions stored in the memory 7004 or on the storage device 7006 to display graphical information for a GUI on an external input/output device, such as display 7016 coupled to high speed interface 7008 .
  • an external input/output device such as display 7016 coupled to high speed interface 7008 .
  • multiple processors and/or multiple buses may be used, as appropriate, along with multiple memories and types of memory.
  • multiple computing devices 7000 may be connected, with each device providing portions of the necessary operations (e.g., as a server bank, a group of blade servers, or a multi-processor system).
  • the memory 7004 stores information within the computing device 7000 .
  • the memory 7004 is a volatile memory unit or units.
  • the memory 7004 is a non-volatile memory unit or units.
  • the memory 7004 may also be another form of computer-readable medium, such as a magnetic or optical disk.
  • the high speed controller 7008 manages bandwidth-intensive operations for the computing device 7000 , while the low speed controller 7012 manages lower bandwidth-intensive operations.
  • the high-speed controller 7008 is coupled to memory 7004 , display 7016 (e.g., through a graphics processor or accelerator), and to high-speed expansion ports 7010 , which may accept various expansion cards (not shown).
  • low-speed controller 7012 is coupled to storage device 7006 and low-speed expansion port 7014 .
  • the low-speed expansion port which may include various communication ports (e.g., USB, Bluetooth, Ethernet, wireless Ethernet) may be coupled to one or more input/output devices, such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • input/output devices such as a keyboard, a pointing device, a scanner, or a networking device such as a switch or router, e.g., through a network adapter.
  • Computing device 7050 includes a processor 7052 , memory 7064 , an input/output device such as a display 7054 , a communication interface 7066 , and a transceiver 7068 , among other components.
  • the device 7050 may also be provided with a storage device, such as a microdrive or other device, to provide additional storage.
  • a storage device such as a microdrive or other device, to provide additional storage.
  • Each of the components 7050 , 7052 , 7064 , 7054 , 7066 , and 7068 are interconnected using various buses, and several of the components may be mounted on a common motherboard or in other manners as appropriate.
  • the processor 7052 can execute instructions within the computing device 7050 , including instructions stored in the memory 7064 .
  • the processor may be implemented as a chipset of chips that include separate and multiple analog and digital processors.
  • the processor may provide, for example, for coordination of the other components of the device 7050 , such as control of user interfaces, applications run by device 7050 , and wireless communication by device 7050 .
  • Processor 7052 may communicate with a user through control interface 7058 and display interface 7056 coupled to a display 7054 .
  • the display 7054 may be, for example, a TFT LCD (Thin-Film-Transistor Liquid Crystal Display) or an OLED (Organic Light Emitting Diode) display, or other appropriate display technology.
  • the display interface 7056 may comprise appropriate circuitry for driving the display 7054 to present graphical and other information to a user.
  • the control interface 7058 may receive commands from a user and convert them for submission to the processor 7052 .
  • an external interface 7062 may be provide in communication with processor 7052 , so as to enable near area communication of device 7050 with other devices.
  • External interface 7062 may provide, for example, for wired communication in some implementations, or for wireless communication in other implementations, and multiple interfaces may also be used.
  • the memory 7064 stores information within the computing device 7050 .
  • the memory 7064 can be implemented as one or more of a computer-readable medium or media, a volatile memory unit or units, or a non-volatile memory unit or units.
  • Expansion memory 7074 may also be provided and connected to device 7050 through expansion interface 7072 , which may include, for example, a SIMM (Single In Line Memory Module) card interface.
  • SIMM Single In Line Memory Module
  • expansion memory 7074 may provide extra storage space for device 7050 , or may also store applications or other information for device 7050 .
  • expansion memory 7074 may include instructions to carry out or supplement the processes described above, and may include secure information also.
  • expansion memory 7074 may be provide as a security module for device 7050 , and may be programmed with instructions that permit secure use of device 7050 .
  • secure applications may be provided via the SIMM cards, along with additional information, such as placing identifying information on the SIMM card in a non-hackable manner.
  • the memory may include, for example, flash memory and/or NVRAM memory, as discussed below.
  • a computer program product is tangibly embodied in an information carrier.
  • the computer program product contains instructions that, when executed, perform one or more methods, such as those described above.
  • the information carrier is a computer- or machine-readable medium, such as the memory 7064 , expansion memory 7074 , or memory on processor 7052 , that may be received, for example, over transceiver 7068 or external interface 7062 .
  • Device 7050 may communicate wirelessly through communication interface 7066 , which may include digital signal processing circuitry where necessary. Communication interface 7066 may provide for communications under various modes or protocols, such as GSM voice calls, SMS, EMS, or MMS messaging, CDMA, TDMA, PDC, WCDMA, CDMA2000, or GPRS, among others. Such communication may occur, for example, through radio-frequency transceiver 7068 . In addition, short-range communication may occur, such as using a Bluetooth, Wi-Fi, or other such transceiver (not shown). In addition, GPS (Global Positioning System) receiver module 7070 may provide additional navigation- and location-related wireless data to device 7050 , which may be used as appropriate by applications running on device 7050 .
  • GPS Global Positioning System
  • Device 7050 may also communicate audibly using audio codec 7060 , which may receive spoken information from a user and convert it to usable digital information. Audio codec 7060 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 7050 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 7050 .
  • Audio codec 7060 may receive spoken information from a user and convert it to usable digital information. Audio codec 7060 may likewise generate audible sound for a user, such as through a speaker, e.g., in a handset of device 7050 . Such sound may include sound from voice telephone calls, may include recorded sound (e.g., voice messages, music files, etc.) and may also include sound generated by applications operating on device 7050 .
  • the computing device 7050 may be implemented in a number of different forms, as shown in the figure. For example, it may be implemented as a cellular telephone 7080 . It may also be implemented as part of a smart phone 7082 , personal digital assistant, or other similar mobile device.
  • implementations of the systems and techniques described here can be realized in digital electronic circuitry, integrated circuitry, specially designed ASICs (application specific integrated circuits), computer hardware, firmware, software, and/or combinations thereof.
  • ASICs application specific integrated circuits
  • These various implementations can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which may be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • the systems and techniques described here can be implemented on a computer having a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to the user and a keyboard and a pointing device (e.g., a mouse or a trackball) by which the user can provide input to the computer.
  • a display device e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor
  • a keyboard and a pointing device e.g., a mouse or a trackball
  • Other kinds of devices can be used to provide for interaction with a user as well; for example, feedback provided to the user can be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user can be received in any form, including acoustic, speech, or tactile input.
  • the systems and techniques described here can be implemented in a computing system that includes a back end component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front end component (e.g., a client computer having a graphical user interface or a Web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such back end, middleware, or front end components.
  • the components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include a local area network (“LAN”), a wide area network (“WAN”), and the Internet.
  • LAN local area network
  • WAN wide area network
  • the Internet the global information network
  • the computing devices depicted in FIG. 70 can include sensors that interface with a virtual reality (VR headset 7090 ).
  • VR headset 7090 virtual reality
  • one or more sensors included on a computing device 7050 or other computing device depicted in FIG. 70 can provide input to VR headset 7090 or in general, provide input to a VR space.
  • the sensors can include, but are not limited to, a touchscreen, accelerometers, gyroscopes, pressure sensors, biometric sensors, temperature sensors, humidity sensors, and ambient light sensors.
  • the computing device 7050 can use the sensors to determine an absolute position and/or a detected rotation of the computing device in the VR space that can then be used as input to the VR space.
  • the computing device 7050 may be incorporated into the VR space as a virtual object, such as a controller, a laser pointer, a keyboard, a weapon, etc.
  • a virtual object such as a controller, a laser pointer, a keyboard, a weapon, etc.
  • Positioning of the computing device/virtual object by the user when incorporated into the VR space can allow the user to position the computing device to view the virtual object in certain manners in the VR space.
  • the virtual object represents a laser pointer
  • the user can manipulate the computing device as if it were an actual laser pointer.
  • the user can move the computing device left and right, up and down, in a circle, etc., and use the device in a similar fashion to using a laser pointer.
  • a touchscreen of the computing device 7050 can be rendered as a touchpad in VR space.
  • a user can interact with the touchscreen of the computing device 7050 .
  • the interactions are rendered, in VR headset 7090 for example, as movements on the rendered touchpad in the VR space.
  • the rendered movements can control objects in the VR space.
  • one or more output devices included on the computing device 7050 can provide output and/or feedback to a user of the VR headset 7090 in the VR space.
  • the output and feedback can be visual, tactical, or audio.
  • the output and/or feedback can include, but is not limited to, vibrations, turning on and off or blinking and/or flashing of one or more lights or strobes, sounding an alarm, playing a chime, playing a song, and playing of an audio file.
  • the output devices can include, but are not limited to, vibration motors, vibration coils, piezoelectric devices, electrostatic devices, light emitting diodes (LEDs), strobes, and speakers.
  • the computing device 7050 may appear as another object in a computer-generated, 3D environment. Interactions by the user with the computing device 7050 (e.g., rotating, shaking, touching a touchscreen, swiping a finger across a touch screen) can be interpreted as interactions with the object in the VR space.
  • the computing device 7050 appears as a virtual laser pointer in the computer-generated, 3D environment.
  • the user manipulates the computing device 7050 the user in the VR space sees movement of the laser pointer.
  • the user receives feedback from interactions with the computing device 7050 in the VR space on the computing device 7050 or on the VR headset 7090 .
  • one or more input devices in addition to the computing device can be rendered in a computer-generated, 3D environment.
  • the rendered input devices e.g., the rendered mouse, the rendered keyboard
  • Computing device 7000 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers.
  • Computing device 7050 is intended to represent various forms of mobile devices, such as personal digital assistants, cellular telephones, smart phones, and other similar computing devices.
  • the components shown here, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed in this document.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Evolutionary Computation (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
US14/859,175 2014-09-18 2015-09-18 Three-dimensional drawing inside virtual reality environment Abandoned US20190347865A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/859,175 US20190347865A1 (en) 2014-09-18 2015-09-18 Three-dimensional drawing inside virtual reality environment

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201462052338P 2014-09-18 2014-09-18
US14/859,175 US20190347865A1 (en) 2014-09-18 2015-09-18 Three-dimensional drawing inside virtual reality environment

Publications (1)

Publication Number Publication Date
US20190347865A1 true US20190347865A1 (en) 2019-11-14

Family

ID=57588130

Family Applications (2)

Application Number Title Priority Date Filing Date
US14/859,167 Active 2037-05-26 US10509865B2 (en) 2014-09-18 2015-09-18 Dress form for three-dimensional drawing inside virtual reality environment
US14/859,175 Abandoned US20190347865A1 (en) 2014-09-18 2015-09-18 Three-dimensional drawing inside virtual reality environment

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US14/859,167 Active 2037-05-26 US10509865B2 (en) 2014-09-18 2015-09-18 Dress form for three-dimensional drawing inside virtual reality environment

Country Status (8)

Country Link
US (2) US10509865B2 (ja)
EP (1) EP3350775A1 (ja)
JP (1) JP6553212B2 (ja)
KR (1) KR20170132840A (ja)
CN (1) CN107636585B (ja)
DE (1) DE112016004249T5 (ja)
GB (1) GB2555021B (ja)
WO (1) WO2017048685A1 (ja)

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160189427A1 (en) * 2014-12-31 2016-06-30 Immersion Corporation Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
US20190378334A1 (en) * 2018-06-08 2019-12-12 Vulcan Inc. Augmented reality portal-based applications
US10632682B2 (en) * 2017-08-04 2020-04-28 Xyzprinting, Inc. Three-dimensional printing apparatus and three-dimensional printing method
US10860090B2 (en) * 2018-03-07 2020-12-08 Magic Leap, Inc. Visual tracking of peripheral devices
US10909769B1 (en) * 2019-09-18 2021-02-02 Industry Academy Cooperation Foundation Of Sejong University Mixed reality based 3D sketching device and method
US10909876B2 (en) * 2018-02-05 2021-02-02 Envision Technologies, LLC Spray paint simulator and training aid
US10996831B2 (en) 2018-06-29 2021-05-04 Vulcan Inc. Augmented reality cursors
US11086392B1 (en) * 2019-04-09 2021-08-10 Facebook Technologies, Llc Devices, systems, and methods for virtual representation of user interface devices
US11095855B2 (en) * 2020-01-16 2021-08-17 Microsoft Technology Licensing, Llc Remote collaborations with volumetric space indications
US11194400B2 (en) * 2017-04-25 2021-12-07 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
US11226722B2 (en) * 2017-09-25 2022-01-18 Tencent Technology (Shenzhen) Company Limited Information interaction method and apparatus, storage medium, and electronic apparatus
US11232643B1 (en) * 2020-12-22 2022-01-25 Facebook Technologies, Llc Collapsing of 3D objects to 2D images in an artificial reality environment
US11257295B2 (en) * 2019-09-20 2022-02-22 Facebook Technologies, Llc Projection casting in virtual environments
US11262857B2 (en) * 2017-11-21 2022-03-01 Wacom Co., Ltd. Rendering device and rendering method
US11276206B2 (en) 2020-06-25 2022-03-15 Facebook Technologies, Llc Augmented reality effect resource sharing
US11422530B2 (en) * 2018-08-20 2022-08-23 Dell Products, L.P. Systems and methods for prototyping a virtual model
US11461973B2 (en) 2020-12-22 2022-10-04 Meta Platforms Technologies, Llc Virtual reality locomotion via hand gesture
US11468644B2 (en) 2019-09-20 2022-10-11 Meta Platforms Technologies, Llc Automatic projection type selection in an artificial reality environment
US20230005224A1 (en) * 2021-07-01 2023-01-05 Lenovo (Singapore) Pte. Ltd. Presenting real world view during virtual reality presentation
WO2023009492A1 (en) * 2021-07-27 2023-02-02 Apple Inc. Method and device for managing interactions directed to a user interface with a physical object
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
US20230089635A1 (en) * 2016-10-14 2023-03-23 Vr-Chitect Limited Virtual reality system and method
US11734886B1 (en) * 2021-02-18 2023-08-22 Splunk Inc. Interaction tools in networked remote collaboration
EP4156113A4 (en) * 2020-07-27 2023-11-29 Wacom Co., Ltd. METHOD EXECUTED BY COMPUTER, COMPUTER AND PROGRAM
US20230386158A1 (en) * 2022-05-31 2023-11-30 Snap Inc. Cross-modal shape and color manipulation
US11991222B1 (en) 2023-05-02 2024-05-21 Meta Platforms Technologies, Llc Persistent call control user interface element in an artificial reality environment
US12003585B2 (en) 2018-06-08 2024-06-04 Vale Group Llc Session-based information exchange
US12093501B2 (en) * 2022-06-13 2024-09-17 Illuscio, Inc. Systems and methods for interacting with three-dimensional graphical user interface elements to control computer operation
US12099693B2 (en) 2019-06-07 2024-09-24 Meta Platforms Technologies, Llc Detecting input in artificial reality systems based on a pinch and pull gesture
US12147616B2 (en) 2017-11-21 2024-11-19 Wacom Co., Ltd. Rendering device and rendering method

Families Citing this family (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2597173B2 (es) * 2015-07-15 2017-10-16 Universidad Rey Juan Carlos Método implementado por ordenador, sistema y producto de programa para ordenador para simular el comportamiento de textil tejido a nivel de hilo
US10838502B2 (en) * 2016-03-29 2020-11-17 Microsoft Technology Licensing, Llc Sharing across environments
US20180096505A1 (en) * 2016-10-04 2018-04-05 Facebook, Inc. Controls and Interfaces for User Interactions in Virtual Spaces
GB2605299B (en) * 2016-10-14 2022-12-28 Vr Chitect Ltd Virtual reality system and method
CN107123152B (zh) * 2017-04-06 2023-01-06 腾讯科技(深圳)有限公司 一种编辑处理方法及装置
US11094136B2 (en) * 2017-04-28 2021-08-17 Linden Research, Inc. Virtual reality presentation of clothing fitted on avatars
US11145138B2 (en) 2017-04-28 2021-10-12 Linden Research, Inc. Virtual reality presentation of layers of clothing on avatars
CN107273033A (zh) * 2017-06-28 2017-10-20 王汝平 服装设计方法、装置及系统
US10140392B1 (en) 2017-06-29 2018-11-27 Best Apps, Llc Computer aided systems and methods for creating custom products
US10930078B1 (en) * 2017-11-01 2021-02-23 Bentley Systems, Incorporated Techniques for improving perception of projections of subsurface features on a terrain surface in augmented reality
KR101964446B1 (ko) * 2017-11-28 2019-04-01 주식회사 펀웨이브 가상현실을 이용한 미술활동 시스템
US20190236222A1 (en) * 2018-01-27 2019-08-01 Walmart Apollo, Llc System for augmented apparel design
US20190304154A1 (en) * 2018-03-30 2019-10-03 First Insight, Inc. Importation and transformation tool for utilizing computer-aided design files in a web browser or customized client interface
US20190339837A1 (en) * 2018-05-04 2019-11-07 Oculus Vr, Llc Copy and Paste in a Virtual Reality Environment
US10768426B2 (en) * 2018-05-21 2020-09-08 Microsoft Technology Licensing, Llc Head mounted display system receiving three-dimensional push notification
CN108898680B (zh) * 2018-05-30 2019-06-18 贝壳找房(北京)科技有限公司 一种在虚拟三维空间中自动校正截取图片的方法及装置
CN108919948A (zh) * 2018-06-20 2018-11-30 珠海金山网络游戏科技有限公司 一种基于手机的vr系统、存储介质及输入方法
KR102083504B1 (ko) * 2018-06-22 2020-03-02 주식회사 지이모션 환경 요소를 고려한 의류의 3차원 모델링 방법 및 컴퓨터 프로그램
US11227435B2 (en) 2018-08-13 2022-01-18 Magic Leap, Inc. Cross reality system
CN112805750B (zh) 2018-08-13 2024-09-27 奇跃公司 跨现实系统
US11232635B2 (en) * 2018-10-05 2022-01-25 Magic Leap, Inc. Rendering location specific virtual content in any location
US11030796B2 (en) * 2018-10-17 2021-06-08 Adobe Inc. Interfaces and techniques to retarget 2D screencast videos into 3D tutorials in virtual reality
CN111083391A (zh) * 2018-10-19 2020-04-28 舜宇光学(浙江)研究院有限公司 虚实融合系统及其方法
US10867081B2 (en) 2018-11-21 2020-12-15 Best Apps, Llc Computer aided systems and methods for creating custom products
CN109584376B (zh) * 2018-12-03 2023-04-07 广东工业大学 基于vr技术的构图教学方法、装置、设备以及存储介质
US11195308B2 (en) * 2018-12-05 2021-12-07 Sony Group Corporation Patcher tool
CA3115710A1 (en) 2018-12-10 2020-06-18 Quality Executive Partners, Inc. Virtual reality simulation and method
US11004269B2 (en) * 2019-04-22 2021-05-11 Microsoft Technology Licensing, Llc Blending virtual environments with situated physical reality
CN114721511A (zh) * 2019-04-24 2022-07-08 彼乐智慧科技(北京)有限公司 一种三维物体定位的方法及装置
DE102020001882A1 (de) * 2019-04-26 2020-11-12 Mitutoyo Corporation Visualisierungsvorrichtung und Programm
US11422669B1 (en) * 2019-06-07 2022-08-23 Facebook Technologies, Llc Detecting input using a stylus in artificial reality systems based on a stylus movement after a stylus selection action
CN110070777B (zh) * 2019-06-13 2021-07-09 大连民族大学 一种赫哲族鱼皮画仿真培训系统及实现方法
US11494953B2 (en) * 2019-07-01 2022-11-08 Microsoft Technology Licensing, Llc Adaptive user interface palette for augmented reality
US11244516B2 (en) 2019-09-16 2022-02-08 Magic Leap, Inc. Object interactivity in virtual space
CN114586071A (zh) 2019-10-15 2022-06-03 奇跃公司 支持多设备类型的交叉现实系统
WO2021076748A1 (en) 2019-10-15 2021-04-22 Magic Leap, Inc. Cross reality system with wireless fingerprints
WO2021076754A1 (en) 2019-10-15 2021-04-22 Magic Leap, Inc. Cross reality system with localization service
EP4052086A4 (en) 2019-10-31 2023-11-15 Magic Leap, Inc. EXTENDED REALITY SYSTEM PROVIDING QUALITY INFORMATION ABOUT PERSISTENT COORDINATE FRAMES
JP7525603B2 (ja) 2019-11-12 2024-07-30 マジック リープ, インコーポレイテッド 位置特定サービスおよび共有場所ベースのコンテンツを伴うクロスリアリティシステム
US11562542B2 (en) 2019-12-09 2023-01-24 Magic Leap, Inc. Cross reality system with simplified programming of virtual content
US11562525B2 (en) 2020-02-13 2023-01-24 Magic Leap, Inc. Cross reality system with map processing using multi-resolution frame descriptors
EP4104145A4 (en) 2020-02-13 2024-01-24 Magic Leap, Inc. CROSS-REALLY SYSTEM PRIORITIZING GEOLOCALIZATION INFORMATION FOR LOCALIZATION
US11410395B2 (en) 2020-02-13 2022-08-09 Magic Leap, Inc. Cross reality system with accurate shared maps
US11551430B2 (en) 2020-02-26 2023-01-10 Magic Leap, Inc. Cross reality system with fast localization
CN115803788A (zh) 2020-04-29 2023-03-14 奇跃公司 用于大规模环境的交叉现实系统
US11514203B2 (en) * 2020-05-18 2022-11-29 Best Apps, Llc Computer aided systems and methods for creating custom products
EP3926432A1 (en) * 2020-06-16 2021-12-22 Hexagon Geosystems Services AG Touch control of unmanned aerial vehicles
US11954268B2 (en) * 2020-06-30 2024-04-09 Snap Inc. Augmented reality eyewear 3D painting
CN111773669B (zh) * 2020-07-03 2024-05-03 珠海金山数字网络科技有限公司 一种在虚拟环境中生成虚拟对象方法及装置
CN112306230A (zh) * 2020-09-10 2021-02-02 上海风语筑文化科技股份有限公司 基于vr技术的虚拟喷漆互动展项装置
US11157163B1 (en) * 2020-09-29 2021-10-26 X Development Llc Paintbrush-like techniques for determining fabricable segmented designs
JP7434134B2 (ja) * 2020-11-04 2024-02-20 ソフトバンク株式会社 データ処理装置、プログラム、及びデータ処理方法
CN112241993B (zh) * 2020-11-30 2021-03-02 成都完美时空网络技术有限公司 游戏图像处理方法、装置及电子设备
CN114973922B (zh) * 2021-02-25 2024-03-15 北京服装学院 一种服装立体裁剪虚拟现实教学系统及方法
CN115034954A (zh) * 2021-02-25 2022-09-09 北京服装学院 一种三维服装设计制版系统及方法
CN114882153B (zh) * 2022-04-01 2024-09-24 网易(杭州)网络有限公司 一种动画生成的方法及装置
CN115657851B (zh) * 2022-10-31 2023-08-29 首都师范大学 一种虚拟现实中基于双手操作的三维绘画方法及系统
CN117830535A (zh) * 2024-01-15 2024-04-05 内蒙古工业大学 一种用于蒙古族袍服展示的vr系统

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPO850197A0 (en) 1997-08-11 1997-09-04 Silverbrook Research Pty Ltd Image processing method and apparatus (art30)
JP2000003376A (ja) 1998-06-15 2000-01-07 Toyobo Co Ltd 衣服の補正方法及びこの方法を用いた衣服の補正装置
US6222465B1 (en) 1998-12-09 2001-04-24 Lucent Technologies Inc. Gesture-based computer interface
US6404426B1 (en) * 1999-06-11 2002-06-11 Zenimax Media, Inc. Method and system for a computer-rendered three-dimensional mannequin
CA2289413C (en) * 1999-11-15 2004-06-22 Public Technologies Multimedia, Inc. System and method for displaying selected garments on a computer-simulated mannequin
KR100403332B1 (ko) 2001-07-14 2003-11-07 (주)아트윈텍 패션물의 디자인 삼차원입체 매핑 기능을 갖는 디자인 방법
KR20030039970A (ko) 2001-11-16 2003-05-22 주식회사 해피투웨어 가상공간에서의 의복 디자인 협업 서비스 사업 방법 및이를 구현할 수 있는 프로그램이 수록된 컴퓨터로 읽을 수있는 기록매체
FR2837593B1 (fr) 2002-03-22 2004-05-28 Kenneth Kuk Kei Wang Procede et dispositif de visualisation, d'archivage et de transmission sur un reseau d'ordinateurs d'un modele de vetement
SE0203908D0 (sv) 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
ES2211357B1 (es) * 2002-12-31 2005-10-16 Reyes Infografica, S.L. Metodo asistido por ordenador para diseñar prendas de vestir.
JP2006215146A (ja) 2005-02-02 2006-08-17 Matsushita Electric Ind Co Ltd 画像表示装置及び画像表示方法
CN101398942A (zh) * 2008-04-24 2009-04-01 中山大学 三维试衣仿真系统
JP4879946B2 (ja) 2008-09-04 2012-02-22 株式会社スクウェア・エニックス 3次元デザイン支援装置及びプログラム
US8947455B2 (en) 2010-02-22 2015-02-03 Nike, Inc. Augmented reality design system
GB201102794D0 (en) * 2011-02-17 2011-03-30 Metail Ltd Online retail system
US8606645B1 (en) 2012-02-02 2013-12-10 SeeMore Interactive, Inc. Method, medium, and system for an augmented reality retail application
US9811854B2 (en) * 2013-07-02 2017-11-07 John A. Lucido 3-D immersion technology in a virtual store
US20150316579A1 (en) 2014-05-02 2015-11-05 Qualcomm Incorporated Motion direction determination and application

Cited By (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160189427A1 (en) * 2014-12-31 2016-06-30 Immersion Corporation Systems and methods for generating haptically enhanced objects for augmented and virtual reality applications
US20230089635A1 (en) * 2016-10-14 2023-03-23 Vr-Chitect Limited Virtual reality system and method
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
US11194400B2 (en) * 2017-04-25 2021-12-07 Tencent Technology (Shenzhen) Company Limited Gesture display method and apparatus for virtual reality scene
US10632682B2 (en) * 2017-08-04 2020-04-28 Xyzprinting, Inc. Three-dimensional printing apparatus and three-dimensional printing method
US20220100334A1 (en) * 2017-09-25 2022-03-31 Tencent Technology (Shenzhen) Company Limited Information interaction method and apparatus, storage medium, and electronic apparatus
US11809685B2 (en) * 2017-09-25 2023-11-07 Tencent Technology (Shenzhen) Company Limited Information interaction method and apparatus, storage medium, and electronic apparatus
US11226722B2 (en) * 2017-09-25 2022-01-18 Tencent Technology (Shenzhen) Company Limited Information interaction method and apparatus, storage medium, and electronic apparatus
US12147616B2 (en) 2017-11-21 2024-11-19 Wacom Co., Ltd. Rendering device and rendering method
US11681383B2 (en) 2017-11-21 2023-06-20 Wacom Co., Ltd. Rendering device and rendering method
US11262857B2 (en) * 2017-11-21 2022-03-01 Wacom Co., Ltd. Rendering device and rendering method
US11145218B2 (en) 2018-02-05 2021-10-12 Envision Technologies, LLC Spray paint simulator and training aid
US10909876B2 (en) * 2018-02-05 2021-02-02 Envision Technologies, LLC Spray paint simulator and training aid
US11181974B2 (en) * 2018-03-07 2021-11-23 Magic Leap, Inc. Visual tracking of peripheral devices
US20230195212A1 (en) * 2018-03-07 2023-06-22 Magic Leap, Inc. Visual tracking of peripheral devices
US11625090B2 (en) * 2018-03-07 2023-04-11 Magic Leap, Inc. Visual tracking of peripheral devices
US10860090B2 (en) * 2018-03-07 2020-12-08 Magic Leap, Inc. Visual tracking of peripheral devices
US11989339B2 (en) * 2018-03-07 2024-05-21 Magic Leap, Inc. Visual tracking of peripheral devices
US11195336B2 (en) 2018-06-08 2021-12-07 Vulcan Inc. Framework for augmented reality applications
US12003585B2 (en) 2018-06-08 2024-06-04 Vale Group Llc Session-based information exchange
US20190378334A1 (en) * 2018-06-08 2019-12-12 Vulcan Inc. Augmented reality portal-based applications
US10996831B2 (en) 2018-06-29 2021-05-04 Vulcan Inc. Augmented reality cursors
US11422530B2 (en) * 2018-08-20 2022-08-23 Dell Products, L.P. Systems and methods for prototyping a virtual model
US11086392B1 (en) * 2019-04-09 2021-08-10 Facebook Technologies, Llc Devices, systems, and methods for virtual representation of user interface devices
US12099693B2 (en) 2019-06-07 2024-09-24 Meta Platforms Technologies, Llc Detecting input in artificial reality systems based on a pinch and pull gesture
US10909769B1 (en) * 2019-09-18 2021-02-02 Industry Academy Cooperation Foundation Of Sejong University Mixed reality based 3D sketching device and method
US11947111B2 (en) 2019-09-20 2024-04-02 Meta Platforms Technologies, Llc Automatic projection type selection in an artificial reality environment
US11468644B2 (en) 2019-09-20 2022-10-11 Meta Platforms Technologies, Llc Automatic projection type selection in an artificial reality environment
US11257295B2 (en) * 2019-09-20 2022-02-22 Facebook Technologies, Llc Projection casting in virtual environments
US11095855B2 (en) * 2020-01-16 2021-08-17 Microsoft Technology Licensing, Llc Remote collaborations with volumetric space indications
US11276206B2 (en) 2020-06-25 2022-03-15 Facebook Technologies, Llc Augmented reality effect resource sharing
US12026802B2 (en) 2020-06-25 2024-07-02 Meta Platforms Technologies, Llc Sharing of resources for generating augmented reality effects
EP4156113A4 (en) * 2020-07-27 2023-11-29 Wacom Co., Ltd. METHOD EXECUTED BY COMPUTER, COMPUTER AND PROGRAM
US12001646B2 (en) * 2020-07-27 2024-06-04 Wacom Co., Ltd. Computer-implemented method, computer, and program for rendering a three-dimensional object in a virtual reality space
US11461973B2 (en) 2020-12-22 2022-10-04 Meta Platforms Technologies, Llc Virtual reality locomotion via hand gesture
US11232643B1 (en) * 2020-12-22 2022-01-25 Facebook Technologies, Llc Collapsing of 3D objects to 2D images in an artificial reality environment
US11734886B1 (en) * 2021-02-18 2023-08-22 Splunk Inc. Interaction tools in networked remote collaboration
US20230005224A1 (en) * 2021-07-01 2023-01-05 Lenovo (Singapore) Pte. Ltd. Presenting real world view during virtual reality presentation
WO2023009492A1 (en) * 2021-07-27 2023-02-02 Apple Inc. Method and device for managing interactions directed to a user interface with a physical object
US12094073B2 (en) * 2022-05-31 2024-09-17 Snap Inc. Cross-modal shape and color manipulation
US20230386158A1 (en) * 2022-05-31 2023-11-30 Snap Inc. Cross-modal shape and color manipulation
US12093501B2 (en) * 2022-06-13 2024-09-17 Illuscio, Inc. Systems and methods for interacting with three-dimensional graphical user interface elements to control computer operation
US12149511B2 (en) 2022-08-12 2024-11-19 Bank Of America Corporation Provisioning secured data access to authorized users through light fidelity (LiFi) data transmission and a virtual reality device
US11991222B1 (en) 2023-05-02 2024-05-21 Meta Platforms Technologies, Llc Persistent call control user interface element in an artificial reality environment

Also Published As

Publication number Publication date
GB2555021B (en) 2021-04-28
EP3350775A1 (en) 2018-07-25
WO2017048685A1 (en) 2017-03-23
DE112016004249T5 (de) 2018-06-07
US10509865B2 (en) 2019-12-17
US20160370971A1 (en) 2016-12-22
JP2018535458A (ja) 2018-11-29
JP6553212B2 (ja) 2019-07-31
GB2555021A (en) 2018-04-18
CN107636585B (zh) 2020-12-29
CN107636585A (zh) 2018-01-26
KR20170132840A (ko) 2017-12-04
GB201717379D0 (en) 2017-12-06

Similar Documents

Publication Publication Date Title
US20190347865A1 (en) Three-dimensional drawing inside virtual reality environment
TWI827633B (zh) 具有廣泛使用性的三維圖形使用者介面的系統及方法與對應的可讀式媒體
US11043031B2 (en) Content display property management
Linowes et al. Augmented reality for developers: Build practical augmented reality applications with unity, ARCore, ARKit, and Vuforia
US10325407B2 (en) Attribute detection tools for mixed reality
EP3814876B1 (en) Placement and manipulation of objects in augmented reality environment
US9582142B2 (en) System and method for collaborative computing
Spencer ZBrush character creation: advanced digital sculpting
US20180165877A1 (en) Method and apparatus for virtual reality animation
US20230038709A1 (en) System and Method for Authoring Freehand Interactive Augmented Reality Applications
JP6062589B1 (ja) プログラム、情報処理装置、影響度導出方法、画像生成方法及び記録媒体
Piekarski et al. Tinmith-mobile outdoor augmented reality modelling demonstration
Caudron et al. Blender 3D: Designing Objects
GB2595445A (en) Digital sandtray
Zhang Colouring the sculpture through corresponding area from 2D to 3D with augmented reality
Arora Creative visual expression in immersive 3D environments
Xin 3D sketching and collaborative design with napkin sketch
Costello ARCAD: Augmented Reality Computer Aided Design
Cassab-Gheta Three-Dimensional Sketching within an Iterative Design Workflow
Belmonte et al. Federate resource management in a distributed virtual environment
Ucchesu A Mixed Reality application to support TV Studio Production
Syed et al. Digital sand model using virtual reality workbench
Sun et al. HoloLens-Based Visualization Teaching System for Algorithms of Computer Animation
Lewis et al. Maya 5 fundamentals
CN118379415A (zh) 生成波点形态的方法、装置、存储介质及电子装置

Legal Events

Date Code Title Description
AS Assignment

Owner name: GOOGLE INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HACKETT, PATRICK RYAN;SKILLMAN, ANDREW LEE;SIGNING DATES FROM 20150921 TO 20151012;REEL/FRAME:036835/0391

AS Assignment

Owner name: GOOGLE LLC, CALIFORNIA

Free format text: CHANGE OF NAME;ASSIGNOR:GOOGLE INC.;REEL/FRAME:044567/0001

Effective date: 20170929

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION