CN109478103A - Three-dimensional model information is shown in virtual reality - Google Patents

Three-dimensional model information is shown in virtual reality Download PDF

Info

Publication number
CN109478103A
CN109478103A CN201780045798.3A CN201780045798A CN109478103A CN 109478103 A CN109478103 A CN 109478103A CN 201780045798 A CN201780045798 A CN 201780045798A CN 109478103 A CN109478103 A CN 109478103A
Authority
CN
China
Prior art keywords
user
model
component
space
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780045798.3A
Other languages
Chinese (zh)
Other versions
CN109478103B (en
Inventor
马雷克·克里茨勒
马蒂亚斯·迈尔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of CN109478103A publication Critical patent/CN109478103A/en
Application granted granted Critical
Publication of CN109478103B publication Critical patent/CN109478103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/15Vehicle, aircraft or watercraft design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/23Design optimisation, verification or simulation using finite element methods [FEM] or finite difference methods [FDM]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2016Rotation, translation, scaling

Abstract

For showing 3D model (20) in virtual reality (VR), CAD only being shown to, being changed into 3D rendering from 2D screen may be not enough to reduce information clutter.It is blocked or mixed and disorderly metadata to be provided to 3D CAD model (20) with less, individual space (32) is generated in reality environment.The metadata and information of selected part (36) about 3D CAD model (20) and/or 3D CAD model (20) show (58) in individual space (32).User can check 3D CAD model (20) in a space (30), and the metadata that with or without component indicates is checked in another space (32), or combinations thereof.

Description

Three-dimensional model information is shown in virtual reality
Related application
This patent document requires the root for the Provisional U.S. Patent Application sequence number 62/353,073 submitted on June 22nd, 2016 According to the equity of the submission date of 35U.S.C. § 119 (e), which is incorporated herein by reference.
Technical field
The present embodiment is related to the display of threedimensional model.Rocket, wind turbine, automobile, bicycle, shoes, slotted screw And other objects are using CAD (CAD) software design.CAD is used to create for any industry and any purpose Build the model of any size.Engineer uses the attribute of CAD design, analysis and simulated object.Engineer can modify three-dimensional The single component of (3D) CAD model.The component can be combined to assembly.
Background technique
3D CAD model is designed on computers using keyboard and mouse.For example, computer screen and CAD software provide Highdensity information and to for creating, manipulating and visual different tools facilitate access.The supplier of CAD software mentions The tool for checking cad file as three-dimensional (3D) model of two dimension (2D) screen for being rendered into desktop computer is supplied. Metadata about 3D model is shown on panel, which may be Chong Die with 3D model, or 3D model is required to be less than really World's scale is to allow the display of panel.As the complexity of engineering object improves, theirs are interacted and checked with these models Link information becomes much more difficult.Indicate that 3D CAD model can ignore three-dimensional space, and display function on 2D computer screen Metadata may become to be at a loss.
Summary of the invention
By introducing, preferred embodiments described below includes the side for the display 3D model in virtual reality (VR) Method, system, instruction and computer-readable medium.CAD display, which is only changed into 3D rendering from 2D screen, to be not enough to Reduce information clutter.It is blocked or mixed and disorderly metadata to be provided to 3D CAD model with less, in reality environment Generate additional space.Metadata about 3D CAD model and/or 3D CAD model selected part is shown in exceptional space. User can check 3D CAD model in a space, and the first number indicated with or without component is checked in another space According to, or combinations thereof.
In the first aspect, the system for showing threedimensional model in virtual reality is provided.Memory is configured as The threedimensional model of storage object.Virtual reality headset equipment is configured as aobvious to the user for wearing virtual reality headset equipment Show virtual environment.Processor is configurable to generate virtual environment, which there is (1) to be located on first side on surface The expression of user's selected part of the expression of threedimensional model and (2) threedimensional model and the user being located in the second side on surface The metadata of selected part.
In second aspect, the method for using virtual reality headset equipment to show threedimensional model is provided.In void In quasi- reality headset equipment, the first view of the three-dimensional computer Computer Aided Design of the arrangement of component is shown as being shelved on three-dimensional space Between in first level on.Receive selection of the user to the first component of the arrangement of component.User passes through from first view Floor is transformed into the second view in the second level on floor, opposite with first view side, in three-dimensional space.? The first component and the information about the first component are shown at the second view in virtual reality headset equipment.
In a third aspect, virtual reality system includes stereoscopic display, which is configured as in the first sky Between it is middle indicate Computer Aided Design Model inside, and in the second space being spatially separating with first indicate about calculating The information of machine aided design models.User's input pickup is configured as receiving from the user interactively enter.Processor is matched It is set in response to interactively entering, user focus is changed from inside to information.
The present invention is defined by the following claims, and the content in this section is not construed as the limit to these claims System.Other aspects and advantage combination preferred embodiment of the invention are described following, and can later individually or group It is claimed with closing.
Detailed description of the invention
Component and the drawings are not necessarily drawn to scale, but emphasize the principle of the present invention.In addition, in the accompanying drawings, In all different views, Similar reference numerals specify corresponding component.
Fig. 1 shows the embodiment of the virtual reality system with more space environments;
Fig. 2 shows an examples of more space virtual environment;
Fig. 3 shows the exemplary internal view of 3D CAD model;
Fig. 4 is another example of more space virtual environment;
Fig. 5 shows the exemplary view of the information centre of metadata, and wherein the component of 3D CAD model can pass through translucent table Face views;And
Fig. 6 is the process for the one embodiment for using the method for virtual reality headset equipment display threedimensional model Figure.
Specific embodiment
3D CAD model and model information are shown in VR environment.CAD is the pith of engineering.CAD program help is created Build, analyze and optimize the model from screw to aircraft.VR is the computer simulation checked by the user for having on headset equipment The world.The headset equipment is characterized by having the screen for each eye, and is come with Inertial Measurement Unit (IMU) The movement for tracking head, for the VR that navigates.Since application program makes a response to the movement on body and head, it is thus possible to can feel Feel that artificial world is true.The combination of CAD and VR allows user to explore 3D CAD model in VR.Can experience actual size or The model of another size.
Compared with the visualization on 2D computer screen, VR can help more to immerse and more intuitively explore 3D CAD mould Type.User can interact with the virtual representation based on 3D CAD model and manipulate virtual representation.VR system allow user experience its Middle space is virtual world that is unconfined and can excluding or change physical law.Experimental field abundant allows user to explore The attribute of the object not yet constructed.In one embodiment, agravic environment allows user by shown model in 3D sky Between middle navigation, and experience vision setting and both relevant informations.VR is fit closely technology, it can close at 2D meter The visualization limitation of 3D model on calculation machine screen and shortage immersion explore the gap between possibility.
When exploring 3D CAD model in VR, user can enter 3D model to understand the component of the inside of 3D model and thin Section.In VR, user may fully or mostly be surrounded by 3D model.In the case, it is impossible to which display is about 3D mould Type entirety or the information of its component are without stopping 3D model or making most of failure of virtual 3d model.In addition, in 3D model Space it is very limited.In the case where not relocating, information panel may be shown without enough spaces.These information Panel may interfere with the exploration of the remainder to 3D model.Panel covers view and interacts with the other parts of 3D model Possibility.In addition, studied component can be attached to 3D model other parts or it is built-in wherein, so this of 3D model A little other parts may block some in concern part.
To use less block to handle a large amount of available informations, more spaces are separated between 3D virtual representation and relevant information (for example, two floors) indication.Second level or space are the supplements of " classic view is checked " 3D CAD data in VR.This side Method allows to show 3D CAD model and relevant information, without the component of the presented 3D CAD model of covering.The mould of all sizes Type can by immersing be explored, while still provide the particular elements that enough spaces check 3D model in individual space.Add Add and shows that independent space prevents user from getting lost in the world VR with entire 3D model.Being directly connected between two spaces will be empty Between link together.For example, the connection can be realized by translucent floor.User can position individual sky in virtual world Between and oneself.Individual space is introduced to cause for showing that a part of 3D CAD model avoids the rest part of 3D model Block and shown by a component and blocked caused by the panel of information.This individual space can overcome beside 3D model The shortcomings that showing ejecting plate, because 3D model can be large object (for example, train, automobile, Plain or wind turbine).
Fig. 1 shows one embodiment for showing the system of 3D model in VR.The VR system is configured as one 3D model is shown in a space, and shows additional information in additional individually space.These spaces are with known or intuitive side Formula connection, such as one on top of the other, in order to navigate.
System include with one or more sensors 12, one or more sensors 16, processor 14, memory 18 with And the VR headset equipment 22 of stereoscopic display 26.It can provide additional, different or less component.For example, providing fortune Dynamic detection gloves or motion detection sensor, microphone, loudspeaker (for example, earphone) or other VR hardware.As another example, Sensor 12 is not provided.
System implements the method or different methods of Fig. 6.For example, processor 14 and stereoscopic display 26 implement behavior 50,56,58 and 60.Sensor 12 implements behavior 54.Sensor 12 and/or processor 14 implement behavior 52.Other components or group These implementable behaviors of the combination of part.
Processor 14 and/or memory 18 are a part of VR headset equipment 22.Processor 14 and/or memory 18 with Stereoscopic display 26 is included in same housing, or in individual shell.In separate housing, processor 14 and/or deposit Reservoir 18 can be worn by user, such as in knapsack, be mounted on waistband or tied up on device.Alternatively, processor 14 and/ Or memory 18 is the computer separated with user, server, work station or other processing units, and use and VR wear-type The communication of equipment 22, stereoscopic display 26 and/or sensor 12.Wired or wireless communication be used for processor 14, memory 18, Interaction between any other electronic assembly part of sensor 12, display 26 and VR headset equipment 22.Separate processor can For any of component.
Memory 18 is graphics process memory, video RAM, random access memory, system storage Device, buffer memory, hard disk drive, optical medium, magnetic medium, flash drive, buffer, database, a combination thereof, or use In storage 3D model, metadata, camera information, stereo-picture, sensor signal and/or other information other currently known or The memory device developed later.Memory 18 is associated with processor 14, VR headset equipment 22 or independent device A part of computer.
Memory 18 is configured as the 3D model of storage object, such as CAD model of object 20.It navigates the phase in virtual reality Between, 3D CAD model or 3D data are used to that three-dimensional view is rendered or generated from any potential user visual angle.
Object 20 is indicated by 3D data.For example, building, the assembly of cross-correlation component or manufacturing object 20 are by CAD data And/or other project data indicate.3D data are shown by the segment table by size, shape and/or length parameter.It can be used other The triangle of 3D data parameterization, such as grid or interconnection.As another example, 3D data are along uniform or non-uniform grid The voxel of distribution.Alternatively, segmentation is executed, and 3D data are the model or grid of fitting.
The geometry of 3D model expression object 20.Indicate one or more surfaces.3D model includes the surface of object 20, It is not based on virtual camera.3D CAD is usually indicated by using XYZ coordinate (vertex) in the 3 d space.Connection between vertex is Known-either geometric graphic element such as triangle/tetrahedron, or the more complicated 3D expression of composition 3D CAD model. CAD data clean and complete (impermeable) and do not include noise.CAD data is usually drawn and is indicated with metric scale metric.Engineering Or GIS data may also comprise a small amount of noise or no noise.More than one model can be presented, be such as positioned to adjacent to each other Two or more 3D models.
3D model may include metadata.Metadata is encoded in a model, or is separately deposited according to file format and 3D model Storage.Metadata can be one or more labels.Label is the information other than the geometry of physical object.Label can be portion Part information is (for example, unit number, available options, material, manufacturer, recall notice, performance information, operation instruction, assembling/tear open Unload explanation, cost and/or availability).Label can be other information, the shipping date of such as component.Label is only capable of identifying A part of object 20 or object 20.Different metadata can be provided for the different components of assembly.Metadata may include Reference documents, such as material data table, analog result, test result, recall notice, quality control information, design alternatives And/or Other Engineering information.Metadata be about 3D model rather than the information of geometry.
Alternately or additionally, memory 18 or other memories are computer readable storage mediums, and storage indicates The data for the instruction that can be executed by programmed process device 14 or another processor.For implementing process discussed, method, behavior And/or technology instruction be arranged in non-transitory computer-readable storage media or memory, such as caching, buffer, On RAM, removable media, hard disk drive or other computer readable storage mediums.Non-transitory computer-readable storage medium Matter includes various types of volatile storage mediums and non-volatile memory medium.Function as shown in the figure or as described herein, row For or task held in response to storing one or more groups of instructions in a computer-readable storage medium or on computer-readable medium Row.Function, behavior or task, and can be by lists independently of certain types of instruction set, storage medium, processor or processing strategie Only or combination operation software, hardware, integrated circuit, firmware, microcode etc. execute.Equally, processing strategie may include multiprocessing, Multitasking, parallel processing etc..
In one embodiment, instruction is stored on removable media device, for being read by Local or Remote system.? In other embodiments, instruction is stored in remote location, for being transmitted by computer network or through telephone wire.In other realities It applies in example, instruction is stored in given computer, in CPU, GPU or system.
VR headset equipment 22 includes three-dimensional head-mounted display 26 and inertia motion sensor 16.It may include loudspeaking Device, microphone and/or other devices.VR headset equipment that is any currently known or developing later can be used, such as Occulas, Google's carton, HTC Hive, PlayStation VR or Samsung Galaxy VR.In alternative embodiments, it uses One or more projectors replace stereoscopic display 26.Projector will be on the retina of graphic projection to user.Show as another Example, eye tracker (for example, camera of alignment eyes of user) is used to be aligned the direction of visual angle and user focus, rather than uses head Movement.
VR headset equipment 22 is configured as showing virtual environment to the user for wearing VR headset equipment 22.For example, VR Headset equipment 22 is a pair of of goggles, and the visual field of user wholly or largely is restricted to stereoscopic display 26 by it.Wear-type And/or eyewear apparatus can cover the whole visual field of user.The partial visual field of user may be limited, and circumferential view is such as stopped It is wild.VR headset equipment is wear-type or can wear.For example, VR headset equipment includes cap or ring, for being shelved on user Head on and display 26 is placed on before the eyes of user.As head-mounted display, harness or helmet support are aobvious Show device.
Stereoscopic display 26 includes the independent display for user's each eye.One restriction of obstacle one eye eyeball is shown Show interference of the device to the display of another eyes.Individual monitor can be used for showing two images, show one for each eye Width image.The dedicated image of eyes is rendered from different camera angles and/or position, and stereo-picture is provided.Can In alternate embodiment, other stereoscopic displays can be used.
Inertia motion sensor 16 is position and/or the fortune of the head movement or VR headset equipment 22 for detecting user Gyroscope, accelerometer, structured light sensor, gravitational field sensor and/or the other devices of dynamic variation.Magnetic field sensor can For detecting position and/or movement.16 measurement position of inertia motion sensor or change in location, such as measuring six freedom The gyroscope of degree.The visual angle in the visual field is adjusted with the head movement of user in virtual reality.In alternative embodiments, user is defeated Enter device (such as arrow key) to be used to replace inertia mobile.
Provide VR ambient source, such as graphics processor 14.Graphics processor 14 is based on coming from inertia motion sensor 16 Input, for stereoscopic display 26 generate image.Processor 14 generates more space environments, selected for the user for CAD model Part display CAD model or metadata.
Sensor 12 is the user's input pickup for being configured as receiving interaction from the user (for example, navigation) input. Keyboard, mouse, trace ball, Trackpad, touch screen, gesture sensor or other users input unit are used.User, which inputs, to be passed Sensor measures or receives input from the user, for navigating and/or interacting with virtual environment.
Alternatively or additionally, sensor 12 is gesture sensor.For example, jumping gesture sensor is used to know The hand of other user.Two cameras and software are run in graphics processor 14 or individually processor, to estimate including finger The position of both hands.Sensor 12 is fixedly or releasably connect with VR headset equipment 22, is such as bonded or is mounted on VR head It wears before formula equipment 22.Cable is wirelessly connected for providing the measured value from sensor 12.In other embodiments, it controls Device or sensor processed are mounted on sensing on hand or in gloves for gesture for user.
The gesture of interaction schemes identification user.For example, detecting due to one hand, both hands, a finger and/or multiple fingers Movement or position generate mode (such as pinching).The gesture that any amount of differentiation can be used, such as provides and pinches (example Such as, mobile, rotate, scaling), grab (for example, mobile subscriber in the scene);And (3) thumb upwards (for example, in ground level and Switch between information centre).According to background, each gesture is mapped to a reaction.Once gesture has been identified, can be mentioned to user For directly feeding back.For example, hand is highlighted by flash of light and the upward gesture of thumb triggers associated move immediately if identification is pinched or grabbed Make.Alternatively or additionally, it is fed back to highlighted or responds.
In one embodiment, VR headset equipment 22 is as processor 14, memory 18, inertia motion sensor 16 And the smart phone of stereoscopic display 26.Gesture sensor 12 is mounted on before VR headset equipment 22, for checking use The hand and/or finger at family.Any hardware and/or software can be used, such as 3D game engine (for example, Unity), VR wear-type are set Standby (such as Google's carton), stereoscopic display 26 (for example, smart phone), processor 14 (for example, smart phone), memory 18 (for example, smart phone), gesture sensor (for example, jumping), menu interface are (for example, as the interface Hover-VR external member A part Hovercast) and Stream Media Application program (for example, for generating Trinus VR of image).It can be used Other arrangements.
Processor 14 be general processor, central processing unit, control processor, graphics processor, graphics processing unit, Digital signal processor, three-dimensional rendering processor, image processor, specific integrated circuit, field programmable gate array, number electricity Road, analog circuit, a combination thereof or other devices that are currently known or developing later.Processor 14 is serial, parallel or independent behaviour The single device of work or multiple devices.Processor 14 can be the primary processor of computer, such as smart phone, laptop Or desktop computer, or can be the processor for handling some tasks in larger system, such as in VR headset equipment In 22.Processor 14 is by instruction, design, firmware, hardware and/or software configuration, to execute behavior discussed herein.
Virtual environment is generated using the processor 14 of three-dimensional imaging.Virtual environment has the 3D CAD mould from memory 18 The expression or display of type.The open space of the size of 3D CAD model of display is not limited to by providing, this expression, which has, appoints What size.VR environment includes multiple spaces 30,32.These spaces 30,32 can be checked simultaneously.For example, the visual angle of user is reduced To include the two spaces.As another example, when the visual angle of user is placed into a space or checks a space, by In the orientation of user, another space can at least partly be checked.
Space 30,32 is separation.It can individually check space 30,32 and not have to check other spaces 32,30.Such as pass through Wall or surface 34, can be used any separation.Alternatively, by be not connected to (that is, a component or figure do not extend to it is another In a component or figure or by region or it is spatially separating) come separated space 30,32.
Space 30,32 has known relationship each other, and orientation such as having the same (for example, be upwards in two spaces It is identical) and/or by coordinate system alignment (for example, identical ratio, so that the component in a space is in another space Same coordinate at).One space 32 can based on selected component relative to another spatial displacement (for example, by information centre The center in space 32 is placed in 3D model 20 at the center of component 36).In one embodiment, space 30,32 passes through adjacent There is known relationship each other.For example, one another left side or one on the other side.
In one embodiment shown in Fig. 1, relationship or a space 30 with stratum are on another space 32 Face.It can be seen that translucent bottom or the support of surface 34 or keep entire 3D CAD model 20.3D CAD model 20 can be according to current Demand is sized, to allow the closer experience observed but also have actual size.Other spaces 32 are bottom or surface 34 Following information centre.Second lower level is used to check and show the subassemblies of single component 36 or component, and shows Information 42 (for example, metadata) about component 36, subassemblies and/or entire 3D CAD model 20 out.This information centre is empty Between display in 32 avoid blocking 3D CAD model 20.
In the 3D model space, the expression of 3D model is located on the top side on surface 34 or is otherwise located on space In 30.The expression is that 3D is indicated, such as shows 3D model from given visual angle.3D model can be used as true man's size and present to help Help orientation, but any ratio of working model, size, height or other sizes.
Surface 34 is bottom, but can be wall, ceiling or other spacer bodies.Three-dimensional space is separated into difference by floor It is horizontal.Other spaces 32 are below surface 34.Surface 34 is the isolate of the compartment 30,32.
Surface 34 is opaque or translucent.For example, when user's use is located in 34 top of surface and/or space 30 Camera when checking, surface 34 is opaque.When camera is located remotely from 3D model 20, such as boundary more than surface 34 Lateral position when, then surface 34 is shown as translucent.Alternatively, surface 34 is always translucent.Translucent includes quilt It is considered as the flat surface with identical horizontal transparent degree and/or with the point or grid lines opaquer than other components.
Surface 34 has arbitrary dimension and shape, all more wider than 3D model and deeper but big as entire 3D model in this way Ellipse.The size and/or shape on surface 34 is suitable for the size of 3D model 20.Fig. 2 shows automobile 3D models 20, wherein Surface 34 is laterally extended 1/2 width or depth beyond 3D model 20 less than 3D model 20.Fig. 2 shows be located on surface 34 Face, but it is placed across the camera of the border outer on surface 34.VR environment include user from the camera position and meanwhile check two A space 30,32.
Based on the input from sensor 12,16, processor 14 generates virtual environment on stereoscopic display 26, to show Around any of space 30,32, component 36 and/or 3D model 20, internal and/or across it navigation.Changeable user Visual angle, to check around 3D model 20 and/or enter 3D model 20.It continuous navigation and can be worn between different spaces 30,32 These spaces are crossed, but disjunct navigation can be provided between space 30,32.User can visit as a whole 3D model Rope, and select the component or part set of concern.Different types of mobile permission user selects the most comfortable and most suitable movement Method.The one group of tool provided allows users to interact with 3D CAD mode 20.In virtual environment, in the position of user hand The place of setting can show menu.Plate with menu option can be activated and show the selection for menu item.
Component 36 or subassemblies may be selected in user.Sensor 12,16 is activated, the selection of trigger unit.This method Gravity may not be used, user's any position placing component in the scene is allowed.Alternatively, using gravity.If a portion Part is moving, then physics allows or do not allow component to collide.By not allowing to collide, it can more easily select and move Any part rather than block-by-block disassembles entire 3D CAD model.User can deactivate component or assembly to expose built-in portion Part.
When user passes through or when relative to 3D Model Navigation, information centre space 32 rests on identical relative altitude, still The content in information centre space 32 can follow user when exploring 3D CAD model 20.When user navigates or selects, letter is updated Breath center is to include metadata associated with current selected component.This facilitates user when switching to information centre space 32 Their orientation is not lost.
Fig. 3 shows showing for the 3D CAD model 20 at the correspondence visual angle with camera and user in 3D CAD model 20 Example view.Component 36 several different is shown in the stereoscopic display of the inside of 3D model 20.In the perspective view of Fig. 3 First space 30 is only shown.When user navigates in 3D mode, show geological information without other metadata.Some member numbers According to can be shown on one or more components 36.
When user's alternative pack 36, component 36 can be highlighted.If metadata will be shown in space 30 from inside view Out, then metadata (metadata such as on panel or other annotations) will block the component of 3D model 20, otherwise from given view Angle can view these components.When virtually exploring 3D model, user can be fully entered in scene and by 3D model packet It encloses.In this case, 3D model is not blocked in virtual plate, virtual screen or virtual panel, do not reduce 3D model and be used for vacateing In the case where the screen space of metadata or the big component of disabling virtual 3d model, show about 3D model entirety or part thereof Information is impossible.Panel or plate beside model may be for big 3D CAD model (such as aircraft, train or ocean classes Wheel) it does not work.By providing different spaces 32 at least some of metadata, interior views can be safeguarded, wherein first number According to being seldom blocked or do not block.
Virtual environment includes information centre space 32, for the information except the geometry of CAD model 20.3D model 20 User's selected part 36 3D indicate and/or the metadata for user's selected part 36 be located in other spaces 32.? In the example of Fig. 1 and Fig. 2, user's selected part 36 and metadata are located in the side on the surface 34 relative to 3D model 20 On.Steric information about 3D CAD model 20 is presented in the second space 32 isolated with the first space 30.Pass through addition the Two floors provide space as the information centre space 32 below ground level or surface 34 to check single component or component Collection, but the space is not attached the rest part that (but still can refer to complete model above) arrives 3D model 20.It can not hide Gear 20 rest part of 3D model but still with 3D model reference frame in the case where check selected part 36 and/or Metadata.
The information centre space 32 allows users to select the particular elements 36 or assembly of 3D CAD model 20, and more Close-ups.Component 36 or assembly are not covered by the other components occurred in 3D CAD model 20, and user is less It may be divert one's attention by the rest part of 3D CAD model 20.Although the visual angle or the visual field of user include information centre space 32 or at this In information centre space, but metadata can be shown in the case where not blocking 3D CAD model 20 and/or component 36.
Fig. 4 shows an example, and wherein surface 34 is to divide in the space 30 for being used for 3D model 20 from information centre space 32 The ground level of the limited extent left.Information centre space 32 includes user's selected part 36 on pedestal 38.Component 36 is in table It is shown above face 40, such as the floor for information centre space 32.Other surfaces 40 are translucent or opaque. Component 36 is at the eye-level display in interior zone (that is, in the center of floor surface 40 and/or horizontal boundary of the floor surface) Display.In other embodiments, pedestal 38 is not provided.Component 36 is illustrated as disconnecting with 3D model 20, therefore can be from any angle Component 36 is checked, without being blocked by other components of 3D model 20.
Fig. 5 shows example, and wherein camera is located in the outside of the horizontal boundary of 34 or less surface and floor surface 40.User Visual angle show the component 36 on the pedestal 38 below surface 34, by it can be seen that the component of 3D model 20.Floor surface Selected object or component 36 are checked in 40 addition permission under not circumstance of occlusion, and refer to entire 3D model 20.Floor surface 40 is parallel with ground surface 34, provides the spatial relationship between space 30,32.
Metadata is presented in any form.For example, overlapping shows the note of alphanumeric text or link on component 36 It releases.The image of metadata, text, link, list, figure, chart or combinations thereof are presented in information centre space 32.
In the embodiment shown in 1,4 and 5, metadata is rendered as one or more panels or plate 42.In Fig. 1, Plate 42 is located in around the boundary of floor surface 40.In figures 4 and 5, show plate 42 interconnected tree structure or Menu structure.The tree of plate 42 is shown behind component 36 (at least at initial camera position) and in floor surface 40 In boundary laterally.The float text or information of no background board or panel can be used.Any position of panel or plate can be used, it is all Such as stack and/or be positioned laterally on the border outer of floor surface 40.
Information centre allows to show 3D model assembly 36 and the metadata for component 36, block without any.It can Extend this information centre space 32 to transmit the information about entire 3D model 20, such as list including component and other members Data.The size on floor 40 can be limited, and be preferably orientated with providing user, but still show enough spaces to show Information.Selected part 36 or part set are presented on eye-level display (optionally in the platform 38 for the dummy keyboard for keeping text input Top on), and virtual plate 42 around component 36 shows available information or metadata, without blocking shown component 36. Can provide with different plates or panel, it is different place, other cloth of different surfaces and/or multiple components with and without pedestal It sets.
For to particular elements 36 or subassemblies filling information central space 32, user navigates or looks into the model space 30 When seeing 3D model 20, alternative pack 36 and/or subassemblies.It is allowed users to checking tool from 3D CAD for example, selecting 20 extracting parts of model or assembly.This tool replicates component 36 or assembly, and by duplicate movement or is placed into letter It ceases in central space 32.Alternatively, the component 36 in the focus instruction 3D model 20 of eyes of user 28, and the heart in the information Any part 36 of focal point is selected and replicated in space 32.In other embodiments, component 36 is shown as mobile from 3D model 20 To pedestal 38.Metadata and/or menu are additionally provided in information centre space 32, menu is for navigating to selected component 36 or son The metadata of assembly.
In one embodiment, panel list, tree structure or the graphical display component names of 3D CAD model 20, and And user is allowed to select specific component 36, component 36 is highlighted in 3D model 20, and quick in heart space 32 in the information Ground movement or placing component 36.List, figure or structure are provided in heart space 32 in the information.User does not have to that the time is spent to exist Search parts 36 in entire 3D CAD model 20, but navigate to the plate in information centre space 32.This method can avoid Menu or plate are fixedly mounted beside shown 3D CAD model 20, or avoids exposing dish between the component 36 in 3D model 20 List and some components for blocking 3D model 20.
Using the spatial relationship between two spaces 30,32, user's camera position and/or visual angle can two spaces 30, It is shifted between 32.The visual angle of user ground surface 34 it is not ipsilateral between change.Any trigger, such as hand can be used The activation of gesture, voice command or enter key.In one embodiment, component 36 or subassemblies are selected as trigger.Triggering Device leads to the change at visual angle.Visual angle is displaced to information centre space 32 from 3D model 20 (such as within the inside of 3D model 20) In component 36 or subassemblies before identical view or orientation, vice versa.Visual angle shifts (from model center space to letter Breath center is quickly converted, to avoid motion sickness and user irritated) not (but may) change ratio between space 30,32 or Orientation.
In the case where two spaces are by the relevant situation in top or bottom, displacement can be elevator simulation or movement.The displacement User directly up or is moved down.Elevator change can be instantaneous or be carried out with any rate.For instantaneous, user will Selected part (that is, quick change of camera coordinates (it is the position of user)) is from the top 3D model space " transferred " in information The lower floor surface 40 in heart space 32.For less instantaneous elevator motion, camera gradually shifts upward or downward, passes through Surface 34 is down or up.Other movements, such as angle or orientation of change camera or user perspective can be used, from a sidesway It moves to the other side, reduce and/or amplifies, or along the camber line between space 30,32 on earth and across the boundary on surface 34.
In view of the spatial relationship between space, user can be easier to perceive variation.Processor 14 is in response to trigger Interactively enter, user focus is changed into information centre space 32 from 3D model 20 (for example, from interior views --- see Fig. 3) Component 36 and information (for example, metadata panel).Information initially after the change, change during or change before show.With Family can such as select with information exchange, is highlighted and/or mobile.The information provided in information centre space 32 is not block 3D It is provided in the case where the inside of model 20, because 3D model 20 is in individual space 30.
Fig. 6 shows an implementation of the flow chart of the method for using virtual reality headset equipment to show threedimensional model Example.In general, this method is related to navigating around 3D CAD model and/or in 3D CAD, and mentions in different regions For specific component and/or metadata information, 3D CAD model, and/or appointing from selected part are blocked to avoid with metadata What angle provides the view not stopped.
Method is executed by the system of Fig. 1, graphics processor, VR headset equipment or combinations thereof.For example, VR headset equipment Stereoscopic display process performing 50,56,58 and 60.The processor process performing 52 of VR headset equipment.It has or non-band There is the gesture sensor process performing 54 of the processor of VR headset equipment.
(from top to bottom or number) or different sequences execute method in the order presented.For example, Fig. 6 is shown in 3D mould It navigates in type, is transformed into information centre, is then converted back.In other embodiments, using from information centre to 3D model Initial conversion or the reverse sequence of conversion.
It can provide additional, different or less behavior.For example, only carrying out a conversion (for example, from 3D model to letter Breath center or vice versa).In another example, any of conversion is repeated.In another example, behavior 52 has been used User selection except other selections.In other examples, gesture is not detected in behavior 54, but opposite transition occurs automatically Or in response to other inputs.
In behavior 50, the view for shelving the 3D CAD of component layout on the surface is shown in virtual reality wear-type and sets In standby.Any camera position can be used.User can be from different location and/or direction adjustment or change view to check 3D CAD. 3D CAD can be from externally and/or internally checking.
The view of the 3D CAD of arrangement on the surface, on such as ground surface.The surface has any lateral extent, such as greatly Twice of the lateral extent of longest width or longest depth less than 3D CAD.Surface can be unlimited.Surface is impermeable It is bright or translucent.It can provide any pattern, such as including shading grid.It may not provide pattern in other embodiments.
Camera position is square on the surface.When user navigates to check 3D CAD, camera is maintained at surface.Camera can It is located in lower face and is directed toward surface, such as checking the bottom of 3D CAD by translucent surface.Alternatively, 3D CAD just floats or can separate from surface on the surface, to allow to check bottom when camera is square on the surface.User can promote, Rotation, selection, changes or manipulates in other ways 3D CAD or its component at activation.
In behavior 52, user input apparatus and/or processor receive selection of the user to a part of component layout.It can Alternatively, user's selection of arrangement is received.User is based on detected hand position and trigger, the selector in VR environment Part.In other embodiments, eye focus and/or gesture can be used.It can be used from list of parts or other navigate to concerned department The selection of part.In other embodiments, it is selected as automatically or occurs in the absence of user input.
In behavior 54, have or the gesture of the sensor senses user not with processor.User is with specific mode Movement their one hand, both hands and/or finger.It detects this movement or thus moves the placement of generation.For example, user will hold at Fist, thumb are stretched upwards.It detects and stretches out one's hand gesture on the thumb.In alternative embodiments, other user's inputs, such as key are detected Or the activation of voice command.In other embodiments, the user of behavior 52 is chosen for use as or instead of the gestures detection in behavior 54.
In behavior 56, using the graphics processor of the display in VR headset equipment by User from 3D CAD View pass through surface be transformed into the view on the view opposite side with 3D CAD on surface.The view of user turns from 3D CAD Change to the component on opposite sides on information and/or surface.The conversion be by by the position of camera from the side on surface change to What the other side was realized.In the case where surface is wall, change be can be laterally from side to the other side.Alternatively or separately Other places, conversion be by will be orientated from be directed toward surface be changed to be directed toward lower face complete.It can be used and change scaling.
In one embodiment, the coordinate system on opposite sides on surface is replicated or mirror image.Same coordinate for camera Or the reflection of same coordinate is used for the opposite side on surface.The camera position X of component in 3DCAD checks in side on the surface1、Y1、Z1 And the angle 3D θ (left side turns over/and the right side turns over, faces upward/nutation and "Left"-deviationist/Right deviation) it is repositioned at lower face X1、Y1、Z1Position with And be in the angle 3D θ, check the component without can be appreciated that 3D CAD rest part.User can be with the component in two views in identical height Degree, transversal displacement, depth and angle, but a view is another in the heart together with component in 3D CAD, and in the information The component (that is, without rest part of 3D CAD) that a view has.In alternative embodiments, for surface it is not ipsilateral on View different relative positionings is provided.
Any converting motion can be used.For example, as elevator by the visual angle of user, convert downward or upward by movement.This turn It changes and changes position only along a dimension.It is orientated and/or is scaled constant or do not change.User can continue to navigate, and cause Change of the transition period in other dimensions, angle or scaling.Conversion be it is instantaneous, be such as moved to next position from a position It sets without any medial view.Alternatively, conversion is gradual, includes one or more intermediate as camera position changes Position.In other embodiments, converting motion is along camber line or other non-directional routes, with and without ratio (that is, contracting Put) and/or angle variation.
Transition response in behavior 54 detection of gesture and occur.Other triggers can be used.
In behavior 58, in the display of VR headset equipment, graphics processor by user's selected part and about with The information of family selected part is shown at view upon the transition.Any camera position can be used.Initially, make at conversion end With camera position.User is adjustable or changes view to check 3D component and/or letter from different location, ratio and/or direction Breath.
The view of component is on surface and/or pedestal.Surface and/or pedestal have any lateral extent, such as surface extremely It is twice of the lateral extent of pedestal less, and the size of pedestal is less than the longest width of component or the lateral extent of longest depth 1.5 times.Surface can be unlimited.Surface and/or pedestal are opaque or translucent.It can provide any pattern, such as Including shading grid.It may not provide pattern in other embodiments.
The top on camera position heart surface in the information.When user navigates to check component or information, camera is maintained at The lower face of information centre surface and 3D CAD.Camera can be positioned on information centre's lower face and Compass Face, such as checking the bottom of component across translucent surface.Alternatively, component just floats on the surface or can be from adjacent Surface or separated from surface, to allow to check bottom when camera is square on the surface.User can be promoted, rotated, activated, be selected It selects, change or control member in other ways.Camera can be positioned on the surface 3D CAD check information centre component and/or Metadata.
Information is shown on one or more plates.Panel or plate can be opaque, translucent or transparent.One Link, alphanumeric text, figure, chart and/or the other expressions of information about component are provided on a or multiple virtual plates. Plate has any placement, such as around the surface-boundary of information centre (for example, as part of wall).Plate is separated with pedestal and component To avoid the view of blocking parts.Plate can be (for example, presenting as mark) floating or be supported.Plate can be flat Or it is curved.User can interact with the content of plate or plate, such as navigate through the content on one or more plates (for example, choosing It selects another plate of information activation on a plate and/or replaces text with new text).
In behavior 60, in the display of VR headset equipment, processor is transformed into cloth from the view of component and/or information The 3D CAD set.Identical or different trigger (for example, thumb is upward) is used.In response to detection triggers, view is in sky Between between be transformed into other sides on surface.The conversion only changes the camera in a dimension, without changing orientation or ratio, But other changes can be used.The convertible view of user with the component that selects different component and/or select before checking relative to The interaction or placement of the other parts of 3D CAD.Component (such as color) and conversion can be changed in user, to check relative to 3D The result of the change of CAD.
Although the present invention has been described above with reference to various embodiments, it should be understood that many modifications may be made to and modification, Without departing from the scope of the present invention.Therefore foregoing detailed description should be considered as illustrative rather than restrictive, and answer It is the appended claims restriction the spirit and scope of the present invention for including all equivalent forms when understanding.

Claims (20)

1. system of the one kind for showing threedimensional model (20) in virtual reality, the system comprises:
Memory (18) is configured as the threedimensional model (20) of storage object;
Virtual reality headset equipment (22) is configured as showing to the user for wearing the virtual reality headset equipment (22) Virtual environment;And
Processor (14), is configurable to generate the virtual environment, and there is the virtual environment (1) to be located in the of surface (34) The expression and use of user's selected part of the expression and (2) described threedimensional model (20) of the threedimensional model (20) on side In the metadata of user's selected part in the second side for being located in the surface (34).
2. system according to claim 1, wherein the area of computer aided that the threedimensional model (20) includes the object is set It counts model (20), the object includes multiple components that are mutually related, and the multiple component that is mutually related includes the user Selected part.
3. system according to claim 1, wherein the virtual reality headset equipment (22) includes that three-dimensional wear-type is aobvious Show device and inertia motion sensor.
4. system according to claim 1, wherein the virtual reality headset equipment (22) includes gesture sensor, institute It states processor (14) and is configured to respond to the gesture detected by the gesture sensor, by the visual angle of the user described Change between first side on surface (34) and second side.
5. system according to claim 1, wherein the surface (34) are ground level, the threedimensional model (20) is shelved on On the ground level, and described second side is under the ground level, wherein be configured as will be described for the processor (14) The visual angle of user is mobile as elevator between first side and described second side and changes.
6. system according to claim 1, wherein the surface (34) are translucent.
7. system according to claim 1, wherein the expression of the metadata includes image, text with the metadata The plate (42) of this or image and text, the plate (42) is located in around perimeter, and the expression of user's selected part It is located in interior zone.
8. system according to claim 1, wherein the expression of user's selected part and the threedimensional model (20) are disconnected It opens, and is located at and is shown on the pedestal on second surface (34).
9. system according to claim 1, wherein the processor (14) was configured as in the threedimensional model (20) week It encloses navigation visual angle and navigates at the visual angle in the threedimensional model, and select user's selected part.
10. system according to claim 1, wherein the expression of the threedimensional model (20) is not by the screening of the metadata Gear.
11. method of the one kind for showing threedimensional model (20) using virtual reality headset equipment (22), which comprises
The component in first level that display (50) is held on three-dimensional space in the virtual reality headset equipment (22) Arrangement, three-dimensional computer Computer Aided Design first view;
Receive selection of (52) user to the first component of the arrangement of the component;
By User from the first view by the floor of first level conversion (56) to the floor with it is described The second view on the opposite side of first view, second view have the second level of the three-dimensional space;And
Shown at second view in the virtual reality headset equipment (22) (58) described first component and about The information of the first component.
12. according to the method for claim 11, wherein the floor includes ground surface (34), being shelved on the ground surface The arrangement of the component, and wherein the conversion (56) includes being converted (56) as elevator, and the elevator moves down The user perspective.
13. according to the method for claim 11, wherein the conversion (56) includes not changing user's orientation to convert (56)。
14. according to the method for claim 11, further including detecting (54) gesture using gesture sensor, and wherein respond The conversion (56) occurs in the detection of the gesture.
15. according to the method for claim 11, further including returning to the first view from second view conversion (60).
16. according to the method for claim 11, wherein display (58) described first component and the information are included in pedestal Upper display (58) described first component, and the information is shown that (58) are to separate with the pedestal and be located in the base Virtual plate (42) around seat.
17. according to the method for claim 11, wherein the floor is translucent, and wherein display (58) described the Two views include the cloth of display (58) first component, the floor and the component on the other side on the floor At least part set.
18. a kind of virtual reality system, comprising:
Stereoscopic display (26) is configured as indicating the inside of Computer Aided Design Model (20) in the first space (30), And it indicates in the second space (32) separated with first space (30) about the Computer Aided Design Model (20) information;
User's input pickup (12,16) is configured as interactively entering from user reception user;And
Processor (14), be configured to respond to it is described interactively enter, user focus is changed into the information from the inside.
19. virtual reality system according to claim 18, wherein in the feelings of no information before the change The inside is shown under condition, and the information is wherein shown after the change and the information do not block it is described in Portion.
20. virtual reality system according to claim 18, wherein semi-transparently plane (34) is by first space (30) it is separated with the second space (32), and wherein change simulation from first space (30) is moved to described the The elevator in two spaces (32).
CN201780045798.3A 2016-06-22 2017-06-09 System and method for displaying three-dimensional model in virtual reality Active CN109478103B (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662353073P 2016-06-22 2016-06-22
US62/353,073 2016-06-22
PCT/US2017/036721 WO2017222829A1 (en) 2016-06-22 2017-06-09 Display of three-dimensional model information in virtual reality

Publications (2)

Publication Number Publication Date
CN109478103A true CN109478103A (en) 2019-03-15
CN109478103B CN109478103B (en) 2022-07-08

Family

ID=59078246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780045798.3A Active CN109478103B (en) 2016-06-22 2017-06-09 System and method for displaying three-dimensional model in virtual reality

Country Status (4)

Country Link
US (1) US10747389B2 (en)
EP (1) EP3458942B1 (en)
CN (1) CN109478103B (en)
WO (1) WO2017222829A1 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018223229A1 (en) 2017-06-05 2018-12-13 2689090 Canada Inc. System and method for displaying an asset of an interactive electronic technical publication synchronously in a plurality of extended reality display devices
US11727166B2 (en) * 2018-05-08 2023-08-15 Autodesk, Inc. Techniques for generating graph-based representations of complex mechanical assemblies
EP3594831A1 (en) * 2018-07-09 2020-01-15 VOLKE - Entwicklungsring SE Method for displaying cad data and computer program product
US11544425B2 (en) 2019-04-12 2023-01-03 Cnh Industrial America Llc Systems and methods for expediting design of physical components through use of computationally efficient virtual simulations
US11295046B2 (en) 2019-04-12 2022-04-05 Cnh Industrial America Llc Systems and methods for expediting design of physical components through use of computationally efficient virtual simulations
CN110503709A (en) * 2019-08-26 2019-11-26 杭州师范大学 A method of realizing that extensive Web3D model is presented in data center's load
CN111198615A (en) * 2020-03-04 2020-05-26 重庆一七科技开发有限公司 Portable wireless device for virtual reality experience
CN111523267B (en) * 2020-04-21 2023-05-23 重庆邮电大学 Fan main shaft structure optimization method based on parameterized finite element model
CN114385052B (en) * 2020-10-19 2023-10-20 聚好看科技股份有限公司 Dynamic display method of Tab column and three-dimensional display device
KR102422834B1 (en) * 2020-12-02 2022-07-19 (주)유비컴 Gesture recognition device using 3D virtual space and wrist band and recognition method using the same
US20230108256A1 (en) * 2021-08-11 2023-04-06 MeetKai, Inc. Conversational artificial intelligence system in a virtual reality space

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187389A1 (en) * 2008-01-18 2009-07-23 Lockheed Martin Corporation Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave
CN101901290A (en) * 2010-07-22 2010-12-01 西北师范大学 Machine design method based on three-dimensional virtual space of integrated mechanism
US20130141428A1 (en) * 2011-11-18 2013-06-06 Dale L. Gipson Computer-implemented apparatus, system, and method for three dimensional modeling software
US20140002351A1 (en) * 2012-07-02 2014-01-02 Sony Computer Entertainment Inc. Methods and systems for interaction with an expanded information space
US20140085298A1 (en) * 2008-09-09 2014-03-27 Canon Kabushiki Kaisha Mixed reality space image providing apparatus

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040104935A1 (en) * 2001-01-26 2004-06-03 Todd Williamson Virtual reality immersion system
US9729850B2 (en) * 2015-02-17 2017-08-08 Nextvr Inc. Methods and apparatus for receiving and/or using reduced resolution images
US10156908B2 (en) * 2015-04-15 2018-12-18 Sony Interactive Entertainment Inc. Pinch and hold gesture navigation on a head-mounted display
KR20160133230A (en) * 2015-05-12 2016-11-22 엘지전자 주식회사 Mobile terminal
WO2017082457A1 (en) * 2015-11-11 2017-05-18 엘지전자 주식회사 Hmd and method for controlling same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090187389A1 (en) * 2008-01-18 2009-07-23 Lockheed Martin Corporation Immersive Collaborative Environment Using Motion Capture, Head Mounted Display, and Cave
US20140085298A1 (en) * 2008-09-09 2014-03-27 Canon Kabushiki Kaisha Mixed reality space image providing apparatus
CN101901290A (en) * 2010-07-22 2010-12-01 西北师范大学 Machine design method based on three-dimensional virtual space of integrated mechanism
US20130141428A1 (en) * 2011-11-18 2013-06-06 Dale L. Gipson Computer-implemented apparatus, system, and method for three dimensional modeling software
US20140002351A1 (en) * 2012-07-02 2014-01-02 Sony Computer Entertainment Inc. Methods and systems for interaction with an expanded information space

Also Published As

Publication number Publication date
CN109478103B (en) 2022-07-08
US20190179510A1 (en) 2019-06-13
EP3458942B1 (en) 2021-06-02
US10747389B2 (en) 2020-08-18
WO2017222829A1 (en) 2017-12-28
EP3458942A1 (en) 2019-03-27

Similar Documents

Publication Publication Date Title
CN109478103A (en) Three-dimensional model information is shown in virtual reality
US10928974B1 (en) System and method for facilitating user interaction with a three-dimensional virtual environment in response to user input into a control device having a graphical interface
US20220084279A1 (en) Methods for manipulating objects in an environment
US10521028B2 (en) System and method for facilitating virtual interactions with a three-dimensional virtual environment in response to sensor input into a control device having sensors
US10417812B2 (en) Systems and methods for data visualization using three-dimensional displays
US11275481B2 (en) Collaborative augmented reality system
US10678340B2 (en) System and method for providing user interface tools
US20190279424A1 (en) Collaborative augmented reality system
JP2022540315A (en) Virtual User Interface Using Peripheral Devices in Artificial Reality Environment
CN106687886B (en) Three-dimensional hybrid reality viewport
CN105637564B (en) Generate the Augmented Reality content of unknown object
Millette et al. DualCAD: integrating augmented reality with a desktop GUI and smartphone interaction
CN105359082B (en) system and method for user interface navigation
Beattie et al. Taking the LEAP with the Oculus HMD and CAD-Plucking at thin Air?
JP6369842B2 (en) Multi-depth interval refocusing method and apparatus and electronic device
EP3814876B1 (en) Placement and manipulation of objects in augmented reality environment
US20220155881A1 (en) Sensing movement of a hand-held controller
Hernoux et al. A seamless solution for 3D real-time interaction: design and evaluation
Lacolina et al. Natural exploration of 3D models
Akdaş et al. Virtual Reality in Product Design: Researches between 1997 and 2021
WO2020084192A1 (en) Method, arrangement, and computer program product for three-dimensional visualization of augmented reality and virtual reality environments
JP2020077365A (en) Image display system, operating system, and control method for image display system
Lee et al. Mirage: A touch screen based mixed reality interface for space planning applications
US10768721B2 (en) Model controller
Gheorghe et al. Exploring interactions specific to mixed reality 3D modeling systems

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant