CN112270767A - Building virtual display control method and device, wearable device and storage medium - Google Patents

Building virtual display control method and device, wearable device and storage medium Download PDF

Info

Publication number
CN112270767A
CN112270767A CN202011239605.8A CN202011239605A CN112270767A CN 112270767 A CN112270767 A CN 112270767A CN 202011239605 A CN202011239605 A CN 202011239605A CN 112270767 A CN112270767 A CN 112270767A
Authority
CN
China
Prior art keywords
target
building
virtual display
user
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011239605.8A
Other languages
Chinese (zh)
Inventor
李磊
张媛
徐俊
谢晓旭
刘海棠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chongqing Wisdom Source Technology Co ltd
Original Assignee
Chongqing Wisdom Source Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chongqing Wisdom Source Technology Co ltd filed Critical Chongqing Wisdom Source Technology Co ltd
Priority to CN202011239605.8A priority Critical patent/CN112270767A/en
Publication of CN112270767A publication Critical patent/CN112270767A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a building virtual display control method, a building virtual display control device, wearable equipment and a storage medium, wherein VR virtual display is carried out on a selected building base, building components existing in a current visual angle display area are determined based on the current visual angle display area, target components capable of being operated are screened out, and operable actions capable of being executed corresponding to the target components are determined; displaying icons corresponding to the target component and icons corresponding to the operable actions which can be executed and correspond to the target component in a set display area of the current VR virtual display image in a related manner; monitoring the somatosensory information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action; and executing the target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image. This scheme easy operation only needs the user to trigger through limbs and controls corresponding building element, realizes corresponding the operation, and then realizes the purpose of directly perceived show, quick operation.

Description

Building virtual display control method and device, wearable device and storage medium
Technical Field
The invention relates to the technical field of VR (virtual reality), in particular to a building virtual display control method, a building virtual display control device, wearable equipment and a storage medium.
Background
In the building design and construction stage, in order to perform more intuitive display on the internal or external structure of a building and help designers and constructors to better understand the building structure, especially a large building cluster, the design and display are mainly performed based on 3D software at present, for example, rendering and 3D display are performed through 3D max. Compared with the traditional paper file, the method is more efficient, visual and easy to store. However, this requires a certain use base of the 3D software by the relevant personnel, and requires a relatively mature operation experience, and the 3D software has complex functions and is difficult to be skillfully operated in a short time, so that it is not friendly to beginners or temporary users, and cannot quickly, timely and intuitively know the building structure. Therefore, it is very important how to help the zero-based beginners or temporary users to conveniently view and operate the 3D stereoscopic images of the buildings.
Disclosure of Invention
The invention provides a building virtual display control method, a building virtual display control device, wearable equipment and a storage medium, and mainly solves the technical problems that: how to help the zero-base beginners or temporary users to conveniently view and operate the 3D stereo image of the building.
In order to solve the above technical problem, the present invention provides a building virtual display control method, including:
performing VR virtual display on a selected building base, determining building components existing in a current visual angle display area based on the current visual angle display area, screening operable target components, and determining the corresponding executable operable action of the target components;
displaying the icon corresponding to the target component and the icon corresponding to the executable operable action corresponding to the target component in a set display area of the current VR virtual display image in a correlated manner;
monitoring the somatosensory information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action;
and executing the target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image.
Optionally, the target member for operation includes at least one of a door, a window, an elevator, a lighting device, furniture, and an appliance, and the operable action includes at least one of opening, closing, moving, deleting, and measuring a size.
Optionally, the monitoring of the user somatosensory information and the obtaining of the operation instruction include:
collecting a user limb image, loading the user limb image into the VR virtual display image in real time for superposition display, tracking the user limb, taking a target component as the target operation component when the triggering condition of the target component is met, simultaneously starting a limb action recognition function, matching the user limb action with each executable action corresponding to the target operation component, and taking the executable action matched with the user limb action as the target operation action.
Optionally, the process of detecting whether the trigger condition of a certain target component is met includes:
and when the specific part of the limb of the user is positioned on the icon area corresponding to a certain target member and stays for more than a set length, judging that the triggering condition of the target member is met.
Optionally, the user somatosensory information includes a user gesture.
Optionally, the specific part of the user limb is the tip of the index finger of the user.
The present invention also provides a building virtual display control apparatus, including:
the display module is used for carrying out VR virtual display on the selected building base, and carrying out associated display on icons corresponding to target components existing in the current visual angle display area and icons corresponding to executable operational actions corresponding to the target components in the set display area;
the processing module is used for determining building components existing in a current visual angle display area based on the current visual angle display area, screening out target components available for operation and determining the corresponding executable operation actions of the target components; executing a target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image;
the motion sensing monitoring module is used for monitoring the motion sensing information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action.
The invention also provides wearable equipment comprising the building virtual display control device.
Optionally, the wearable device is VR glasses or a VR helmet.
The present invention also provides a storage medium storing one or more programs executable by one or more processors to implement the steps of the building virtual display control method as described above.
The invention has the beneficial effects that:
according to the building virtual display control method, the building virtual display control device, the wearable equipment and the storage medium, VR virtual display is carried out on a selected building base, building components existing in a current visual angle display area are determined based on the current visual angle display area, target components capable of being operated are screened out, and operable actions capable of being executed corresponding to the target components are determined; displaying icons corresponding to the target component and icons corresponding to the operable actions which can be executed and correspond to the target component in a set display area of the current VR virtual display image in a related manner; monitoring the somatosensory information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action; and executing the target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image. The scheme is simple to operate, and the user can control the corresponding building component by triggering the limb, so that the corresponding operation of the related building component is realized, and the aims of visual display and quick operation of the 3D building image are further fulfilled.
Drawings
Fig. 1 is a schematic flow chart of a building virtual display control method according to a first embodiment of the present invention;
fig. 2 is a schematic structural diagram of a virtual display control apparatus for a building according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram of a wearable device according to a third embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following detailed description and accompanying drawings. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The first embodiment is as follows:
the present embodiment provides a method for controlling virtual display of a building, please refer to fig. 1, which includes the following steps:
and S101, performing VR virtual display on the selected building base.
The selected building is stored in a data form readable by computer software for reading and displaying by VR virtual display equipment, and a user can view a 3D virtual image of the building by wearing the VR virtual display equipment.
S102, building components existing in the current visual angle display area are determined based on the current visual angle display area.
And displaying the 3D virtual image related area of the building based on the current visual angle of the user, and acquiring building construction existing in the current visual angle display area based on modules such as an equipment processor or a controller. It should be understood that a building element refers to a component required to make up the entirety of a building, including, but not limited to, the ground, walls, beams, columns, water and electrical pipelines, doors, windows, furniture, appliances, and the like. Based on the different properties of the building element, a division is made into elements that can be operated by a viewer and elements that cannot be operated. Components such as floors, walls, beams, columns, water and electricity pipelines, etc. are not operable and, after the component is designed to be completed, are fixed in position and function and therefore are not operable. Members such as doors, windows, furniture, home appliances, and the like have characteristics such as opening, closing, or positional movement, and are operable members.
S103, screening out the target components which can be operated, and determining that the target components correspond to the operable actions which can be executed.
Based on the attributes of the target component available for operation, an operable action may be determined that it corresponds to that may be performed. Taking a door or a window as an example, the door or the window has two states of opening and closing, and thus has two actions of opening and closing correspondingly. For another example, the position is flexibly designable for each furniture component and is therefore movable; since it is removable, it is of course also possible to delete it, which corresponds to deleting it from the virtual image. Since the home appliance member has two states of on (start) and off (stop), the operable operation also includes on and off; meanwhile, the position of the household appliance can be flexibly designed, so that the household appliance can be moved and deleted.
The operable action also includes the sizing of the vehicle for any building element, which is space specific and therefore scalable, to facilitate the user's understanding of the amount of space each building constitutes.
The target member for operation includes at least one of a door, a window, an elevator, a lighting device, furniture, and an electric appliance, and the operable action includes at least one of opening, closing, moving, deleting, and measuring a dimension.
And S104, displaying the icon corresponding to the target component and the icon corresponding to the executable operable action corresponding to the target component in a setting display area of the current VR virtual display image in a related mode.
For viewing and operation, the system pre-constructs and stores a mapping between each building element and the building icon, and a mapping between the operational action and the action icon. And calling out corresponding icons for target components which can be operated and exist in the current visual angle range based on the mapping relation, and carrying out superposition display on the icons and the 3D virtual image setting area of the building, so that a user can see the operable components in the current visual range while watching the 3D virtual image of the building. And for the target components which can be operated, the corresponding operable actions which can be executed are displayed in association with the component icons.
In order to avoid the influence of the component icons and the action icons on the normal view of the building view, the setting area may be disposed in the edge area of the view, for example, the left area of the view, and the setting area is sequentially displayed from top to bottom.
If a plurality of identical components are located at different positions in the visual range, only one component icon can be displayed for the identical components, and the display area is reduced; when a user needs to operate one of the components, the user can firstly move a finger to the area where the component icon is located to trigger the constructed operation intention, and at the moment, the system highlights the components distributed at a plurality of different positions corresponding to the component icon so as to highlight the difference with other components. For example, the method is realized by color change, amplification and the like, so that a user can conveniently confirm which member is specifically operated from the plurality of members, when the user moves a finger to a specific position of a certain member, the member is triggered to be selected, and the user displays different gestures through the finger to realize different control actions. Different gestures correspond to different operation actions, such as opening, and can be realized by opening a palm; the operation of 'off' can be realized through 'fist' action; the operation of 'moving' can be realized by the action of 'continuous displacement of a palm' for a certain distance; the operation of 'deleting' can be changed into fist after a certain distance through 'palm continuous position'; the operation of "size measurement" can be realized by the action of "scaling the thumb and the index finger", and the like.
S105, monitoring the somatosensory information of the user and acquiring an operation instruction; the operation command includes a target operation member and a target operation action.
In this embodiment, the user somatosensory information includes a user gesture. Optionally, eyeball information, motion information of other body parts, and the like.
The method comprises the steps of collecting a user limb image, loading the user limb image into a VR virtual display image in real time for superposition display, tracking a user limb, taking a target component as a target operation component when a trigger condition meeting the target component is detected, simultaneously starting a limb action recognition function, matching user limb actions with various executable actions corresponding to the target operation component, and taking the executable actions matched with the user limb actions as target operation actions.
The process of detecting whether the trigger condition of a certain target component is met comprises the following steps: and when the specific part of the limb of the user is positioned on the icon area corresponding to a certain target member and stays for more than a set length, judging that the triggering condition of the target member is met.
The specific part of the limb of the user is the tip of the index finger of the user.
And S106, executing a target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image.
According to the building virtual display control method provided by the invention, VR virtual display is carried out on a selected building base, building components existing in a current visual angle display area are determined based on the current visual angle display area, operable target components are screened out, and operable actions which can be executed corresponding to the target components are determined; displaying icons corresponding to the target component and icons corresponding to the operable actions which can be executed and correspond to the target component in a set display area of the current VR virtual display image in a related manner; monitoring the somatosensory information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action; and executing the target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image. The scheme is simple to operate, and the user can control the corresponding building component by triggering the limb, so that the corresponding operation of the related building component is realized, and the aims of visual display and quick operation of the 3D building image are further fulfilled.
Example two:
in this embodiment, on the basis of the first embodiment, a building virtual display control apparatus is provided for implementing the steps of the building virtual display control method in the first embodiment, please refer to fig. 2, and the apparatus includes the following modules:
the display module 21 is configured to perform VR virtual display on the selected building base, and perform associated display on an icon corresponding to a target component existing in the current view angle display area and an icon corresponding to an executable operable action corresponding to the target component in a set display area;
the processing module 22 is configured to determine building components existing in a current viewing angle display area based on the current viewing angle display area, screen out target components available for operation, and determine that the target components correspond to executable operable actions; executing a target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image;
the motion sensing monitoring module 23 is configured to monitor user motion sensing information and acquire an operation instruction; the operation instruction comprises a target operation component and a target operation action.
For details, please refer to the description in the first embodiment, which is not repeated herein.
Example three:
in this embodiment, on the basis of the second embodiment, a wearable device is provided, please refer to fig. 3, the wearable device is VR glasses or a VR helmet, and includes the building virtual display control apparatus of the second embodiment.
Example four:
the present embodiment provides a computer-readable storage medium on the basis of the first embodiment, where the computer-readable storage medium stores one or more programs, and the one or more programs are executable by one or more processors to implement the steps of the building virtual display control method according to the first embodiment. For details, please refer to the description in the first embodiment, which is not repeated herein.
It will be apparent to those skilled in the art that the modules or steps of the invention described above may be implemented in a general purpose computing device, they may be centralized on a single computing device or distributed across a network of computing devices, and optionally they may be implemented in program code executable by a computing device, such that they may be stored on a computer storage medium (ROM/RAM, magnetic disks, optical disks) and executed by a computing device, and in some cases, the steps shown or described may be performed in an order different than that described herein, or they may be separately fabricated into individual integrated circuit modules, or multiple ones of them may be fabricated into a single integrated circuit module. Thus, the present invention is not limited to any specific combination of hardware and software.
The foregoing is a more detailed description of the present invention that is presented in conjunction with specific embodiments, and the practice of the invention is not to be considered limited to those descriptions. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (10)

1. A building virtual display control method is characterized by comprising the following steps:
performing VR virtual display on a selected building base, determining building components existing in a current visual angle display area based on the current visual angle display area, screening operable target components, and determining the corresponding executable operable action of the target components;
displaying the icon corresponding to the target component and the icon corresponding to the executable operable action corresponding to the target component in a set display area of the current VR virtual display image in a correlated manner;
monitoring the somatosensory information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action;
and executing the target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image.
2. The building virtual display control method according to claim 1, wherein the operable target member includes at least one of a door, a window, an elevator, a lighting device, furniture, and an appliance, and the operable action includes at least one of opening, closing, moving, deleting, and measuring a size.
3. The building virtual display control method according to claim 1 or 2, wherein the monitoring of the user somatosensory information and the obtaining of the operation instruction include:
collecting a user limb image, loading the user limb image into the VR virtual display image in real time for superposition display, tracking the user limb, taking a target component as the target operation component when the triggering condition of the target component is met, simultaneously starting a limb action recognition function, matching the user limb action with each executable action corresponding to the target operation component, and taking the executable action matched with the user limb action as the target operation action.
4. The building virtual display control method according to claim 3, wherein the process of detecting whether the trigger condition of a certain target member is satisfied includes:
and when the specific part of the limb of the user is positioned on the icon area corresponding to a certain target member and stays for more than a set length, judging that the triggering condition of the target member is met.
5. The building virtual display control method of claim 4, wherein the user somatosensory information comprises a user gesture.
6. The method for controlling virtual display of a building according to claim 5, wherein the specific portion of the limb of the user is a tip of an index finger of the user.
7. A building virtual display control apparatus, comprising:
the display module is used for carrying out VR virtual display on the selected building base, and carrying out associated display on icons corresponding to target components existing in the current visual angle display area and icons corresponding to executable operational actions corresponding to the target components in the set display area;
the processing module is used for determining building components existing in a current visual angle display area based on the current visual angle display area, screening out target components available for operation and determining the corresponding executable operation actions of the target components; executing a target operation action on the target operation component according to the operation instruction so as to update the VR virtual display image;
the motion sensing monitoring module is used for monitoring the motion sensing information of the user and acquiring an operation instruction; the operation instruction comprises a target operation component and a target operation action.
8. A wearable device comprising the building virtual display control apparatus of claim 7.
9. The wearable device of claim 8, wherein the wearable device is VR glasses or a VR headset.
10. A storage medium storing one or more programs executable by one or more processors to implement the steps of the building virtual display control method according to any one of claims 1 to 6.
CN202011239605.8A 2020-11-09 2020-11-09 Building virtual display control method and device, wearable device and storage medium Pending CN112270767A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011239605.8A CN112270767A (en) 2020-11-09 2020-11-09 Building virtual display control method and device, wearable device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011239605.8A CN112270767A (en) 2020-11-09 2020-11-09 Building virtual display control method and device, wearable device and storage medium

Publications (1)

Publication Number Publication Date
CN112270767A true CN112270767A (en) 2021-01-26

Family

ID=74339657

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011239605.8A Pending CN112270767A (en) 2020-11-09 2020-11-09 Building virtual display control method and device, wearable device and storage medium

Country Status (1)

Country Link
CN (1) CN112270767A (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106251185A (en) * 2016-08-24 2016-12-21 四川见山科技有限责任公司 VR house property based on UE4 engine is shown and interactive system
CN109657387A (en) * 2018-12-27 2019-04-19 重庆上丞科技有限公司 A kind of household model orientation laying method based on mixed reality scene
US10489931B1 (en) * 2016-03-02 2019-11-26 Meta View, Inc. Systems and methods for reducing processing load when simulating user interaction with virtual objects in an augmented reality space and/or evaluating user interaction with virtual objects in an augmented reality space
CN111611647A (en) * 2020-05-05 2020-09-01 李力 Virtual-real interactive vehicle display system and use method thereof
CN111862341A (en) * 2020-07-09 2020-10-30 北京市商汤科技开发有限公司 Virtual object driving method and device, display equipment and computer storage medium
CN111880720A (en) * 2020-07-31 2020-11-03 北京市商汤科技开发有限公司 Virtual display method, device, equipment and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10489931B1 (en) * 2016-03-02 2019-11-26 Meta View, Inc. Systems and methods for reducing processing load when simulating user interaction with virtual objects in an augmented reality space and/or evaluating user interaction with virtual objects in an augmented reality space
CN106251185A (en) * 2016-08-24 2016-12-21 四川见山科技有限责任公司 VR house property based on UE4 engine is shown and interactive system
CN109657387A (en) * 2018-12-27 2019-04-19 重庆上丞科技有限公司 A kind of household model orientation laying method based on mixed reality scene
CN111611647A (en) * 2020-05-05 2020-09-01 李力 Virtual-real interactive vehicle display system and use method thereof
CN111862341A (en) * 2020-07-09 2020-10-30 北京市商汤科技开发有限公司 Virtual object driving method and device, display equipment and computer storage medium
CN111880720A (en) * 2020-07-31 2020-11-03 北京市商汤科技开发有限公司 Virtual display method, device, equipment and computer readable storage medium

Similar Documents

Publication Publication Date Title
JP6659644B2 (en) Low latency visual response to input by pre-generation of alternative graphic representations of application elements and input processing of graphic processing unit
EP2631739B1 (en) Contactless gesture-based control method and apparatus
CN110162236B (en) Display method and device between virtual sample boards and computer equipment
TWI623877B (en) Virtual reality device and virtual reality method
EP2972669B1 (en) Depth-based user interface gesture control
WO2017222208A1 (en) Remote hover touch system and method
KR101554082B1 (en) Natural gesture based user interface methods and systems
EP2638461B1 (en) Apparatus and method for user input for controlling displayed information
WO2013118373A1 (en) Image processing apparatus, image processing method, and program
EP2717140B1 (en) Equipment control device, operation reception method, and program
KR20130108604A (en) Apparatus and method for user input for controlling displayed information
JP2012003742A (en) Input device, input method, program and recording medium
US9544556B2 (en) Projection control apparatus and projection control method
CN103631893B (en) A kind of browser control method and browser
CN102650906B (en) A kind of control method of user interface and device
US20150054784A1 (en) Method and apparatus for executing application using multiple input tools on touchscreen device
US20150234567A1 (en) Information processing apparatus, information processing method and computer program
CN113680065B (en) Map processing method and device in game
CN107450804A (en) A kind of method and terminal for responding touch control operation
CN113849112A (en) Augmented reality interaction method and device suitable for power grid regulation and control and storage medium
CN109144390A (en) Information processing equipment and information processing method
CN105046748B (en) The 3D photo frame apparatus of image can be formed in a kind of three-dimensional geologic scene
CN112270767A (en) Building virtual display control method and device, wearable device and storage medium
KR101188871B1 (en) Touch screen apparatus for patients with low vision and the method of displaying the same
CN104951211A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210126