CN109782910B - VR scene interaction method and device - Google Patents

VR scene interaction method and device Download PDF

Info

Publication number
CN109782910B
CN109782910B CN201811642097.0A CN201811642097A CN109782910B CN 109782910 B CN109782910 B CN 109782910B CN 201811642097 A CN201811642097 A CN 201811642097A CN 109782910 B CN109782910 B CN 109782910B
Authority
CN
China
Prior art keywords
scene
user
prop
virtual
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811642097.0A
Other languages
Chinese (zh)
Other versions
CN109782910A (en
Inventor
郭瑽
刘晨阳
戴若犁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING NOITOM TECHNOLOGY Ltd
Original Assignee
BEIJING NOITOM TECHNOLOGY Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING NOITOM TECHNOLOGY Ltd filed Critical BEIJING NOITOM TECHNOLOGY Ltd
Priority to CN201811642097.0A priority Critical patent/CN109782910B/en
Publication of CN109782910A publication Critical patent/CN109782910A/en
Application granted granted Critical
Publication of CN109782910B publication Critical patent/CN109782910B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to a VR scene interaction method and a VR scene interaction device, wherein the method comprises the following steps: acquiring the position of a user in a real scene acquired by a position acquisition device; displaying a user position mark with the user position as a center in a VR scene corresponding to the real scene; and if the user position in the real scene changes, controlling the user position mark in the VR scene to move along with the change of the user position. According to the embodiment of the invention, the position of the user in the VR scene can be determined according to the position of the user in the real scene, the user position mark is displayed at the position of the user, and the user position mark can move along with the movement of the user, so that the user can know the position of the user in the VR scene, the immersion feeling of the VR scene is improved, and the experience of interaction with the VR scene is improved.

Description

VR scene interaction method and device
Technical Field
The application relates to the technical field of virtual reality, in particular to a VR scene interaction method and device.
Background
With the development of the internet, the Virtual Reality (VR) technology is a computer simulation system that can create and experience a Virtual world, and it uses a computer to generate a simulation environment, which is a system simulation of multi-source information fusion, interactive three-dimensional dynamic views and entity behaviors to immerse users in the environment.
However, the current interactive scene generally ignores the real environment where the user is located, and the immersion feeling of the user is not strong.
Disclosure of Invention
In order to solve the technical problem or at least partially solve the technical problem, the present application provides a VR scene interaction method and apparatus.
In a first aspect, the present application provides a VR scene interaction method, including:
acquiring the position of a user in a real scene acquired by a position acquisition device;
displaying a user position mark with the user position as a center in a VR scene corresponding to the real scene;
and if the user position in the real scene changes, controlling the user position mark in the VR scene to move along with the change of the user position.
Optionally, the method further comprises:
judging whether the moving speed of the user exceeds a preset speed threshold value or not;
and if the moving speed of the user exceeds a preset speed threshold, displaying a high-speed moving special effect based on the user position mark.
Optionally, the method further comprises:
acquiring environmental information acquired by an environment acquisition device in a real scene;
creating a virtual background prop in a target VR scene based on the environment information;
determining a target prop in a real scene based on the environment information;
and matching the virtual interaction prop preset in the VR scene template with the target prop to obtain the target VR scene.
Optionally, the method further comprises:
judging whether the user position mark enters a safety area where the target prop is located;
and if the user position mark enters a safety zone where the target prop is located, controlling the virtual prop in the VR scene to collide with the boundary of the safety zone.
Optionally, matching the virtual interaction prop preset in the VR scene template with the target prop includes:
determining an actual shape of the target prop based on the environmental information;
matching the virtual shape of the virtual interaction prop with the actual shape of the target prop;
determining the actual central position of the target prop in a real scene based on the environment information;
and matching the virtual center position of the virtual interaction prop with the actual center position.
In a second aspect, an embodiment of the present invention further provides a VR scene interaction apparatus, including:
the acquisition module is used for acquiring the position of the user in the real scene acquired by the position acquisition device;
the first display module is used for displaying a user position mark by taking the user position as a center in a VR scene corresponding to the real scene;
the first control module is used for controlling the user position mark in the VR scene to move along with the change of the user position if the user position in the real scene changes.
Optionally, the apparatus further comprises:
the judging module is used for judging whether the moving speed of the user exceeds a preset speed threshold value;
and the second display module is used for displaying a high-speed moving special effect based on the user position mark if the moving speed of the user exceeds a preset speed threshold.
Optionally, the method further comprises:
the acquisition module is used for acquiring environmental information acquired by the environment acquisition equipment in a real scene;
the creating module is used for creating a virtual background prop in the target VR scene based on the environment information;
the determining module is used for determining a target prop in a real scene based on the environment information;
and the matching module is used for matching the preset virtual interaction prop in the VR scene template with the target prop to obtain the target VR scene.
Optionally, the apparatus further comprises:
the judging module is used for judging whether the user position mark enters a safety area where the target prop is located;
and the second control module is used for controlling the virtual prop in the VR scene to collide with the boundary of the safety zone if the user position mark enters the safety zone where the target prop is located.
Optionally, the matching module includes:
a shape determination unit for determining an actual shape of the target prop based on the environmental information;
the shape matching unit is used for matching the virtual shape of the virtual interaction prop with the actual shape of the target prop;
the position determining unit is used for determining the actual central position of the target prop in a real scene based on the environment information;
and the position matching unit is used for matching the virtual center position of the virtual interactive prop with the actual center position.
Compared with the prior art, the technical scheme provided by the embodiment of the application has the following advantages:
according to the embodiment of the invention, the position of the user in the VR scene can be determined according to the position of the user in the real scene, the user position mark is displayed at the position of the user, and the user position mark can move along with the movement of the user, so that the user can know the position of the user in the VR scene, the immersion feeling of the VR scene is improved, and the experience of interaction with the VR scene is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the invention and together with the description, serve to explain the principles of the invention.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious for those skilled in the art that other drawings can be obtained according to the drawings without inventive exercise.
Fig. 1 is a flowchart of a VR scene interaction method according to an embodiment of the present application;
fig. 2 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 3 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 4 is a schematic view of another application scenario provided in the embodiment of the present application;
fig. 5 is a structural diagram of a VR scene interaction apparatus according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Because the real environment where the user is located is generally ignored in the current interactive scene, the immersion feeling of the user is not strong. Therefore, the VR scene interaction method and apparatus provided in the embodiments of the present application may be applied to a processing device, for example: the computer etc. and processing equipment can be with environment collection equipment and VR mutual equipment communication connection, and environment collection equipment is used for scanning the reality scene, obtains environmental information, can be provided with position collection system and speed collection system etc. among the VR mutual equipment. As shown in fig. 1, the method includes:
step S101, acquiring the position of a user in a real scene acquired by a position acquisition device;
in the embodiment of the present invention, a real scene may refer to a space where a user performs VR interaction, and a user position may refer to an (X, Y) coordinate with a center of the real scene as an origin; it may also refer to (X, Y) coordinates with an origin at any corner of the real scene, etc.
Step S102, displaying a user position mark by taking the user position as a center in a VR scene corresponding to the reality scene;
for example, referring to fig. 2, the user position mark may be an aperture, a human body model, or the like, and may be specifically set according to actual needs, which is not limited in the present invention.
In one embodiment of the present invention, in order to facilitate the user to know the position of the user in the VR scene, the user position mark may exist with the user after the user logs in the VR scene.
In another embodiment of the invention, in order to save system resources of the processing device, the user position mark may be displayed when the user enters a VR scene for some interaction, and not displayed at other times.
Step S103, if the user position in the real scene changes, the user position mark in the VR scene is controlled to move along with the change of the user position.
In one embodiment of the present invention, in order to facilitate the user to know the self position in the VR scene, the user position mark may move in real time with the position movement of the user in the actual scene, for example, if the user position is at the coordinates (10, 50) at the first time, at the coordinates (20, 55) at the second time, and at the coordinates (100 ) at the tenth time at the coordinates (30, 60) … … at the third time, the user position mark also follows the coordinates (10, 50) of the user position from the first time, passes through the coordinates (20, 55) at the second time, and moves to the coordinates (100 ) at the tenth time at the coordinates (30, 60) … … at the third time in real time, and the position at each time is the same, so the motion trajectory is also the same.
According to the embodiment of the invention, the position of the user in the VR scene can be determined according to the position of the user in the real scene, the user position mark is displayed at the position of the user, and the user position mark can move along with the movement of the user, so that the user can know the position of the user in the VR scene, the immersion feeling of the VR scene is improved, and the experience of interaction with the VR scene is improved.
In practical applications, a user may move a position quickly, and if the user wants to perform collision interaction with a virtual interactive prop in a VR scene, the user may collide with a target prop corresponding to the virtual interactive prop in a real scene due to too high speed, so as to threaten the safety of the user, and for this reason, in another embodiment of the present invention, the method further includes:
judging whether the moving speed of the user exceeds a preset speed threshold value or not;
in the embodiment of the invention, the moving speed of the user can be acquired by a speed acquisition device in VR interactive equipment held or worn by the user, and the preset speed threshold can be the maximum safe moving speed obtained by counting the moving speed of the user at different speeds in advance.
And if the moving speed of the user exceeds a preset speed threshold, displaying a high-speed moving special effect based on the user position mark.
In the embodiment of the invention, the special effect of high-speed movement can refer to a user position mark of an enlarged version, and the protective cover can also be displayed by taking the user as a center to prompt the user to realize collision contact with the virtual interactive prop, so that the user can stop in front of the target prop in time to avoid actual collision.
In yet another embodiment of the present invention, the method further comprises:
acquiring environmental information acquired by an environment acquisition device in a real scene;
in the embodiment of the invention, the environment acquisition equipment can scan the position of each object in a real scene and object information, such as walls, doors, murals, kettles, televisions and the like.
Creating a virtual background prop in a target VR scene based on the environment information;
in the embodiment of the invention, the virtual background prop can be created in the VR scene based on the prop which is not required to interact with the user in the VR scene in the real scene.
Determining a target prop in a real scene based on the environment information;
the target prop may refer to a prop in a real scene that interacts with a user in a VR scene, and for example, the target prop may refer to a carpet, a sofa, a seat, or the like.
And matching the virtual interaction prop preset in the VR scene template with the target prop to obtain the target VR scene.
In this step, determining an actual shape of the target prop based on the environmental information; matching the virtual shape of the virtual interaction prop with the actual shape of the target prop; determining the actual central position of the target prop in a real scene based on the environment information; and matching the virtual center position of the virtual interaction prop with the actual center position.
The embodiment of the invention can construct the VR scene based on the environmental information of the real scene, and the VR scene is fused with the real scene, so that the immersion of the user on the VR scene is improved, the fusion degree is higher, and the user experience is enhanced.
In a further embodiment of the invention, referring to fig. 3 and 4, the method further comprises, when the user interacts with the VR scene, a collision attack by a virtual object in the VR scene may be received, for example, in a game stage, a bullet may fly to the user:
judging whether the user position mark enters a safety area where the target prop is located;
and if the user position mark enters a safety zone where the target prop is located, controlling the virtual prop in the VR scene to collide with the boundary of the safety zone.
According to the embodiment of the invention, the safe area can be constructed based on the target prop so as to block collision damage from a virtual object, increase VR interactive scenes and facilitate the use of users.
In another embodiment of the present invention, there is also provided a VR scene interaction apparatus, as shown in fig. 5, including:
the acquisition module 11 is used for acquiring the user position where the user is located in the real scene acquired by the position acquisition device;
a first display module 12, configured to display a user position mark in a VR scene corresponding to the real scene with the user position as a center;
a first control module 13, configured to control the user position marker in the VR scene to move along with a change of the user position if the user position in the real scene changes.
In yet another embodiment of the present invention, the apparatus further comprises:
the judging module is used for judging whether the moving speed of the user exceeds a preset speed threshold value;
and the second display module is used for displaying a high-speed moving special effect based on the user position mark if the moving speed of the user exceeds a preset speed threshold.
In another embodiment of the present invention, the method further comprises:
the acquisition module is used for acquiring environmental information acquired by the environment acquisition equipment in a real scene;
the creating module is used for creating a virtual background prop in the target VR scene based on the environment information;
the determining module is used for determining a target prop in a real scene based on the environment information;
and the matching module is used for matching the preset virtual interaction prop in the VR scene template with the target prop to obtain the target VR scene.
In yet another embodiment of the present invention, the apparatus further comprises:
the judging module is used for judging whether the user position mark enters a safety area where the target prop is located;
and the second control module is used for controlling the virtual prop in the VR scene to collide with the boundary of the safety zone if the user position mark enters the safety zone where the target prop is located.
In another embodiment of the present invention, the matching module includes:
a shape determination unit for determining an actual shape of the target prop based on the environmental information;
the shape matching unit is used for matching the virtual shape of the virtual interaction prop with the actual shape of the target prop;
the position determining unit is used for determining the actual central position of the target prop in a real scene based on the environment information;
and the position matching unit is used for matching the virtual center position of the virtual interactive prop with the actual center position.
It is noted that, in this document, relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The foregoing are merely exemplary embodiments of the present invention, which enable those skilled in the art to understand or practice the present invention. Various modifications to these embodiments will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other embodiments without departing from the spirit or scope of the invention. Thus, the present invention is not intended to be limited to the embodiments shown herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (8)

1. A VR scene interaction method, comprising:
acquiring the position of a user in a real scene acquired by a position acquisition device;
in a VR scene corresponding to the real scene, displaying a user position mark by taking the user position as a center when the user enters the VR scene for interaction;
if the user position in the real scene changes, controlling the user position mark in the VR scene to move along with the change of the user position;
the method further comprises the following steps:
judging whether the moving speed of the user exceeds a preset speed threshold value or not;
and if the moving speed of the user exceeds a preset speed threshold, displaying a high-speed moving special effect based on the user position mark, wherein the high-speed moving special effect is the user position mark of the amplified version, and the user position mark is an aperture.
2. The VR scene interaction method of claim 1, further comprising:
acquiring environmental information acquired by an environment acquisition device in a real scene;
creating a virtual background prop in a target VR scene based on the environment information;
determining a target prop in a real scene based on the environment information;
and matching the virtual interaction prop preset in the VR scene template with the target prop to obtain the target VR scene.
3. The VR scene interaction method of claim 2, further comprising:
judging whether the user position mark enters a safety area where the target prop is located;
and if the user position mark enters a safety zone where the target prop is located, controlling the virtual prop in the VR scene to collide with the boundary of the safety zone.
4. The VR scene interaction method of claim 2, wherein the matching of the preset virtual interaction props in the VR scene template with the target props comprises:
determining an actual shape of the target prop based on the environmental information;
matching the virtual shape of the virtual interaction prop with the actual shape of the target prop;
determining the actual central position of the target prop in a real scene based on the environment information;
and matching the virtual center position of the virtual interaction prop with the actual center position.
5. A VR scene interaction device comprising:
the acquisition module is used for acquiring the position of the user in the real scene acquired by the position acquisition device;
the first display module is used for displaying a user position mark by taking the user position as a center when the user enters the VR scene for interactive activities in the VR scene corresponding to the real scene;
a first control module, configured to control the user position marker in the VR scene to move along with a change in the user position if the user position in the real scene changes;
the judging module is used for judging whether the moving speed of the user exceeds a preset speed threshold value;
and the second display module is used for displaying a high-speed moving special effect based on the user position mark if the moving speed of the user exceeds a preset speed threshold, wherein the high-speed moving special effect is the user position mark of the amplified version, and the user position mark is an aperture.
6. The VR scene interaction device of claim 5, further comprising:
the acquisition module is used for acquiring environmental information acquired by the environment acquisition equipment in a real scene;
the creating module is used for creating a virtual background prop in the target VR scene based on the environment information;
the determining module is used for determining a target prop in a real scene based on the environment information;
and the matching module is used for matching the preset virtual interaction prop in the VR scene template with the target prop to obtain the target VR scene.
7. The VR scene interaction device of claim 6, further comprising:
the judging module is used for judging whether the user position mark enters a safety area where the target prop is located;
and the second control module is used for controlling the virtual prop in the VR scene to collide with the boundary of the safety zone if the user position mark enters the safety zone where the target prop is located.
8. The VR scene interaction device of claim 6, wherein the matching module includes:
a shape determination unit for determining an actual shape of the target prop based on the environmental information;
the shape matching unit is used for matching the virtual shape of the virtual interaction prop with the actual shape of the target prop;
the position determining unit is used for determining the actual central position of the target prop in a real scene based on the environment information;
and the position matching unit is used for matching the virtual center position of the virtual interactive prop with the actual center position.
CN201811642097.0A 2018-12-29 2018-12-29 VR scene interaction method and device Active CN109782910B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811642097.0A CN109782910B (en) 2018-12-29 2018-12-29 VR scene interaction method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811642097.0A CN109782910B (en) 2018-12-29 2018-12-29 VR scene interaction method and device

Publications (2)

Publication Number Publication Date
CN109782910A CN109782910A (en) 2019-05-21
CN109782910B true CN109782910B (en) 2021-04-06

Family

ID=66499534

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811642097.0A Active CN109782910B (en) 2018-12-29 2018-12-29 VR scene interaction method and device

Country Status (1)

Country Link
CN (1) CN109782910B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111340598B (en) * 2020-03-20 2024-01-16 北京爱笔科技有限公司 Method and device for adding interactive labels
CN113064955B (en) * 2020-08-26 2022-02-25 视伴科技(北京)有限公司 Method and device for displaying geographic marking information

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103970268A (en) * 2013-02-01 2014-08-06 索尼公司 Information processing device, client device, information processing method, and program
US20160012645A1 (en) * 2010-03-24 2016-01-14 Sony Corporation Image processing device, image processing method, and program
CN107209564A (en) * 2015-01-20 2017-09-26 微软技术许可有限责任公司 Real world ratio is applied to virtual content
CN108762492A (en) * 2018-05-14 2018-11-06 歌尔科技有限公司 Method, apparatus, equipment and the storage medium of information processing are realized based on virtual scene

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10725297B2 (en) * 2015-01-28 2020-07-28 CCP hf. Method and system for implementing a virtual representation of a physical environment using a virtual reality environment
CN106295581A (en) * 2016-08-15 2017-01-04 联想(北京)有限公司 Obstacle detection method, device and virtual reality device
CN107422942A (en) * 2017-08-15 2017-12-01 吴金河 A kind of control system and method for immersion experience
CN108008820B (en) * 2017-12-14 2021-09-21 深圳位形空间科技有限公司 Redirection walking method, redirection walking server and redirection walking system
CN108269307B (en) * 2018-01-15 2023-04-07 歌尔科技有限公司 Augmented reality interaction method and equipment
CN108427501B (en) * 2018-03-19 2022-03-22 网易(杭州)网络有限公司 Method and device for controlling movement in virtual reality

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160012645A1 (en) * 2010-03-24 2016-01-14 Sony Corporation Image processing device, image processing method, and program
CN103970268A (en) * 2013-02-01 2014-08-06 索尼公司 Information processing device, client device, information processing method, and program
CN107209564A (en) * 2015-01-20 2017-09-26 微软技术许可有限责任公司 Real world ratio is applied to virtual content
CN108762492A (en) * 2018-05-14 2018-11-06 歌尔科技有限公司 Method, apparatus, equipment and the storage medium of information processing are realized based on virtual scene

Also Published As

Publication number Publication date
CN109782910A (en) 2019-05-21

Similar Documents

Publication Publication Date Title
US10661157B2 (en) Method and system incorporating real environment for virtuality and reality combined interaction
CN105279795B (en) Augmented reality system based on 3D marker
WO2016122973A1 (en) Real time texture mapping
KR20170137913A (en) Information processing method, terminal and computer storage medium
CN107583271A (en) The exchange method and device of selection target in gaming
CN109782910B (en) VR scene interaction method and device
WO2017076224A1 (en) User interaction method and system based on virtual reality
CN106325517A (en) Target object trigger method and system and wearable equipment based on virtual reality
CN111199583B (en) Virtual content display method and device, terminal equipment and storage medium
CN104067315A (en) Target acquisition in a three dimensional building display
CN113867531A (en) Interaction method, device, equipment and computer readable storage medium
CN107240105A (en) A kind of image cropping method and device
CN108509043B (en) Interaction control method and system
CN109857259B (en) Collision body interaction control method and device, electronic equipment and storage medium
CN106325524A (en) Method and device for acquiring instruction
TWI825004B (en) Input methods, devices, equipment, systems and computer storage media
CN108989268B (en) Session display method and device and computer equipment
CN111872928B (en) Obstacle attribute distinguishing method and system and intelligent robot
CN111068309B (en) Display control method, device, equipment, system and medium for virtual reality game
CN106125927B (en) Image processing system and method
CN114756162B (en) Touch system and method, electronic device and computer readable storage medium
CN106127858B (en) Information processing method and electronic equipment
CN108922115B (en) Information processing method and electronic equipment
KR20210004479A (en) Augmented reality-based shooting game method and system for child
CN112215929A (en) Virtual social data processing method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant