CN115981518B - VR demonstration user operation method and related equipment - Google Patents

VR demonstration user operation method and related equipment Download PDF

Info

Publication number
CN115981518B
CN115981518B CN202310281632.9A CN202310281632A CN115981518B CN 115981518 B CN115981518 B CN 115981518B CN 202310281632 A CN202310281632 A CN 202310281632A CN 115981518 B CN115981518 B CN 115981518B
Authority
CN
China
Prior art keywords
touch
panorama
interface
target
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310281632.9A
Other languages
Chinese (zh)
Other versions
CN115981518A (en
Inventor
余大飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Tongchuang Blue Sky Cloud Technology Co ltd
Original Assignee
Beijing Tongchuang Blue Sky Cloud Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Tongchuang Blue Sky Cloud Technology Co ltd filed Critical Beijing Tongchuang Blue Sky Cloud Technology Co ltd
Priority to CN202310281632.9A priority Critical patent/CN115981518B/en
Publication of CN115981518A publication Critical patent/CN115981518A/en
Application granted granted Critical
Publication of CN115981518B publication Critical patent/CN115981518B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Abstract

The embodiment of the application provides a VR demonstration user operation method and related equipment, which can solve the problems of poor interactivity and poor interactive matching of the current VR panorama touch interface. The method comprises the following steps: under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction; acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time.

Description

VR demonstration user operation method and related equipment
Technical Field
The application relates to the technical field of computers, in particular to a VR display user operation method and related equipment.
Background
With the great heat of VR virtual reality technology, VR panorama looks at room, cares the car, looks at exhibition, shopping etc. functions such as shopping begin to put into use at each big platform, and the time has almost become industry standard and allocated, has led the new fashion in various marketing fields. The mode is convenient, the travel cost is saved for the clients, the operation efficiency is improved for enterprises, the experience of the clients is improved, and the labor cost is saved.
However, at present, in the process of experiencing a real scene through VR panorama, when switching between different areas, the switching effect is single, and the switching effect is poor.
Disclosure of Invention
The embodiment of the application provides a VR demonstration user operation method and related equipment, which can solve the problems of poor interactivity and poor interactive matching of the current VR panorama touch interface.
A first aspect of an embodiment of the present application provides a VR display user operation method, including:
under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time.
Optionally, the determining the theoretical step duration based on the user information includes:
Determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, obtaining the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of the height, the age and the gender of the user.
Optionally, the method further comprises:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
The determining the theoretical walking time length based on the user information comprises the following steps:
and determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality excessive image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality excessive image.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
Acquiring surrounding shielding object information based on the position information of the real house;
simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
A second aspect of the embodiments of the present application provides a VR display user operating device, including:
the device comprises a receiving unit, a target VR panorama touch interface and a control unit, wherein the receiving unit is used for receiving real interactive information of a preset area range corresponding to a touch instruction under the condition that the touch instruction of a first user is received by the target VR panorama touch interface, and the VR panorama touch interface is obtained by panorama shooting based on a real scene;
the acquisition unit is used for acquiring instruction content information based on the touch track corresponding to the touch instruction;
and the determining unit is used for determining target instruction contents of the first user according to the identity information of the first user and the real interactive information under the condition that the instruction content information is at least two, and executing the target instruction contents in the VR panorama touch interface.
A third aspect of the embodiments of the present application provides an electronic device, including a memory, and a processor, where the processor is configured to implement the steps of the VR presentation user operation method described above when executing a computer program stored in the memory.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the VR presentation user operation method described above.
In summary, according to the VR display user operation method provided by the embodiment of the present application, under the condition that a touch instruction of a user is received by a target VR panorama touch interface, a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time. Because the user can switch scene points through the touch screen in the process of watching or interacting with the real scene in the panoramic interface, the current scene point switching cannot well show the walking effect, the scaling of the previous scene picture is generally accompanied by the expansion of the next scene picture, and the whole switching process is often completed in a very short time, for example, the switching is carried out from the central position of one room to the central position of the next room, and the switching can be carried out at the same speed no matter what the distance between the two scene points is, and no matter what type is used. By means of the method, the walking time required by theory of the user between the points corresponding to the next panoramic interface indicated by the walking control and switching the current panoramic interface can be predicted by obtaining the user information, the walking effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interactive matching of the current VR panoramic touch interface are solved.
Accordingly, the VR display user operation device, the electronic device and the computer readable storage medium provided by the embodiment of the invention also have the technical effects.
Drawings
Fig. 1 is a flowchart of a possible VR display user operation method provided in an embodiment of the present application;
fig. 2 is a schematic block diagram of one possible VR display user operating device provided by an embodiment of the present application;
fig. 3 is a schematic hardware structure diagram of a possible VR display user operation device according to an embodiment of the present application;
FIG. 4 is a schematic block diagram of one possible electronic device provided in an embodiment of the present application;
fig. 5 is a schematic block diagram of one possible computer-readable storage medium provided in an embodiment of the present application.
Detailed Description
The embodiment of the application provides a VR demonstration user operation method and related equipment, which can solve the problems of poor interactivity and poor interactive matching of the current VR panorama touch interface.
The terms "first," "second," "third," "fourth" and the like in the description and in the claims of this application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments described herein may be implemented in other sequences than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus. The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application.
Referring to fig. 1, a flowchart of a VR display user operation method provided in an embodiment of the present application may specifically include: S110-S130.
S110, under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction.
S120, acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction.
For example, in a case where the user switches from the target VR panorama touch interface to the next panorama interface indicated by the step control, the user may click on the control of the direction of the next panorama interface.
S130, determining theoretical walking time length based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time length.
According to the VR demonstration user operation method provided by the embodiment, under the condition that the target VR panorama touch interface receives the touch instruction of the user, the touch type of the touch instruction and the control information in the preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time. Because the user can switch scene points through the touch screen in the process of watching or interacting with the real scene in the panoramic interface, the current scene point switching cannot well show the walking effect, the scaling of the previous scene picture is generally accompanied by the expansion of the next scene picture, and the whole switching process is often completed in a very short time, for example, the switching is carried out from the central position of one room to the central position of the next room, and the switching can be carried out at the same speed no matter what the distance between the two scene points is, and no matter what type is used. By means of the method, the walking time required by theory of the user between the points corresponding to the next panoramic interface indicated by the walking control and switching the current panoramic interface can be predicted by obtaining the user information, the walking effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interactive matching of the current VR panoramic touch interface are solved.
According to some embodiments, the determining a theoretical step duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
According to some embodiments, when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, obtaining the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises calibration objects acquired from at least two different images of the height, the age and the gender of the user to calculate the height of the user. Different depths of field between the preset calibration object and the user can be obtained through at least two different images, so that the height of the user can be predicted through the preset calibration object. In this way, the user information can be conveniently predicted under the condition that the user information can not be obtained through the information pre-stored by the user. In addition, under the condition that the registered user and the user are different, the user information of the user can be predicted, so that the theoretical walking time length is more in line with the currently used user.
For example, the age and sex of the user may be predicted by image recognition. May also pass at least
According to some embodiments, further comprising:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
the determining the theoretical walking time length based on the user information comprises the following steps:
and determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
By way of example, the current panoramic interface and the next panoramic interface indicated by the step control can be queried through pre-stored scene information of the target VR panoramic touch interface to determine the actual step distance of the real scene. And predicting the current panoramic interface and the next panoramic interface indicated by the step control according to the scene picture of the VR panoramic touch interface to determine the actual step distance of the real scene. Therefore, the theoretical walking time length is determined accurately and truly by combining the actual walking distance of the real scene, and the user experience is more real.
According to some embodiments, further comprising:
generating a virtual reality excessive image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
And switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality excessive image.
It should be noted that, by switching the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the step control according to the theoretical step duration and the virtual reality transition image, the user experiences more truly under the condition of the current panoramic interface and the next panoramic interface indicated by the step control, and jump feeling and cracking feeling are avoided.
According to some embodiments, further comprising:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
By way of example, the location information of the corresponding real house in the target VR panorama touch interface is obtained, and the outdoor illumination condition can be determined based on the location information of the real house and the current time. By simulating the outdoor light brightness and generating the view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness, more real house watching experience can be provided for a user.
According to some embodiments, further comprising:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
acquiring surrounding shielding object information based on the position information of the real house;
simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
By way of example, the incident angle and intensity of outdoor light from different lighting windows of a house can be simulated according to the position information, the height information, the surrounding shielding information and the current moment of a real house, and more real house watching experience can be provided for a user. In addition, the injection angle and intensity of outdoor light rays from different lighting windows of a house at different moments in a day can be demonstrated in a short time according to the time compression ratio.
The VR display user operation method in the embodiment of the present application is described above, and the VR display user operation device in the embodiment of the present application is described below.
Referring to fig. 2, one embodiment of a VR display user operation device is described in embodiments of the present application, and may include:
the determining unit 201 is configured to determine, based on a touch instruction, a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction, when the target VR panorama touch interface receives the touch instruction of a user;
the obtaining unit 202 is configured to obtain user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction includes a step control;
and the execution unit 203 is configured to determine a theoretical walking duration based on the user information, and switch the target VR panoramic touch interface from the current panoramic interface to the next panoramic interface indicated by the walking control according to the theoretical walking duration.
According to the VR display user operation device provided by the embodiment, under the condition that the target VR panorama touch interface receives a touch instruction of a user, the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction are determined based on the touch instruction; acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction; and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time. Because the user can switch scene points through the touch screen in the process of watching or interacting with the real scene in the panoramic interface, the current scene point switching cannot well show the walking effect, the scaling of the previous scene picture is generally accompanied by the expansion of the next scene picture, and the whole switching process is often completed in a very short time, for example, the switching is carried out from the central position of one room to the central position of the next room, and the switching can be carried out at the same speed no matter what the distance between the two scene points is, and no matter what type is used. By means of the method, the walking time required by theory of the user between the points corresponding to the next panoramic interface indicated by the walking control and switching the current panoramic interface can be predicted by obtaining the user information, the walking effect can be adaptively adjusted according to different types of users and different interface scene distances, and the problems of poor interactivity and poor interactive matching of the current VR panoramic touch interface are solved.
The VR display user operation device in the embodiment of the present application is described above from the perspective of the modularized functional entity in fig. 2, and the VR display user operation device in the embodiment of the present application is described below in detail from the perspective of hardware processing, referring to fig. 3, an embodiment of the VR display user operation device 300 in the embodiment of the present application includes:
input device 301, output device 302, processor 303, and memory 304, wherein the number of processors 303 may be one or more, one processor 303 being exemplified in fig. 3. In some embodiments of the present application, the input device 301, the output device 302, the processor 303, and the memory 304 may be connected by a bus or other means, where a bus connection is exemplified in fig. 3.
Wherein, by calling the operation instruction stored in the memory 304, the processor 303 is configured to execute the following steps:
under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
And determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time.
Optionally, the determining the theoretical step duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, obtaining the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of the height, the age and the gender of the user.
Optionally, the method further comprises:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
the determining the theoretical walking time length based on the user information comprises the following steps:
and determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality excessive image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality excessive image.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
acquiring surrounding shielding object information based on the position information of the real house;
simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
The processor 303 is further configured to execute any of the embodiments corresponding to fig. 1 by calling the operation instructions stored in the memory 304.
Referring to fig. 4, fig. 4 is a schematic diagram of an embodiment of an electronic device according to an embodiment of the present application.
As shown in fig. 4, the embodiment of the present application provides an electronic device 400, including a memory 410, a processor 420, and a computer program 411 stored on the memory 410 and executable on the processor 420, wherein the processor 420 implements the following steps when executing the computer program 411:
under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
Acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time.
Optionally, the determining the theoretical step duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
and determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, obtaining the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
And under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of the height, the age and the gender of the user.
Optionally, the method further comprises:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
the determining the theoretical walking time length based on the user information comprises the following steps:
and determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality excessive image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality excessive image.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
Simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
acquiring surrounding shielding object information based on the position information of the real house;
simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
In a specific implementation, when the processor 420 executes the computer program 411, any implementation of the embodiment corresponding to fig. 1 may be implemented.
Since the electronic device described in this embodiment is a device for implementing a system resource management device in this embodiment, based on the method described in this embodiment, those skilled in the art can understand the specific implementation of the electronic device in this embodiment and various modifications thereof, so how to implement the method in this embodiment for this electronic device will not be described in detail herein, and as long as those skilled in the art implement the device for implementing the method in this embodiment for this embodiment are all within the scope of protection intended by this application.
Referring to fig. 5, fig. 5 is a schematic diagram of an embodiment of a computer readable storage medium according to an embodiment of the present application.
As shown in fig. 5, the present embodiment provides a computer-readable storage medium 500 having stored thereon a computer program 511, which computer program 511 when executed by a processor implements the steps of:
under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
and determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time.
Optionally, the determining the theoretical step duration based on the user information includes:
determining a theoretical stride of the user based on the user information;
And determining the theoretical walking time length according to the theoretical stride of the user.
Optionally, when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, obtaining the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of the height, the age and the gender of the user.
Optionally, the method further comprises:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
the determining the theoretical walking time length based on the user information comprises the following steps:
And determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
Optionally, the method further comprises:
generating a virtual reality excessive image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality excessive image.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
Optionally, the method further comprises:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
acquiring surrounding shielding object information based on the position information of the real house;
Simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
In a specific implementation, the computer program 511 may implement any of the embodiments corresponding to fig. 1 when executed by a processor.
In the foregoing embodiments, the descriptions of the embodiments are focused on, and for those portions of one embodiment that are not described in detail, reference may be made to the related descriptions of other embodiments.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
Embodiments of the present application also provide a computer program product comprising computer software instructions that, when run on a processing device, cause the processing device to perform a flow in a VR demonstration user operation method as in the corresponding embodiment of fig. 1.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line (digital subscriber line, DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy disk, a hard disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.
It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the above-described systems, apparatuses and units may refer to corresponding procedures in the foregoing method embodiments, which are not repeated herein.
In the several embodiments provided in this application, it should be understood that the disclosed systems, apparatuses, and methods may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the units is merely a logical function division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another system, or some features may be omitted or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The above embodiments are merely for illustrating the technical solution of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some of the technical features thereof can be replaced equivalently; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (8)

1. A VR display user operation method, comprising:
under the condition that a touch instruction of a user is received by a target VR panorama touch interface, determining a touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction;
acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction;
determining a theoretical walking time length based on the user information, and switching the target VR panorama touch interface from a current panorama interface to a next panorama interface indicated by the walking control according to the theoretical walking time length;
Wherein the determining the theoretical walking time length based on the user information includes:
determining a theoretical stride of the user based on the user information;
determining the theoretical walking time length according to the theoretical stride of the user;
and when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the step control is included in the preset area range corresponding to the touch instruction, acquiring the user information includes:
acquiring image information based on a front-mounted imaging device to which a target VR panorama touch interface belongs, wherein the image information is acquired by at least two different images when the touch type is a click type and control information in a preset area range corresponding to the touch instruction indicates that a step control is included in the preset area range corresponding to the touch instruction;
and under the condition that the image comprises a preset calibration object, acquiring the user information based on the preset calibration object, wherein the user information comprises at least one of the height, the age and the gender of the user.
2. The method as recited in claim 1, further comprising:
determining an actual walking distance of a real scene based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the walking control;
The determining the theoretical walking time length based on the user information comprises the following steps:
and determining a theoretical walking time length based on the user information and the actual walking distance of the real scene.
3. The method as recited in claim 1, further comprising:
generating a virtual reality transition image based on the target VR panorama touch interface and a next panorama interface indicated by the current panorama interface and the step control;
and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the step control according to the theoretical step duration and the virtual reality transition image.
4. The method as recited in claim 1, further comprising:
acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
simulating outdoor light brightness based on the position information of the real house and the current moment;
and generating a view prospect of the real lighting window on the target VR panorama touch interface according to the light brightness.
5. The method as recited in claim 1, further comprising:
Acquiring position information of a corresponding real house in the target VR panorama touch interface under the condition that the target VR panorama touch interface is the real house panorama interface;
acquiring surrounding shielding object information based on the position information of the real house;
simulating the incidence angle and intensity of outdoor light according to the position information, the height information, the surrounding shielding object information and the current moment of the real house;
and generating a view prospect of the real lighting window and an indoor perspective view prospect on the target VR panorama touch interface according to the incidence angle and the intensity.
6. A VR display user operated device employing the method of any one of claims 1-5, the device comprising:
the determining unit is used for determining the touch type of the touch instruction and control information in a preset area range corresponding to the touch instruction based on the touch instruction under the condition that the target VR panorama touch interface receives the touch instruction of a user;
the acquisition unit is used for acquiring user information when the touch type is a click type and the control information in the preset area range corresponding to the touch instruction indicates that the preset area range corresponding to the touch instruction comprises a step control;
And the execution unit is used for determining theoretical walking time based on the user information, and switching the target VR panorama touch interface from the current panorama interface to the next panorama interface indicated by the walking control according to the theoretical walking time.
7. An electronic device comprising a memory, a processor, wherein the processor is configured to implement the steps of the VR presentation user operation method of any one of claims 1 to 5 when executing a computer program stored in the memory.
8. A computer-readable storage medium having stored thereon a computer program, characterized by: the computer program, when executed by a processor, implements the steps of the VR presentation user operating method of any one of claims 1 to 5.
CN202310281632.9A 2023-03-22 2023-03-22 VR demonstration user operation method and related equipment Active CN115981518B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310281632.9A CN115981518B (en) 2023-03-22 2023-03-22 VR demonstration user operation method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310281632.9A CN115981518B (en) 2023-03-22 2023-03-22 VR demonstration user operation method and related equipment

Publications (2)

Publication Number Publication Date
CN115981518A CN115981518A (en) 2023-04-18
CN115981518B true CN115981518B (en) 2023-06-02

Family

ID=85972575

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310281632.9A Active CN115981518B (en) 2023-03-22 2023-03-22 VR demonstration user operation method and related equipment

Country Status (1)

Country Link
CN (1) CN115981518B (en)

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3329976A1 (en) * 2016-11-30 2018-06-06 Thomson Licensing 3d immersive method and device for a user in a virtual 3d scene
CN106652043A (en) * 2016-12-29 2017-05-10 深圳前海弘稼科技有限公司 Method and device for virtual touring of scenic region
CN108310768B (en) * 2018-01-16 2020-04-07 腾讯科技(深圳)有限公司 Virtual scene display method and device, storage medium and electronic device
CN108830692B (en) * 2018-06-20 2020-04-14 厦门市超游网络科技股份有限公司 Remote panoramic house-viewing method and device, user terminal, server and storage medium
CN109745699A (en) * 2018-12-29 2019-05-14 维沃移动通信有限公司 A kind of method and terminal device responding touch control operation
CN109814713A (en) * 2019-01-10 2019-05-28 重庆爱奇艺智能科技有限公司 A kind of method and apparatus for the switching of VR user perspective
US10974149B2 (en) * 2019-01-22 2021-04-13 Electronic Arts Inc. Controlling character movement in a video-game
CN111840989B (en) * 2020-08-05 2023-10-27 网易(杭州)网络有限公司 Virtual object moving route processing method and device and electronic equipment
CN112402963B (en) * 2020-11-20 2022-08-30 腾讯科技(深圳)有限公司 Information sending method, device, equipment and storage medium in virtual scene
CN115793928A (en) * 2021-09-09 2023-03-14 北京字跳网络技术有限公司 Page switching method, device, equipment and storage medium
CN114371800B (en) * 2021-12-15 2022-09-09 北京城市网邻信息技术有限公司 Space display method, device, terminal and medium based on VR panorama roaming

Also Published As

Publication number Publication date
CN115981518A (en) 2023-04-18

Similar Documents

Publication Publication Date Title
CN105915599B (en) Interface display method and device
TW202304212A (en) Live broadcast method, system, computer equipment and computer readable storage medium
CN108829468B (en) Three-dimensional space model skipping processing method and device
CN109309842B (en) Live broadcast data processing method and device, computer equipment and storage medium
CN108255291B (en) Virtual scene data transmission method and device, storage medium and electronic device
CN108234659B (en) Data processing method, device and system
CN110507992B (en) Technical support method, device, equipment and storage medium in virtual scene
JP7449403B2 (en) Virtual scene data processing method, device, electronic device and program
CN111324275B (en) Broadcasting method and device for elements in display picture
CN110619659A (en) House resource display method, device, terminal equipment and medium
CN112667936A (en) Video processing method, device, terminal, server and storage medium
CN112044078A (en) Access method, device, equipment and storage medium for virtual scene application
CN115981518B (en) VR demonstration user operation method and related equipment
CN109660508A (en) Data visualization method, electronic device, computer equipment and storage medium
CN112399265B (en) Method and system for adding content to image based on negative space recognition
CN114449355B (en) Live interaction method, device, equipment and storage medium
CN115981517B (en) VR multi-terminal cooperative interaction method and related equipment
CN113989427A (en) Illumination simulation method and device, electronic equipment and storage medium
CN106127858B (en) Information processing method and electronic equipment
CN108874141A (en) A kind of body-sensing browsing method and device
CN104618733A (en) Image remote projection method and related device
CN115524990A (en) Intelligent household control method, device, system and medium based on digital twins
CN111652831B (en) Object fusion method and device, computer-readable storage medium and electronic equipment
CN113810624A (en) Video generation method and device and electronic equipment
CN113822777A (en) Virtual teaching resource aggregation system based on 5G cloud rendering and working method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant