CN115888093A - Control method and device - Google Patents

Control method and device Download PDF

Info

Publication number
CN115888093A
CN115888093A CN202310101253.7A CN202310101253A CN115888093A CN 115888093 A CN115888093 A CN 115888093A CN 202310101253 A CN202310101253 A CN 202310101253A CN 115888093 A CN115888093 A CN 115888093A
Authority
CN
China
Prior art keywords
space scene
target virtual
user
target
virtual object
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310101253.7A
Other languages
Chinese (zh)
Inventor
徐浩煜
李盼盼
曹凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN202310101253.7A priority Critical patent/CN115888093A/en
Publication of CN115888093A publication Critical patent/CN115888093A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application provides a control method and a control device, wherein the method comprises the following steps: under the condition that a target virtual object in a virtual space scene is determined to be an object which is interested by a user, at least controlling a camera corresponding to the target virtual object in a real space scene to adjust a shooting direction based on a visual angle and visual angle parameters of the target virtual object in the virtual space scene; acquiring a first real image shot by a camera after the shooting direction is adjusted; and controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.

Description

Control method and device
Technical Field
The present disclosure relates to the field of electronic technologies, and in particular, to a control method and apparatus.
Background
With the continuous development of scientific technology, physical scenes in the real world can be mapped to virtual space scenes through digital technology, so that a user obtains immersive use experience in the virtual space scenes. However, the rendering effect of some objects in the physical scene in the virtual space scene is poor, which results in poor use experience for the user.
Disclosure of Invention
The application provides the following technical scheme:
one aspect of the present application provides a control method, including:
under the condition that a target virtual object in a virtual space scene is determined to be an object which is interested by a user, at least controlling a camera corresponding to the target virtual object in a real space scene to adjust a shooting direction based on a visual angle and visual angle parameters of the target virtual object in the virtual space scene;
acquiring a first real image shot by a camera after the shooting direction is adjusted;
and controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.
Determining that a target virtual object in a virtual space scene is an object in which a user is interested, includes:
determining a target range in a virtual space scene based on a visual field range of a user in the virtual space scene;
and determining a target virtual object corresponding to the sight line of the user in the virtual space scene as an object which is interested by the user from the target range.
A target virtual object corresponding to the user's gaze in the virtual space scene, comprising:
the time length of stay of the sight line of the user in the virtual space scene meets at least one of a set time length threshold value and a distance between the stay of the sight line of the user in the virtual space scene and an output position of the sight line of the user in the virtual space scene meets a set distance threshold value.
Based on the visual angle and the visual angle parameter of the target virtual object in the virtual space scene, at least controlling a camera corresponding to the target virtual object in a real space scene to adjust a shooting orientation, comprising:
and controlling a camera corresponding to the target virtual object in the real space scene to adjust the shooting direction and the shooting parameters based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
If the camera corresponding to the target virtual object in the real space scene comprises at least two cameras, the shooting parameters at least comprise:
a distance between the at least two cameras.
The method further comprises the following steps:
determining that the visual angle and the visual angle parameters of the target virtual object in the virtual space scene change, and updating the shooting orientation of the camera based on the changed visual angle and the changed visual angle parameters;
acquiring a second real image shot by the camera after the shooting direction is updated;
updating the first real image displayed in the space region corresponding to the target virtual object in the virtual space scene based on the second real image.
The method further comprises the following steps:
determining that a visual angle between a camera with an adjusted shooting orientation and a target real object corresponding to the target virtual object in the real space scene and a visual angle to the target virtual object in the virtual space scene do not meet consistency conditions, determining a target virtual position in the virtual space scene, and outputting prompt information at the target virtual position, wherein the prompt information is used for prompting the virtual user to move to the target virtual position;
and under the condition that the virtual user is at the target virtual position, the visual angle of the virtual user to the target virtual object and the visual angle between the camera after the shooting direction is adjusted and the target real object corresponding to the target virtual object in the real space scene meet the consistency condition.
The method further comprises the following steps:
and controlling the virtual user to move to the target virtual position based on the prompt information.
The method further comprises the following steps:
and under the condition that the target virtual object in the virtual space scene is determined to be switched from the object which is interested by the user to the object which is not interested by the user, controlling a space area corresponding to the target virtual object in the virtual space scene to stop displaying the first real image.
Another aspect of the present application provides a control apparatus, including:
the first control module is used for controlling at least a camera corresponding to a target virtual object in a real space scene to adjust a shooting orientation based on a visual angle and visual angle parameters of the target virtual object in a virtual space scene under the condition that the target virtual object in the virtual space scene is determined to be an object interested by a user;
the first acquisition module is used for acquiring a first real image shot by the camera after the shooting direction is adjusted;
and the second control module is used for controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.
According to the method and the device, under the condition that the target virtual object in the virtual space scene is determined to be the object which is interested by the user, at least the camera corresponding to the target virtual object in the real space scene is controlled to adjust the shooting direction based on the visual angle and the visual angle parameter of the target virtual object in the virtual space scene, the first real image shot by the camera after the shooting direction is adjusted is obtained, and the space area corresponding to the target virtual object in the virtual space scene is controlled to display the first real image, so that the target virtual object is more vivid in the virtual space scene, the presenting effect of the target virtual object in the virtual space scene is ensured, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic flow chart of a control method provided in embodiment 1 of the present application;
FIG. 2 is a schematic diagram of an implementation scenario of a control method provided in the present application;
FIG. 3 is a schematic diagram of another implementation scenario of a control method provided in the present application;
fig. 4 is a schematic flowchart of a control method provided in embodiment 3 of the present application;
fig. 5 is a schematic flowchart of a control method provided in embodiment 4 of the present application;
fig. 6 is a schematic flowchart of a control method provided in embodiment 5 of the present application;
fig. 7 is a schematic flowchart of a control method provided in embodiment 6 of the present application;
fig. 8 is a schematic flowchart of a control method provided in embodiment 7 of the present application;
fig. 9 is a schematic structural diagram of a control device provided in the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, the present application is described in further detail with reference to the accompanying drawings and the detailed description.
Referring to fig. 1, a flowchart of a control method provided in embodiment 1 of the present application is illustrated, where the method may be applied to an electronic device, and the present application does not limit a product type of the electronic device, and as shown in fig. 1, the method may include, but is not limited to, the following steps:
step S101, under the condition that the target virtual object in the virtual space scene is determined to be the object which is interested by the user, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting direction based on the visual angle and the visual angle parameter of the target virtual object in the virtual space scene.
A virtual space scene can be understood as a three-dimensional, stereoscopic virtual space. At least one virtual object is shown in the virtual space scene, and the virtual object can be a virtual character, an article and the like.
The target virtual object is one or more of the at least one virtual object.
The perspective to the target virtual object in the virtual space scene may be understood as: and simulating the visual angle of a user viewing the target virtual object in the virtual space scene.
The view angle parameters may include, but are not limited to: at least one of a size, direction and distance of the viewing angle. The direction of the viewing angle can be understood as: simulating the direction of the virtual position of the user in the virtual space scene relative to the target virtual object; the distance of the viewing angle can be understood as: the distance of the virtual position of the user in the virtual space scene relative to the target virtual object is simulated.
Based on the view angle and the view angle parameters of the target virtual object in the virtual space scene, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting orientation, which may include but is not limited to:
and S1011, at least determining shooting orientation adjustment parameters of the camera corresponding to the target virtual object in the real space scene based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
And S1012, sending the shooting direction adjustment parameters to the camera so that the camera adjusts the shooting direction based on the shooting direction adjustment parameters.
Of course, based on the view angle and the view angle parameters of the target virtual object in the virtual space scene, at least the camera corresponding to the target virtual object in the real space scene is controlled to adjust the shooting orientation, which may include but is not limited to:
and S1013, at least determining shooting orientation adjustment parameters of the camera corresponding to the target virtual object in the real space scene based on the view angle and the view angle parameters of the target virtual object in the virtual space scene.
And S1014, adjusting the shooting orientation of the camera corresponding to the target virtual object in the real space scene based on the shooting orientation adjustment parameter.
The camera after adjusting the shooting orientation can shoot at least a part of the target virtual object.
In this embodiment, when a virtual space scene is constructed, configuration information, coordinates, and poses of cameras corresponding to each real object in a real space scene may be imported, and on this basis, cameras corresponding to target virtual objects in the real space scene may be determined in the following manner, but not limited to:
and determining a target camera with coordinates and pose matched with the visual angle and visual angle parameters of the target virtual object in the virtual space scene from the camera corresponding to each real object in the real space scene, and determining the target camera as the camera corresponding to the target virtual object in the real space scene.
The configuration information of the camera corresponding to the real object may include, but is not limited to: the access address and the identification of the camera corresponding to the real object, the supported image transmission protocol and the like.
And S102, acquiring a first real image shot by the camera after the shooting direction is adjusted.
In this embodiment, the first real image shot by the camera after the shooting orientation is adjusted may be acquired based on the configuration information of the camera after the shooting orientation is adjusted.
The first real image may be, but is not limited to: a frame of the first real image, or a plurality of consecutive frames of the first real image.
And S103, controlling a space area corresponding to the target virtual object in the virtual space scene to display a first real image.
In this embodiment, the spatial region corresponding to the target virtual object in the virtual spatial scene may be, but is not limited to: a spatial region in the virtual spatial scene in which the target virtual object is shown or a target spatial region other than the spatial region in the virtual spatial scene in which the target virtual object is shown.
The corresponding space area corresponding to the target virtual object in the virtual space scene is the implementation mode of the space area for displaying the target virtual object in the virtual space scene, the space area corresponding to the target virtual object in the virtual space scene is controlled to display the first real image, the first real image covers the target virtual object, and the effect of presenting the first real image corresponding to the target virtual object in the virtual space scene is achieved. For example, as shown in part (a) of fig. 2, a target virtual object is shown in a virtual space scene; the corresponding rendering effect of the first real image overlay target virtual object is shown in part (b) of fig. 2.
And controlling the space area corresponding to the target virtual object in the virtual space scene to display the first real image, so that the first real image and the target virtual object can be displayed in the virtual space scene. For example, as shown in part (a) of fig. 3, a target virtual object is shown in a virtual space scene; as shown in part (b) of fig. 3, the first real image displays a target spatial region other than a spatial region showing the target virtual object in the virtual spatial scene.
In this embodiment, when it is determined that a target virtual object in a virtual space scene is an object in which a user is interested, based on a viewing angle and viewing angle parameters of the target virtual object in the virtual space scene, at least a camera corresponding to the target virtual object in a real space scene is controlled to adjust a shooting orientation, a first real image shot by the camera after the shooting orientation is adjusted is obtained, and a space area corresponding to the target virtual object in the virtual space scene is controlled to display the first real image, so that the target virtual object is more vivid in the virtual space scene, a user can obtain a real state corresponding to the target virtual object in the virtual space scene, and user experience is improved. In addition, the calculation amount of the real state of the simulation target virtual object can be reduced, and the resources and calculation force for constructing the virtual space scene are saved.
As another optional embodiment of the present application, in this embodiment, mainly for the refinement scheme in embodiment 1, which determines that the target virtual object in the virtual space scene is the object of interest to the user, determining that the target virtual object in the virtual space scene is the object of interest to the user may include, but is not limited to, the following steps:
and S1015, determining a target range in the virtual space scene based on the visual field range of the user in the virtual space scene.
This step may include, but is not limited to:
s10151, determining the visual field range of the user in the virtual space scene as the target range in the virtual space scene.
This step may also include, but is not limited to:
s10152, determining a scanning range of the sight line of the user in the virtual space scene in the visual field range of the user in the virtual space scene as a target range in the virtual space scene.
And S1016, determining a target virtual object corresponding to the sight of the user in the virtual space scene from the target range as the object of interest of the user.
The target virtual objects corresponding to the user's line of sight in the virtual space scene may include, but are not limited to:
and the time length of stay of the sight line of the user in the virtual space scene meets at least one of a set time length threshold value and the distance from the output position of the sight line of the user in the virtual space scene meets a set distance threshold value.
The set time length threshold and the set distance threshold may be set as needed, and are not limited in this application.
In the embodiment, the target range in the virtual space scene is determined based on the visual field range of the user in the virtual space scene, the target virtual object corresponding to the sight of the user in the virtual space scene is determined from the target range and is the object of interest of the user, the user can be ensured to determine the object of interest by changing the sight in the virtual space scene, the visual angle of the user does not need to be switched, manual operation of the user for display control is reduced, and the use experience of the user is improved.
As another alternative embodiment of the present application, referring to fig. 4, a flowchart of a control method provided in embodiment 3 of the present application is shown, and this embodiment mainly refines step S101 in embodiment 1 described above, as shown in fig. 4, step S101 may include, but is not limited to, the following steps:
s1017, controlling a camera corresponding to the target virtual object in the real space scene to adjust the shooting direction and the shooting parameters based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
This step may include, but is not limited to:
s10171, at least determining shooting orientation adjustment parameters and shooting parameters of a camera corresponding to the target virtual object in the real space scene based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
S10172, sending the shooting orientation adjusting parameters and the shooting parameters to the camera, so that the camera adjusts the shooting orientation based on the shooting orientation adjusting parameters and carries out shooting setting based on the shooting parameters.
Of course, based on the view angle and the view angle parameters of the target virtual object in the virtual space scene, at least the camera corresponding to the target virtual object in the real space scene is controlled to adjust the shooting orientation, which may include but is not limited to:
s10173, at least determining shooting azimuth adjusting parameters and shooting parameters of a camera corresponding to the target virtual object in the real space scene based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
And S10174, adjusting the shooting orientation of the camera corresponding to the target virtual object in the real space scene based on the shooting orientation adjusting parameter.
S10175, based on the shooting parameters, shooting setting is carried out on the camera corresponding to the target virtual object in the real space scene.
The shooting parameters may include, but are not limited to: at least one of a zoom parameter, a focus following parameter, and an exposure parameter.
If the camera corresponding to the target virtual object in the real space scene includes at least two cameras, the shooting parameters at least may include, but are not limited to:
a distance between at least two cameras.
The distance between the at least two cameras can influence the parallax of the real images shot by the at least two cameras.
The parallax of the real image shot by the at least two cameras is adjusted by adjusting the distance between the at least two cameras, so that the setting condition is met between the parallax of the real image and the parallax of the target virtual object viewed by the user in the virtual space scene. Specifically, the two cameras are used for collecting a left eye image and a right eye image respectively, a user perceives a real image as a three-dimensional image through the left eye image and the right eye image, the view angle parameters are view angle parameters corresponding to a target virtual object observed by the user in a virtual space scene, the view angle parameters comprise first distance information, the first distance information represents the distance between the user and the target virtual object in the virtual space scene, the distance between the two cameras is adjusted according to the first distance information, and therefore the distance perceived by the user from the real image is adjusted to be consistent with the distance in the virtual space scene by adjusting the parallax between the left eye image and the right eye image. The setting conditions may be set as needed, for example, the setting conditions may be, but are not limited to: and (5) consistency conditions.
Corresponding to the embodiment of step S101, the process of step S102 may include, but is not limited to:
and S1021, acquiring a first real image shot by the camera after the shooting direction and the shooting parameters are adjusted.
As another alternative embodiment of the present application, referring to fig. 5, a flowchart of a control method provided in embodiment 4 of the present application is shown, and this embodiment is mainly an extension of the foregoing embodiment 1, as shown in fig. 5, the method includes, but is not limited to, the following steps:
step S201, when it is determined that the target virtual object in the virtual space scene is the object of interest to the user, based on the viewing angle and the viewing angle parameter of the target virtual object in the virtual space scene, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting orientation.
And step S202, acquiring a first real image shot by the camera after the shooting direction is adjusted.
And S203, controlling a space area corresponding to the target virtual object in the virtual space scene to display a first real image.
The detailed processes of steps S201 to S203 can refer to the related descriptions of steps S101 to S103 in embodiment 1, and are not described herein again.
Step S205, determining that the visual angle and the visual angle parameter of the target virtual object in the virtual space scene are changed, and updating the shooting direction of the camera based on the changed visual angle and the changed visual angle parameter.
If a user wants to obtain the real state of the target virtual object from different angles, the user can adjust the sight of the target virtual object in the virtual space scene, after the sight of the target virtual object is adjusted, the visual angle and the visual angle parameters of the target virtual object in the virtual space scene change, and accordingly the electronic equipment can determine that the visual angle and the visual angle parameters of the target virtual object in the virtual space scene change.
Based on the changed view angle and view angle parameters, updating the shooting orientation of the camera, which may include but is not limited to: based on the changed visual angle and the visual angle parameter, re-determining a shooting orientation adjustment parameter of the camera, and sending the re-determined shooting orientation adjustment parameter of the camera to the camera, so that the camera adjusts the shooting orientation based on the re-determined shooting orientation adjustment parameter; or adjusting the shooting orientation of the camera based on the re-determined shooting orientation adjustment parameter.
And step S206, acquiring a second real image shot by the camera after the shooting direction is updated.
And step S207, updating the first real image displayed in the space area corresponding to the target virtual object in the virtual space scene based on the second real image.
This step may include, but is not limited to: and replacing the first real image with the second real image, and controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.
Of course, this step may also include, but is not limited to: and controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image and the second real image.
In this embodiment, the first real image and the second real image are displayed in the space region corresponding to the target virtual object in the virtual space scene by controlling, so that a user can obtain the real state change condition of the target virtual object in the virtual space scene, and the user experience is improved.
In this embodiment, under the condition that it is determined that a target virtual object in a virtual space scene is an object of interest of a user, based on a viewing angle and viewing angle parameters of the target virtual object in the virtual space scene, at least a camera corresponding to the target virtual object in a real space scene is controlled to adjust a shooting orientation, a first real image shot by the camera after the shooting orientation is adjusted is obtained, and a space region corresponding to the target virtual object in the virtual space scene is controlled to display the first real image, so that the target virtual object is more vivid in the virtual space scene, a user can obtain a real state corresponding to the target virtual object in the virtual space scene, and user experience is improved. In addition, the calculation amount of the real state of the simulation target virtual object can be reduced, and the resources and calculation force for constructing the virtual space scene are saved.
And determining that the visual angle and the visual angle parameter of the target virtual object in the virtual space scene are changed, updating the shooting direction of the camera based on the changed visual angle and the visual angle parameter, acquiring a second real image shot by the camera after the shooting direction is updated, updating a first real image displayed in a space region corresponding to the target virtual object in the virtual space scene based on the second real image, ensuring that a user can obtain real states corresponding to the target virtual object at different angles in the virtual space scene, and improving the user experience.
As another alternative embodiment of the present application, referring to fig. 6, a flowchart of a control method provided in embodiment 5 of the present application is shown, and this embodiment is mainly an extension of the foregoing embodiment 1, as shown in fig. 6, the method includes, but is not limited to, the following steps:
step S301, under the condition that the target virtual object in the virtual space scene is determined to be the object which is interested by the user, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting direction based on the visual angle and the visual angle parameter of the target virtual object in the virtual space scene.
The detailed process of step S301 may refer to the related description of step S101 in embodiment 1, and is not described herein again.
Step S302, determining that the visual angle between the camera after the shooting direction is adjusted and the target real object corresponding to the target virtual object in the real space scene and the visual angle to the target virtual object in the virtual space scene do not meet the consistency condition, determining the target virtual position in the virtual space scene, and outputting prompt information at the target virtual position, wherein the prompt information is used for prompting a virtual user to move to the target virtual position.
Specifically, a target view angle to the target virtual object in the virtual space scene may be determined based on a view angle between the camera after the shooting orientation is adjusted and the target real object corresponding to the target virtual object in the real space scene, and the target virtual position in the virtual space scene may be determined based on the target view angle.
And under the condition that the virtual user is at the target virtual position, the visual angle of the virtual user to the target virtual object and the visual angle between the camera after the shooting direction is adjusted and the target real object corresponding to the target virtual object in the real space scene meet the consistency condition.
The user may select to control the virtual user to move to the target virtual location based on the hint information. The movement of the virtual user to the target virtual location may be understood as: the virtual position of the user in the virtual space scene moves to the target virtual position.
And step S303, acquiring a first real image shot by the camera after the shooting direction is adjusted.
And S304, controlling a space area corresponding to the target virtual object in the virtual space scene to display a first real image.
The detailed processes of steps S303 to S304 can be referred to the related descriptions of steps S102 to S103 in embodiment 1, and are not described herein again.
Under the condition that the virtual user moves to the target virtual position, the user views the first real image at the target virtual position, so that the first real image viewed by the user is a real image corresponding to a visual angle with a consistent visual angle of the target virtual object in the virtual space scene, the viewing effect of the user is ensured, and the use experience of the user is improved.
As another alternative embodiment of the present application, referring to fig. 7, a flowchart of a control method provided in embodiment 6 of the present application is shown, and this embodiment is mainly an extension of the foregoing embodiment 5, as shown in fig. 7, the method includes, but is not limited to, the following steps:
step S401, when it is determined that the target virtual object in the virtual space scene is the object of interest to the user, based on the viewing angle and the viewing angle parameter of the target virtual object in the virtual space scene, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting orientation.
Step S402, determining that the visual angle between the camera after the shooting direction is adjusted and a target real object corresponding to the target virtual object in the real space scene and the visual angle to the target virtual object in the virtual space scene do not meet the consistency condition, determining a target virtual position in the virtual space scene, and outputting prompt information at the target virtual position, wherein the prompt information is used for prompting a virtual user to move to the target virtual position.
The detailed processes of steps S401 to S402 can refer to the related descriptions of steps S301 to S302 in embodiment 5, and are not described herein again.
And S403, controlling the virtual user to move to the target virtual position based on the prompt message.
In this embodiment, the user does not need to select to control the virtual user to move, the electronic device controls the virtual user to move to the target virtual position based on the prompt information, and the user moves corresponding to the movement of the virtual user, so that the virtual position of the user in the virtual space scene moves to the target virtual position.
And S404, acquiring a first real image shot by the camera after the shooting direction is adjusted.
And S405, controlling a space area corresponding to the target virtual object in the virtual space scene to display a first real image.
The detailed processes of steps S404-S405 can be referred to the related descriptions of steps S303-S304 in embodiment 5, and are not described herein again.
In the embodiment, it is determined that the viewing angle between the camera after the shooting orientation is adjusted and the target real object corresponding to the target virtual object in the real space scene and the viewing angle to the target virtual object in the virtual space scene do not satisfy the consistency condition, the target virtual position in the virtual space scene is determined, the prompt information is output at the target virtual position, the virtual user is controlled to move to the target virtual position based on the prompt information, and the user views the first real image at the target virtual position under the condition that the virtual user moves to the target virtual position, so that the first real image viewed by the user is the real image corresponding to the viewing angle to the target virtual object in the virtual space scene, the viewing effect of the user is ensured, and the user experience is improved.
As another alternative embodiment of the present application, referring to fig. 8, a flowchart of a control method provided in embodiment 7 of the present application is shown, and this embodiment is mainly an extension of the foregoing embodiment 1, as shown in fig. 8, the method includes, but is not limited to, the following steps:
step S501, under the condition that the target virtual object in the virtual space scene is determined to be the object which is interested by the user, at least controlling the camera corresponding to the target virtual object in the real space scene to adjust the shooting direction based on the visual angle and the visual angle parameter of the target virtual object in the virtual space scene.
And S502, acquiring a first real image shot by the camera after the shooting direction is adjusted.
And S503, controlling a space area corresponding to the target virtual object in the virtual space scene to display a first real image.
The detailed processes of steps S501 to S503 can refer to the related descriptions of steps S101 to S103 in embodiment 1, and are not described herein again.
Step S504, under the condition that the target virtual object in the virtual space scene is determined to be switched from the object which is interested by the user to the object which is not interested by the user, controlling the space area corresponding to the target virtual object in the virtual space scene to stop displaying the first real image.
In this embodiment, the determining that the target virtual object in the virtual space scene is switched from the object in which the user is interested to the object in which the user is not interested may include, but is not limited to:
determining that the stay time length of a target virtual object in the virtual space scene from the stay time length of the sight of the user in the virtual space scene satisfying a set time length threshold value to the stay time length of the sight of the user in the virtual space scene not satisfying the set time length threshold value;
and/or the presence of a gas in the atmosphere,
and determining that the distance between the target virtual object in the virtual space scene and the output position of the sight line of the user in the virtual space scene meets the set distance threshold, and the distance between the target virtual object in the virtual space scene and the output position of the sight line of the user in the virtual space scene does not meet the set distance threshold.
The set time length threshold and the set distance threshold may be set as needed, and are not limited in this application.
In this embodiment, when it is determined that the target virtual object in the virtual space scene is switched from the object in which the user is interested to the object in which the user is not interested, the spatial region corresponding to the target virtual object in the virtual space scene is controlled to stop displaying the first real image, so that it is ensured that the target virtual object is switched to be displayed in the virtual space scene.
Next, a control device provided in the present application will be described, and the control device described below and the control method described above may be referred to in correspondence with each other.
Referring to fig. 9, the control device includes: a first control module 100, a first acquisition module 200, and a second control module 300.
The first control module 100 is configured to, when it is determined that a target virtual object in a virtual space scene is an object of interest to a user, control at least a camera corresponding to the target virtual object in a real space scene to adjust a shooting orientation based on an angle of view and an angle of view parameter of the target virtual object in the virtual space scene.
The first obtaining module 200 is configured to obtain a first real image shot by the camera after the shooting direction is adjusted.
The second control module 300 is configured to control a spatial region corresponding to a target virtual object in a virtual spatial scene to display a first real image.
In this embodiment, determining that the target virtual object in the virtual space scene is an object that is interested by the user may include, but is not limited to:
determining a target range in the virtual space scene based on the visual field range of the user in the virtual space scene;
and determining a target virtual object corresponding to the sight of the user in the virtual space scene from the target range as the object of interest of the user.
The target virtual objects corresponding to the user's line of sight in the virtual space scene may include, but are not limited to:
and the time length of stay of the sight line of the user in the virtual space scene meets at least one of a set time length threshold value and the distance from the output position of the sight line of the user in the virtual space scene meets a set distance threshold value.
In this embodiment, the first control module 100 may specifically be configured to:
and controlling a camera corresponding to the target virtual object in the real space scene to adjust the shooting direction and the shooting parameters based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
If the camera that the virtual object of target corresponds in real space scene includes two at least cameras, the shooting parameter includes at least:
a distance between at least two cameras.
In this embodiment, the control device may further include:
and the first updating module is used for determining that the visual angle and the visual angle parameters of the target virtual object in the virtual space scene are changed, and updating the shooting direction of the camera based on the changed visual angle and the changed visual angle parameters.
And the second acquisition module is used for acquiring a second real image shot by the camera after the shooting direction is updated.
And the second updating module is used for updating the first real image displayed in the space area corresponding to the target virtual object in the virtual space scene based on the second real image.
In this embodiment, the control device may further include:
the prompting module is used for determining that the visual angle between the camera after the shooting direction is adjusted and a target real object corresponding to the target virtual object in the real space scene and the visual angle to the target virtual object in the virtual space scene do not meet the consistency condition, determining a target virtual position in the virtual space scene, and outputting prompting information at the target virtual position, wherein the prompting information is used for prompting a virtual user to move to the target virtual position;
and under the condition that the virtual user is at the target virtual position, the visual angle of the virtual user to the target virtual object and the visual angle between the camera after the shooting direction is adjusted and the target real object corresponding to the target virtual object in the real space scene meet the consistency condition.
In this embodiment, the control device may further include:
and the third control module is used for controlling the virtual user to move to the target virtual position based on the prompt information.
In this embodiment, the control device may further include:
and the fourth control module is used for controlling a space area corresponding to the target virtual object in the virtual space scene to stop displaying the first real image under the condition that the target virtual object in the virtual space scene is determined to be switched from the object which is interested by the user to the object which is not interested by the user.
Corresponding to the embodiment of the control method provided by the application, the application also provides an embodiment of the electronic device using the control method.
The electronic device may include the following structure:
a memory and a processor.
A memory for storing at least one set of instructions;
a processor for calling and executing the instruction set in the memory, and executing the control method as described in any one of the above method embodiments 1-7 by executing the instruction set.
Corresponding to the above embodiment of the control method provided by the present application, the present application further provides an embodiment of a storage medium.
In this embodiment, a storage medium stores a computer program that implements the control method described in any one of method embodiments 1 to 7, and the computer program is executed by a processor to implement the control method described in any one of method embodiments 1 to 7.
It should be noted that the focus of each embodiment is different from that of other embodiments, and the same and similar parts between the embodiments may be referred to each other. For the device-like embodiment, since it is basically similar to the method embodiment, the description is simple, and for the relevant points, reference may be made to the partial description of the method embodiment.
Finally, it should also be noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising a … …" does not exclude the presence of another identical element in a process, method, article, or apparatus that comprises the element.
For convenience of description, the above devices are described as being divided into various modules by functions, and are described separately. Of course, the functionality of the various modules may be implemented in the same one or more software and/or hardware implementations as the present application.
From the above description of the embodiments, it is clear to those skilled in the art that the present application can be implemented by software plus necessary general hardware platform. Based on such understanding, the technical solutions of the present application or portions thereof that contribute to the prior art may be embodied in the form of a software product, which may be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, or the like, and includes several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute the method according to the embodiments or some portions of the embodiments of the present application.
The above detailed description is provided for a control method and apparatus provided by the present application, and the principle and the implementation of the present application are explained by applying specific examples, and the description of the above examples is only used to help understanding the method and the core idea of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A control method, comprising:
under the condition that a target virtual object in a virtual space scene is determined to be an object which is interested by a user, at least controlling a camera corresponding to the target virtual object in a real space scene to adjust a shooting direction based on a visual angle and visual angle parameters of the target virtual object in the virtual space scene;
acquiring a first real image shot by a camera after the shooting direction is adjusted;
and controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.
2. The method of claim 1, determining that a target virtual object in the virtual space scene is an object of interest to the user, comprising:
determining a target range in a virtual space scene based on a visual field range of a user in the virtual space scene;
and determining a target virtual object corresponding to the sight of the user in the virtual space scene as an object of interest of the user from the target range.
3. The method of claim 2, a target virtual object corresponding to the user's gaze in the virtual space scene, comprising:
the time length of stay of the sight line of the user in the virtual space scene meets at least one of a set time length threshold value and a distance between the stay of the sight line of the user in the virtual space scene and an output position of the sight line of the user in the virtual space scene meets a set distance threshold value.
4. The method of claim 1, controlling at least a camera corresponding to the target virtual object in a real space scene to adjust a shooting orientation based on a view angle and view angle parameters of the target virtual object in the virtual space scene, comprising:
and controlling a camera corresponding to the target virtual object in the real space scene to adjust the shooting direction and the shooting parameters based on the visual angle and the visual angle parameters of the target virtual object in the virtual space scene.
5. The method of claim 4, wherein if the cameras corresponding to the target virtual object in the real space scene include at least two cameras, the capturing parameters at least include:
a distance between the at least two cameras.
6. The method of claim 1, further comprising:
determining that the visual angle and the visual angle parameter of the target virtual object in the virtual space scene change, and updating the shooting orientation of the camera based on the changed visual angle and the changed visual angle parameter;
acquiring a second real image shot by the camera after the shooting direction is updated;
updating the first real image displayed in the space region corresponding to the target virtual object in the virtual space scene based on the second real image.
7. The method of claim 1, further comprising:
determining that a visual angle between a camera after the shooting direction is adjusted and a target real object corresponding to the target virtual object in the real space scene and the visual angle to the target virtual object in the virtual space scene do not meet a consistency condition, determining a target virtual position in the virtual space scene, and outputting prompt information at the target virtual position, wherein the prompt information is used for prompting the virtual user to move to the target virtual position;
and under the condition that the virtual user is at the target virtual position, the visual angle of the virtual user to the target virtual object and the visual angle between the camera after the shooting direction is adjusted and the target real object corresponding to the target virtual object in the real space scene meet the consistency condition.
8. The method of claim 7, further comprising:
and controlling the virtual user to move to the target virtual position based on the prompt information.
9. The method of claim 1, further comprising:
and under the condition that the target virtual object in the virtual space scene is determined to be switched from the object which is interested by the user to the object which is not interested by the user, controlling a space area corresponding to the target virtual object in the virtual space scene to stop displaying the first real image.
10. A control device, comprising:
the first control module is used for controlling at least a camera corresponding to a target virtual object in a real space scene to adjust a shooting orientation based on a visual angle and visual angle parameters of the target virtual object in the virtual space scene under the condition that the target virtual object in the virtual space scene is determined to be an object interested by a user;
the first acquisition module is used for acquiring a first real image shot by the camera after the shooting direction is adjusted;
and the second control module is used for controlling a space area corresponding to the target virtual object in the virtual space scene to display the first real image.
CN202310101253.7A 2023-01-18 2023-01-18 Control method and device Pending CN115888093A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310101253.7A CN115888093A (en) 2023-01-18 2023-01-18 Control method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310101253.7A CN115888093A (en) 2023-01-18 2023-01-18 Control method and device

Publications (1)

Publication Number Publication Date
CN115888093A true CN115888093A (en) 2023-04-04

Family

ID=86488389

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310101253.7A Pending CN115888093A (en) 2023-01-18 2023-01-18 Control method and device

Country Status (1)

Country Link
CN (1) CN115888093A (en)

Similar Documents

Publication Publication Date Title
US11240403B2 (en) Compensation for delay in PTZ camera system
KR102637901B1 (en) A method of providing a dolly zoom effect by an electronic device and the electronic device utilized in the method
US11094125B2 (en) Storage medium, and method and system for simulating photography in virtual reality scene
KR20200020919A (en) Apparatus and method for generating an image
US9025007B1 (en) Configuring stereo cameras
JP2021057766A5 (en) Image display system, image processing device, and video distribution method
US8471896B2 (en) Signal processing apparatus, signal processing method, display apparatus, and program product
CN114115525B (en) Information display method, device, equipment and storage medium
JP2024026151A (en) Methods, systems, and media for rendering immersive video content using foveated meshes
WO2022116397A1 (en) Virtual viewpoint depth map processing method, device, and apparatus, and storage medium
WO2018069570A1 (en) Display of visual data with a virtual reality headset
KR20200022476A (en) Apparatus and method for generating an image
CN113286138A (en) Panoramic video display method and display equipment
WO2017062730A1 (en) Presentation of a virtual reality scene from a series of images
US10650507B2 (en) Image display method and apparatus in VR device, and VR device
CN109271123B (en) Picture display method and picture display device
JP6980913B2 (en) Learning device, image generator, learning method, image generation method and program
JP7110378B2 (en) METHOD AND PROGRAM FOR PROVIDING AUGMENTED REALITY IMAGE USING DEPTH DATA
CN115888093A (en) Control method and device
CN108734791B (en) Panoramic video processing method and device
CN108510433B (en) Space display method and device and terminal
CN111292234A (en) Panoramic image generation method and device
CN111127621B (en) Picture rendering method, device and readable storage medium
JP2023011265A (en) Virtual reality simulator and virtual reality simulation program
JP2023011262A (en) Virtual reality simulator and virtual reality simulation program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination