CN112967404A - Method and device for controlling movement of virtual object, electronic equipment and storage medium - Google Patents

Method and device for controlling movement of virtual object, electronic equipment and storage medium Download PDF

Info

Publication number
CN112967404A
CN112967404A CN202110208654.3A CN202110208654A CN112967404A CN 112967404 A CN112967404 A CN 112967404A CN 202110208654 A CN202110208654 A CN 202110208654A CN 112967404 A CN112967404 A CN 112967404A
Authority
CN
China
Prior art keywords
virtual object
information
equipment
determining
movement
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110208654.3A
Other languages
Chinese (zh)
Inventor
卢金莲
韦豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TetrasAI Technology Co Ltd
Original Assignee
Shenzhen TetrasAI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TetrasAI Technology Co Ltd filed Critical Shenzhen TetrasAI Technology Co Ltd
Priority to CN202110208654.3A priority Critical patent/CN112967404A/en
Publication of CN112967404A publication Critical patent/CN112967404A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Physics & Mathematics (AREA)
  • Software Systems (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Automation & Control Theory (AREA)
  • Geometry (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The present disclosure provides a method, an apparatus, an electronic device and a storage medium for controlling movement of a virtual object, wherein the method comprises: acquiring Inertial Measurement Unit (IMU) data of the AR equipment in the process of displaying the AR scene containing the target virtual object by the AR equipment; determining movement state information of the AR device based on the IMU data; and controlling the target virtual object to move according to the movement state information. According to the method and the device, in the AR scene displayed by the AR equipment, the moving state of the target virtual object can change along with the state change of the AR equipment, the displayed AR scene picture can be more visual and vivid, and the AR service quality is improved.

Description

Method and device for controlling movement of virtual object, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of computer application technologies, and in particular, to a method and an apparatus for controlling movement of a virtual object, an electronic device, and a storage medium.
Background
The Augmented Reality (AR) technology is a technology for superimposing a corresponding image, video, and three-Dimensional (3-Dimensional, 3D) model on a video to realize fusion of a virtual world and a real world according to a position and an angle of a camera image calculated in real time, and provides a new interactive experience for a user.
AR technology may be applied in various application scenarios (e.g., AR navigation scenarios, AR game scenarios). Taking AR navigation as an example, AR navigation as a navigation mode realized by combining AR technology with map information can provide users with more vivid, intuitive, and safe navigation services. After the AR navigation of the navigation device is started, the AR navigation route superimposed in the real scene may be displayed in the navigation device. How to fully utilize the AR technology to make the AR navigation process more intuitive and visual is a problem to be further researched.
Disclosure of Invention
The embodiment of the present disclosure provides at least a scheme for controlling movement of a virtual object, which controls a movement state of the virtual object in an AR scene (such as an AR navigation scene) based on IMU data of an AR device, so that a presented AR scene picture can be more intuitive and vivid, and AR service quality can be improved.
Mainly comprises the following aspects:
in a first aspect, an embodiment of the present disclosure provides a method for controlling movement of a virtual object, where the method includes:
acquiring Inertial Measurement Unit (IMU) data of an AR device in the process of displaying an AR scene containing a target virtual object by the AR device;
determining movement state information of the AR device based on the IMU data;
and controlling the target virtual object to move according to the movement state information.
By adopting the method for controlling the movement of the virtual object, in the process of displaying the AR scene containing the target virtual object by the AR equipment, the movement state information of the AR equipment can be determined according to the acquired IMU data of the AR equipment, and the target virtual object can be controlled to move according to the movement state information of the AR equipment.
In one possible implementation, the determining the movement state information of the AR device based on the IMU data includes:
determining acceleration information of the AR device based on the triaxial acceleration information, and determining angular velocity information of the AR device based on the triaxial angular velocity information;
determining movement state information of the AR device based on the acceleration information and the angular velocity information.
Here, the movement state information of the AR device may be determined based on the three-axis acceleration information and the three-axis angular velocity information, and then control on the target virtual object may be achieved based on the movement state information, improving AR service quality.
In a possible implementation manner, the controlling the target virtual object to move according to the movement state information includes:
and controlling the target virtual object to synchronously move according to the movement state information of the AR equipment.
Here, the target virtual object may move synchronously with the movement state information of the AR device, for example, when the user holds the AR device in a running state, the target virtual object may also present a running posture, which greatly improves the AR experience of the user.
In a possible implementation manner, the controlling the target virtual object to move synchronously according to the movement state information of the AR device includes:
determining a target mobile state type to which the AR equipment belongs currently based on the mobile state information of the AR equipment;
determining target special effect data corresponding to the target moving state type based on the corresponding relation between each moving state type and each special effect data;
and controlling the target virtual object to synchronously move according to the target moving state information and the target special effect data.
In the embodiment of the disclosure, different special effect data can be corresponding to different mobile state types, so that the presented AR experience effect is more interesting, and the service quality in the AR experience is further improved.
In one possible embodiment, the movement state information includes a movement speed and a movement direction; the movement state type includes at least one of walking, running, and turning directions.
In one possible implementation, the AR scene is an AR navigation scene, and the target virtual object is a virtual navigator; the method further comprises the following steps:
determining the position information of the AR equipment in a three-dimensional scene map based on a real scene image shot by the AR equipment and the pre-constructed three-dimensional scene map;
determining location information of the virtual navigator in the three-dimensional scene map based on the location information of the AR device in the three-dimensional scene map;
displaying an AR navigation scene containing the virtual navigator on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map;
the method for acquiring inertial measurement unit IMU data of the AR equipment in the process of displaying the AR scene containing the target virtual object by the AR equipment comprises the following steps:
and acquiring Inertial Measurement Unit (IMU) data of the AR equipment in the process of displaying the AR navigation scene containing the virtual navigator by the AR equipment.
Here, in a case where the AR scene is determined to be an AR navigation scene, the location information of the virtual navigator in the three-dimensional scene map may be determined based on the location information of the AR device in the three-dimensional scene map, for example, the location may be at a preset distance in front of the AR device, so that the virtual navigator may perform AR navigation on the user more vividly, and the quality of navigation service is further improved.
In one possible embodiment, the determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map includes:
determining a navigation route according to the position information of the AR equipment in a world coordinate system corresponding to the three-dimensional scene map and the set destination information;
determining position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map and the navigation route; the virtual navigator is located on the navigation route and has a preset distance with the AR device.
Here, the position information of the virtual navigator positioned on the navigation route may be determined based on the navigation route determined in the three-dimensional space and the position information of the AR device itself, thereby facilitating the virtual navigator to perform AR navigation in accordance with the determined navigation route.
In one possible embodiment, the presenting, at the AR device, the AR navigation scene including the virtual navigator according to the position information of the virtual navigator in the three-dimensional scene map includes:
and displaying an AR navigation scene containing the virtual navigator and the navigation route on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map and the navigation route.
The displayed AR navigation scene can visually present the navigation effect of the virtual navigator on the navigation route, so that the interactive experience between the user and the AR equipment can be well improved.
In a second aspect, an embodiment of the present disclosure further provides an apparatus for controlling movement of a virtual object, where the apparatus includes:
the system comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring Inertial Measurement Unit (IMU) data of the AR equipment in the process of displaying an AR scene containing a target virtual object by the AR equipment;
a determination module to determine movement state information of the AR device based on the IMU data;
and the control module is used for controlling the target virtual object to move according to the movement state information.
In a third aspect, an embodiment of the present disclosure further provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the steps of the method of controlling movement of a virtual object as set forth in the first aspect and any of its various embodiments.
In a fourth aspect, the disclosed embodiments also provide a computer-readable storage medium, on which a computer program is stored, where the computer program is executed by an electronic device, and the electronic device executes the steps of the method for controlling movement of a virtual object according to the first aspect and any one of the various implementation manners thereof.
For the description of the effects of the above apparatus, electronic device, and computer-readable storage medium for controlling the movement of the virtual object, reference is made to the above description of the method for controlling the movement of the virtual object, and details are not repeated here.
In order to make the aforementioned objects, features and advantages of the present disclosure more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present disclosure, the drawings required for use in the embodiments will be briefly described below, and the drawings herein incorporated in and forming a part of the specification illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the technical solutions of the present disclosure. It is appreciated that the following drawings depict only certain embodiments of the disclosure and are therefore not to be considered limiting of its scope, for those skilled in the art will be able to derive additional related drawings therefrom without the benefit of the inventive faculty.
Fig. 1 is a flowchart illustrating a method for controlling movement of a virtual object according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram illustrating an application of a method for controlling movement of a virtual object according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram illustrating an apparatus for controlling movement of a virtual object according to a second embodiment of the disclosure;
fig. 4 shows a schematic diagram of an electronic device provided in a third embodiment of the present disclosure.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present disclosure more clear, the technical solutions of the embodiments of the present disclosure will be described clearly and completely with reference to the drawings in the embodiments of the present disclosure, and it is obvious that the described embodiments are only a part of the embodiments of the present disclosure, not all of the embodiments. The components of embodiments of the present disclosure, as generally described and illustrated herein, may be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present disclosure is not intended to limit the scope of the disclosure, as claimed, but is merely representative of selected embodiments of the disclosure. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the disclosure without making creative efforts, shall fall within the protection scope of the disclosure.
Research shows that Augmented Reality (AR) navigation in the prior art is used as a navigation mode realized on the basis of combining an AR technology and map information, and after the AR navigation of the navigation equipment is started, an AR navigation route superposed in a real scene can be displayed in the navigation equipment. How to fully utilize the AR technology to make the AR navigation process more intuitive and visual is a problem to be further researched.
Based on the above research, the present disclosure provides at least one solution for controlling the movement of a virtual object, which controls the movement state of the virtual object in an AR scene (e.g., an AR navigation scene) based on IMU data of an AR device, so that the presented AR scene picture is more intuitive and vivid, and the AR service quality is improved.
The above-mentioned drawbacks are the results of the inventor after practical and careful study, and therefore, the discovery process of the above-mentioned problems and the solutions proposed by the present disclosure to the above-mentioned problems should be the contribution of the inventor in the process of the present disclosure.
It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures.
To facilitate understanding of the present embodiment, first, a method for controlling a virtual object to move disclosed in the embodiments of the present disclosure is described in detail, where an execution subject of the method for controlling a virtual object to move provided in the embodiments of the present disclosure is generally an electronic device with certain computing capability, and the electronic device includes, for example: the user terminal or the server or other processing devices may be, for example, a server connected to the user terminal, the user terminal may be a tablet computer, a smart phone, an intelligent wearable device, an augmented reality AR device (e.g., AR glasses, AR helmet, etc.), and other devices having a display function and a data processing capability, and the user terminal may be connected to the server through an application program. In some possible implementations, the method of controlling movement of a virtual object may be implemented by a processor calling computer readable instructions stored in a memory.
The following describes a method for controlling movement of a virtual object according to an embodiment of the present disclosure, taking an execution subject as an AR device as an example.
Example one
Referring to fig. 1, which is a flowchart of a method for controlling movement of a virtual object according to an embodiment of the present disclosure, the method includes steps S101 to S103, where:
s101, acquiring Inertial Measurement Unit (IMU) data of AR equipment in the process of displaying an AR scene containing a target virtual object by the AR equipment;
s102, determining the movement state information of the AR equipment based on the IMU data;
and S103, controlling the target virtual object to move according to the movement state information.
Here, in order to facilitate understanding of the method for controlling the movement of the virtual object provided by the embodiment of the present disclosure, an application scenario of the method may be first described in detail. The method for controlling the movement of the virtual object in the embodiment of the disclosure can be applied to various application scenes in which the control of the AR object is required. For example, the method may be applied to a virtual game scene to control a game character in the virtual game scene, may also be applied to an AR navigation scene to control a virtual object (such as a landmark, and the like) in the AR navigation scene, and may also be applied to other various application scenes, which is not limited specifically herein.
Considering that the AR navigation is widely applied, for example, to road navigation, navigation in exhibition halls, and navigation in other scenes, the AR navigation application is exemplified as follows.
In the method for controlling movement of a virtual object provided in the embodiment of the present disclosure, before controlling a target virtual object in an AR scene displayed by an AR device, Inertial Measurement Unit (IMU) data of the AR device may be obtained first, and movement state information of the AR device may be determined based on the IMU data, where the movement state information may include related information such as a movement speed and a movement direction. In this way, the target virtual object in the AR scene can be controlled to move based on the movement state information.
It should be noted that, in the embodiment of the present disclosure, the movement state of the target virtual object may be synchronized with the movement state indicated by the movement state information of the AR device, or may be determined after being adjusted based on the movement state indicated by the AR device. For example, the target virtual object may move at double the movement speed of the AR device.
In consideration of that in a specific application process, the movement state of the AR device is controlled to a certain extent by the motion state of the user, and the synchronization of the states is more favorable for improving the AR experience of the user, so in the embodiment of the present disclosure, the target virtual object may be controlled to move synchronously according to the movement state information of the AR device.
For example, when the moving speed of the AR device is 5 km/h, the moving pixel number of the target virtual object in the presented AR scene may be determined according to the moving speed, so as to implement the moving control of the target virtual object.
The IMU data in the embodiments of the present disclosure may be measured by an IMU apparatus disposed on the AR device. As a device for measuring the three-axis attitude angle and the three-axis acceleration of an object, the IMU device may include three single-axis accelerometers and three single-axis gyroscopes, the accelerometers detecting acceleration information of the object in three independent axes of a carrier coordinate system, and the gyroscopes detecting angular velocity information of the carrier relative to a navigation coordinate system. That is, angular velocity information and acceleration information of the object in a three-dimensional space can be measured, and the posture of the object can be calculated based on the acceleration information and the angular velocity information. Here, the acceleration information and the angular velocity information of the AR device may be measured according to the measurement method of the IMU apparatus, and the movement state information of the corresponding AR device may correspond to the calculated posture.
In the embodiment of the present disclosure, when the movement state information of the AR device is determined according to the above measurement manner, the target virtual object may be controlled to synchronously move according to the movement state information, and meanwhile, the special effect data corresponding to the movement state type may be further combined to realize synchronous movement. The method mainly considers that the immersion experience degree of the AR technology serving as a technology for skillfully fusing virtual information and a real world directly influences the wide application of the technology, and the immersion experience degree of an AR scene can be improved by using special effect data corresponding to the mobile state type.
In the embodiment of the present disclosure, the target virtual object may be controlled to implement synchronous movement by combining the movement state information and the corresponding special effect data according to the following steps:
step one, determining a target mobile state type to which AR equipment belongs currently based on mobile state information of the AR equipment;
secondly, determining target special effect data corresponding to the target moving state type based on the corresponding relation between each moving state type and each special effect data;
and step three, controlling the target virtual object to synchronously move according to the target moving state information and the target special effect data.
The moving state type may be determined based on the moving state information, and the moving state types corresponding to different moving state information are also different, for example, the moving speed of the AR device is kept at 5 km/h within a period of time, at this time, the moving state type of the AR device may be determined as walking, when the moving speed of the AR device is determined to reach 20 km/h, the moving state type of the AR device may be determined as running, when the moving direction of the AR device is determined to be changed, the AR device may be considered to have changed direction, for example, turning walking occurs during forward walking, and for example, jumping occurs in a running state. In addition, the movement state type in the embodiment of the present disclosure may be other types, and is not particularly limited herein.
Here, different special effect data may be set for different movement state types, for example, in a movement state determined to be about to run, the target virtual object may be controlled to transition to a running state and synchronize the state of the AR device according to the running state; as another example, in a moving state determined to be in a transition direction, a target virtual object in an AR scene may be controlled to transition towards to synchronize a state of an AR device.
It should be noted that, in the embodiment of the present disclosure, different target virtual objects may be corresponding to different application scenarios. For example, for an AR navigation scene, the corresponding target virtual object may be a virtual object such as a virtual road sign and a virtual guide arrow, or may also be a virtual character such as a virtual navigator, and for an AR game scene, the corresponding target virtual object may be a virtual game character.
In the embodiment of the present disclosure, for different application scenarios, the AR scenarios including the target virtual object exhibited in the AR device are also different. Considering that the virtual navigator corresponding to the AR navigation scene is widely applied to the visual navigation in the indoor exhibition hall and other scenes, the process of presenting the AR navigation scene in which the virtual navigator is located can be specifically described, and the process of presenting the AR navigation scene can be realized through the following steps:
the method comprises the steps that firstly, position information of AR equipment in a three-dimensional scene map is determined based on a real scene image shot by the AR equipment and the three-dimensional scene map which is constructed in advance;
secondly, determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR equipment in the three-dimensional scene map;
and step three, displaying the AR navigation scene containing the virtual navigator on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map.
Here, in order to determine the location information of the AR device in the three-dimensional scene map, map data corresponding to the real scene image captured by the AR device may be first searched in the pre-constructed three-dimensional scene map, so that the location information of the AR device in the three-dimensional scene map may be determined based on the searched map data and a conversion relationship between an image coordinate system in which the captured real scene image is located, a camera coordinate system in which the AR device is located, and a world coordinate system in which the three-dimensional scene map is located, that is, at the location information, the map data corresponding to the real scene image captured by the AR device may be related.
In the embodiment of the disclosure, in the case of determining the position information of the AR device in the three-dimensional scene map, the position information of the virtual navigator in the three-dimensional scene map may be determined. The method mainly considers that in the process of constructing the virtual navigator, the position information of the user to which the AR device belongs is often required to be referred to, that is, the position information of the virtual navigator can be determined under the condition of determining the preset relative position relationship between the AR device and the virtual navigator and the position information of the AR device in the three-dimensional scene map.
In a specific application, the preset relative position relationship may be set, for example, a virtual navigator may be set at a position 1 meter in front of the AR device, so that the virtual navigator may perform navigation guidance; for another example, the virtual navigator can be arranged at the position 1 m behind the AR device, that is, the virtual navigator can follow up to attend, and navigation assistance work can be provided at any time; in addition, the position of the AR equipment can be set as the position of the virtual navigator, namely, the virtual navigator can serve as a virtual character matched with the user in the three-dimensional space, so that the AR immersion experience is improved, and the AR service quality is further improved.
The virtual navigator in the embodiment of the present disclosure can perform AR navigation guidance according to a navigation route, which can be specifically implemented by the following steps:
step one, determining a navigation route according to position information of the AR equipment in a world coordinate system corresponding to a three-dimensional scene map and set destination information;
determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR equipment in the three-dimensional scene map and the navigation route; the virtual navigator is located on the navigation route and is away from the AR device by a preset distance.
Here, in the case of determining the location information of the AR device and the preset destination information, a route from the location of the AR device (i.e., the location of the user) to the user destination may be planned based on a related navigation route planning method, thereby determining the navigation route. In this way, in the case where the preset distance between the virtual navigator and the AR device is preset, the position information of the virtual navigator on the navigation route in the three-dimensional scene map can be determined.
In the embodiment of the disclosure, in the case of determining the position information of the virtual navigator on the navigation route in the three-dimensional scene map, the AR navigation scene including the virtual navigator and the navigation route may be displayed on the AR device in combination with the navigation route.
As shown in fig. 2, a schematic diagram of an AR navigation scene including a virtual navigator and a navigation route is provided for the embodiment of the present disclosure. The sign S in fig. 2 indicates the position information of the virtual navigator, and the sign D indicates the destination, so that the virtual navigator can guide the user to the destination in the case of determining the navigation route.
It can be known that, the AR navigation scene presented by the method for controlling the movement of the virtual object provided by the embodiment of the present disclosure can not only visualize the navigation route, but also visualize the virtual navigator with guiding effect, and the movement state of the virtual navigator can be kept synchronous with the movement state of the AR device, thereby further improving the immersion experience of the AR navigation scene and improving the AR navigation service quality.
It will be understood by those skilled in the art that in the method of the present invention, the order of writing the steps does not imply a strict order of execution and any limitations on the implementation, and the specific order of execution of the steps should be determined by their function and possible inherent logic.
Based on the same inventive concept, the embodiment of the present disclosure further provides a device for controlling movement of a virtual object corresponding to the method for controlling movement of a virtual object, and since the principle of solving the problem of the device in the embodiment of the present disclosure is similar to the method for controlling movement of a virtual object in the embodiment of the present disclosure, the implementation of the device may refer to the implementation of the method, and repeated details are omitted.
Example two
Referring to fig. 3, a schematic diagram of an apparatus for controlling movement of a virtual object according to an embodiment of the present disclosure is shown, where the apparatus includes: an acquisition module 301, a determination module 302 and a control module 303; wherein the content of the first and second substances,
an obtaining module 301, configured to obtain inertial measurement unit IMU data of an augmented reality AR device in a process of displaying an AR scene including a target virtual object by the AR device;
a determining module 302 for determining movement state information of the AR device based on the IMU data;
and a control module 303, configured to control the target virtual object to move according to the movement state information.
By adopting the device for controlling the movement of the virtual object, in the process of displaying the AR scene containing the target virtual object by the augmented reality AR equipment, the movement state information of the AR equipment can be determined according to the acquired IMU data of the AR equipment, and the target virtual object can be controlled to move according to the movement state information of the AR equipment.
In one possible implementation, the IMU data includes three-axis acceleration information and three-axis angular velocity information, and the determining module 302 is configured to determine the movement state information of the AR device based on the IMU data according to the following steps:
determining acceleration information of the AR device based on the triaxial acceleration information, and determining angular velocity information of the AR device based on the triaxial angular velocity information;
based on the acceleration information and the angular velocity information, movement state information of the AR device is determined.
In one possible implementation, the control module 303 is configured to control the target virtual object to move according to the movement state information according to the following steps:
and the control target virtual object synchronously moves according to the movement state information of the AR equipment.
In a possible implementation manner, the control module 303 is configured to control the target virtual object to move synchronously according to the movement state information of the AR device according to the following steps:
determining a target mobile state type to which the AR equipment belongs currently based on the mobile state information of the AR equipment;
determining target special effect data corresponding to the target moving state type based on the corresponding relation between each moving state type and each special effect data;
and controlling the target virtual object to synchronously move according to the target moving state information and the target special effect data.
In one possible embodiment, the movement state information includes a movement speed and a movement direction; the movement state type includes at least one of walking, running, and turning directions.
In one possible implementation, the AR scene is an AR navigation scene, and the target virtual object is a virtual navigator; the obtaining module 301 is further configured to:
determining the position information of the AR equipment in the three-dimensional scene map based on the real scene image shot by the AR equipment and the pre-constructed three-dimensional scene map;
determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR equipment in the three-dimensional scene map;
displaying an AR navigation scene containing the virtual navigator on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map;
and in the process of displaying the AR navigation scene containing the virtual navigator by the AR equipment, acquiring Inertial Measurement Unit (IMU) data of the AR equipment.
In one possible implementation, the obtaining module 301 is configured to determine the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map according to the following steps:
determining a navigation route according to the position information of the AR equipment in a world coordinate system corresponding to the three-dimensional scene map and the set destination information;
determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map and the navigation route; the virtual navigator is located on the navigation route and is away from the AR device by a preset distance.
In a possible implementation manner, the obtaining module 301 is configured to display, on the AR device, an AR navigation scene including a virtual navigator according to the determined position information of the virtual navigator in the three-dimensional scene map according to the following steps:
and displaying the AR navigation scene containing the virtual navigator and the navigation route on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map and the navigation route.
The description of the processing flow of each module in the device and the interaction flow between the modules may refer to the related description in the above method embodiments, and will not be described in detail here.
EXAMPLE III
An embodiment of the present disclosure further provides an electronic device, as shown in fig. 4, which is a schematic structural diagram of the electronic device provided in the embodiment of the present disclosure, and the electronic device includes: a processor 401, a memory 402, and a bus 403. The memory 402 stores machine-readable instructions executable by the processor 401 (for example, corresponding execution instructions of the acquisition module 301, the determination module 302, and the control module 303 in the apparatus for controlling the movement of a virtual object in fig. 3, and the like), when the electronic device is operated, the processor 401 communicates with the memory 402 via the bus 403, and when the processor 401 is executed, the machine-readable instructions perform the following processes:
acquiring Inertial Measurement Unit (IMU) data of the AR equipment in the process of displaying the AR scene containing the target virtual object by the AR equipment;
determining movement state information of the AR device based on the IMU data;
and controlling the target virtual object to move according to the movement state information.
In one possible implementation, the IMU data includes three-axis acceleration information and three-axis angular velocity information, and the instructions executed by the processor 401 to determine the movement state information of the AR device based on the IMU data includes:
determining acceleration information of the AR device based on the triaxial acceleration information, and determining angular velocity information of the AR device based on the triaxial angular velocity information;
based on the acceleration information and the angular velocity information, movement state information of the AR device is determined.
In one possible implementation, the instructions executed by the processor 401 to control the target virtual object to move according to the determined movement state information of the AR device includes:
and the control target virtual object synchronously moves according to the movement state information of the AR equipment.
In a possible implementation manner, in the instructions executed by the processor 401, the controlling the target virtual object to move synchronously according to the movement state information of the AR device includes:
determining a target mobile state type to which the AR equipment belongs currently based on the mobile state information of the AR equipment;
determining target special effect data corresponding to the target moving state type based on the corresponding relation between each moving state type and each special effect data;
and controlling the target virtual object to synchronously move according to the target moving state information and the target special effect data.
In one possible embodiment, the movement state information includes a movement speed and a movement direction; the movement state type includes at least one of walking, running, and turning directions.
In one possible implementation, the AR scene is an AR navigation scene, and the target virtual object is a virtual navigator; the instructions executed by the processor 401 further include:
determining the position information of the AR equipment in the three-dimensional scene map based on the real scene image shot by the AR equipment and the pre-constructed three-dimensional scene map;
determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR equipment in the three-dimensional scene map;
displaying an AR navigation scene containing the virtual navigator on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map;
in the instruction executed by the processor 401, in the process of displaying the AR scene including the target virtual object by the augmented reality AR device, acquiring inertial measurement unit IMU data of the AR device includes:
and in the process of displaying the AR navigation scene containing the virtual navigator by the AR equipment, acquiring Inertial Measurement Unit (IMU) data of the AR equipment.
In one possible implementation, the instructions executed by the processor 401 for determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map includes:
determining a navigation route according to the position information of the AR equipment in a world coordinate system corresponding to the three-dimensional scene map and the set destination information;
determining the position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map and the navigation route; the virtual navigator is located on the navigation route and is away from the AR device by a preset distance.
In a possible implementation manner, the instructions executed by the processor 401 to present, in the AR device, an AR navigation scene including a virtual navigator according to the position information of the virtual navigator in the three-dimensional scene map includes:
and displaying the AR navigation scene containing the virtual navigator and the navigation route on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map and the navigation route.
For the specific execution process of the instruction, reference may be made to the steps of the method for controlling the movement of the virtual object in the first embodiment of the disclosure, which are not described herein again.
The embodiment of the present disclosure further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program performs the steps of the method for controlling the movement of the virtual object in the first embodiment of the foregoing method. The storage medium may be a volatile or non-volatile computer-readable storage medium.
A computer program product of a method for controlling a virtual object to move according to a first embodiment of the present disclosure includes a computer-readable storage medium storing a program code, where instructions included in the program code may be used to execute steps of the method for controlling a virtual object to move according to the first embodiment of the method.
The embodiments of the present disclosure also provide a computer program, which when executed by a processor implements any one of the methods of the foregoing embodiments. The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the system and the apparatus described above may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again. In the several embodiments provided in the present disclosure, it should be understood that the disclosed system, apparatus, and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one logical division, and there may be other divisions when actually implemented, and for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of devices or units through some communication interfaces, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a non-volatile computer-readable storage medium executable by a processor. Based on such understanding, the technical solution of the present disclosure may be embodied in the form of a software product, which is stored in a storage medium and includes several instructions for causing an electronic device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present disclosure. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
Finally, it should be noted that: the above-mentioned embodiments are merely specific embodiments of the present disclosure, which are used for illustrating the technical solutions of the present disclosure and not for limiting the same, and the scope of the present disclosure is not limited thereto, and although the present disclosure is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive of the technical solutions described in the foregoing embodiments or equivalent technical features thereof within the technical scope of the present disclosure; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present disclosure, and should be construed as being included therein. Therefore, the protection scope of the present disclosure shall be subject to the protection scope of the claims.

Claims (11)

1. A method of controlling movement of a virtual object, the method comprising:
acquiring Inertial Measurement Unit (IMU) data of an AR device in the process of displaying an AR scene containing a target virtual object by the AR device;
determining movement state information of the AR device based on the IMU data;
and controlling the target virtual object to move according to the movement state information.
2. The method of claim 1, wherein the IMU data includes three-axis acceleration information and three-axis angular velocity information, and wherein determining the movement state information of the AR device based on the IMU data includes:
determining acceleration information of the AR device based on the triaxial acceleration information, and determining angular velocity information of the AR device based on the triaxial angular velocity information;
determining movement state information of the AR device based on the acceleration information and the angular velocity information.
3. The method according to claim 1 or 2, wherein the controlling the target virtual object to move according to the movement state information comprises:
and controlling the target virtual object to synchronously move according to the movement state information of the AR equipment.
4. The method of claim 3, wherein the controlling the target virtual object to move synchronously according to the movement state information of the AR device comprises:
determining a target mobile state type to which the AR equipment belongs currently based on the mobile state information of the AR equipment;
determining target special effect data corresponding to the target moving state type based on the corresponding relation between each moving state type and each special effect data;
and controlling the target virtual object to synchronously move according to the target moving state information and the target special effect data.
5. The method of claim 4, wherein the movement status information includes a movement speed and a movement direction; the movement state type includes at least one of walking, running, and turning directions.
6. The method according to any one of claims 1 to 5, wherein the AR scene is an AR navigation scene, the target virtual object is a virtual navigator, and the method further comprises:
determining the position information of the AR equipment in a three-dimensional scene map based on a real scene image shot by the AR equipment and the pre-constructed three-dimensional scene map;
determining location information of the virtual navigator in the three-dimensional scene map based on the location information of the AR device in the three-dimensional scene map;
displaying an AR navigation scene containing the virtual navigator on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map;
the method for acquiring inertial measurement unit IMU data of the AR equipment in the process of displaying the AR scene containing the target virtual object by the AR equipment comprises the following steps:
and acquiring Inertial Measurement Unit (IMU) data of the AR equipment in the process of displaying the AR navigation scene containing the virtual navigator by the AR equipment.
7. The method of claim 6, wherein the determining the location information of the virtual navigator in the three-dimensional scene map based on the location information of the AR device in the three-dimensional scene map comprises:
determining a navigation route according to the position information of the AR equipment in a world coordinate system corresponding to the three-dimensional scene map and the set destination information;
determining position information of the virtual navigator in the three-dimensional scene map based on the position information of the AR device in the three-dimensional scene map and the navigation route; the virtual navigator is located on the navigation route and has a preset distance with the AR device.
8. The method of claim 7, wherein the presenting, at the AR device, the AR navigation scene containing the virtual navigator according to the position information of the virtual navigator in the three-dimensional scene map comprises:
and displaying an AR navigation scene containing the virtual navigator and the navigation route on the AR equipment according to the position information of the virtual navigator in the three-dimensional scene map and the navigation route.
9. An apparatus for controlling movement of a virtual object, the apparatus comprising:
acquiring Inertial Measurement Unit (IMU) data of an AR device in the process of displaying an AR scene containing a target virtual object by the AR device;
determining movement state information of the AR device based on the IMU data;
and controlling the target virtual object to move according to the movement state information.
10. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor being configured to execute the machine-readable instructions stored in the memory, the processor and the memory communicating via the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the method of controlling movement of a virtual object according to any one of claims 1 to 8.
11. A computer-readable storage medium, having stored thereon a computer program which, when executed by an electronic device, causes the electronic device to execute the method of controlling movement of a virtual object according to any one of claims 1 to 8.
CN202110208654.3A 2021-02-24 2021-02-24 Method and device for controlling movement of virtual object, electronic equipment and storage medium Pending CN112967404A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110208654.3A CN112967404A (en) 2021-02-24 2021-02-24 Method and device for controlling movement of virtual object, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110208654.3A CN112967404A (en) 2021-02-24 2021-02-24 Method and device for controlling movement of virtual object, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN112967404A true CN112967404A (en) 2021-06-15

Family

ID=76286012

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110208654.3A Pending CN112967404A (en) 2021-02-24 2021-02-24 Method and device for controlling movement of virtual object, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN112967404A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113362475A (en) * 2021-06-29 2021-09-07 视伴科技(北京)有限公司 Method and device for verifying generated virtual road directing identification
CN114661398A (en) * 2022-03-22 2022-06-24 上海商汤智能科技有限公司 Information display method and device, computer equipment and storage medium
CN115640953A (en) * 2022-09-13 2023-01-24 深圳会邦科技有限公司 Intelligent digital processing method, system, equipment and medium for conference

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016187477A1 (en) * 2015-05-20 2016-11-24 Daqri, Llc Virtual personification for augmented reality system
US20170270711A1 (en) * 2016-03-16 2017-09-21 Michael John Schoenberg Virtual object pathing
CN111078003A (en) * 2019-11-27 2020-04-28 Oppo广东移动通信有限公司 Data processing method and device, electronic equipment and storage medium
CN111176465A (en) * 2019-12-25 2020-05-19 Oppo广东移动通信有限公司 Use state identification method and device, storage medium and electronic equipment
CN111325849A (en) * 2018-12-14 2020-06-23 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN112034988A (en) * 2017-08-31 2020-12-04 苹果公司 Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
CN112146649A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Navigation method and device in AR scene, computer equipment and storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016187477A1 (en) * 2015-05-20 2016-11-24 Daqri, Llc Virtual personification for augmented reality system
US20170270711A1 (en) * 2016-03-16 2017-09-21 Michael John Schoenberg Virtual object pathing
CN112034988A (en) * 2017-08-31 2020-12-04 苹果公司 Systems, methods, and graphical user interfaces for interacting with augmented and virtual reality environments
CN111325849A (en) * 2018-12-14 2020-06-23 广东虚拟现实科技有限公司 Virtual content display method and device, terminal equipment and storage medium
CN111078003A (en) * 2019-11-27 2020-04-28 Oppo广东移动通信有限公司 Data processing method and device, electronic equipment and storage medium
CN111176465A (en) * 2019-12-25 2020-05-19 Oppo广东移动通信有限公司 Use state identification method and device, storage medium and electronic equipment
CN112146649A (en) * 2020-09-23 2020-12-29 北京市商汤科技开发有限公司 Navigation method and device in AR scene, computer equipment and storage medium

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113362475A (en) * 2021-06-29 2021-09-07 视伴科技(北京)有限公司 Method and device for verifying generated virtual road directing identification
CN114661398A (en) * 2022-03-22 2022-06-24 上海商汤智能科技有限公司 Information display method and device, computer equipment and storage medium
CN114661398B (en) * 2022-03-22 2024-05-17 上海商汤智能科技有限公司 Information display method and device, computer equipment and storage medium
CN115640953A (en) * 2022-09-13 2023-01-24 深圳会邦科技有限公司 Intelligent digital processing method, system, equipment and medium for conference
CN115640953B (en) * 2022-09-13 2023-08-08 深圳会邦科技有限公司 Intelligent digital processing method, system, equipment and medium for meeting business

Similar Documents

Publication Publication Date Title
CN112967404A (en) Method and device for controlling movement of virtual object, electronic equipment and storage medium
CN107407567B (en) Augmented reality navigation
JP6329343B2 (en) Image processing system, image processing apparatus, image processing program, and image processing method
CN112146649B (en) Navigation method and device in AR scene, computer equipment and storage medium
Vera et al. Augmented mirror: interactive augmented reality system based on kinect
KR20210047278A (en) AR scene image processing method, device, electronic device and storage medium
US20160163063A1 (en) Mixed-reality visualization and method
CN111610998A (en) AR scene content generation method, display method, device and storage medium
CN112148189A (en) Interaction method and device in AR scene, electronic equipment and storage medium
WO2014204330A1 (en) Methods and systems for determining 6dof location and orientation of head-mounted display and associated user movements
JP2012128779A (en) Virtual object display device
CN112729327A (en) Navigation method, navigation device, computer equipment and storage medium
US20210304509A1 (en) Systems and methods for virtual and augmented reality
CN108388347B (en) Interaction control method and device in virtual reality, storage medium and terminal
KR20210148074A (en) AR scenario content creation method, display method, device and storage medium
CN108830944A (en) Optical perspective formula three-dimensional near-eye display system and display methods
CN111693063A (en) Navigation interaction display method and device, electronic equipment and storage medium
Ayyanchira et al. Toward cross-platform immersive visualization for indoor navigation and collaboration with augmented reality
CN113359983A (en) Augmented reality data presentation method and device, electronic equipment and storage medium
CN111569414B (en) Flight display method and device of virtual aircraft, electronic equipment and storage medium
CN113345108A (en) Augmented reality data display method and device, electronic equipment and storage medium
CN111815783A (en) Virtual scene presenting method and device, electronic equipment and storage medium
CN110956702A (en) 3D visual editor and editing method based on time axis
CN113256710B (en) Method and device for displaying foresight in game, computer equipment and storage medium
KR101914660B1 (en) Method and apparatus for controlling displaying of augmented reality contents based on gyro sensor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination