WO2018201825A1 - Method, device and system for turn control - Google Patents

Method, device and system for turn control Download PDF

Info

Publication number
WO2018201825A1
WO2018201825A1 PCT/CN2018/080627 CN2018080627W WO2018201825A1 WO 2018201825 A1 WO2018201825 A1 WO 2018201825A1 CN 2018080627 W CN2018080627 W CN 2018080627W WO 2018201825 A1 WO2018201825 A1 WO 2018201825A1
Authority
WO
WIPO (PCT)
Prior art keywords
target object
scene
steering
real
virtual scene
Prior art date
Application number
PCT/CN2018/080627
Other languages
French (fr)
Chinese (zh)
Inventor
韩振泽
王晓阳
杨俊�
张佳宁
张道宁
Original Assignee
北京凌宇智控科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京凌宇智控科技有限公司 filed Critical 北京凌宇智控科技有限公司
Publication of WO2018201825A1 publication Critical patent/WO2018201825A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Definitions

  • the present application relates to, but is not limited to, information processing technology, and in particular, to a steering control method, apparatus and system.
  • the positioning technology currently used is positioned within 180 degrees of the user facing the camera or the signal transmitting base station; if the user wants to turn backwards, for example, there is a table behind the user in the virtual game. The user wants to turn around and take things on the table behind the body.
  • the user's head-mounted display device (hereinafter referred to as the head display) locator and the handle locator cannot be recognized by the camera or cannot receive the positioning signal transmitted by the signal transmitting base station. This prevents the head locator and the handle locator from being positioned. Therefore, for this type of positioning, the user cannot be turned to operate the scene behind the scene, thereby affecting the user experience.
  • the embodiment of the present application provides a steering control method, device, and system, which solves the problem that the positioning cannot be performed after the user turns around in a specific positioning mode.
  • the embodiment of the present application provides a steering control method, which is used to adjust scene information that a target object faces in a virtual scene, and the method includes:
  • At least two positioning devices may be disposed on the target object
  • Determining, according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction, a position of the target object in the virtual scene after the steering position is turned according to the steering angle include:
  • the at least two positioning devices are The coordinates in the virtual scene.
  • the determining, according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction, determining a corresponding character of the target object in the virtual scene according to the steering angle can also include:
  • the at least two positioning devices are The pose in the virtual scene.
  • the method may further include:
  • the coordinates of the at least two positioning devices in the real scene when the target object is in a real-time position the at least two positioning when receiving the steering triggering instruction
  • the coordinates of the at least one of the devices in the real-life scene and the steering angle determine coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time real-time position.
  • the method may further include:
  • the number of the positioning devices may be three, one of the positioning devices may be mounted on the head mounted display device, and the other two positioning devices may be respectively mounted on the two handles.
  • the steering angle may be 180 degrees.
  • the embodiment of the present application further provides a steering control apparatus, which is used to adjust scene information that a target object faces in a virtual scene;
  • the steering control apparatus includes:
  • a receiving module configured to receive a steering trigger command
  • a position adjustment module configured to determine, after the receiving module receives the steering triggering instruction, the target object in the virtual according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction a position in the scene in which the corresponding character performs the steering according to the steering angle;
  • the display control module is configured to adjust the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
  • At least two positioning devices may be disposed on the target object
  • the position adjustment module may be configured to determine, after the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene. Coordinates and poses of the at least two positioning devices in the virtual scene.
  • the position adjustment module may be further configured to: when the target object moves in a real scene, the at least two positioning devices are in the real scene according to the target object being in a real-time position Coordinates, coordinates of the at least one of the at least two positioning devices when the steering trigger command is received, and the steering angle, determining that the target object is in the real-time real-time position The coordinates of the at least two positioning devices in the virtual scene.
  • the embodiment of the present application further provides a steering control system, including: a steering control device and at least two positioning devices disposed on the target object;
  • the at least two positioning devices are configured to determine a position of the target object in a real scene
  • the steering control device is configured to determine, after receiving the steering triggering instruction, the target object in the virtual scene according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction Positioning the position according to the steering angle according to the steering angle; adjusting the corresponding position of the target object in the virtual scene according to the position of the target object in the virtual scene according to the steering angle Facing the scene information.
  • the positioning device may be configured to determine coordinates in a real-world scene by receiving a positioning signal transmitted by the signal transmitter.
  • the steering control device may be configured to determine, after receiving the steering triggering command, a position of the target object in a real scene and a steering angle indicated by the steering triggering command, by: The position of the target object in the virtual scene after the corresponding character follows the steering angle:
  • the at least two positioning devices are The coordinates and poses in the virtual scene.
  • the embodiment of the present application further provides a terminal, including: a memory and a processor, where the memory stores a steering control program, and the step of implementing the steering control method when the steering control program is executed by the processor.
  • Embodiments of the present application also provide a machine readable medium storing a steering control program that implements the steps of the steering control method when the steering control program is executed by a processor.
  • the target object when the target object (such as a user) needs to perform steering in the virtual scene, the target object does not need to turn in the real scene, and only controls the corresponding character in the virtual scene to perform steering, for example, the target object may be virtualized.
  • the corresponding characters in the scene can face the scenes that are facing away from each other, and avoid the problem that the positioning cannot be performed due to the steering in the real scene, thereby improving the user experience.
  • FIG. 1 is a flowchart of a steering control method provided by an embodiment of the present application.
  • FIG. 2 is a schematic diagram of position conversion when a target object corresponding to a character is turned according to an embodiment of the present application
  • FIG. 3 is a schematic diagram of positional movement of a target object corresponding character after steering according to an embodiment of the present application
  • FIG. 4 is a schematic diagram of a steering control apparatus according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a steering control system according to an embodiment of the present application.
  • the embodiment provides a steering control method for adjusting scene information that a target object faces in a virtual scene.
  • the steering control method of this embodiment includes:
  • the target object may be a person, such as a controller of a virtual game.
  • the target object changes the scene information in the seen virtual scene by moving in a real scene; or, in the virtual scene, there is a character corresponding to the target object, and the target object is in the real scene.
  • Motion control corresponds to the movement of the character in the virtual scene.
  • the position of the user in the real scene does not need to be turned, and only the position of the character corresponding to the user in the virtual scene needs to be adjusted, so that the user turns to the corresponding character in the virtual scene. , causing the user to turn through the scene that the corresponding character sees in the virtual scene.
  • the position of the target object in the real scene can be determined by the positioning device.
  • the positioning device may adopt a binocular positioning method or a laser positioning method, which is not limited in this application.
  • At least two positioning devices may be disposed on the target object. At least two positioning devices are arranged at different positions on the target object.
  • the two positioning devices can be respectively the handle locator, and the two handle locators can be respectively held by the left and right hands of the virtual game controller; or, two positioning The device can be a handle locator and a head locator respectively.
  • the handle locator can be held by one hand of the virtual game controller, and the head locator can be worn on the head of the virtual game controller.
  • this application is not limited thereto.
  • the number of positioning devices may be three, one of which may be mounted on the head mounted display device, and the other two positioning devices may be mounted on the two handles, respectively. In this way, the user can grasp the position of the real scene by holding the two handles and wearing the head.
  • the positioning device can determine the position in the real scene by the positioning signal transmitted by the receiving signal transmitter (for example, the signal transmitting base station).
  • a preset spatial coordinate system can be determined in a real scene.
  • the origin of the preset space coordinate system may be the center of gravity of the signal transmitter, and the first coordinate axis (for example, referred to as the X axis) may be perpendicular to the side panel of the signal transmitter.
  • the second coordinate axis (for example, referred to as the Z axis) may be perpendicular to the front panel of the signal transmitter, and the direction pointing to the front side of the front panel is the positive direction of the second coordinate axis, the positive direction of the first coordinate axis and the second coordinate axis
  • the positive direction satisfies the right hand rule
  • the third coordinate axis (for example, the Y axis) is perpendicular to the plane defined by the first coordinate axis and the second coordinate axis.
  • the positioning signal transmitted by the signal transmitter may include: a first laser plane signal, a second laser plane signal, and an ultrasonic signal.
  • the first laser plane signal and the second laser plane signal are rotated and emitted, and the rotation axes of the two may be perpendicular to each other.
  • the signal transmitter can transmit a synchronization signal, an ultrasonic signal, a first laser plane signal, and a second laser plane signal in each signal period.
  • the positioning device can obtain the time of receiving the synchronization signal, the time of receiving the laser plane signal, and the time of receiving the ultrasonic signal according to the time of receiving the signals, and the corresponding algorithm can be used to calculate the positioning device in the preset space coordinate system. coordinate.
  • S101 can include:
  • the preset spatial coordinate system (hereinafter referred to as the first coordinate system) in the real scene in which the target object is located may be identical or different from the spatial coordinate system (hereinafter referred to as the second coordinate system) in the virtual scene.
  • the first coordinate system and the second coordinate system are inconsistent, there may be a conversion relationship between the two. That is, the coordinates in the first coordinate system can obtain the corresponding coordinates in the second coordinate system by the conversion relationship.
  • the position in the virtual scene that is, the coordinates in the second coordinate system
  • the position in the virtual scene can be determined according to the coordinates of the positioning device in the first coordinate system.
  • the position of the positioning device in the virtual scene can be obtained, that is, in the second coordinate system.
  • the coordinates of the positioning device D1 in the first coordinate system when receiving the steering trigger command are used as the replacement origin; according to the replacement origin and the steering angle, it is determined that the target object is in the first coordinate system.
  • the coordinates of the two positioning devices D1, D2 in the first coordinate system when the first coordinate system is consistent with the second coordinate system in the virtual scene, determining that the positioning devices D1, D2 are in the first coordinate system
  • the coordinates are the coordinates of the two positioning devices D1 and D2 after the target object is turned in the corresponding character in the virtual scene.
  • the coordinates of the positioning device D1 are unchanged, and the coordinates of the positioning device D2 become the replacement origin with the coordinates of the positioning device D1.
  • the position after turning to the degree when the first coordinate system is inconsistent with the second coordinate system in the virtual scene, the coordinates of the assumed target object in the real scene may be calculated according to the coordinates of the positioning device D1 and D2 in the first coordinate system, and then the coordinates are calculated.
  • the coordinate is obtained according to the conversion relationship mapping to obtain the coordinates of the positioning devices D1 and D2 in the virtual scene; or, the positioning device D1 can be calculated according to the coordinates of the positioning device D1 and D2 in the first coordinate system and the conversion relationship between the two coordinate systems.
  • D2 is the coordinate in the second coordinate system in the virtual scene, and then the coordinates of the positioning devices D1, D2 are steered in the second coordinate system to obtain the coordinates after the steering.
  • this application is not limited thereto.
  • S101 may further include:
  • the method of the embodiment may further include: when the target object moves in the real scene, according to the coordinates of the at least two positioning devices in the real scene when the target object is in the real-time position, receiving Determining the coordinates of at least two positioning devices in the virtual scene when the target object is in the real-time position by the coordinates and the steering angle of at least one of the at least two positioning devices when the triggering command is turned.
  • the scene information of the target object in the virtual scene may be further adjusted according to the coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time position. That is, according to the real-time moving coordinates of the target object in the real scene, the scene information that the target object faces in the virtual scene is dynamically adjusted.
  • the target object in the virtual scene is rotated because the target object does not turn in the real scene, so that the scene after the target object is turned to the target object can be seen.
  • the coordinates of the positioning device in the first coordinate system may be changed according to the target object and
  • the initial displacement origin determines the coordinates of the positioning device in the second coordinate system after the target object is turned.
  • the coordinate of the positioning device in the first coordinate system and the replacement origin may be first calculated, and the coordinates of the positioning device in the first coordinate system after the target object is turned in the real scene are calculated. Then, the coordinate is mapped according to the conversion relationship to obtain the coordinates of the positioning device in the second coordinate system in the virtual scene; or, the positioning device in the virtual scene may be calculated according to the coordinates and the conversion relationship of the positioning device in the first coordinate system.
  • the coordinates in the second coordinate system are then steered by the coordinates of the positioning device in the second coordinate system of the virtual scene to obtain the coordinates after the steering.
  • this application is not limited thereto.
  • the steering control method of the present embodiment can be applied to head display or to a control device independent of the head display and the handle.
  • this application is not limited thereto.
  • a steering trigger button may be provided on the handle, or a steering trigger button may be provided on the head display, and the user may press the steering trigger button when a scene requiring the user to turn around occurs.
  • the handle can send a steering trigger command to the head display or control device.
  • a plurality of steering trigger buttons may be correspondingly disposed on the handle or the head, or a combination button may be set on the handle or the head to allow the user to set the steering angle.
  • the steering angle may be 180 degrees, however, this application is not limited thereto. In practical applications, the steering angle that needs to be processed by the solution of the present application may be set according to actual conditions, for example, the steering angle may be greater than 180 degrees.
  • the target object after the target object performs a 180 degree turn in the corresponding character in the virtual scene, in the virtual scene, the target object corresponds to the scene facing the character.
  • the attitude also has a 180 degree rotation.
  • the target object is the user
  • the number of positioning devices is three, one of which is mounted on the head display and the other two are mounted on the handle as an example.
  • a signal transmitting base station is provided.
  • the signal transmitting base station defines the X, Y, and Z axes of the real space, and establishes a spatial coordinate system (corresponding to the first coordinate system described above).
  • the positioning device determines the coordinates in the spatial coordinate system by receiving a positioning signal transmitted by the base station.
  • the first coordinate system and the second coordinate system in the virtual scene are identical as an example.
  • the target object (user) corresponds to a human character
  • the positioning device mounted on the head display corresponds to the position of the human head
  • the positioning device mounted on the handle corresponds to the position of the human hand.
  • the user can press the trigger device, which can be a button on the handle or a button on the head display or the like.
  • the position conversion is started after the head display receives the steering trigger command.
  • the original position of the head display is T 0
  • the original positions of the two handles are A 0 and B 0
  • T 0 is set as the replacement origin of the new coordinate system
  • the steering angle is 180 degrees
  • the user is
  • the corresponding character (human) in the virtual scene is turned, in the virtual scene, the human head position becomes T 0 ', and the positions of the two hands become A 0 ' and B 0 ', respectively.
  • the coordinates of T 0 are (X 0 , Y 0 , Z 0 ), the coordinates of A 0 are (X A0 , Y A0 , Z A0 ), and the coordinates of B 0 are (X B0 , Y B0 , Z B0 ).
  • the A 0 'point coordinates are (X A0 ', Y A0 ', Z A0 '), and the B 0 'point coordinates are (X B0 ', Y B0 ', Z B0 ').
  • T 0 ' (X 0 , Y 0 , Z 0 ).
  • the user's posture corresponding to the character (human) in the virtual scene also rotates 180 degrees.
  • the scene information faced by the corresponding character (human) in the virtual scene is further adjusted according to the coordinates and posture of the head display and the two handles in the virtual scene when the user performs the position conversion.
  • the user moves in the real scene, corresponding to the movement of the user's corresponding character (human) in the virtual scene, such as the head display and the handle in the real scene from FIG. 2
  • the positions T 0 , A 0 and B 0 are moved to the positions T 1 , A 1 and B 1 as shown in FIG. 3 , and the coordinates of the two handles in the real scene are A 1 (X A1 , Y A1 , Z, respectively).
  • the coordinates of the head display are T 1 (X T1 , Y T1 , Z T1 ).
  • X T1 ' 2X 0 -X T1
  • Y T1 ' Y T1
  • Z T1 ' 2Z 0 -Z T1 .
  • the scene information faced by the corresponding character (human) in the virtual scene is further adjusted.
  • the user-oriented role steering is implemented in the virtual scenario, and the user does not need to be redirected in the real-life scenario, thereby avoiding the problem that the user cannot be located due to the user's steering in the real-life scenario, thereby improving the user experience.
  • the steering control device is configured to adjust the scene information that the target object faces in the virtual scene in the virtual scene.
  • the steering control apparatus in this embodiment includes:
  • the receiving module 401 is configured to receive a steering trigger instruction
  • the position adjustment module 402 is configured to: after the receiving module 401 receives the steering trigger command, determine, according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction, that the target object is in the virtual scene according to the steering angle. The position after the turn;
  • the display control module 403 is configured to adjust the scene information that the target object corresponds to in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
  • At least two positioning devices may be disposed on the target object
  • the position adjustment module 402 can be configured to determine, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, the coordinates of the at least two positioning devices in the virtual scene after the corresponding object is turned according to the steering angle in the virtual scene. .
  • the position adjustment module 402 may be further configured to: when the target object moves in the real scene, the coordinates of the at least two positioning devices in the real scene when the target object is in the real-time position, and the at least two positioning when the steering trigger command is received. At least one of the coordinates of the device in the real scene and the steering angle determine the coordinates of at least two positioning devices in the virtual scene when the target object is in the real-time position.
  • the number of positioning devices may be three, one of which may be mounted on the head mounted display device, and the other two positioning devices may be mounted on the two handles, respectively.
  • the position adjustment module 402 can be configured to determine coordinates of the three positioning devices in the virtual scene after the target object is turned according to the steering angle according to the coordinates of the three positioning devices in the real scene and the steering angle.
  • the position adjustment module 402 can also be configured to be located on the head mounted display device when the target object is moving in the real scene according to the coordinates of the three positioning devices in the real scene when the target object is in the real-time position.
  • the coordinates of the positioning device in the real scene and the steering angle determine the coordinates of the three positioning devices in the virtual scene when the target object is in the real-time position.
  • the display control module 403 may be further configured to further adjust a scene that the target object faces in the virtual scene according to the coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time position. information. That is, according to the real-time moving coordinates of the target object in the real scene, the scene information faced by the corresponding object in the virtual scene in the virtual scene is adjusted in real time.
  • the steering angle may be 180 degrees.
  • the embodiment provides a steering control system including: a steering control device and at least two positioning devices disposed on the target object.
  • At least two positioning devices are configured to determine a position of the target object in a real scene
  • the steering control device is configured to, after receiving the steering trigger command, determine, according to the position of the target object in the real scene and the steering angle indicated by the steering trigger command, the target object is steered according to the steering angle in the virtual scene.
  • Position adjusts the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene after the steering is turned according to the steering angle.
  • the positioning device may be configured to determine coordinates in a real scene by receiving a positioning signal transmitted by the signal transmitter.
  • the steering control device may be configured to determine that the target object is in the virtual scene according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction after receiving the steering triggering command in the following manner.
  • the positioning device can include two handle locators 502a, 502b and a head locator 503.
  • the positioning device determines the coordinates in the first coordinate system in the real scene by receiving the positioning signal transmitted by the signal transmitter 501.
  • the signal transmitter 501 can periodically transmit a synchronization signal, a laser plane signal, and an ultrasonic signal; the handle locators 502a, 502b and the head locator 503 can respectively calculate the synchronization signal, the laser plane signal, and the ultrasonic signal, and calculate the respective The coordinates in the first coordinate system; the handle locators 502a, 502b and the head locator 503, after calculating their coordinates in the first coordinate system, send the coordinate information to the control device 504 (eg, a smartphone or a head) Display).
  • Control device 504 can convert the coordinate information of handle locators 502a, 502b and head locator 503 into the relative positions of the displayed images on control device 504.
  • a steering trigger command is sent to the control device 504, and after the control device 504 receives the steering trigger command, the steering control device 5040 is internally provided.
  • the steering process is performed such that the display image on the control device 504 is an image after the user has turned the corresponding character in the virtual scene.
  • the embodiment of the present application further provides a terminal (such as a positioning device, a smart phone, or a head display), including: a memory and a processor, where the memory stores a steering control program, and when the steering control program is executed by the processor, the foregoing is implemented.
  • a terminal such as a positioning device, a smart phone, or a head display
  • the embodiment of the present application further provides a machine readable medium storing a steering control program, where the steering control program is executed by the processor to implement the steps of the steering control method.
  • Such software may be distributed on a machine-readable medium, such as a computer-readable medium, which may include computer storage media (or non-transitory media) and communication media (or transitory media).
  • a computer-readable medium includes volatile and nonvolatile, implemented in any method or technology for storing information, such as computer readable instructions, data structures, program modules or other data. Sex, removable and non-removable media.
  • Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical disc storage, magnetic cartridge, magnetic tape, magnetic disk storage or other magnetic storage device, or may Any other medium used to store the desired information and that can be accessed by the computer.
  • communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and can include any information delivery media. .
  • the embodiment of the present invention provides a steering control method, device, and system, which solves the problem that the user cannot perform positioning after turning the user in a specific positioning mode, thereby improving the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Disclosed in the present application is a method, a device and system for turn control, used for adjusting information of a scene faced by a character in a virtual scene who corresponds to a target object, wherein the method comprises: after receiving a triggering instruction for a turn, according to the position of a target object in an actual scene and a turning degree indicated by the triggering instruction, determining the position of a character in the virtual scene who corresponds to the target object after turning according to the turning degree; and on the basis of the position of the character in the virtual scene who corresponds to the target object after turning according to the turning degree, adjusting the information of the scene faced by the character in the virtual scene who corresponds to the target object.

Description

一种转向控制方法、装置及系统Steering control method, device and system 技术领域Technical field
本申请涉及但不限于信息处理技术,尤其涉及一种转向控制方法、装置及系统。The present application relates to, but is not limited to, information processing technology, and in particular, to a steering control method, apparatus and system.
背景技术Background technique
随着虚拟现实(VR,Virtual Reality)领域的日益繁荣,虚拟游戏开始出现,在虚拟游戏提供的沉浸式交互体验中,精确的空间定位追踪技术显得尤为关键。目前使用的定位技术,无论是双目识别还是激光定位,都是在用户面对摄像机或者信号发射基站的180度范围内实现定位;如果用户想转身向后,例如虚拟游戏中用户身后有个桌子,用户想转身拿身后桌子上的东西,此时,用户的头戴式显示设备(以下简称头显)定位器和手柄定位器,不能被摄像机识别或者不能接收到信号发射基站发射的定位信号,导致头显定位器和手柄定位器无法进行定位。因此,对于这类定位方式,无法实现用户转身对身后场景进行操作,从而影响了用户体验。With the growing prosperity of virtual reality (VR), virtual games are beginning to emerge. In the immersive interactive experience provided by virtual games, accurate spatial location tracking technology is particularly critical. The positioning technology currently used, whether binocular recognition or laser positioning, is positioned within 180 degrees of the user facing the camera or the signal transmitting base station; if the user wants to turn backwards, for example, there is a table behind the user in the virtual game. The user wants to turn around and take things on the table behind the body. At this time, the user's head-mounted display device (hereinafter referred to as the head display) locator and the handle locator cannot be recognized by the camera or cannot receive the positioning signal transmitted by the signal transmitting base station. This prevents the head locator and the handle locator from being positioned. Therefore, for this type of positioning, the user cannot be turned to operate the scene behind the scene, thereby affecting the user experience.
发明概述Summary of invention
以下是对本文详细描述的主题的概述。本概述并非是为了限制权利要求的保护范围。The following is an overview of the topics detailed in this document. This Summary is not intended to limit the scope of the claims.
本申请实施例提供一种转向控制方法、装置及系统,解决了针对特定定位方式下用户转身后无法进行定位的问题。The embodiment of the present application provides a steering control method, device, and system, which solves the problem that the positioning cannot be performed after the user turns around in a specific positioning mode.
本申请实施例提供一种转向控制方法,用于调整目标物体在虚拟场景中对应角色所面对的场景信息,上述方法包括:The embodiment of the present application provides a steering control method, which is used to adjust scene information that a target object faces in a virtual scene, and the method includes:
在接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;After receiving the steering triggering command, determining, according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction, determining that the target object is in the virtual scene according to the steering angle The position after the turn;
根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转 向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。And adjusting the scene information that the target object faces in the virtual scene according to the position that the target object is rotated in the virtual scene according to the steering angle.
在示例性实施方式中,所述目标物体上可以设置有至少两个定位设备;In an exemplary embodiment, at least two positioning devices may be disposed on the target object;
所述根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,可以包括:Determining, according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction, a position of the target object in the virtual scene after the steering position is turned according to the steering angle, include:
根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The coordinates in the virtual scene.
在示例性实施方式中,所述根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,还可以包括:In an exemplary embodiment, the determining, according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction, determining a corresponding character of the target object in the virtual scene according to the steering angle The position after the steering can also include:
根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The pose in the virtual scene.
在示例性实施方式中,所述调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息之后,上述方法还可以包括:In an exemplary embodiment, after the adjusting the scene information that the target object faces in the virtual scene, the method may further include:
当所述目标物体在现实场景中移动时,根据所述目标物体处于现实实时位置时所述至少两个定位设备在所述现实场景中的坐标、在接收到转向触发指令时所述至少两个定位设备中的至少一个在所述现实场景中的坐标以及所述转向角度,确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标。When the target object moves in a real scene, the coordinates of the at least two positioning devices in the real scene when the target object is in a real-time position, the at least two positioning when receiving the steering triggering instruction The coordinates of the at least one of the devices in the real-life scene and the steering angle determine coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time real-time position.
在示例性实施方式中,所述确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标之后,上述方法还可以包括:In an exemplary embodiment, after determining the coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time position, the method may further include:
根据所述虚拟场景中所述至少两个定位设备的坐标,进一步调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。And further adjusting, according to coordinates of the at least two positioning devices in the virtual scenario, scene information that the target object faces in the virtual scene.
在示例性实施方式中,所述定位设备的数目可以为三个,其中一个定位设备可以安装在头戴式显示设备上,其余两个定位设备可以分别安装在两个 手柄上。In an exemplary embodiment, the number of the positioning devices may be three, one of the positioning devices may be mounted on the head mounted display device, and the other two positioning devices may be respectively mounted on the two handles.
在示例性实施方式中,所述转向角度可以为180度。In an exemplary embodiment, the steering angle may be 180 degrees.
本申请实施例还提供一种转向控制装置,用于调整目标物体在虚拟场景中对应角色所面对的场景信息;所述转向控制装置包括:The embodiment of the present application further provides a steering control apparatus, which is used to adjust scene information that a target object faces in a virtual scene; the steering control apparatus includes:
接收模块,配置为接收转向触发指令;a receiving module configured to receive a steering trigger command;
位置调整模块,配置为在所述接收模块接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;a position adjustment module configured to determine, after the receiving module receives the steering triggering instruction, the target object in the virtual according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction a position in the scene in which the corresponding character performs the steering according to the steering angle;
显示控制模块,配置为根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。The display control module is configured to adjust the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
在示例性实施方式中,所述目标物体上可以设置有至少两个定位设备;In an exemplary embodiment, at least two positioning devices may be disposed on the target object;
所述位置调整模块可以配置为根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标和姿态。The position adjustment module may be configured to determine, after the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene. Coordinates and poses of the at least two positioning devices in the virtual scene.
在示例性实施方式中,所述位置调整模块还可以配置为当所述目标物体在现实场景中移动时,根据所述目标物体处于现实实时位置时所述至少两个定位设备在所述现实场景中的坐标、在接收到转向触发指令时所述至少两个定位设备中的至少一个在所述现实场景中的坐标以及所述转向角度,确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标。In an exemplary embodiment, the position adjustment module may be further configured to: when the target object moves in a real scene, the at least two positioning devices are in the real scene according to the target object being in a real-time position Coordinates, coordinates of the at least one of the at least two positioning devices when the steering trigger command is received, and the steering angle, determining that the target object is in the real-time real-time position The coordinates of the at least two positioning devices in the virtual scene.
本申请实施例还提供一种转向控制系统,包括:转向控制装置以及设置在目标物体上的至少两个定位设备;The embodiment of the present application further provides a steering control system, including: a steering control device and at least two positioning devices disposed on the target object;
所述至少两个定位设备配置为确定所述目标物体在现实场景中的位置;The at least two positioning devices are configured to determine a position of the target object in a real scene;
所述转向控制装置配置为在接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述 目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。The steering control device is configured to determine, after receiving the steering triggering instruction, the target object in the virtual scene according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction Positioning the position according to the steering angle according to the steering angle; adjusting the corresponding position of the target object in the virtual scene according to the position of the target object in the virtual scene according to the steering angle Facing the scene information.
在示例性实施方式中,所述定位设备可以配置为通过接收信号发射器发射的定位信号,确定在现实场景中的坐标。In an exemplary embodiment, the positioning device may be configured to determine coordinates in a real-world scene by receiving a positioning signal transmitted by the signal transmitter.
在示例性实施方式中,所述转向控制装置可以配置为在接收到转向触发指令之后,通过以下方式根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置:In an exemplary embodiment, the steering control device may be configured to determine, after receiving the steering triggering command, a position of the target object in a real scene and a steering angle indicated by the steering triggering command, by: The position of the target object in the virtual scene after the corresponding character follows the steering angle:
根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标和姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The coordinates and poses in the virtual scene.
本申请实施例还提供一种终端,包括:存储器以及处理器,所述存储器存储有转向控制程序,在所述转向控制程序被处理器执行时实现上述转向控制方法的步骤。The embodiment of the present application further provides a terminal, including: a memory and a processor, where the memory stores a steering control program, and the step of implementing the steering control method when the steering control program is executed by the processor.
本申请实施例还提供一种机器可读介质,存储有转向控制程序,在所述转向控制程序被处理器执行时实现上述转向控制方法的步骤。Embodiments of the present application also provide a machine readable medium storing a steering control program that implements the steps of the steering control method when the steering control program is executed by a processor.
在本申请实施例中,当目标物体(比如用户)需要在虚拟场景中进行转向时,目标物体在现实场景中无需转向,仅控制在虚拟场景中对应角色进行转向,比如可以使得目标物体在虚拟场景中对应角色可以面向之前背对的场景,避免由于在现实场景中进行转向导致无法进行定位的问题,从而提高了用户体验。In the embodiment of the present application, when the target object (such as a user) needs to perform steering in the virtual scene, the target object does not need to turn in the real scene, and only controls the corresponding character in the virtual scene to perform steering, for example, the target object may be virtualized. The corresponding characters in the scene can face the scenes that are facing away from each other, and avoid the problem that the positioning cannot be performed due to the steering in the real scene, thereby improving the user experience.
本申请的其它特征和优点将在随后的说明书中阐述,并且,部分地从说明书中变得显而易见,或者通过实施本申请而了解。本申请的目的和其他优点可通过在说明书、权利要求书以及附图中所特别指出的结构来实现和获得。Other features and advantages of the present application will be set forth in the description which follows. The objectives and other advantages of the present invention can be realized and obtained by the structure of the invention.
附图概述BRIEF abstract
附图用来提供对本申请技术方案的进一步理解,并且构成说明书的一部 分,与本申请的实施例一起用于解释本申请的技术方案,并不构成对本申请技术方案的限制。The drawings are used to provide a further understanding of the technical solutions of the present application, and constitute a part of the specification, which is used together with the embodiments of the present application to explain the technical solutions of the present application, and does not constitute a limitation of the technical solutions of the present application.
图1为本申请实施例提供的转向控制方法的流程图;1 is a flowchart of a steering control method provided by an embodiment of the present application;
图2为本申请实施例的目标物体对应角色进行转向时的位置转换示意图;2 is a schematic diagram of position conversion when a target object corresponding to a character is turned according to an embodiment of the present application;
图3为本申请实施例的目标物体对应角色在转向后的位置移动示意图;FIG. 3 is a schematic diagram of positional movement of a target object corresponding character after steering according to an embodiment of the present application; FIG.
图4为本申请实施例提供的转向控制装置的示意图;4 is a schematic diagram of a steering control apparatus according to an embodiment of the present application;
图5为本申请实施例提供的转向控制系统的示意图。FIG. 5 is a schematic diagram of a steering control system according to an embodiment of the present application.
详述Detailed
以下结合附图对本申请实施例进行详细说明,应当理解,以下所说明的实施例仅用于说明和解释本申请,并不用于限定本申请。需要说明的是,在不冲突的情况下,本申请中的实施例及实施例中的特征可以相互任意组合。The embodiments of the present application are described in detail below with reference to the accompanying drawings. It should be noted that, in the case of no conflict, the features in the embodiments and the embodiments in the present application may be arbitrarily combined with each other.
在附图的流程图示出的步骤可以在诸如一组计算机可执行指令的计算机系统中执行。并且,虽然在流程图中示出了逻辑顺序,但是在某些情况下,可以以不同于此处的顺序执行所示出或描述的步骤。The steps illustrated in the flowchart of the figures may be executed in a computer system such as a set of computer executable instructions. Also, although logical sequences are shown in the flowcharts, in some cases the steps shown or described may be performed in a different order than the ones described herein.
实施例一Embodiment 1
本实施例提供一种转向控制方法,用于调整目标物体在虚拟场景中对应角色所面对的场景信息。如图1所示,本实施例的转向控制方法包括:The embodiment provides a steering control method for adjusting scene information that a target object faces in a virtual scene. As shown in FIG. 1, the steering control method of this embodiment includes:
S101、在接收到转向触发指令之后,根据目标物体在现实场景中的位置以及转向触发指令所指示的转向角度,确定目标物体在虚拟场景中对应角色按照该转向角度进行转向后的位置;S101. After receiving the steering triggering instruction, determine, according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction, the position of the target object in the virtual scene after the steering position is turned according to the steering angle;
S102、根据目标物体在虚拟场景中对应角色按照该转向角度进行转向后的位置,调整对应角色在虚拟场景中所面对的场景信息。S102. Adjust the scene information that the corresponding character faces in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
本实施例中,目标物体可以为人,比如,虚拟游戏的操控者。在示例性实施方式中,目标物体通过在现实场景中进行运动,使得看到的虚拟场景中的场景信息发生变化;或者,在虚拟场景中存在目标物体对应的角色,目标物体在现实场景中的运动控制对应角色在虚拟场景中的运动。In this embodiment, the target object may be a person, such as a controller of a virtual game. In an exemplary embodiment, the target object changes the scene information in the seen virtual scene by moving in a real scene; or, in the virtual scene, there is a character corresponding to the target object, and the target object is in the real scene. Motion control corresponds to the movement of the character in the virtual scene.
在本实施例中,在接收到转向触发指令时,用户在现实场景中的位置无需转向,仅需调整用户对应在虚拟场景中的角色的位置,以实现用户在虚拟场景中对应的角色进行转向,使得用户通过对应角色在虚拟场景中看到的场景进行转向。In this embodiment, when the steering triggering command is received, the position of the user in the real scene does not need to be turned, and only the position of the character corresponding to the user in the virtual scene needs to be adjusted, so that the user turns to the corresponding character in the virtual scene. , causing the user to turn through the scene that the corresponding character sees in the virtual scene.
在本实施例中,目标物体在现实场景中的位置可以通过定位设备进行定位确定。定位设备可以采用双目定位方式或激光定位方式,本申请对此并不限定。In this embodiment, the position of the target object in the real scene can be determined by the positioning device. The positioning device may adopt a binocular positioning method or a laser positioning method, which is not limited in this application.
其中,目标物体上可以设置有至少两个定位设备。至少两个定位设备在目标物体上的设置位置不同。比如,以目标物体为虚拟游戏的操控者为例,两个定位设备可以分别为手柄定位仪,两个手柄定位仪可以分别由虚拟游戏操控者的左右两个手握持;或者,两个定位设备可以分别为手柄定位仪和头显定位仪,手柄定位仪可以由虚拟游戏操控者的一个手握持,头显定位仪可以戴在虚拟游戏操控者的头上。然而,本申请对此并不限定。Wherein, at least two positioning devices may be disposed on the target object. At least two positioning devices are arranged at different positions on the target object. For example, taking the target object as the controller of the virtual game as an example, the two positioning devices can be respectively the handle locator, and the two handle locators can be respectively held by the left and right hands of the virtual game controller; or, two positioning The device can be a handle locator and a head locator respectively. The handle locator can be held by one hand of the virtual game controller, and the head locator can be worn on the head of the virtual game controller. However, this application is not limited thereto.
在示例性实施方式中,定位设备的数目可以为三个,其中一个定位设备可以安装在头戴式显示设备上,其余两个定位设备可以分别安装在两个手柄上。如此,用户可以通过握持两个手柄,头戴头显实现自身在现实场景中的定位。In an exemplary embodiment, the number of positioning devices may be three, one of which may be mounted on the head mounted display device, and the other two positioning devices may be mounted on the two handles, respectively. In this way, the user can grasp the position of the real scene by holding the two handles and wearing the head.
其中,定位设备可以通过接收信号发射器(比如,信号发射基站)发射的定位信号确定在现实场景中的位置。比如,在现实场景中可以确定预设空间坐标系。在一示例中,以信号发射器为长方体为例,预设空间坐标系的原点可以为信号发射器的重心、第一坐标轴(例如称为X轴)可以垂直于信号发射器的侧面板,第二坐标轴(例如称为Z轴)可以垂直于信号发射器的前面板,且指向前面板前侧的方向为第二坐标轴的正方向,第一坐标轴的正方向和第二坐标轴的正方向满足右手定则,第三坐标轴(例如称为Y轴)垂直于第一坐标轴和第二坐标轴所确定平面。其中,关于预设空间坐标系的设置方式存在多种,本申请对此并不限定。在实际应用中,可以根据现实场景的实际情况以及所采用的定位方式确定预设空间坐标系。Wherein, the positioning device can determine the position in the real scene by the positioning signal transmitted by the receiving signal transmitter (for example, the signal transmitting base station). For example, a preset spatial coordinate system can be determined in a real scene. In an example, taking the signal transmitter as a cuboid, the origin of the preset space coordinate system may be the center of gravity of the signal transmitter, and the first coordinate axis (for example, referred to as the X axis) may be perpendicular to the side panel of the signal transmitter. The second coordinate axis (for example, referred to as the Z axis) may be perpendicular to the front panel of the signal transmitter, and the direction pointing to the front side of the front panel is the positive direction of the second coordinate axis, the positive direction of the first coordinate axis and the second coordinate axis The positive direction satisfies the right hand rule, and the third coordinate axis (for example, the Y axis) is perpendicular to the plane defined by the first coordinate axis and the second coordinate axis. There are a plurality of ways for setting the preset space coordinate system, which is not limited in this application. In practical applications, the preset spatial coordinate system can be determined according to the actual situation of the actual scene and the positioning method adopted.
在上述示例中,信号发射器发射的定位信号可以包括:第一激光平面信号、第二激光平面信号以及超声波信号。其中,第一激光平面信号以及第二 激光平面信号旋转发射,且两者的旋转轴可以相互垂直。信号发射器可以在每个信号周期内发射同步信号、超声波信号、第一激光平面信号以及第二激光平面信号。定位设备可以根据接收到这些信号的时间得到收到同步信号的时刻、收到激光平面信号的时刻以及收到超声波信号的时刻,采用相应的算法可以计算得到定位设备在预设空间坐标系中的坐标。In the above example, the positioning signal transmitted by the signal transmitter may include: a first laser plane signal, a second laser plane signal, and an ultrasonic signal. Wherein, the first laser plane signal and the second laser plane signal are rotated and emitted, and the rotation axes of the two may be perpendicular to each other. The signal transmitter can transmit a synchronization signal, an ultrasonic signal, a first laser plane signal, and a second laser plane signal in each signal period. The positioning device can obtain the time of receiving the synchronization signal, the time of receiving the laser plane signal, and the time of receiving the ultrasonic signal according to the time of receiving the signals, and the corresponding algorithm can be used to calculate the positioning device in the preset space coordinate system. coordinate.
其中,S101可以包括:Wherein, S101 can include:
根据至少两个定位设备在现实场景中的坐标以及转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后,至少两个定位设备在虚拟场景中的坐标。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, the coordinates of the at least two positioning devices in the virtual scene after the corresponding object in the virtual scene is turned according to the steering angle.
在本实施例中,目标物体所在的现实场景中的预设空间坐标系(以下称为第一坐标系)与虚拟场景中的空间坐标系(以下称为第二坐标系)可以一致或不同。在第一坐标系和第二坐标系不一致时,两者之间可以存在转换关系。即第一坐标系中的坐标通过该转换关系可以得到在第二坐标系中对应的坐标。In this embodiment, the preset spatial coordinate system (hereinafter referred to as the first coordinate system) in the real scene in which the target object is located may be identical or different from the spatial coordinate system (hereinafter referred to as the second coordinate system) in the virtual scene. When the first coordinate system and the second coordinate system are inconsistent, there may be a conversion relationship between the two. That is, the coordinates in the first coordinate system can obtain the corresponding coordinates in the second coordinate system by the conversion relationship.
在第一坐标系和第二坐标系一致时,根据定位设备在第一坐标系中的坐标即可以确定在虚拟场景中的位置,即在第二坐标系中的坐标。When the first coordinate system and the second coordinate system are coincident, the position in the virtual scene, that is, the coordinates in the second coordinate system, can be determined according to the coordinates of the positioning device in the first coordinate system.
在第一坐标系和第二坐标系之间存在转换关系时,根据定位设备在第一坐标系中的坐标以及上述转换关系,可以得到定位设备在虚拟场景中的位置,即在第二坐标系中的坐标。When there is a conversion relationship between the first coordinate system and the second coordinate system, according to the coordinates of the positioning device in the first coordinate system and the above conversion relationship, the position of the positioning device in the virtual scene can be obtained, that is, in the second coordinate system. The coordinates in .
以两个定位设备D1、D2为例,将接收到转向触发指令时定位设备D1在第一坐标系中的坐标作为置换原点;根据置换原点以及转向角度,确定若目标物体在第一坐标系中进行转向后,两个定位设备D1、D2在第一坐标系下的坐标;在第一坐标系与虚拟场景中的第二坐标系一致时,确定定位设备D1、D2在第一坐标系下的坐标即为目标物体在虚拟场景中对应角色进行转向后两个定位设备D1、D2的坐标。例如,转向角度为180度时,在转向后,由于定位设备D1的坐标为置换原点,则定位设备D1的坐标不变,定位设备D2的坐标变为以定位设备D1的坐标为置换原点经过180度转向后的位置。在第一坐标系与虚拟场景中的第二坐标系不一致时,可以先根据定位设备D1、D2在第一坐标系下的坐标,计算假设目标物体在现实场景进行转向后的坐标, 再将该坐标根据转换关系映射得到虚拟场景中定位设备D1、D2的坐标;或者,可以先根据定位设备D1、D2在第一坐标系下的坐标以及两个坐标系之间的转换关系,计算定位设备D1、D2在虚拟场景中第二坐标系下的坐标,然后,在第二坐标系下对定位设备D1、D2的坐标进行转向,得到转向后的坐标。然而,本申请对此并不限定。Taking two positioning devices D1 and D2 as an example, the coordinates of the positioning device D1 in the first coordinate system when receiving the steering trigger command are used as the replacement origin; according to the replacement origin and the steering angle, it is determined that the target object is in the first coordinate system. After the steering is performed, the coordinates of the two positioning devices D1, D2 in the first coordinate system; when the first coordinate system is consistent with the second coordinate system in the virtual scene, determining that the positioning devices D1, D2 are in the first coordinate system The coordinates are the coordinates of the two positioning devices D1 and D2 after the target object is turned in the corresponding character in the virtual scene. For example, when the steering angle is 180 degrees, after the steering, since the coordinates of the positioning device D1 are the replacement origin, the coordinates of the positioning device D1 are unchanged, and the coordinates of the positioning device D2 become the replacement origin with the coordinates of the positioning device D1. The position after turning to the degree. When the first coordinate system is inconsistent with the second coordinate system in the virtual scene, the coordinates of the assumed target object in the real scene may be calculated according to the coordinates of the positioning device D1 and D2 in the first coordinate system, and then the coordinates are calculated. The coordinate is obtained according to the conversion relationship mapping to obtain the coordinates of the positioning devices D1 and D2 in the virtual scene; or, the positioning device D1 can be calculated according to the coordinates of the positioning device D1 and D2 in the first coordinate system and the conversion relationship between the two coordinate systems. D2 is the coordinate in the second coordinate system in the virtual scene, and then the coordinates of the positioning devices D1, D2 are steered in the second coordinate system to obtain the coordinates after the steering. However, this application is not limited thereto.
在示例性实施方式中,S101还可以包括:In an exemplary embodiment, S101 may further include:
根据至少两个定位设备在现实场景中的坐标以及转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后,至少两个定位设备在虚拟场景中的姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, the posture of the at least two positioning devices in the virtual scene after the corresponding object is turned according to the steering angle in the virtual scene.
在示例性实施方式中,S102之后,本实施例的方法还可以包括:当目标物体在现实场景中移动时,根据目标物体处于现实实时位置时至少两个定位设备在现实场景中的坐标、在接收到转向触发指令时至少两个定位设备中的至少一个在现实场景中的坐标以及转向角度,确定目标物体处于现实实时位置时在虚拟场景中至少两个定位设备的坐标。本示例中,可以根据目标物体处于现实实时位置时在虚拟场景中至少两个定位设备的坐标,进一步调整目标物体在虚拟场景中对应角色所面对的场景信息。即根据目标物体在现实场景中的实时移动坐标,动态地调整目标物体在虚拟场景中对应角色所面对的场景信息。In an exemplary embodiment, after S102, the method of the embodiment may further include: when the target object moves in the real scene, according to the coordinates of the at least two positioning devices in the real scene when the target object is in the real-time position, receiving Determining the coordinates of at least two positioning devices in the virtual scene when the target object is in the real-time position by the coordinates and the steering angle of at least one of the at least two positioning devices when the triggering command is turned. In this example, the scene information of the target object in the virtual scene may be further adjusted according to the coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time position. That is, according to the real-time moving coordinates of the target object in the real scene, the scene information that the target object faces in the virtual scene is dynamically adjusted.
其中,当目标物体在虚拟场景中对应角色进行转向之后,由于现实场景中目标物体没有转向,因此实际是虚拟场景中的场景进行了旋转,使得可以看到目标物体对应角色转向后的场景。After the target object is turned in the corresponding character in the virtual scene, the target object in the virtual scene is rotated because the target object does not turn in the real scene, so that the scene after the target object is turned to the target object can be seen.
其中,当目标物体在虚拟场景对应角色进行转向后,目标物体进行移动时,若第一坐标系和第二坐标系一致,则可以根据目标物体移动之后定位设备在第一坐标系的坐标以及起始的置换原点,确定目标物体转向后定位设备在第二坐标系统的坐标。Wherein, when the target object is steered in the virtual scene corresponding character, when the target object moves, if the first coordinate system and the second coordinate system are consistent, the coordinates of the positioning device in the first coordinate system may be changed according to the target object and The initial displacement origin determines the coordinates of the positioning device in the second coordinate system after the target object is turned.
若第一坐标系和第二坐标系不一致,则可以先根据定位设备在第一坐标系下的坐标以及置换原点,计算假设目标物体在现实场景进行转向后定位设备在第一坐标系下的坐标,再将该坐标根据转换关系映射得到虚拟场景中定位设备在第二坐标系下的坐标;或者,可以先根据定位设备在第一坐标系下 的坐标以及转换关系,计算在虚拟场景中定位设备在第二坐标系下的坐标,然后,将在虚拟场景的第二坐标系下定位设备的坐标进行转向,得到转向后的坐标。然而,本申请对此并不限定。If the first coordinate system and the second coordinate system are inconsistent, the coordinate of the positioning device in the first coordinate system and the replacement origin may be first calculated, and the coordinates of the positioning device in the first coordinate system after the target object is turned in the real scene are calculated. Then, the coordinate is mapped according to the conversion relationship to obtain the coordinates of the positioning device in the second coordinate system in the virtual scene; or, the positioning device in the virtual scene may be calculated according to the coordinates and the conversion relationship of the positioning device in the first coordinate system. The coordinates in the second coordinate system are then steered by the coordinates of the positioning device in the second coordinate system of the virtual scene to obtain the coordinates after the steering. However, this application is not limited thereto.
本实施例的转向控制方法可以应用于头显,或者,应用于独立于头显和手柄的控制设备。然而,本申请对此并不限定。The steering control method of the present embodiment can be applied to head display or to a control device independent of the head display and the handle. However, this application is not limited thereto.
在示例性实施方式中,手柄上可以设置转向触发按键,或者头显上可以设置转向触发按键,在出现需要用户转身的场景时,用户可以按下转向触发按键。比如,手柄上的转向触发按键被按下时,手柄可以向头显或控制设备发送转向触发指令。In an exemplary embodiment, a steering trigger button may be provided on the handle, or a steering trigger button may be provided on the head display, and the user may press the steering trigger button when a scene requiring the user to turn around occurs. For example, when the steering trigger button on the handle is pressed, the handle can send a steering trigger command to the head display or control device.
在示例性实施方式中,根据转向角度的不同,手柄上或头显上可以对应设置多个转向触发按键,或者,手柄上或头显上可以设置组合按键以便用户设置转向角度。In an exemplary embodiment, depending on the steering angle, a plurality of steering trigger buttons may be correspondingly disposed on the handle or the head, or a combination button may be set on the handle or the head to allow the user to set the steering angle.
在示例性实施方式中,转向角度可以为180度,然而,本申请对此并不限定。在实际应用中,可以根据实际情况,设置需要采用本申请方案处理的转向角度,比如,转向角度可以大于180度。In an exemplary embodiment, the steering angle may be 180 degrees, however, this application is not limited thereto. In practical applications, the steering angle that needs to be processed by the solution of the present application may be set according to actual conditions, for example, the steering angle may be greater than 180 degrees.
需要说明的是,本实施例中,以转向角度为180度为例,虚拟场景当目标物体在虚拟场景中对应角色进行180度转向之后,在虚拟场景中,目标物体对应角色所面对的场景除了位置转变,姿态也发生180度的转动。It should be noted that, in this embodiment, taking a steering angle of 180 degrees as an example, after the target object performs a 180 degree turn in the corresponding character in the virtual scene, in the virtual scene, the target object corresponds to the scene facing the character. In addition to the positional shift, the attitude also has a 180 degree rotation.
下面通过一个示例对本申请的方案进行说明。The solution of the present application will be described below by way of an example.
在本示例中,以目标物体为用户,定位设备的数目为三个,其中一个安装在头显,另外两个安装在手柄为例进行说明。在现实场景中,设置有一个信号发射基站,该信号发射基站限定现实空间的X、Y、Z轴,建立一个空间坐标系(对应上述的第一坐标系)。定位设备通过接收信号发射基站发射的定位信号确定在该空间坐标系内的坐标。在本示例中,以第一坐标系和虚拟场景内的第二坐标系一致为例进行说明。在虚拟场景中,目标物体(用户)对应的角色为人类,安装在头显上的定位设备对应于人类头部的位置,安装在手柄上的定位设备对应于人类手部的位置。In this example, the target object is the user, and the number of positioning devices is three, one of which is mounted on the head display and the other two are mounted on the handle as an example. In a real-life scenario, a signal transmitting base station is provided. The signal transmitting base station defines the X, Y, and Z axes of the real space, and establishes a spatial coordinate system (corresponding to the first coordinate system described above). The positioning device determines the coordinates in the spatial coordinate system by receiving a positioning signal transmitted by the base station. In this example, the first coordinate system and the second coordinate system in the virtual scene are identical as an example. In the virtual scene, the target object (user) corresponds to a human character, the positioning device mounted on the head display corresponds to the position of the human head, and the positioning device mounted on the handle corresponds to the position of the human hand.
在本示例中,当出现需要转身的场景时,用户可以按下触发装置,该触 发装置可以是手柄上的按键或者头显上的按键等。In this example, when a scene in which a turn is desired occurs, the user can press the trigger device, which can be a button on the handle or a button on the head display or the like.
在本示例中,以头显执行本申请的方案为例进行说明。在触发装置被按下之后,会产生转向触发指令给头显。In this example, the scheme in which the present application is executed by the head is taken as an example for explanation. After the trigger device is pressed, a steering trigger command is generated for the head display.
在头显收到转向触发指令后开始进行位置转换。如图2所示,头显的原位置为T 0,两个手柄的原位置为A 0和B 0,其中,设置T 0为新坐标系的置换原点,转向角度为180度时,用户在虚拟场景中对应角色(人类)进行转向之后,在虚拟场景中,人类的头部位置变为T 0’,两个手部的位置分别变为A 0’和B 0’。 The position conversion is started after the head display receives the steering trigger command. As shown in Fig. 2, the original position of the head display is T 0 , and the original positions of the two handles are A 0 and B 0 , wherein T 0 is set as the replacement origin of the new coordinate system, and when the steering angle is 180 degrees, the user is After the corresponding character (human) in the virtual scene is turned, in the virtual scene, the human head position becomes T 0 ', and the positions of the two hands become A 0 ' and B 0 ', respectively.
如图2所示,T 0点坐标为(X 0、Y 0、Z 0)、A 0点坐标为(X A0、Y A0、Z A0)、B 0点坐标为(X B0、Y B0、Z B0)。A 0’点坐标为(X A0’、Y A0’、Z A0’),B 0’点坐标为(X B0’、Y B0’、Z B0’)。 As shown in Fig. 2, the coordinates of T 0 are (X 0 , Y 0 , Z 0 ), the coordinates of A 0 are (X A0 , Y A0 , Z A0 ), and the coordinates of B 0 are (X B0 , Y B0 , Z B0 ). The A 0 'point coordinates are (X A0 ', Y A0 ', Z A0 '), and the B 0 'point coordinates are (X B0 ', Y B0 ', Z B0 ').
在转向角度为180度时,经过计算可知:When the steering angle is 180 degrees, it is calculated that:
X A0’=2X 0-X A0,Y A0’=Y A0,Z A0’=2Z 0-Z A0X A0 '=2X 0 -X A0 , Y A0 '=Y A0 ,Z A0 '=2Z 0 -Z A0 ;
X B0’=2X 0-X B0,Y B0’=Y B0,Z B0’=2Z 0-Z B0X B0 '=2X 0 -X B0 , Y B0 '=Y B0 ,Z B0 '=2Z 0 -Z B0 ;
T 0’=(X 0、Y 0、Z 0)。 T 0 '=(X 0 , Y 0 , Z 0 ).
同时,用户在虚拟场景中对应角色(人类)的手部的姿势也发生180度旋转。At the same time, the user's posture corresponding to the character (human) in the virtual scene also rotates 180 degrees.
根据计算后得到的用户进行位置转换时在虚拟场景中头显和两个手柄的坐标和姿态,进一步调整所述用户在虚拟场景中对应角色(人类)所面对的场景信息。The scene information faced by the corresponding character (human) in the virtual scene is further adjusted according to the coordinates and posture of the head display and the two handles in the virtual scene when the user performs the position conversion.
在本示例中,当用户对应角色在虚拟场景进行转向之后,用户在现实场景中移动,对应于用户对应角色(人类)在虚拟场景中的移动,例如头显以及手柄在现实场景中从图2所示的位置T 0、A 0和B 0移动到如图3所示的位置T 1、A 1和B 1,则现实场景中两个手柄的坐标分别为A 1(X A1、Y A1、Z A1)、B 1(X B1、Y B1、Z B1),头显的坐标为T 1(X T1、Y T1、Z T1)。对应在虚拟场景中对应角色手部的坐标分别为A 1’(X A1’、Y A1’、Z A1’)、B 1’(X B1’、Y B1’、Z B1’),头部的坐标为T 1’(X T1’、Y T1’、Z T1’)。 In this example, after the user's corresponding character is turned in the virtual scene, the user moves in the real scene, corresponding to the movement of the user's corresponding character (human) in the virtual scene, such as the head display and the handle in the real scene from FIG. 2 The positions T 0 , A 0 and B 0 are moved to the positions T 1 , A 1 and B 1 as shown in FIG. 3 , and the coordinates of the two handles in the real scene are A 1 (X A1 , Y A1 , Z, respectively). A1 ), B 1 (X B1 , Y B1 , Z B1 ), the coordinates of the head display are T 1 (X T1 , Y T1 , Z T1 ). Corresponding to the coordinates of the corresponding character hand in the virtual scene are A 1 '(X A1 ', Y A1 ', Z A1 '), B 1 '(X B1 ', Y B1 ', Z B1 '), the head The coordinates are T 1 '(X T1 ', Y T1 ', Z T1 ').
在转向角度为180度时,经过计算可知:When the steering angle is 180 degrees, it is calculated that:
X A1’=2X 0-X A1,Y A1’=Y A1,Z A1’=2Z 0-Z A1X A1 '=2X 0 -X A1 , Y A1 '=Y A1 , Z A1 '=2Z 0 -Z A1 ;
X B1’=2X 0-X B1,Y B1’=Y B1,Z B1’=2Z 0-Z B1X B1 '=2X 0 -X B1 , Y B1 '=Y B1 , Z B1 '=2Z 0 -Z B1 ;
X T1’=2X 0-X T1,Y T1’=Y T1,Z T1’=2Z 0-Z T1X T1 '=2X 0 -X T1 , Y T1 '=Y T1 , Z T1 '=2Z 0 -Z T1 .
根据计算后得到的用户处于现实实时位置时在虚拟场景中头显和两个手柄的坐标,进一步调整用户在虚拟场景中对应角色(人类)所面对的场景信息。According to the calculated coordinates of the head and the two handles in the virtual scene when the user is in the real-time position, the scene information faced by the corresponding character (human) in the virtual scene is further adjusted.
综上所述,本实施例通过在虚拟场景中实现用户对应角色转向,而无需在现实场景中进行用户转向,避免了由于现实场景中用户转向导致无法进行定位的问题,从而提高了用户体验。In summary, in this embodiment, the user-oriented role steering is implemented in the virtual scenario, and the user does not need to be redirected in the real-life scenario, thereby avoiding the problem that the user cannot be located due to the user's steering in the real-life scenario, thereby improving the user experience.
实施例二Embodiment 2
本实施例提供一种转向控制装置,用于调整目标物体在虚拟场景中对应角色所面对的场景信息;如图4所示,本实施例的转向控制装置包括:The steering control device is configured to adjust the scene information that the target object faces in the virtual scene in the virtual scene. As shown in FIG. 4, the steering control apparatus in this embodiment includes:
接收模块401,配置为接收转向触发指令;The receiving module 401 is configured to receive a steering trigger instruction;
位置调整模块402,配置为在接收模块401接收到转向触发指令之后,根据目标物体在现实场景中的位置以及转向触发指令所指示的转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后的位置;The position adjustment module 402 is configured to: after the receiving module 401 receives the steering trigger command, determine, according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction, that the target object is in the virtual scene according to the steering angle. The position after the turn;
显示控制模块403,配置为根据目标物体对应角色在虚拟场景中按照转向角度进行转向后的位置,调整目标物体对应角色在虚拟场景中所面对的场景信息。The display control module 403 is configured to adjust the scene information that the target object corresponds to in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
其中,目标物体上可以设置有至少两个定位设备;Wherein, at least two positioning devices may be disposed on the target object;
位置调整模块402可以配置为根据至少两个定位设备在现实场景中的坐标以及转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后,至少两个定位设备在虚拟场景中的坐标。The position adjustment module 402 can be configured to determine, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, the coordinates of the at least two positioning devices in the virtual scene after the corresponding object is turned according to the steering angle in the virtual scene. .
其中,位置调整模块402还可以配置为当目标物体在现实场景中移动时,根据目标物体处于现实实时位置时至少两个定位设备在现实场景中的坐标、在接收到转向触发指令时至少两个定位设备中的至少一个在现实场景中的坐标以及转向角度,确定目标物体处于现实实时位置时在虚拟场景中至少两个定位设备的坐标。The position adjustment module 402 may be further configured to: when the target object moves in the real scene, the coordinates of the at least two positioning devices in the real scene when the target object is in the real-time position, and the at least two positioning when the steering trigger command is received. At least one of the coordinates of the device in the real scene and the steering angle determine the coordinates of at least two positioning devices in the virtual scene when the target object is in the real-time position.
在示例性实施方式中,定位设备的数目可以为三个,其中一个定位设备可以安装在头戴式显示设备上,其余两个定位设备可以分别安装在两个手柄上。In an exemplary embodiment, the number of positioning devices may be three, one of which may be mounted on the head mounted display device, and the other two positioning devices may be mounted on the two handles, respectively.
位置调整模块402可以配置为根据三个定位设备在现实场景中的坐标以及转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后,三个定位设备在虚拟场景中的坐标。位置调整模块402还可以配置为当目标物体在现实场景中移动时,根据目标物体处于现实实时位置时三个定位设备在现实场景中的坐标、在接收到转向触发指令时位于头戴式显示设备上的定位设备在现实场景中的坐标以及转向角度,确定目标物体处于现实实时位置时在虚拟场景中三个定位设备的坐标。The position adjustment module 402 can be configured to determine coordinates of the three positioning devices in the virtual scene after the target object is turned according to the steering angle according to the coordinates of the three positioning devices in the real scene and the steering angle. The position adjustment module 402 can also be configured to be located on the head mounted display device when the target object is moving in the real scene according to the coordinates of the three positioning devices in the real scene when the target object is in the real-time position. The coordinates of the positioning device in the real scene and the steering angle determine the coordinates of the three positioning devices in the virtual scene when the target object is in the real-time position.
在示例性实施方式中,显示控制模块403还可以配置为根据目标物体处于现实实时位置时在虚拟场景中至少两个定位设备的坐标,进一步调整目标物体在虚拟场景中对应角色所面对的场景信息。即根据目标物体在现实场景中的实时移动坐标,实时地调整目标物体在虚拟场景中对应角色所面对的场景信息。In an exemplary embodiment, the display control module 403 may be further configured to further adjust a scene that the target object faces in the virtual scene according to the coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time position. information. That is, according to the real-time moving coordinates of the target object in the real scene, the scene information faced by the corresponding object in the virtual scene in the virtual scene is adjusted in real time.
在示例性实施方式中,转向角度可以为180度。In an exemplary embodiment, the steering angle may be 180 degrees.
此外,关于本实施例提供的转向控制装置的相关处理过程可以参照实施例一的转向控制方法的描述,故于此不再赘述。For the related processing procedure of the steering control apparatus provided in this embodiment, reference may be made to the description of the steering control method of the first embodiment, and thus no further details are provided herein.
实施例三Embodiment 3
本实施例提供一种转向控制系统,包括:转向控制装置以及设置在目标物体上的至少两个定位设备。The embodiment provides a steering control system including: a steering control device and at least two positioning devices disposed on the target object.
其中,至少两个定位设备配置为确定目标物体在现实场景中的位置;Wherein at least two positioning devices are configured to determine a position of the target object in a real scene;
转向控制装置配置为在接收到转向触发指令之后,根据目标物体在现实场景中的位置以及转向触发指令所指示的转向角度,确定目标物体在虚拟场景中对应角色按照所述转向角度进行转向后的位置;根据目标物体在虚拟场景中对应角色按照转向角度进行转向后的位置,调整目标物体在虚拟场景中对应角色所面对的场景信息。The steering control device is configured to, after receiving the steering trigger command, determine, according to the position of the target object in the real scene and the steering angle indicated by the steering trigger command, the target object is steered according to the steering angle in the virtual scene. Position: adjusts the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene after the steering is turned according to the steering angle.
其中,定位设备可以配置为通过接收信号发射器发射的定位信号,确定 在现实场景中的坐标。The positioning device may be configured to determine coordinates in a real scene by receiving a positioning signal transmitted by the signal transmitter.
在示例性实施方式中,转向控制装置可以配置为在接收到转向触发指令之后,通过以下方式根据目标物体在现实场景中的位置以及转向触发指令所指示的转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后的位置:In an exemplary embodiment, the steering control device may be configured to determine that the target object is in the virtual scene according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction after receiving the steering triggering command in the following manner. The position of the corresponding character after steering according to the steering angle:
根据至少两个定位设备在现实场景中的坐标以及转向角度,确定目标物体在虚拟场景中对应角色按照转向角度进行转向后,至少两个定位设备在虚拟场景中的坐标和姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, the coordinates and postures of the at least two positioning devices in the virtual scene after the corresponding object is turned by the steering angle in the virtual scene.
下面参照图5通过一个示例对本申请的转向控制系统进行说明。The steering control system of the present application will be described below by way of an example with reference to FIG.
如图5所示,定位设备可以包括两个手柄定位仪502a、502b以及一个头显定位仪503。定位设备通过接收信号发射器501发射的定位信号,确定在现实场景中第一坐标系下的坐标。As shown in FIG. 5, the positioning device can include two handle locators 502a, 502b and a head locator 503. The positioning device determines the coordinates in the first coordinate system in the real scene by receiving the positioning signal transmitted by the signal transmitter 501.
其中,信号发射器501可以周期性发射同步信号、激光平面信号以及超声波信号;手柄定位仪502a、502b和头显定位仪503可以分别通过接收同步信号、激光平面信号以及超声波信号,计算得到各自在第一坐标系下的坐标;手柄定位仪502a、502b和头显定位仪503在计算得到自己在第一坐标系下的坐标后,会将坐标信息发送给控制设备504(比如,智能手机或头显)。控制设备504可以将手柄定位仪502a、502b和头显定位仪503的坐标信息转化为控制设备504上的显示图像的相对位置。在手柄定位仪502a、502b或者头显定位仪503上的转向触发按键被触发时,会发送转向触发指令给控制设备504,控制设备504接收到转向触发指令之后,通过内部设置的转向控制装置5040进行转向处理,使得控制设备504上的显示图像为用户在虚拟场景对应角色进行转向后的图像。The signal transmitter 501 can periodically transmit a synchronization signal, a laser plane signal, and an ultrasonic signal; the handle locators 502a, 502b and the head locator 503 can respectively calculate the synchronization signal, the laser plane signal, and the ultrasonic signal, and calculate the respective The coordinates in the first coordinate system; the handle locators 502a, 502b and the head locator 503, after calculating their coordinates in the first coordinate system, send the coordinate information to the control device 504 (eg, a smartphone or a head) Display). Control device 504 can convert the coordinate information of handle locators 502a, 502b and head locator 503 into the relative positions of the displayed images on control device 504. When the steering trigger button on the handle locator 502a, 502b or the head locator 503 is triggered, a steering trigger command is sent to the control device 504, and after the control device 504 receives the steering trigger command, the steering control device 5040 is internally provided. The steering process is performed such that the display image on the control device 504 is an image after the user has turned the corresponding character in the virtual scene.
此外,本申请实施例还提供一种终端(比如,定位设备、智能手机或头显),包括:存储器以及处理器,存储器存储有转向控制程序,在该转向控制程序被处理器执行时实现上述转向控制方法的步骤。In addition, the embodiment of the present application further provides a terminal (such as a positioning device, a smart phone, or a head display), including: a memory and a processor, where the memory stores a steering control program, and when the steering control program is executed by the processor, the foregoing is implemented. The steps of the steering control method.
此外,本申请实施例还提供一种机器可读介质,存储有转向控制程序,该转向控制程序被处理器执行时实现上述转向控制方法的步骤。In addition, the embodiment of the present application further provides a machine readable medium storing a steering control program, where the steering control program is executed by the processor to implement the steps of the steering control method.
本领域普通技术人员可以理解,上文中所公开方法中的全部或某些步骤、系统、装置中的功能模块/单元可以被实施为软件、固件、硬件及其适当的组合。在硬件实施方式中,在以上描述中提及的功能模块/单元之间的划分不一定对应于物理组件的划分;例如,一个物理组件可以具有多个功能,或者一个功能或步骤可以由若干物理组件合作执行。某些组件或所有组件可以被实施为由处理器,如数字信号处理器或微处理器执行的软件,或者被实施为硬件,或者被实施为集成电路,如专用集成电路。这样的软件可以分布在机器可读介质(比如,计算机可读介质)上,计算机可读介质可以包括计算机存储介质(或非暂时性介质)和通信介质(或暂时性介质)。如本领域普通技术人员公知的,术语计算机存储介质包括在用于存储信息(诸如计算机可读指令、数据结构、程序模块或其他数据)的任何方法或技术中实施的易失性和非易失性、可移除和不可移除介质。计算机存储介质包括但不限于RAM、ROM、EEPROM、闪存或其他存储器技术、CD-ROM、数字多功能盘(DVD)或其他光盘存储、磁盒、磁带、磁盘存储或其他磁存储装置、或者可以用于存储期望的信息并且可以被计算机访问的任何其他的介质。此外,本领域普通技术人员公知的是,通信介质通常包含计算机可读指令、数据结构、程序模块或者诸如载波或其他传输机制之类的调制数据信号中的其他数据,并且可包括任何信息递送介质。Those of ordinary skill in the art will appreciate that all or some of the steps, systems, and functional blocks/units of the methods disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof. In a hardware implementation, the division between functional modules/units mentioned in the above description does not necessarily correspond to the division of physical components; for example, one physical component may have multiple functions, or one function or step may be composed of several physical The components work together. Some or all of the components may be implemented as software executed by a processor, such as a digital signal processor or microprocessor, or as hardware, or as an integrated circuit, such as an application specific integrated circuit. Such software may be distributed on a machine-readable medium, such as a computer-readable medium, which may include computer storage media (or non-transitory media) and communication media (or transitory media). As is well known to those of ordinary skill in the art, the term computer storage medium includes volatile and nonvolatile, implemented in any method or technology for storing information, such as computer readable instructions, data structures, program modules or other data. Sex, removable and non-removable media. Computer storage media includes, but is not limited to, RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disc (DVD) or other optical disc storage, magnetic cartridge, magnetic tape, magnetic disk storage or other magnetic storage device, or may Any other medium used to store the desired information and that can be accessed by the computer. Moreover, it is well known to those skilled in the art that communication media typically includes computer readable instructions, data structures, program modules, or other data in a modulated data signal, such as a carrier wave or other transport mechanism, and can include any information delivery media. .
以上显示和描述了本申请的基本原理和主要特征和本申请的优点。本申请不受上述实施例的限制,上述实施例和说明书中描述的只是说明本申请的原理,在不脱离本申请精神和范围的前提下,本申请还会有各种变化和改进,这些变化和改进都落入要求保护的本申请范围内。The basic principles and main features of the present application and the advantages of the present application are shown and described above. The present application is not limited by the above-described embodiments, and the above-described embodiments and the description are only for explaining the principles of the present application, and various changes and modifications may be made to the present application without departing from the spirit and scope of the application. And improvements are within the scope of the claimed invention.
工业实用性Industrial applicability
本申请实施例提供一种转向控制方法、装置及系统,解决了针对特定定位方式下用户转身后无法进行定位的问题,从而提高了用户体验。The embodiment of the present invention provides a steering control method, device, and system, which solves the problem that the user cannot perform positioning after turning the user in a specific positioning mode, thereby improving the user experience.

Claims (15)

  1. 一种转向控制方法,用于调整目标物体在虚拟场景中对应角色所面对的场景信息,上述方法包括:A steering control method is used to adjust scene information that a target object faces in a virtual scene, and the method includes:
    在接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;After receiving the steering triggering command, determining, according to the position of the target object in the real scene and the steering angle indicated by the steering triggering instruction, determining that the target object is in the virtual scene according to the steering angle The position after the turn;
    根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。And adjusting the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene after the steering is performed according to the steering angle.
  2. 根据权利要求1所述的方法,其中,所述目标物体上设置有至少两个定位设备;The method according to claim 1, wherein at least two positioning devices are disposed on the target object;
    所述根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,包括:Determining, according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction, a position of the target object after the corresponding character in the virtual scene is turned according to the steering angle, including :
    根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The coordinates in the virtual scene.
  3. 根据权利要求2所述的方法,其中,所述根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,还包括:The method according to claim 2, wherein said determining, according to a position of said target object in a real scene and a steering angle indicated by said steering triggering instruction, said corresponding object in said virtual scene according to a corresponding character The steering angle after the steering position further includes:
    根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The pose in the virtual scene.
  4. 根据权利要求2所述的方法,所述调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息之后,所述方法还包括:The method of claim 2, after the adjusting the scene information that the target object faces in the virtual scene, the method further includes:
    当所述目标物体在现实场景中移动时,根据所述目标物体处于现实实时位置时所述至少两个定位设备在所述现实场景中的坐标、在接收到转向触发 指令时所述至少两个定位设备中的至少一个在所述现实场景中的坐标以及所述转向角度,确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标。When the target object moves in a real scene, the coordinates of the at least two positioning devices in the real scene when the target object is in a real-time position, the at least two positioning when receiving the steering triggering instruction The coordinates of the at least one of the devices in the real-life scene and the steering angle determine coordinates of the at least two positioning devices in the virtual scene when the target object is in the real-time real-time position.
  5. 根据权利要求4所述的方法,所述确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标之后,所述方法还包括:根据所述虚拟场景中所述至少两个定位设备的坐标,进一步调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。The method according to claim 4, after determining that the target object is in the real-time position, after the coordinates of the at least two positioning devices in the virtual scene, the method further comprises: according to the virtual The coordinates of the at least two positioning devices in the scenario further adjust scene information that the target object faces in the virtual scene.
  6. 根据权利要求1至5中任一项所述的方法,其中,所述定位设备的数目为三个,其中一个定位设备安装在头戴式显示设备上,其余两个定位设备分别安装在两个手柄上。The method according to any one of claims 1 to 5, wherein the number of the positioning devices is three, one of the positioning devices is mounted on the head mounted display device, and the other two positioning devices are respectively installed in two On the handle.
  7. 根据权利要求1至5中任一项所述的方法,其中,所述转向角度为180度。The method according to any one of claims 1 to 5, wherein the steering angle is 180 degrees.
  8. 一种转向控制装置,用于调整目标物体在虚拟场景中对应角色所面对的场景信息;所述转向控制装置包括:A steering control device is configured to adjust scene information that a target object faces in a virtual scene; the steering control device includes:
    接收模块,配置为接收转向触发指令;a receiving module configured to receive a steering trigger command;
    位置调整模块,配置为在所述接收模块接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;a position adjustment module configured to determine, after the receiving module receives the steering triggering instruction, the target object in the virtual according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction a position in the scene in which the corresponding character performs the steering according to the steering angle;
    显示控制模块,配置为根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。The display control module is configured to adjust the scene information that the target object faces in the virtual scene according to the position of the target object in the virtual scene according to the steering angle.
  9. 根据权利要求8所述的装置,其中,所述目标物体上设置有至少两个定位设备;The apparatus according to claim 8, wherein at least two positioning devices are disposed on the target object;
    所述位置调整模块配置为根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标和姿态。The position adjustment module is configured to determine, according to coordinates of the at least two positioning devices in a real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene. The coordinates and poses of at least two positioning devices in the virtual scene are described.
  10. 根据权利要求9所述的装置,其中,所述位置调整模块还配置为当所述目标物体在现实场景中移动时,根据所述目标物体处于现实实时位置时所述至少两个定位设备在所述现实场景中的坐标、在接收到转向触发指令时所述至少两个定位设备中的至少一个在所述现实场景中的坐标以及所述转向角度,确定所述目标物体处于所述现实实时位置时在所述虚拟场景中所述至少两个定位设备的坐标。The apparatus of claim 9, wherein the position adjustment module is further configured to: when the target object moves in a real scene, the at least two positioning devices are in the a coordinate in a real scene, a coordinate of the at least one of the at least two positioning devices in the real scene when the steering trigger command is received, and the steering angle, determining that the target object is in the real-time real-time position The coordinates of the at least two positioning devices in the virtual scene.
  11. 一种转向控制系统,包括:转向控制装置以及设置在目标物体上的至少两个定位设备;A steering control system includes: a steering control device and at least two positioning devices disposed on the target object;
    所述至少两个定位设备配置为确定所述目标物体在现实场景中的位置;The at least two positioning devices are configured to determine a position of the target object in a real scene;
    所述转向控制装置配置为在接收到转向触发指令之后,根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置;根据所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置,调整所述目标物体在所述虚拟场景中对应角色所面对的场景信息。The steering control device is configured to determine, after receiving the steering triggering instruction, the target object in the virtual scene according to a position of the target object in a real scene and a steering angle indicated by the steering triggering instruction Positioning the position according to the steering angle according to the steering angle; adjusting the corresponding position of the target object in the virtual scene according to the position of the target object in the virtual scene according to the steering angle Facing the scene information.
  12. 根据权利要求11所述的系统,其中,所述定位设备配置为通过接收信号发射器发射的定位信号,确定在现实场景中的坐标。The system of claim 11 wherein said positioning device is configured to determine coordinates in a real-world scene by receiving a positioning signal transmitted by the signal transmitter.
  13. 根据权利要求11所述的系统,其中,所述转向控制装置配置为在接收到转向触发指令之后,通过以下方式根据所述目标物体在现实场景中的位置以及所述转向触发指令所指示的转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后的位置:The system according to claim 11, wherein the steering control device is configured to, after receiving the steering triggering command, according to a position of the target object in a real scene and a steering indicated by the steering triggering command by: An angle determining a position of the target object in the virtual scene after the corresponding character is turned according to the steering angle:
    根据所述至少两个定位设备在现实场景中的坐标以及所述转向角度,确定所述目标物体在所述虚拟场景中对应角色按照所述转向角度进行转向后,所述至少两个定位设备在所述虚拟场景中的坐标和姿态。Determining, according to the coordinates of the at least two positioning devices in the real scene and the steering angle, that the target object is steered according to the steering angle in the virtual scene, the at least two positioning devices are The coordinates and poses in the virtual scene.
  14. 一种终端,包括:存储器以及处理器,所述存储器存储有转向控制程序,在所述转向控制程序被处理器执行时实现如权利要求1至7中任一项所述的转向控制方法的步骤。A terminal comprising: a memory and a processor, the memory storing a steering control program, the step of implementing the steering control method according to any one of claims 1 to 7 when the steering control program is executed by a processor .
  15. 一种机器可读介质,存储有转向控制程序,在所述转向控制程序被处理器执行时实现如权利要求1至7中任一项所述的转向控制方法的步骤。A machine readable medium storing a steering control program that implements the steps of the steering control method according to any one of claims 1 to 7 when the steering control program is executed by a processor.
PCT/CN2018/080627 2017-05-05 2018-03-27 Method, device and system for turn control WO2018201825A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710312560.4A CN107247512B (en) 2017-05-05 2017-05-05 A kind of rotating direction control method, apparatus and system
CN201710312560.4 2017-05-05

Publications (1)

Publication Number Publication Date
WO2018201825A1 true WO2018201825A1 (en) 2018-11-08

Family

ID=60016899

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080627 WO2018201825A1 (en) 2017-05-05 2018-03-27 Method, device and system for turn control

Country Status (2)

Country Link
CN (1) CN107247512B (en)
WO (1) WO2018201825A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107247512B (en) * 2017-05-05 2019-11-29 北京凌宇世纪信息科技有限公司 A kind of rotating direction control method, apparatus and system
CN108245890B (en) * 2018-02-28 2021-04-27 网易(杭州)网络有限公司 Method and device for controlling movement of object in virtual scene
CN108595022B (en) * 2018-04-27 2021-06-18 网易(杭州)网络有限公司 Virtual character advancing direction adjusting method and device, electronic equipment and storage medium
CN109710056A (en) * 2018-11-13 2019-05-03 宁波视睿迪光电有限公司 The display methods and device of virtual reality interactive device
CN109529320B (en) * 2018-12-29 2022-04-12 网易(杭州)网络有限公司 Steering control method and device in game
CN111383344A (en) * 2018-12-29 2020-07-07 深圳市优必选科技有限公司 Virtual scene generation method and device, computer equipment and storage medium
CN112569598A (en) * 2020-12-22 2021-03-30 上海幻电信息科技有限公司 Target object control method and device
CN113466788B (en) * 2021-07-20 2024-05-24 三星电子(中国)研发中心 Method, device and system for specifying object

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823705A (en) * 2013-11-08 2014-05-28 广州菲动软件科技有限公司 Virtual character turning control method and virtual character turning control system
US20160260252A1 (en) * 2015-03-06 2016-09-08 Electronics And Telecommunications Research Institute System and method for virtual tour experience
CN106406525A (en) * 2016-09-07 2017-02-15 讯飞幻境(北京)科技有限公司 Virtual reality interaction method, device and equipment
CN107247512A (en) * 2017-05-05 2017-10-13 北京凌宇智控科技有限公司 A kind of rotating direction control method, apparatus and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN2892214Y (en) * 2006-04-30 2007-04-25 吴铁励 Entertainment machine by human's body gesture operation

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103823705A (en) * 2013-11-08 2014-05-28 广州菲动软件科技有限公司 Virtual character turning control method and virtual character turning control system
US20160260252A1 (en) * 2015-03-06 2016-09-08 Electronics And Telecommunications Research Institute System and method for virtual tour experience
CN106406525A (en) * 2016-09-07 2017-02-15 讯飞幻境(北京)科技有限公司 Virtual reality interaction method, device and equipment
CN107247512A (en) * 2017-05-05 2017-10-13 北京凌宇智控科技有限公司 A kind of rotating direction control method, apparatus and system

Also Published As

Publication number Publication date
CN107247512A (en) 2017-10-13
CN107247512B (en) 2019-11-29

Similar Documents

Publication Publication Date Title
WO2018201825A1 (en) Method, device and system for turn control
US11703886B2 (en) Control method, apparatus, and device, and UAV
US20230341930A1 (en) Systems and methods for tracking a controller
US20190011908A1 (en) Control method, control system, and smart glasses for first person view unmanned aerial vehicle flight
CN108292489B (en) Information processing apparatus and image generating method
US10665021B2 (en) Augmented reality apparatus and system, as well as image processing method and device
US20200404179A1 (en) Motion trajectory determination and time-lapse photography methods, device, and machine-readable storage medium
US20230384872A1 (en) Coordinating alignment of coordinate systems used for a computer generated reality device and a haptic device
US20190215505A1 (en) Information processing device, image generation method, and head-mounted display
WO2017213070A1 (en) Information processing device and method, and recording medium
JP6687751B2 (en) Image display system, image display device, control method thereof, and program
JP6113897B1 (en) Method for providing virtual space, method for providing virtual experience, program, and recording medium
US20220164981A1 (en) Information processing device, information processing method, and recording medium
WO2019176035A1 (en) Image generation device, image generation system, and image generation method
JP6159455B1 (en) Method, program, and recording medium for providing virtual space
EP3887926B1 (en) Smart head-mounted display alignment system and method
WO2020012997A1 (en) Information processing device, program, and information processing method
US11854234B2 (en) Calibration of mobile electronic devices connected to headsets wearable by users
WO2020213363A1 (en) Device provided with plurality of markers
WO2021177132A1 (en) Information processing device, information processing system, information processing method, and program
JP6332658B1 (en) Display control apparatus and program
JP6205047B1 (en) Information processing method and program for causing computer to execute information processing method
JP6448478B2 (en) A program that controls the head-mounted display.
CN105204609B (en) Depth camera chain
KR20240062865A (en) Wearable electronic device and operating method thereof

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18794493

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 17.03.2020)

122 Ep: pct application non-entry in european phase

Ref document number: 18794493

Country of ref document: EP

Kind code of ref document: A1