WO2024082996A1 - Interaction method, in-vehicle infotainment system, vehicle comprising same, and storage medium - Google Patents

Interaction method, in-vehicle infotainment system, vehicle comprising same, and storage medium Download PDF

Info

Publication number
WO2024082996A1
WO2024082996A1 PCT/CN2023/123723 CN2023123723W WO2024082996A1 WO 2024082996 A1 WO2024082996 A1 WO 2024082996A1 CN 2023123723 W CN2023123723 W CN 2023123723W WO 2024082996 A1 WO2024082996 A1 WO 2024082996A1
Authority
WO
WIPO (PCT)
Prior art keywords
inertial measurement
measurement data
data
control
reality content
Prior art date
Application number
PCT/CN2023/123723
Other languages
French (fr)
Chinese (zh)
Inventor
孙艘
袁安贝
刘潇
Original Assignee
蔚来汽车科技(安徽)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 蔚来汽车科技(安徽)有限公司 filed Critical 蔚来汽车科技(安徽)有限公司
Publication of WO2024082996A1 publication Critical patent/WO2024082996A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the field of extended reality display, and more specifically, to an interaction method, a vehicle system, a vehicle including the same, and a storage medium.
  • Bluetooth ring controllers are commonly used external control devices in the field of extended reality, but they are rarely used in vehicle environments.
  • Embodiments of the present application provide an interactive method, a vehicle system, a vehicle including the same, and a storage medium, for improving the operating accuracy of a control device such as a ring controller in a vehicle environment.
  • an interaction method includes: receiving control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; generating second inertial measurement data in the current environment; and generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • control data also includes key control data and/or touch control data for interacting with the extended reality content.
  • control data comes from a ring controller.
  • generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data includes: correcting the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • the method further includes: sending the drawing data to a rendering device of the extended reality content for rendering the control ray.
  • the second inertial measurement data is used to measure the first
  • the inertial measurement data is corrected, comprising: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the first inertial measurement data and the second inertial measurement data are input into a neural network to generate the drawing data.
  • the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
  • a vehicle system includes: a receiving unit configured to receive control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; a measuring unit configured to generate second inertial measurement data in the current environment; and a generating unit configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • control data also includes key control data and/or touch control data for interacting with the extended reality content.
  • control data comes from a ring controller.
  • the receiving unit communicates with the ring controller based on a Bluetooth protocol.
  • the generating unit is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • the generating unit is further configured to send the drawing data to a rendering device of the extended reality content for rendering the control ray.
  • the generating unit is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the generating unit is based on a neural network, and the generating unit inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate the drawing data.
  • the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
  • a vehicle system comprising: a memory configured to store instructions; and a processor configured to execute the instructions so as to perform any one of the methods described above.
  • a vehicle comprising any vehicle system as described above.
  • a computer-readable storage medium wherein instructions are stored in the computer-readable storage medium, and wherein when the instructions are executed by a processor, the processor is caused to execute any one of the methods described above.
  • the interactive method, vehicle system, vehicle including the same, and storage medium provided according to some embodiments of the present application can improve the operating accuracy of control devices such as ring controllers in a vehicle environment, thereby correctly reflecting the control intentions of the occupants and improving the user experience.
  • FIG1 shows an interaction scenario according to an embodiment of the present application
  • FIG2 shows an interaction method according to an embodiment of the present application
  • FIG3 shows a vehicle system according to an embodiment of the present application
  • FIG. 4 shows a vehicle system according to an embodiment of the present application.
  • FIG1 shows an interactive scenario according to an embodiment of the present application, which depicts a scenario of using various extended reality devices (e.g., virtual reality devices, augmented reality devices) in a vehicle.
  • extended reality devices e.g., virtual reality devices, augmented reality devices
  • the vehicle is also in motion, and the extended reality device and the vehicle have the same speed in a certain direction.
  • unexpected vehicle movements may significantly affect the control direction of the occupant operator.
  • unexpected vehicle movements may significantly affect the control direction of the occupant operator.
  • the occupant operator wants to point to a fixed point in the virtual reality scene, if the vehicle suddenly accelerates or encounters bumps, the occupant operator's torso may be affected and cannot point to the original desired location.
  • the vehicle 100 may include multiple seats 101, and the number of passenger operators 200 (also referred to as objects 200 in this article) may not be unique.
  • the number of objects 200 can be at most equal to the number of seats 101 equipped in the vehicle 100.
  • the object 200 can interact with the extended reality screen 300 by wearing a ring controller 201, and the extended reality screen 300 can be generated by the head-mounted extended reality device (rendering device) of the object 200.
  • the object 200 can wear multiple ring controllers 201 to implement complex operations.
  • the extended reality screen 300 can be shared by multiple objects 200 in the vehicle 100. At this time, each object 200 in the vehicle 100 can wear a ring controller 201, and these ring controllers 201 can interact with the extended reality screen 300.
  • the extended reality screen 300 may display control rays 301 and 302 representing the current direction of the object 200. As shown in the figure, the root (vertex) of the control ray 302 will be connected to the ring controller 201 of the object 200 or its graphical representation.
  • the extended reality screen 300 may display a virtual hand, gun, laser pen, etc. as a graphical representation of the ring controller 201 of the object 200.
  • the ring controller 201 directly communicates with the head-mounted augmented reality device of the subject 200.
  • the ring controller 201 may not directly communicate with the head-mounted augmented reality device, but may process the control data through the vehicle system of the vehicle 100 and reflect the processing result on the augmented reality screen 300.
  • the interaction method 20 includes the following steps: receiving control data about extended reality content in step S22, wherein the control data includes first inertial measurement data in the current environment; generating second inertial measurement data in the current environment in step S24; and generating control rays for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data in step S26.
  • the interaction method 20 can be executed by a vehicle system in the vehicle 100. By executing the above steps, the vehicle system can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interaction experience.
  • the interaction method 20 receives control data about the extended reality content in step S22, wherein the control data includes inertial measurement data in the current environment (referred to herein as first inertial measurement data to distinguish it from the second inertial measurement data described below).
  • the control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the content of the interaction that the object 200 wants to perform with the extended reality screen 300.
  • the positions of the control rays 301 and 302 are generated based on the motion data of the ring controller 201.
  • the position of the corresponding control ray can be determined based on the displacement of the ring controller 201 relative to the original position.
  • the inertial measurement data generated by the ring controller 201 can be sent to the vehicle system of the vehicle 100 in real time for processing.
  • the control data (specifically, the first inertial measurement data) received in step S22 will reflect the superposition properties of the torso movement of the object 200 and the movement of the vehicle 100 to a certain extent.
  • the movement of the vehicle 100 is not what the object 200 expects when operating the ring controller 201. In other words, the movement of the vehicle 100 "interferes" with the object 200 operating the ring controller 201, and thus this interference factor needs to be eliminated.
  • the control data received in step S22 may also include key control data and touch control data for interacting with the extended reality content.
  • the ring controller 201 may also have keys and touch control components.
  • keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
  • the interactive method 20 generates inertial measurement data in the current environment in step S24, which is also referred to as the second inertial measurement data in this article.
  • the inertial measurement data generated by the vehicle system in step S24 is referred to as the second inertial measurement data here only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different.
  • the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system.
  • step S26 the interaction method 20 generates a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • the first inertial measurement data received in step S22 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100
  • the second inertial measurement data generated in step S24 is the vehicle system's measurement of the movement of the vehicle 100. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300.
  • This interaction action can be represented by the control rays 301 and 302 in FIG. 1 .
  • the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data.
  • the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm.
  • the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which will not be described in detail in this article.
  • the first inertial measurement data can be corrected using the second inertial measurement data to generate drawing data about the control ray.
  • this correction can be mathematically expressed as an operation of motion vectors in three dimensions of space.
  • the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula.
  • a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by methods such as a table lookup method. In addition, this "correction" will also filter out interference caused by vehicle bumps.
  • step S26 using the second inertial measurement data to correct the first inertial measurement data includes: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the first inertial measurement data has a displacement of 0.2 on the spatial X-axis at a certain moment
  • the second inertial measurement data has a displacement of 0.1 on the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200)
  • 0.8 is a correction coefficient.
  • the coefficient may not be fixed to 0.8, but can be adjusted with acceleration.
  • step S26 the first inertial measurement data and the first Second, the inertial measurement data is input into the neural network to generate the drawing data.
  • the use of neural network technology to process the inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can automatically grasp the characteristics of the environment, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
  • the interaction method 20 further includes the following steps (not shown in FIG. 2 ): sending the drawing data to the rendering device of the extended reality content for rendering the control ray.
  • the drawing data generated by the vehicle system is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
  • the extended reality device can be a virtual reality device, an augmented reality device, etc.
  • the extended reality content in this application can be augmented reality content, virtual reality content, or a combination of the two.
  • the extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
  • the vehicle system 30 includes a receiving unit 31, a measuring unit 32, and a generating unit 33.
  • the vehicle system 30 can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interactive experience.
  • other features of the vehicle system 30 can be carried out with reference to the above-mentioned interactive method 20.
  • the receiving unit 31 of the vehicle system 30 is configured to receive control data about the extended reality content, wherein the control data includes the first inertial measurement data in the current environment.
  • the control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the interaction content that the object 200 wants to perform with the extended reality screen 300.
  • the control data received by the receiving unit 31 may also include key control data and touch control data for interacting with the extended reality content.
  • the ring controller 201 may also have keys and touch control components. For example, keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
  • the receiving unit 31 communicates with the ring controller 201 based on the Bluetooth protocol. In other examples, other wireless communication protocols may be used to implement communication between the vehicle system 30 and the ring controller 201.
  • the measuring unit 32 of the vehicle system 30 is configured to generate a second inertial measurement number under the current environment.
  • the inertial measurement data generated by the vehicle system 30 through the measurement unit 32 is referred to as the second inertial measurement data only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different.
  • the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system 30 is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system 30.
  • the generation unit 33 of the vehicle system 30 is configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • the first inertial measurement data received by the receiving unit 31 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100
  • the second inertial measurement data generated by the measuring unit 32 is a representation of the movement of the vehicle 100 measured by the vehicle system 30. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300.
  • This interaction action can be expressed as control rays 301 and 302 in Figure 1.
  • the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data.
  • the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm.
  • the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which is not described in detail in this article.
  • the generation unit 33 is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • this correction can be mathematically expressed as an operation of a motion vector in three dimensions of space.
  • the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula.
  • a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by a method such as a table lookup. In addition, this "correction" will also filter out interference caused by vehicle bumps.
  • the generating unit 33 is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the overall motion situation from the first inertial measurement data.
  • the motion data caused by the dynamic situation is used to generate drawing data about the control ray.
  • the first inertial measurement data has a displacement of 0.2 in the spatial X-axis at a certain moment
  • the second inertial measurement data has a displacement of 0.1 in the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200)
  • the coefficient may not be fixed at 0.8, but can be adjusted with acceleration.
  • the generating unit 33 is based on a neural network, and the generating unit 33 inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate drawing data.
  • the use of neural network technology to process inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can also automatically grasp environmental characteristics, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
  • the generation unit 33 is further configured to send the drawing data to the rendering device of the extended reality content for rendering the control ray.
  • the drawing data generated by the vehicle system 30 is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
  • the extended reality content can be augmented reality content, virtual reality content, or a combination of the two.
  • the extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
  • the vehicle system 40 includes a memory 41 and a processor 42.
  • the processor 42 can read data from the memory 41 and can write data thereto.
  • the memory 41 is configured to store instructions, and the processor 42 is configured to execute any one of the interaction methods described above when executing the instructions stored in the memory 41.
  • the memory 41 can have the characteristics of a computer-readable storage medium as described below, and the details will be described below.
  • Another aspect of the present application provides a vehicle, wherein the vehicle includes any vehicle system as described above.
  • a computer-readable storage medium wherein instructions are stored, and when the instructions are executed by a processor, the processor executes any one of the interaction methods described above.
  • the computer-readable medium referred to in this application includes various types of computer storage media, which can be any available medium that can be accessed by a general or special computer.
  • the computer-readable medium may include RAM, ROM, EPROM, E2PROM , register, hard disk, removable disk, CD-ROM or other optical disk storage, disk storage or other magnetic storage device, or any other temporary or non-temporary medium that can be used to carry or store the desired program code unit in the form of instructions or data structures and can be accessed by a general or special computer, or a general or special processor.
  • the disk usually copies data magnetically, while the dish uses a laser to optically copy data.
  • the above combination should also be included in the protection scope of the computer-readable medium.
  • the exemplary storage medium is coupled to the processor so that the processor can read and write information from/to the storage medium.
  • the storage medium can be integrated into the processor.
  • the processor and the storage medium can reside in the ASIC.
  • the ASIC can reside in the user terminal.
  • the processor and the storage medium can reside in the user terminal as discrete components.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An interaction method (20), an in-vehicle infotainment system (30), a vehicle (100) comprising same, and a storage medium. The interaction method (20) comprises: receiving control data about extended reality content, the control data comprising first inertial measurement data in the current environment (S22); generating second inertial measurement data in the current environment (S24); and, on the basis of the first inertial measurement data and the second inertial measurement data, generating control rays for interacting with the extended reality content (S26).

Description

交互方法、车机系统及包括其的车辆、存储介质Interaction method, vehicle system, vehicle including the same, and storage medium 技术领域Technical Field
本申请涉及扩展现实显示的领域,具体而言,涉及交互方法、车机系统及包括其的车辆、存储介质。The present application relates to the field of extended reality display, and more specifically, to an interaction method, a vehicle system, a vehicle including the same, and a storage medium.
背景技术Background technique
目前,诸如蓝牙指环控制器等是扩展现实领域中常用的外用控制设备,但是车载环境下的应用却很少。一是因为车载环境下对蓝牙指环控制器等的需求不大;二是由于行车状态存在不稳定性,乘员的操作容易受到行车状态的干扰,因而蓝牙指环控制器等产生的交互信息也会受到较大影响。At present, Bluetooth ring controllers are commonly used external control devices in the field of extended reality, but they are rarely used in vehicle environments. First, there is little demand for Bluetooth ring controllers in vehicle environments; second, due to the instability of driving conditions, the operation of passengers is easily affected by the driving conditions, so the interactive information generated by Bluetooth ring controllers will also be greatly affected.
有鉴于此,需要提出一种改进的交互机制。In view of this, an improved interaction mechanism needs to be proposed.
发明内容Summary of the invention
本申请的实施例提供了一种交互方法、车机系统及包括其的车辆、存储介质,用于提高诸如指环控制器等控制设备在车载环境下的操作精度。Embodiments of the present application provide an interactive method, a vehicle system, a vehicle including the same, and a storage medium, for improving the operating accuracy of a control device such as a ring controller in a vehicle environment.
根据本申请的一方面,提供一种交互方法。所述方法包括:接收关于扩展现实内容的控制数据,其中所述控制数据包括在当前环境下的第一惯性测量数据;在所述当前环境下产生第二惯性测量数据;以及基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线。According to one aspect of the present application, an interaction method is provided. The method includes: receiving control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; generating second inertial measurement data in the current environment; and generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
在本申请的一些实施例中,可选地,所述控制数据还包括与所述扩展现实内容交互的按键控制数据和/或触摸控制数据。In some embodiments of the present application, optionally, the control data also includes key control data and/or touch control data for interacting with the extended reality content.
在本申请的一些实施例中,可选地,所述控制数据来自指环控制器。In some embodiments of the present application, optionally, the control data comes from a ring controller.
在本申请的一些实施例中,可选地,基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线包括:利用所述第二惯性测量数据对所述第一惯性测量数据进行修正,以生成关于所述控制射线的绘制数据。In some embodiments of the present application, optionally, generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data includes: correcting the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
在本申请的一些实施例中,可选地,所述方法还包括:将所述绘制数据发送至所述扩展现实内容的渲染设备供渲染所述控制射线。In some embodiments of the present application, optionally, the method further includes: sending the drawing data to a rendering device of the extended reality content for rendering the control ray.
在本申请的一些实施例中,可选地,利用所述第二惯性测量数据对所述第 一惯性测量数据进行修正包括:根据所述第二惯性测量数据提取在所述当前环境下的整体运动情况;以及从所述第一惯性测量数据中去除由所述整体运动情况引起的运动数据,以生成关于所述控制射线的绘制数据。In some embodiments of the present application, optionally, the second inertial measurement data is used to measure the first The inertial measurement data is corrected, comprising: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
在本申请的一些实施例中,可选地,将所述第一惯性测量数据和所述第二惯性测量数据输入到神经网络中以生成所述绘制数据。In some embodiments of the present application, optionally, the first inertial measurement data and the second inertial measurement data are input into a neural network to generate the drawing data.
在本申请的一些实施例中,可选地,所述扩展现实内容包括如下至少一种:增强现实内容、虚拟现实内容。In some embodiments of the present application, optionally, the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
根据本申请的另一方面,提供一种车机系统。所述系统包括:接收单元,其配置成接收关于扩展现实内容的控制数据,其中所述控制数据包括在当前环境下的第一惯性测量数据;测量单元,其配置成在所述当前环境下产生第二惯性测量数据;以及生成单元,其配置成基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线。According to another aspect of the present application, a vehicle system is provided. The system includes: a receiving unit configured to receive control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; a measuring unit configured to generate second inertial measurement data in the current environment; and a generating unit configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
在本申请的一些实施例中,可选地,所述控制数据还包括与所述扩展现实内容交互的按键控制数据和/或触摸控制数据。In some embodiments of the present application, optionally, the control data also includes key control data and/or touch control data for interacting with the extended reality content.
在本申请的一些实施例中,可选地,所述控制数据来自指环控制器。In some embodiments of the present application, optionally, the control data comes from a ring controller.
在本申请的一些实施例中,可选地,所述接收单元基于蓝牙协议与所述指环控制器通信。In some embodiments of the present application, optionally, the receiving unit communicates with the ring controller based on a Bluetooth protocol.
在本申请的一些实施例中,可选地,所述生成单元被配置成利用所述第二惯性测量数据对所述第一惯性测量数据进行修正,以生成关于所述控制射线的绘制数据。In some embodiments of the present application, optionally, the generating unit is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
在本申请的一些实施例中,可选地,所述生成单元还配置成将所述绘制数据发送至所述扩展现实内容的渲染设备供渲染所述控制射线。In some embodiments of the present application, optionally, the generating unit is further configured to send the drawing data to a rendering device of the extended reality content for rendering the control ray.
在本申请的一些实施例中,可选地,所述生成单元被配置成:根据所述第二惯性测量数据提取在所述当前环境下的整体运动情况;以及从所述第一惯性测量数据中去除由所述整体运动情况引起的运动数据,以生成关于所述控制射线的绘制数据。In some embodiments of the present application, optionally, the generating unit is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
在本申请的一些实施例中,可选地,所述生成单元基于神经网络,并且所述生成单元将所述第一惯性测量数据和所述第二惯性测量数据输入到所述神经网络中以生成所述绘制数据。 In some embodiments of the present application, optionally, the generating unit is based on a neural network, and the generating unit inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate the drawing data.
在本申请的一些实施例中,可选地,所述扩展现实内容包括如下至少一种:增强现实内容、虚拟现实内容。In some embodiments of the present application, optionally, the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
根据本申请的另一方面,提供一种车机系统。所述系统包括:存储器,其配置成存储指令;以及处理器,其配置成执行所述指令以便执行如上文所述的任意一种方法。According to another aspect of the present application, a vehicle system is provided, wherein the system comprises: a memory configured to store instructions; and a processor configured to execute the instructions so as to perform any one of the methods described above.
根据本申请的另一方面,提供一种车辆。所述车辆包括如上文所述的任意一种车机系统。According to another aspect of the present application, a vehicle is provided, wherein the vehicle comprises any vehicle system as described above.
根据本申请的另一方面,提供一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令由处理器执行时,使得所述处理器执行如上文所述的任意一种方法。According to another aspect of the present application, a computer-readable storage medium is provided, wherein instructions are stored in the computer-readable storage medium, and wherein when the instructions are executed by a processor, the processor is caused to execute any one of the methods described above.
根据本申请的一些实施例提供的交互方法、车机系统及包括其的车辆、存储介质可以提高诸如指环控制器等控制设备在车载环境下的操作精度,从而可以正确反映乘员的控制意图、提高用户体验。The interactive method, vehicle system, vehicle including the same, and storage medium provided according to some embodiments of the present application can improve the operating accuracy of control devices such as ring controllers in a vehicle environment, thereby correctly reflecting the control intentions of the occupants and improving the user experience.
附图说明BRIEF DESCRIPTION OF THE DRAWINGS
从结合附图的以下详细说明中,将会使本申请的上述和其他目的及优点更加完整清楚,其中,相同或相似的要素采用相同的标号表示。The above and other objects and advantages of the present application will become more fully apparent from the following detailed description in conjunction with the accompanying drawings, wherein the same or similar elements are represented by the same reference numerals.
图1示出了根据本申请的一个实施例的交互场景;FIG1 shows an interaction scenario according to an embodiment of the present application;
图2示出了根据本申请的一个实施例的交互方法;FIG2 shows an interaction method according to an embodiment of the present application;
图3示出了根据本申请的一个实施例的车机系统;FIG3 shows a vehicle system according to an embodiment of the present application;
图4示出了根据本申请的一个实施例的车机系统。FIG. 4 shows a vehicle system according to an embodiment of the present application.
具体实施方式Detailed ways
出于简洁和说明性目的,本文主要参考其示范实施例来描述本申请的原理。但是,本领域技术人员将容易地认识到相同的原理可等效地应用于所有类型的交互方法、车机系统及包括其的车辆、存储介质,并且可以在其中实施这些相同或相似的原理,任何此类变化不背离本申请的真实精神和范围。For the purpose of brevity and illustration, the principles of the present application are described herein mainly with reference to its exemplary embodiments. However, those skilled in the art will readily recognize that the same principles can be equally applied to all types of interactive methods, vehicle-mounted systems and vehicles including them, and storage media, and these same or similar principles can be implemented therein, and any such changes do not depart from the true spirit and scope of the present application.
图1示出了根据本申请的一个实施例的交互场景,其中描绘了一个在车辆中使用各种扩展现实设备(例如,虚拟现实设备、增强现实设备)的场景。与在静止空间中使用扩展现实设备不同,在车辆环境中使用扩展现实设备时车辆也处于运动之中,扩展现实设备与车辆具有某一方向的相同速度。 FIG1 shows an interactive scenario according to an embodiment of the present application, which depicts a scenario of using various extended reality devices (e.g., virtual reality devices, augmented reality devices) in a vehicle. Unlike using extended reality devices in a stationary space, when using extended reality devices in a vehicle environment, the vehicle is also in motion, and the extended reality device and the vehicle have the same speed in a certain direction.
此外,在车辆环境中使用扩展现实设备容易受到车辆各种非预期运动的影响。例如,车辆的非预期的运动可能会显著影响乘员操作者的控制指向等。在乘员操作者想要指向虚拟现实场景中的固定一点的时候,若车辆突然加速或遇到颠簸,乘员操作者的躯干可能会受到影响而导致不能指向原先需要指向的位置。In addition, the use of extended reality devices in a vehicle environment is susceptible to various unexpected vehicle movements. For example, unexpected vehicle movements may significantly affect the control direction of the occupant operator. When the occupant operator wants to point to a fixed point in the virtual reality scene, if the vehicle suddenly accelerates or encounters bumps, the occupant operator's torso may be affected and cannot point to the original desired location.
下文将结合图1描述在车辆100中使用各种扩展现实设备的原理。车辆100中可以包括多个座位101,并且乘员操作者200(本文中又称为对象200)的数量可以不唯一。对象200的数量最多可以与车辆100中配备的座位101的数量相等。对象200可以通过佩戴诸如指环控制器201以实现与扩展现实画面300交互,并且扩展现实画面300可以由对象200的头戴扩展现实设备(渲染设备)生成。对象200可以佩戴多个指环控制器201以用于实现复杂的操作。此外,扩展现实画面300可以为车辆100内的多个对象200所共享。此时,车辆100中的各个对象200都可以佩戴指环控制器201,并且这些指环控制器201都能够与扩展现实画面300进行交互。The following will describe the principle of using various extended reality devices in the vehicle 100 in conjunction with FIG. 1. The vehicle 100 may include multiple seats 101, and the number of passenger operators 200 (also referred to as objects 200 in this article) may not be unique. The number of objects 200 can be at most equal to the number of seats 101 equipped in the vehicle 100. The object 200 can interact with the extended reality screen 300 by wearing a ring controller 201, and the extended reality screen 300 can be generated by the head-mounted extended reality device (rendering device) of the object 200. The object 200 can wear multiple ring controllers 201 to implement complex operations. In addition, the extended reality screen 300 can be shared by multiple objects 200 in the vehicle 100. At this time, each object 200 in the vehicle 100 can wear a ring controller 201, and these ring controllers 201 can interact with the extended reality screen 300.
当对象通过指环控制器201与扩展现实画面300交互时,扩展现实画面300可以显示表示对象200的当前指向的控制射线301和302。如图所示,控制射线302的根部(顶点)将连接到对象200的指环控制器201或其图形化表达。例如,扩展现实画面300可以显示虚拟的手、枪、激光笔等作为对象200的指环控制器201的图形化表达。When the object interacts with the extended reality screen 300 through the ring controller 201, the extended reality screen 300 may display control rays 301 and 302 representing the current direction of the object 200. As shown in the figure, the root (vertex) of the control ray 302 will be connected to the ring controller 201 of the object 200 or its graphical representation. For example, the extended reality screen 300 may display a virtual hand, gun, laser pen, etc. as a graphical representation of the ring controller 201 of the object 200.
在传统方案中,指环控制器201是与对象200的头戴扩展现实设备直接通信的。在本申请中,诸如指环控制器201将可以不与头戴扩展现实设备直接通信,而是通过车辆100的车机系统处理控制数据并将处理结果反映在扩展现实画面300上。In the conventional solution, the ring controller 201 directly communicates with the head-mounted augmented reality device of the subject 200. In the present application, the ring controller 201 may not directly communicate with the head-mounted augmented reality device, but may process the control data through the vehicle system of the vehicle 100 and reflect the processing result on the augmented reality screen 300.
本申请的一方面提供了一种交互方法。如图2所示,交互方法20包括如下步骤:在步骤S22中接收关于扩展现实内容的控制数据,其中控制数据包括在当前环境下的第一惯性测量数据;在步骤S24中在当前环境下产生第二惯性测量数据;以及在步骤S26中基于第一惯性测量数据和第二惯性测量数据生成用于与扩展现实内容交互的控制射线。交互方法20可以由车辆100中的车机系统执行,车机系统通过执行以上步骤可以实现将对象200的原本操作意图精确地反映在扩展现实画面300中,从而提高交互的感受。 One aspect of the present application provides an interaction method. As shown in FIG2 , the interaction method 20 includes the following steps: receiving control data about extended reality content in step S22, wherein the control data includes first inertial measurement data in the current environment; generating second inertial measurement data in the current environment in step S24; and generating control rays for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data in step S26. The interaction method 20 can be executed by a vehicle system in the vehicle 100. By executing the above steps, the vehicle system can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interaction experience.
交互方法20在步骤S22中接收关于扩展现实内容的控制数据,其中控制数据包括在当前环境下的惯性测量数据(本文中称为第一惯性测量数据以示与下文中描述的第二惯性测量数据的区别)。这里的控制数据是对象200通过诸如指环控制器201等设备产生的,该控制数据反映了对象200想要与扩展现实画面300进行的交互内容。The interaction method 20 receives control data about the extended reality content in step S22, wherein the control data includes inertial measurement data in the current environment (referred to herein as first inertial measurement data to distinguish it from the second inertial measurement data described below). The control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the content of the interaction that the object 200 wants to perform with the extended reality screen 300.
如图1中所描述的,诸如控制射线301和302的位置是根据指环控制器201的运动数据产生的。例如,可以根据指环控制器201相对于原始位置的位移而确定所对应的控制射线的位置。为确定控制射线301和302的实时位置,可以将指环控制器201产生的惯性测量数据实时发送到车辆100的车机系统中进行处理。As described in FIG. 1 , the positions of the control rays 301 and 302 are generated based on the motion data of the ring controller 201. For example, the position of the corresponding control ray can be determined based on the displacement of the ring controller 201 relative to the original position. To determine the real-time positions of the control rays 301 and 302, the inertial measurement data generated by the ring controller 201 can be sent to the vehicle system of the vehicle 100 in real time for processing.
应当理解,当在动荡的车辆100中想要与扩展现实画面300进行交互时,在步骤S22中接收的控制数据(具体而言为第一惯性测量数据)将在一定程度上反映对象200的躯干移动和车辆100的运动的叠加属性。但是,车辆100的运动并不是对象200在操作指环控制器201所期望的。换言之,车辆100的运动对对象200操作环控制器201产生了“干扰”,因而需要将这种干扰因素排除。It should be understood that when the user wants to interact with the extended reality screen 300 in the turbulent vehicle 100, the control data (specifically, the first inertial measurement data) received in step S22 will reflect the superposition properties of the torso movement of the object 200 and the movement of the vehicle 100 to a certain extent. However, the movement of the vehicle 100 is not what the object 200 expects when operating the ring controller 201. In other words, the movement of the vehicle 100 "interferes" with the object 200 operating the ring controller 201, and thus this interference factor needs to be eliminated.
在本申请的一些实施例中,在步骤S22中接收的控制数据还可以包括与扩展现实内容交互的按键控制数据、触摸控制数据。指环控制器201除了可以用于指引方向外,还可以具有按键和触摸控制部件。例如,按键可以用于执行点选等操作,而触摸控制也能实现类似的作用。因此,控制数据还可以包括按键控制数据、触摸控制数据,用以实现与扩展现实内容的精细交互。In some embodiments of the present application, the control data received in step S22 may also include key control data and touch control data for interacting with the extended reality content. In addition to being used for indicating directions, the ring controller 201 may also have keys and touch control components. For example, keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
交互方法20在步骤S24中在当前环境下产生惯性测量数据,在本文中又称为第二惯性测量数据。这里将车机系统在步骤S24中产生的惯性测量数据称为第二惯性测量数据仅意在说明二者产生的主体存在差异,亦即惯性测量数据的来源不同。但是,第一惯性测量数据与第二惯性测量数据可以具有相同的数据格式等。由于车机系统是相对于车辆100固定地设置的,因而第二惯性测量数据实际上是车机系统测量的关于车辆100运动情况的表征。The interactive method 20 generates inertial measurement data in the current environment in step S24, which is also referred to as the second inertial measurement data in this article. The inertial measurement data generated by the vehicle system in step S24 is referred to as the second inertial measurement data here only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different. However, the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system.
交互方法20在步骤S26中基于第一惯性测量数据和第二惯性测量数据生成用于与扩展现实内容交互的控制射线。根据上文的描述可知,在步骤S22中接收的第一惯性测量数据反映对象200的躯干移动和车辆100的运动的叠加属性,而在步骤S24中产生的第二惯性测量数据是车机系统测量的关于车辆100运动情 况的表征。因此,可以从第一惯性测量数据扣除第二惯性测量数据的影响,得到的结果就是对象200预期的与扩展现实画面300的交互动作。这一交互动作可以表现为图1中的控制射线301和302。In step S26, the interaction method 20 generates a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data. According to the above description, the first inertial measurement data received in step S22 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100, while the second inertial measurement data generated in step S24 is the vehicle system's measurement of the movement of the vehicle 100. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300. This interaction action can be represented by the control rays 301 and 302 in FIG. 1 .
例如,在一些示例中,可以从第一惯性测量数据中减去某一个方向(车辆100的运动方向)的位移量即可得到控制射线301或302的位移量,其中该方向的位移量可以通过第二惯性测量数据导出。具体而言,第二惯性测量数据在某个方向具有较大的位移量,那么该位移量就可能是由于车辆100的运动造成的;而第二惯性测量数据在其他方向上的较小的位移量则可能是由于车辆100的颠簸等造成的。因此,第一惯性测量数据在其他方向上的位移量可以利用防抖算法加以过滤。例如,可以对第一惯性测量数据在其他方向上的位移量作平滑处理。平滑处理的方法可以按照现有方案开展,本文在此不作赘述。For example, in some examples, the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data. Specifically, if the second inertial measurement data has a large displacement in a certain direction, then the displacement may be caused by the movement of the vehicle 100; and the smaller displacement of the second inertial measurement data in other directions may be caused by the bumps of the vehicle 100. Therefore, the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm. For example, the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which will not be described in detail in this article.
在本申请的一些实施例中,在步骤S26中可以利用第二惯性测量数据对第一惯性测量数据进行修正,以生成关于控制射线的绘制数据。在一些示例中,这种修正在数学上可以表现为在空间三个维度上的运动向量的运算。在其他一些示例中,也可以通过不能明确表达为数学算式的方式执行修正。例如,可以根据前期实验建立在不同的第二惯性测量数据场景下的修正方案。当确认进入到某一第二惯性测量数据场景时则可以通过诸如查表方法确定针对第一惯性测量数据的修正方案。此外,这种“修正”也将过滤由于车辆颠簸引起的干扰。In some embodiments of the present application, in step S26, the first inertial measurement data can be corrected using the second inertial measurement data to generate drawing data about the control ray. In some examples, this correction can be mathematically expressed as an operation of motion vectors in three dimensions of space. In some other examples, the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula. For example, a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by methods such as a table lookup method. In addition, this "correction" will also filter out interference caused by vehicle bumps.
在本申请的一些实施例中,在步骤S26中利用第二惯性测量数据对第一惯性测量数据进行修正包括:根据第二惯性测量数据提取在当前环境下的整体运动情况;以及从第一惯性测量数据中去除由整体运动情况引起的运动数据,以生成关于控制射线的绘制数据。例如,第一惯性测量数据某一时刻在空间X轴具有0.2的位移,而第二惯性测量数据此时在空间X轴具有0.1的位移(车辆100和对象200的整体运动情况),那么可以推算出指环控制器201具有0.2-0.1=0.1的位移。在另一些示例中,考虑到对象200在速度急剧变化的场景下也会执行某种程度的自修正,推算出指环控制器201可以为0.2-0.1*0.8=0.12的位移,其中0.8为修正系数。以上示例仅出于示意的目的,实际情况要复杂得多,例如系数可能不固定为0.8,而是可以随加速度调整。In some embodiments of the present application, in step S26, using the second inertial measurement data to correct the first inertial measurement data includes: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray. For example, the first inertial measurement data has a displacement of 0.2 on the spatial X-axis at a certain moment, and the second inertial measurement data has a displacement of 0.1 on the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200), then it can be inferred that the finger ring controller 201 has a displacement of 0.2-0.1=0.1. In other examples, considering that the object 200 will also perform a certain degree of self-correction in the scene where the speed changes sharply, it is inferred that the finger ring controller 201 can be 0.2-0.1*0.8=0.12 Displacement, where 0.8 is a correction coefficient. The above examples are for illustrative purposes only, and the actual situation is much more complicated. For example, the coefficient may not be fixed to 0.8, but can be adjusted with acceleration.
在本申请的一些实施例中,在步骤S26中还可以将第一惯性测量数据和第 二惯性测量数据输入到神经网络中以生成绘制数据。利用神经网络技术对惯性测量数据进行处理将具有自适应自学习的特点。同时也能自动掌握环境特征,具有较好的容错性,抗干扰能力也较强。由于神经网络并不要建立具象的数学关系,因而开发的流程也会大幅缩短。In some embodiments of the present application, in step S26, the first inertial measurement data and the first Second, the inertial measurement data is input into the neural network to generate the drawing data. The use of neural network technology to process the inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can automatically grasp the characteristics of the environment, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
在本申请的一些实施例中,交互方法20还包括如下步骤(图2中未示出):将绘制数据发送至扩展现实内容的渲染设备供渲染控制射线。由车机系统产生的绘制数据是控制射线301和302的数学表达,可以发送到头戴渲染设备实现成像,便于对象200根据控制射线301和302实现与扩展现实画面300的交互。In some embodiments of the present application, the interaction method 20 further includes the following steps (not shown in FIG. 2 ): sending the drawing data to the rendering device of the extended reality content for rendering the control ray. The drawing data generated by the vehicle system is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
如上文所述,扩展现实设备可以是虚拟现实设备、增强现实设备等,因而本申请中的扩展现实内容可以是增强现实内容、虚拟现实内容,还可以是二者的某种组合形式。扩展现实内容还可以是以后开发的其他增强现实方案产生的视觉内容。As mentioned above, the extended reality device can be a virtual reality device, an augmented reality device, etc., so the extended reality content in this application can be augmented reality content, virtual reality content, or a combination of the two. The extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
本申请的另一方面提供了一种车机系统。如图3所述,车机系统30包括接收单元31、测量单元32和生成单元33。车机系统30可以实现将对象200的原本操作意图精确地反映在扩展现实画面300中,从而提高交互的感受。除了下文的详细描述外,车机系统30的其他特征可以参考前文的交互方法20而开展。Another aspect of the present application provides a vehicle system. As shown in FIG3 , the vehicle system 30 includes a receiving unit 31, a measuring unit 32, and a generating unit 33. The vehicle system 30 can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interactive experience. In addition to the detailed description below, other features of the vehicle system 30 can be carried out with reference to the above-mentioned interactive method 20.
车机系统30的接收单元31被配置成接收关于扩展现实内容的控制数据,其中控制数据包括在当前环境下的第一惯性测量数据。这里的控制数据是对象200通过诸如指环控制器201等设备产生的,该控制数据反映了对象200想要与扩展现实画面300进行的交互内容。The receiving unit 31 of the vehicle system 30 is configured to receive control data about the extended reality content, wherein the control data includes the first inertial measurement data in the current environment. The control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the interaction content that the object 200 wants to perform with the extended reality screen 300.
在本申请的一些实施例中,接收单元31所接收的控制数据还可以包括与扩展现实内容交互的按键控制数据、触摸控制数据。指环控制器201除了可以用于指引方向外,还可以具有按键和触摸控制部件。例如,按键可以用于执行点选等操作,而触摸控制也能实现类似的作用。因此,控制数据还可以包括按键控制数据、触摸控制数据,用以实现与扩展现实内容的精细交互。In some embodiments of the present application, the control data received by the receiving unit 31 may also include key control data and touch control data for interacting with the extended reality content. In addition to being used for indicating directions, the ring controller 201 may also have keys and touch control components. For example, keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
在本申请的一些实施例中,接收单元31基于蓝牙协议与指环控制器201通信。在其他示例中还可以使用其他无线通信协议实现车机系统30与指环控制器201的通信。In some embodiments of the present application, the receiving unit 31 communicates with the ring controller 201 based on the Bluetooth protocol. In other examples, other wireless communication protocols may be used to implement communication between the vehicle system 30 and the ring controller 201.
车机系统30的测量单元32被配置成在当前环境下产生第二惯性测量数 据。这里将车机系统30通过测量单元32产生的惯性测量数据称为第二惯性测量数据仅意在说明二者产生的主体存在差异,亦即惯性测量数据的来源不同。但是,第一惯性测量数据与第二惯性测量数据可以具有相同的数据格式等。由于车机系统30是相对于车辆100固定地设置的,因而第二惯性测量数据实际上是车机系统30测量的关于车辆100运动情况的表征。The measuring unit 32 of the vehicle system 30 is configured to generate a second inertial measurement number under the current environment. Here, the inertial measurement data generated by the vehicle system 30 through the measurement unit 32 is referred to as the second inertial measurement data only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different. However, the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system 30 is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system 30.
车机系统30的生成单元33被配置成基于第一惯性测量数据和第二惯性测量数据生成用于与扩展现实内容交互的控制射线。根据上文的描述可知,由接收单元31接收的第一惯性测量数据反映对象200的躯干移动和车辆100的运动的叠加属性,而通过测量单元32产生的第二惯性测量数据是车机系统30测量的关于车辆100运动情况的表征。因此,可以从第一惯性测量数据扣除第二惯性测量数据的影响,得到的结果就是对象200预期的与扩展现实画面300的交互动作。这一交互动作可以表现为图1中的控制射线301和302。The generation unit 33 of the vehicle system 30 is configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data. According to the above description, the first inertial measurement data received by the receiving unit 31 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100, and the second inertial measurement data generated by the measuring unit 32 is a representation of the movement of the vehicle 100 measured by the vehicle system 30. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300. This interaction action can be expressed as control rays 301 and 302 in Figure 1.
例如,在一些示例中,可以从第一惯性测量数据中减去某一个方向(车辆100的运动方向)的位移量即可得到控制射线301或302的位移量,其中该方向的位移量可以通过第二惯性测量数据导出。具体而言,第二惯性测量数据在某个方向具有较大的位移量,那么该位移量就可能是由于车辆100的运动造成的;而第二惯性测量数据在其他方向上的较小的位移量则可能是由于车辆100的颠簸等造成的。因此,第一惯性测量数据在其他方向上的位移量可以利用防抖算法加以过滤。例如,可以对第一惯性测量数据在其他方向上的位移量作平滑处理。平滑处理的方法可以按照现有方案开展,本文在此不作赘述。For example, in some examples, the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data. Specifically, if the second inertial measurement data has a large displacement in a certain direction, then the displacement may be caused by the movement of the vehicle 100; and the smaller displacement of the second inertial measurement data in other directions may be caused by the bumps of the vehicle 100. Therefore, the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm. For example, the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which is not described in detail in this article.
在本申请的一些实施例中,生成单元33被配置成利用第二惯性测量数据对第一惯性测量数据进行修正,以生成关于控制射线的绘制数据。在一些示例中,这种修正在数学上可以表现为在空间三个维度上的运动向量的运算。在其他一些示例中,也可以通过不能明确表达为数学算式的方式执行修正。例如,可以根据前期实验建立在不同的第二惯性测量数据场景下的修正方案。当确认进入到某一第二惯性测量数据场景时则可以通过诸如查表方法确定针对第一惯性测量数据的修正方案。此外,这种“修正”也将过滤由于车辆颠簸引起的干扰。In some embodiments of the present application, the generation unit 33 is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray. In some examples, this correction can be mathematically expressed as an operation of a motion vector in three dimensions of space. In some other examples, the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula. For example, a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by a method such as a table lookup. In addition, this "correction" will also filter out interference caused by vehicle bumps.
在本申请的一些实施例中,生成单元33被配置成:根据第二惯性测量数据提取在当前环境下的整体运动情况;以及从第一惯性测量数据中去除由整体运 动情况引起的运动数据,以生成关于控制射线的绘制数据。例如,第一惯性测量数据某一时刻在空间X轴具有0.2的位移,而第二惯性测量数据此时在空间X轴具有0.1的位移(车辆100和对象200的整体运动情况),那么可以推算出指环控制器201具有0.2-0.1=0.1的位移。在另一些示例中,考虑到对象200在速度急剧变化的场景下也会执行某种程度的自修正,推算出指环控制器201可以为0.2-0.1*0.8=0.12的位移,其中0.8为修正系数。以上示例仅出于示意的目的,实际情况要复杂得多,例如系数可能不固定为0.8,而是可以随加速度调整。In some embodiments of the present application, the generating unit 33 is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the overall motion situation from the first inertial measurement data. The motion data caused by the dynamic situation is used to generate drawing data about the control ray. For example, the first inertial measurement data has a displacement of 0.2 in the spatial X-axis at a certain moment, and the second inertial measurement data has a displacement of 0.1 in the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200), then it can be inferred that the ring controller 201 has a displacement of 0.2-0.1=0.1. In other examples, considering that the object 200 will also perform a certain degree of self-correction in the scenario where the speed changes sharply, it is inferred that the ring controller 201 can be a displacement of 0.2-0.1*0.8=0.12, where 0.8 is the correction coefficient. The above examples are for illustrative purposes only, and the actual situation is much more complicated. For example, the coefficient may not be fixed at 0.8, but can be adjusted with acceleration.
在本申请的一些实施例中,生成单元33基于神经网络,并且生成单元33将第一惯性测量数据和第二惯性测量数据输入到神经网络中以生成绘制数据。利用神经网络技术对惯性测量数据进行处理将具有自适应自学习的特点。同时也能自动掌握环境特征,具有较好的容错性,抗干扰能力也较强。由于神经网络并不要建立具象的数学关系,因而开发的流程也会大幅缩短。In some embodiments of the present application, the generating unit 33 is based on a neural network, and the generating unit 33 inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate drawing data. The use of neural network technology to process inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can also automatically grasp environmental characteristics, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
在本申请的一些实施例中,生成单元33还配置成将绘制数据发送至扩展现实内容的渲染设备供渲染控制射线。由车机系统30产生的绘制数据是控制射线301和302的数学表达,可以发送到头戴渲染设备实现成像,便于对象200根据控制射线301和302实现与扩展现实画面300的交互。In some embodiments of the present application, the generation unit 33 is further configured to send the drawing data to the rendering device of the extended reality content for rendering the control ray. The drawing data generated by the vehicle system 30 is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
如上文所述,扩展现实内容可以是增强现实内容、虚拟现实内容,还可以是二者的某种组合形式。扩展现实内容还可以是以后开发的其他增强现实方案产生的视觉内容。As mentioned above, the extended reality content can be augmented reality content, virtual reality content, or a combination of the two. The extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
本申请的另一方面提供了一种车机系统40。如图4所示,车机系统40包括存储器41和处理器42。其中,处理器42可以从存储器41中读取数据并且可以向其中写入数据。存储器41被配置成存储指令,而处理器42被配置成在执行存储器41中存储的指令时将使得执行如上文所述的任意一种交互方法。存储器41可以具有如下文所述的计算机可读存储介质的特征,详细内容将在下文加以描述。Another aspect of the present application provides a vehicle system 40. As shown in FIG4 , the vehicle system 40 includes a memory 41 and a processor 42. The processor 42 can read data from the memory 41 and can write data thereto. The memory 41 is configured to store instructions, and the processor 42 is configured to execute any one of the interaction methods described above when executing the instructions stored in the memory 41. The memory 41 can have the characteristics of a computer-readable storage medium as described below, and the details will be described below.
本申请的另一方面提供了一种车辆。车辆包括如上文所述的任意一种车机系统。Another aspect of the present application provides a vehicle, wherein the vehicle includes any vehicle system as described above.
根据本申请的另一方面,提供一种计算机可读存储介质,其中存储有指令,当所述指令由处理器执行时,使得所述处理器执行如上文所述的任意一种交互方 法。本申请中所称的计算机可读介质包括各种类型的计算机存储介质,可以是通用或专用计算机能够存取的任何可用介质。举例而言,计算机可读介质可以包括RAM、ROM、EPROM、E2PROM、寄存器、硬盘、可移动盘、CD-ROM或其他光盘存储器、磁盘存储器或其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的期望的程序代码单元并能够由通用或专用计算机、或者通用或专用处理器进行存取的任何其他临时性或者非临时性介质。如本文所使用的盘通常磁性地复制数据,而碟则用激光来光学地复制数据。上述的组合也应当包括在计算机可读介质的保护范围之内。示例性存储介质耦合到处理器以使得该处理器能从/向该存储介质读写信息。在替换方案中,存储介质可以被整合到处理器。处理器和存储介质可驻留在ASIC中。ASIC可驻留在用户终端中。在替换方案中,处理器和存储介质可作为分立组件驻留在用户终端中。According to another aspect of the present application, a computer-readable storage medium is provided, wherein instructions are stored, and when the instructions are executed by a processor, the processor executes any one of the interaction methods described above. Method. The computer-readable medium referred to in this application includes various types of computer storage media, which can be any available medium that can be accessed by a general or special computer. For example, the computer-readable medium may include RAM, ROM, EPROM, E2PROM , register, hard disk, removable disk, CD-ROM or other optical disk storage, disk storage or other magnetic storage device, or any other temporary or non-temporary medium that can be used to carry or store the desired program code unit in the form of instructions or data structures and can be accessed by a general or special computer, or a general or special processor. As used herein, the disk usually copies data magnetically, while the dish uses a laser to optically copy data. The above combination should also be included in the protection scope of the computer-readable medium. The exemplary storage medium is coupled to the processor so that the processor can read and write information from/to the storage medium. In an alternative solution, the storage medium can be integrated into the processor. The processor and the storage medium can reside in the ASIC. The ASIC can reside in the user terminal. In an alternative solution, the processor and the storage medium can reside in the user terminal as discrete components.
以上仅为本申请的具体实施方式,但本申请的保护范围并不局限于此。本领域的技术人员可以根据本申请所披露的技术范围想到其他可行的变化或替换,此等变化或替换皆涵盖于本申请的保护范围之中。在不冲突的情况下,本申请的实施方式及实施方式中的特征还可以相互组合。本申请的保护范围以权利要求的记载为准。 The above are only specific implementations of the present application, but the protection scope of the present application is not limited thereto. Those skilled in the art can think of other feasible changes or substitutions based on the technical scope disclosed in the present application, and such changes or substitutions are all included in the protection scope of the present application. In the absence of conflict, the implementation modes of the present application and the features in the implementation modes can also be combined with each other. The protection scope of the present application shall be subject to the description of the claims.

Claims (20)

  1. 一种交互方法,其特征在于,所述方法包括:An interactive method, characterized in that the method comprises:
    接收关于扩展现实内容的控制数据,其中所述控制数据包括在当前环境下的第一惯性测量数据;receiving control data regarding the extended reality content, wherein the control data includes first inertial measurement data in a current environment;
    在所述当前环境下产生第二惯性测量数据;以及generating second inertial measurement data under the current environment; and
    基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线。A control ray for interacting with the augmented reality content is generated based on the first inertial measurement data and the second inertial measurement data.
  2. 根据权利要求1所述的方法,其中,所述控制数据还包括与所述扩展现实内容交互的按键控制数据和/或触摸控制数据。The method according to claim 1, wherein the control data further includes key control data and/or touch control data for interacting with the extended reality content.
  3. 根据权利要求1所述的方法,其中,所述控制数据来自指环控制器。The method of claim 1, wherein the control data is from a finger ring controller.
  4. 根据权利要求1所述的方法,其中,基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线包括:The method according to claim 1, wherein generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data comprises:
    利用所述第二惯性测量数据对所述第一惯性测量数据进行修正,以生成关于所述控制射线的绘制数据。The first inertial measurement data is corrected using the second inertial measurement data to generate rendering data related to the control ray.
  5. 根据权利要求4所述的方法,还包括:将所述绘制数据发送至所述扩展现实内容的渲染设备供渲染所述控制射线。The method according to claim 4 further comprises: sending the drawing data to a rendering device of the extended reality content for rendering the control ray.
  6. 根据权利要求4所述的方法,其中,利用所述第二惯性测量数据对所述第一惯性测量数据进行修正包括:The method according to claim 4, wherein correcting the first inertial measurement data using the second inertial measurement data comprises:
    根据所述第二惯性测量数据提取在所述当前环境下的整体运动情况;以及Extracting the overall motion situation in the current environment according to the second inertial measurement data; and
    从所述第一惯性测量数据中去除由所述整体运动情况引起的运动数据,以生成关于所述控制射线的绘制数据。The motion data caused by the overall motion condition is removed from the first inertial measurement data to generate rendering data about the control ray.
  7. 根据权利要求4所述的方法,其中,将所述第一惯性测量数据和所述第二惯性测量数据输入到神经网络中以生成所述绘制数据。The method of claim 4, wherein the first inertial measurement data and the second inertial measurement data are input into a neural network to generate the rendering data.
  8. 根据权利要求1所述的方法,其中,所述扩展现实内容包括如下至少一种:增强现实内容、虚拟现实内容。The method according to claim 1, wherein the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
  9. 一种车机系统,其特征在于,所述系统包括: A vehicle computer system, characterized in that the system comprises:
    接收单元,其配置成接收关于扩展现实内容的控制数据,其中所述控制数据包括在当前环境下的第一惯性测量数据;a receiving unit configured to receive control data regarding the extended reality content, wherein the control data includes first inertial measurement data in a current environment;
    测量单元,其配置成在所述当前环境下产生第二惯性测量数据;以及a measurement unit configured to generate second inertial measurement data under the current environment; and
    生成单元,其配置成基于所述第一惯性测量数据和所述第二惯性测量数据生成用于与所述扩展现实内容交互的控制射线。A generating unit is configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  10. 根据权利要求9所述的系统,其中,所述控制数据还包括与所述扩展现实内容交互的按键控制数据和/或触摸控制数据。The system according to claim 9, wherein the control data further includes key control data and/or touch control data for interacting with the extended reality content.
  11. 根据权利要求9所述的系统,其中,所述控制数据来自指环控制器。The system of claim 9, wherein the control data is from a finger ring controller.
  12. 根据权利要求11所述的系统,其中,所述接收单元基于蓝牙协议与所述指环控制器通信。The system according to claim 11, wherein the receiving unit communicates with the ring controller based on a Bluetooth protocol.
  13. 根据权利要求9所述的系统,其中,所述生成单元被配置成利用所述第二惯性测量数据对所述第一惯性测量数据进行修正,以生成关于所述控制射线的绘制数据。The system according to claim 9, wherein the generating unit is configured to correct the first inertial measurement data by using the second inertial measurement data to generate the rendering data about the control ray.
  14. 根据权利要求13所述的系统,其中,所述生成单元还配置成将所述绘制数据发送至所述扩展现实内容的渲染设备供渲染所述控制射线。The system according to claim 13, wherein the generating unit is further configured to send the drawing data to a rendering device of the extended reality content for rendering the control ray.
  15. 根据权利要求13所述的系统,其中,所述生成单元被配置成:The system according to claim 13, wherein the generating unit is configured to:
    根据所述第二惯性测量数据提取在所述当前环境下的整体运动情况;以及Extracting the overall motion situation in the current environment according to the second inertial measurement data; and
    从所述第一惯性测量数据中去除由所述整体运动情况引起的运动数据,以生成关于所述控制射线的绘制数据。The motion data caused by the overall motion condition is removed from the first inertial measurement data to generate rendering data about the control ray.
  16. 根据权利要求13所述的系统,其中,所述生成单元基于神经网络,并且所述生成单元将所述第一惯性测量数据和所述第二惯性测量数据输入到所述神经网络中以生成所述绘制数据。The system according to claim 13, wherein the generating unit is based on a neural network, and the generating unit inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate the drawing data.
  17. 根据权利要求9所述的方法,其中,所述扩展现实内容包括如下至少一种:The method according to claim 9, wherein the extended reality content includes at least one of the following:
    增强现实内容、虚拟现实内容。Augmented reality content, virtual reality content.
  18. 一种车机系统,其特征在于,所述系统包括:A vehicle computer system, characterized in that the system comprises:
    存储器,其配置成存储指令;以及a memory configured to store instructions; and
    处理器,其配置成执行所述指令以便执行如权利要求1-8中任一项所述的方法。A processor configured to execute the instructions so as to perform the method according to any one of claims 1 to 8.
  19. 一种车辆,其特征在于,所述车辆包括如权利要求9-18中任一项所述的车机系统。 A vehicle, characterized in that the vehicle comprises the vehicle system as described in any one of claims 9-18.
  20. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,其特征在于,当所述指令由处理器执行时,使得所述处理器执行如权利要求1-8中任一项所述的方法。 A computer-readable storage medium stores instructions, wherein when the instructions are executed by a processor, the processor executes the method according to any one of claims 1 to 8.
PCT/CN2023/123723 2022-10-19 2023-10-10 Interaction method, in-vehicle infotainment system, vehicle comprising same, and storage medium WO2024082996A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211281641.X 2022-10-19
CN202211281641.XA CN115686205A (en) 2022-10-19 2022-10-19 Interaction method, vehicle-mounted machine system, vehicle comprising vehicle-mounted machine system and storage medium

Publications (1)

Publication Number Publication Date
WO2024082996A1 true WO2024082996A1 (en) 2024-04-25

Family

ID=85065835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/123723 WO2024082996A1 (en) 2022-10-19 2023-10-10 Interaction method, in-vehicle infotainment system, vehicle comprising same, and storage medium

Country Status (2)

Country Link
CN (1) CN115686205A (en)
WO (1) WO2024082996A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686205A (en) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 Interaction method, vehicle-mounted machine system, vehicle comprising vehicle-mounted machine system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603167A (en) * 2017-05-05 2019-12-20 奥迪股份公司 Mobile sensor device for a head-mounted visual output device usable in a vehicle and method for operating a display system
CN111149041A (en) * 2017-09-26 2020-05-12 奥迪股份公司 Method for operating a head-mountable electronic display device and display system for displaying virtual content
US20210174590A1 (en) * 2019-12-09 2021-06-10 At&T Intellectual Property I, L.P. Cognitive stimulation in vehicles
CN114003126A (en) * 2021-09-26 2022-02-01 歌尔光学科技有限公司 Interaction control method, device and equipment for virtual reality equipment
CN115080484A (en) * 2022-06-24 2022-09-20 蔚来汽车科技(安徽)有限公司 Vehicle machine system, data processing method thereof and storage medium
CN115686205A (en) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 Interaction method, vehicle-mounted machine system, vehicle comprising vehicle-mounted machine system and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603167A (en) * 2017-05-05 2019-12-20 奥迪股份公司 Mobile sensor device for a head-mounted visual output device usable in a vehicle and method for operating a display system
CN111149041A (en) * 2017-09-26 2020-05-12 奥迪股份公司 Method for operating a head-mountable electronic display device and display system for displaying virtual content
US20210174590A1 (en) * 2019-12-09 2021-06-10 At&T Intellectual Property I, L.P. Cognitive stimulation in vehicles
CN114003126A (en) * 2021-09-26 2022-02-01 歌尔光学科技有限公司 Interaction control method, device and equipment for virtual reality equipment
CN115080484A (en) * 2022-06-24 2022-09-20 蔚来汽车科技(安徽)有限公司 Vehicle machine system, data processing method thereof and storage medium
CN115686205A (en) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 Interaction method, vehicle-mounted machine system, vehicle comprising vehicle-mounted machine system and storage medium

Also Published As

Publication number Publication date
CN115686205A (en) 2023-02-03

Similar Documents

Publication Publication Date Title
WO2024082996A1 (en) Interaction method, in-vehicle infotainment system, vehicle comprising same, and storage medium
CN105555222A (en) Medical robot arm device, medical robot arm control system, medical robot arm control method, and program
JP5147933B2 (en) Man-machine interface device system and method
US11995536B2 (en) Learning device, estimating device, estimating system, learning method, estimating method, and storage medium to estimate a state of vehicle-occupant with respect to vehicle equipment
CN102192740A (en) Posture information calculation device, posture information calculation system, posture information calculation method, and information storage medium
CN104793017A (en) Accelerated speed correction method and terminal
CN111723716B (en) Method, device, system, medium and electronic equipment for determining target object orientation
JP2019010706A (en) Torch cable interference evaluation information output device, evaluation information output method and program for weld robot
US10890981B2 (en) Gesture-based vehicle control
US11009963B2 (en) Sign language inputs to a vehicle user interface
CN110096134A (en) A kind of VR handle ray shake antidote, device, terminal and medium
EP1540511A1 (en) System and method for simulation of nonlinear dynamic systems applicable within soft computing
Fründ et al. Using augmented reality technology to support the automobile development
KR102333768B1 (en) Hand recognition augmented reality-intraction apparatus and method
CN114489412A (en) Method, device and interaction method for performing offset correction on motion sensor
JPH11120385A (en) Two-dimensional and three-dimensional integrated type cad system and storage medium where drawing generating program is recorded
CN110813647A (en) Five-axis motion control method and device and dispensing equipment
AU2021284512A1 (en) Automatically correcting touchscreen errors
CN116113911A (en) Device tracking using angle of arrival data
Pecly et al. Uncoupled stability of kinesthetic haptic systems simulating mass-damper-spring environments with complementary filter
CN112351213B (en) System dynamic allocation on chip resources for efficient signal processing
CN113496165B (en) User gesture recognition method and device, hand intelligent wearable device and storage medium
US20240108977A1 (en) Force and complex vibration rendering system using force feedback device and wideband resonance actuator and method for providing force and complex vibration using the system
WO2023176034A1 (en) Control device and control method
US20230264342A1 (en) Control system, control method, and storage medium