WO2024082996A1 - Procédé d'interaction, système d'infodivertissement embarqué, véhicule le comprenant, et support de stockage - Google Patents

Procédé d'interaction, système d'infodivertissement embarqué, véhicule le comprenant, et support de stockage Download PDF

Info

Publication number
WO2024082996A1
WO2024082996A1 PCT/CN2023/123723 CN2023123723W WO2024082996A1 WO 2024082996 A1 WO2024082996 A1 WO 2024082996A1 CN 2023123723 W CN2023123723 W CN 2023123723W WO 2024082996 A1 WO2024082996 A1 WO 2024082996A1
Authority
WO
WIPO (PCT)
Prior art keywords
inertial measurement
measurement data
data
control
reality content
Prior art date
Application number
PCT/CN2023/123723
Other languages
English (en)
Chinese (zh)
Inventor
孙艘
袁安贝
刘潇
Original Assignee
蔚来汽车科技(安徽)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 蔚来汽车科技(安徽)有限公司 filed Critical 蔚来汽车科技(安徽)有限公司
Publication of WO2024082996A1 publication Critical patent/WO2024082996A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the present application relates to the field of extended reality display, and more specifically, to an interaction method, a vehicle system, a vehicle including the same, and a storage medium.
  • Bluetooth ring controllers are commonly used external control devices in the field of extended reality, but they are rarely used in vehicle environments.
  • Embodiments of the present application provide an interactive method, a vehicle system, a vehicle including the same, and a storage medium, for improving the operating accuracy of a control device such as a ring controller in a vehicle environment.
  • an interaction method includes: receiving control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; generating second inertial measurement data in the current environment; and generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • control data also includes key control data and/or touch control data for interacting with the extended reality content.
  • control data comes from a ring controller.
  • generating a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data includes: correcting the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • the method further includes: sending the drawing data to a rendering device of the extended reality content for rendering the control ray.
  • the second inertial measurement data is used to measure the first
  • the inertial measurement data is corrected, comprising: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the first inertial measurement data and the second inertial measurement data are input into a neural network to generate the drawing data.
  • the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
  • a vehicle system includes: a receiving unit configured to receive control data about extended reality content, wherein the control data includes first inertial measurement data in a current environment; a measuring unit configured to generate second inertial measurement data in the current environment; and a generating unit configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • control data also includes key control data and/or touch control data for interacting with the extended reality content.
  • control data comes from a ring controller.
  • the receiving unit communicates with the ring controller based on a Bluetooth protocol.
  • the generating unit is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • the generating unit is further configured to send the drawing data to a rendering device of the extended reality content for rendering the control ray.
  • the generating unit is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the generating unit is based on a neural network, and the generating unit inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate the drawing data.
  • the extended reality content includes at least one of the following: augmented reality content, virtual reality content.
  • a vehicle system comprising: a memory configured to store instructions; and a processor configured to execute the instructions so as to perform any one of the methods described above.
  • a vehicle comprising any vehicle system as described above.
  • a computer-readable storage medium wherein instructions are stored in the computer-readable storage medium, and wherein when the instructions are executed by a processor, the processor is caused to execute any one of the methods described above.
  • the interactive method, vehicle system, vehicle including the same, and storage medium provided according to some embodiments of the present application can improve the operating accuracy of control devices such as ring controllers in a vehicle environment, thereby correctly reflecting the control intentions of the occupants and improving the user experience.
  • FIG1 shows an interaction scenario according to an embodiment of the present application
  • FIG2 shows an interaction method according to an embodiment of the present application
  • FIG3 shows a vehicle system according to an embodiment of the present application
  • FIG. 4 shows a vehicle system according to an embodiment of the present application.
  • FIG1 shows an interactive scenario according to an embodiment of the present application, which depicts a scenario of using various extended reality devices (e.g., virtual reality devices, augmented reality devices) in a vehicle.
  • extended reality devices e.g., virtual reality devices, augmented reality devices
  • the vehicle is also in motion, and the extended reality device and the vehicle have the same speed in a certain direction.
  • unexpected vehicle movements may significantly affect the control direction of the occupant operator.
  • unexpected vehicle movements may significantly affect the control direction of the occupant operator.
  • the occupant operator wants to point to a fixed point in the virtual reality scene, if the vehicle suddenly accelerates or encounters bumps, the occupant operator's torso may be affected and cannot point to the original desired location.
  • the vehicle 100 may include multiple seats 101, and the number of passenger operators 200 (also referred to as objects 200 in this article) may not be unique.
  • the number of objects 200 can be at most equal to the number of seats 101 equipped in the vehicle 100.
  • the object 200 can interact with the extended reality screen 300 by wearing a ring controller 201, and the extended reality screen 300 can be generated by the head-mounted extended reality device (rendering device) of the object 200.
  • the object 200 can wear multiple ring controllers 201 to implement complex operations.
  • the extended reality screen 300 can be shared by multiple objects 200 in the vehicle 100. At this time, each object 200 in the vehicle 100 can wear a ring controller 201, and these ring controllers 201 can interact with the extended reality screen 300.
  • the extended reality screen 300 may display control rays 301 and 302 representing the current direction of the object 200. As shown in the figure, the root (vertex) of the control ray 302 will be connected to the ring controller 201 of the object 200 or its graphical representation.
  • the extended reality screen 300 may display a virtual hand, gun, laser pen, etc. as a graphical representation of the ring controller 201 of the object 200.
  • the ring controller 201 directly communicates with the head-mounted augmented reality device of the subject 200.
  • the ring controller 201 may not directly communicate with the head-mounted augmented reality device, but may process the control data through the vehicle system of the vehicle 100 and reflect the processing result on the augmented reality screen 300.
  • the interaction method 20 includes the following steps: receiving control data about extended reality content in step S22, wherein the control data includes first inertial measurement data in the current environment; generating second inertial measurement data in the current environment in step S24; and generating control rays for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data in step S26.
  • the interaction method 20 can be executed by a vehicle system in the vehicle 100. By executing the above steps, the vehicle system can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interaction experience.
  • the interaction method 20 receives control data about the extended reality content in step S22, wherein the control data includes inertial measurement data in the current environment (referred to herein as first inertial measurement data to distinguish it from the second inertial measurement data described below).
  • the control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the content of the interaction that the object 200 wants to perform with the extended reality screen 300.
  • the positions of the control rays 301 and 302 are generated based on the motion data of the ring controller 201.
  • the position of the corresponding control ray can be determined based on the displacement of the ring controller 201 relative to the original position.
  • the inertial measurement data generated by the ring controller 201 can be sent to the vehicle system of the vehicle 100 in real time for processing.
  • the control data (specifically, the first inertial measurement data) received in step S22 will reflect the superposition properties of the torso movement of the object 200 and the movement of the vehicle 100 to a certain extent.
  • the movement of the vehicle 100 is not what the object 200 expects when operating the ring controller 201. In other words, the movement of the vehicle 100 "interferes" with the object 200 operating the ring controller 201, and thus this interference factor needs to be eliminated.
  • the control data received in step S22 may also include key control data and touch control data for interacting with the extended reality content.
  • the ring controller 201 may also have keys and touch control components.
  • keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
  • the interactive method 20 generates inertial measurement data in the current environment in step S24, which is also referred to as the second inertial measurement data in this article.
  • the inertial measurement data generated by the vehicle system in step S24 is referred to as the second inertial measurement data here only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different.
  • the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system.
  • step S26 the interaction method 20 generates a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • the first inertial measurement data received in step S22 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100
  • the second inertial measurement data generated in step S24 is the vehicle system's measurement of the movement of the vehicle 100. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300.
  • This interaction action can be represented by the control rays 301 and 302 in FIG. 1 .
  • the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data.
  • the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm.
  • the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which will not be described in detail in this article.
  • the first inertial measurement data can be corrected using the second inertial measurement data to generate drawing data about the control ray.
  • this correction can be mathematically expressed as an operation of motion vectors in three dimensions of space.
  • the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula.
  • a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by methods such as a table lookup method. In addition, this "correction" will also filter out interference caused by vehicle bumps.
  • step S26 using the second inertial measurement data to correct the first inertial measurement data includes: extracting the overall motion situation in the current environment according to the second inertial measurement data; and removing the motion data caused by the overall motion situation from the first inertial measurement data to generate drawing data about the control ray.
  • the first inertial measurement data has a displacement of 0.2 on the spatial X-axis at a certain moment
  • the second inertial measurement data has a displacement of 0.1 on the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200)
  • 0.8 is a correction coefficient.
  • the coefficient may not be fixed to 0.8, but can be adjusted with acceleration.
  • step S26 the first inertial measurement data and the first Second, the inertial measurement data is input into the neural network to generate the drawing data.
  • the use of neural network technology to process the inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can automatically grasp the characteristics of the environment, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
  • the interaction method 20 further includes the following steps (not shown in FIG. 2 ): sending the drawing data to the rendering device of the extended reality content for rendering the control ray.
  • the drawing data generated by the vehicle system is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
  • the extended reality device can be a virtual reality device, an augmented reality device, etc.
  • the extended reality content in this application can be augmented reality content, virtual reality content, or a combination of the two.
  • the extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
  • the vehicle system 30 includes a receiving unit 31, a measuring unit 32, and a generating unit 33.
  • the vehicle system 30 can accurately reflect the original operation intention of the object 200 in the extended reality screen 300, thereby improving the interactive experience.
  • other features of the vehicle system 30 can be carried out with reference to the above-mentioned interactive method 20.
  • the receiving unit 31 of the vehicle system 30 is configured to receive control data about the extended reality content, wherein the control data includes the first inertial measurement data in the current environment.
  • the control data here is generated by the object 200 through a device such as a ring controller 201, and the control data reflects the interaction content that the object 200 wants to perform with the extended reality screen 300.
  • the control data received by the receiving unit 31 may also include key control data and touch control data for interacting with the extended reality content.
  • the ring controller 201 may also have keys and touch control components. For example, keys can be used to perform operations such as clicking, and touch control can also achieve similar effects. Therefore, the control data may also include key control data and touch control data to achieve fine interaction with the extended reality content.
  • the receiving unit 31 communicates with the ring controller 201 based on the Bluetooth protocol. In other examples, other wireless communication protocols may be used to implement communication between the vehicle system 30 and the ring controller 201.
  • the measuring unit 32 of the vehicle system 30 is configured to generate a second inertial measurement number under the current environment.
  • the inertial measurement data generated by the vehicle system 30 through the measurement unit 32 is referred to as the second inertial measurement data only to illustrate that there are differences in the subjects of the two, that is, the sources of the inertial measurement data are different.
  • the first inertial measurement data and the second inertial measurement data can have the same data format, etc. Since the vehicle system 30 is fixedly arranged relative to the vehicle 100, the second inertial measurement data is actually a representation of the movement of the vehicle 100 measured by the vehicle system 30.
  • the generation unit 33 of the vehicle system 30 is configured to generate a control ray for interacting with the extended reality content based on the first inertial measurement data and the second inertial measurement data.
  • the first inertial measurement data received by the receiving unit 31 reflects the superposition property of the torso movement of the object 200 and the movement of the vehicle 100
  • the second inertial measurement data generated by the measuring unit 32 is a representation of the movement of the vehicle 100 measured by the vehicle system 30. Therefore, the influence of the second inertial measurement data can be deducted from the first inertial measurement data, and the result obtained is the expected interaction action of the object 200 with the extended reality screen 300.
  • This interaction action can be expressed as control rays 301 and 302 in Figure 1.
  • the displacement of the control ray 301 or 302 can be obtained by subtracting the displacement in a certain direction (the direction of movement of the vehicle 100) from the first inertial measurement data, wherein the displacement in this direction can be derived through the second inertial measurement data.
  • the displacement of the first inertial measurement data in other directions can be filtered using an anti-shake algorithm.
  • the displacement of the first inertial measurement data in other directions can be smoothed. The smoothing method can be carried out according to the existing scheme, which is not described in detail in this article.
  • the generation unit 33 is configured to correct the first inertial measurement data using the second inertial measurement data to generate drawing data about the control ray.
  • this correction can be mathematically expressed as an operation of a motion vector in three dimensions of space.
  • the correction can also be performed in a way that cannot be clearly expressed as a mathematical formula.
  • a correction scheme can be established under different second inertial measurement data scenarios based on preliminary experiments. When it is confirmed that a certain second inertial measurement data scenario has been entered, the correction scheme for the first inertial measurement data can be determined by a method such as a table lookup. In addition, this "correction" will also filter out interference caused by vehicle bumps.
  • the generating unit 33 is configured to: extract the overall motion situation in the current environment according to the second inertial measurement data; and remove the overall motion situation from the first inertial measurement data.
  • the motion data caused by the dynamic situation is used to generate drawing data about the control ray.
  • the first inertial measurement data has a displacement of 0.2 in the spatial X-axis at a certain moment
  • the second inertial measurement data has a displacement of 0.1 in the spatial X-axis at this time (the overall motion situation of the vehicle 100 and the object 200)
  • the coefficient may not be fixed at 0.8, but can be adjusted with acceleration.
  • the generating unit 33 is based on a neural network, and the generating unit 33 inputs the first inertial measurement data and the second inertial measurement data into the neural network to generate drawing data.
  • the use of neural network technology to process inertial measurement data will have the characteristics of adaptive self-learning. At the same time, it can also automatically grasp environmental characteristics, has good fault tolerance, and strong anti-interference ability. Since the neural network does not need to establish a concrete mathematical relationship, the development process will be greatly shortened.
  • the generation unit 33 is further configured to send the drawing data to the rendering device of the extended reality content for rendering the control ray.
  • the drawing data generated by the vehicle system 30 is a mathematical expression of the control rays 301 and 302, which can be sent to the head-mounted rendering device to realize imaging, so that the object 200 can interact with the extended reality screen 300 according to the control rays 301 and 302.
  • the extended reality content can be augmented reality content, virtual reality content, or a combination of the two.
  • the extended reality content can also be visual content generated by other augmented reality solutions developed in the future.
  • the vehicle system 40 includes a memory 41 and a processor 42.
  • the processor 42 can read data from the memory 41 and can write data thereto.
  • the memory 41 is configured to store instructions, and the processor 42 is configured to execute any one of the interaction methods described above when executing the instructions stored in the memory 41.
  • the memory 41 can have the characteristics of a computer-readable storage medium as described below, and the details will be described below.
  • Another aspect of the present application provides a vehicle, wherein the vehicle includes any vehicle system as described above.
  • a computer-readable storage medium wherein instructions are stored, and when the instructions are executed by a processor, the processor executes any one of the interaction methods described above.
  • the computer-readable medium referred to in this application includes various types of computer storage media, which can be any available medium that can be accessed by a general or special computer.
  • the computer-readable medium may include RAM, ROM, EPROM, E2PROM , register, hard disk, removable disk, CD-ROM or other optical disk storage, disk storage or other magnetic storage device, or any other temporary or non-temporary medium that can be used to carry or store the desired program code unit in the form of instructions or data structures and can be accessed by a general or special computer, or a general or special processor.
  • the disk usually copies data magnetically, while the dish uses a laser to optically copy data.
  • the above combination should also be included in the protection scope of the computer-readable medium.
  • the exemplary storage medium is coupled to the processor so that the processor can read and write information from/to the storage medium.
  • the storage medium can be integrated into the processor.
  • the processor and the storage medium can reside in the ASIC.
  • the ASIC can reside in the user terminal.
  • the processor and the storage medium can reside in the user terminal as discrete components.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Mechanical Engineering (AREA)
  • User Interface Of Digital Computer (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé d'interaction (20), un système d'infodivertissement embarqué (30), un véhicule (100) le comprenant, et un support de stockage. Le procédé d'interaction (20) consiste à : recevoir des données de commande concernant un contenu de réalité étendue, les données de commande comprenant des premières données de mesure inertielle dans l'environnement actuel (S22) ; générer des secondes données de mesure inertielle dans l'environnement actuel (S24) ; et, sur la base des premières données de mesure inertielle et des secondes données de mesure inertielle, générer des rayons de commande pour interagir avec le contenu de réalité étendue (S26).
PCT/CN2023/123723 2022-10-19 2023-10-10 Procédé d'interaction, système d'infodivertissement embarqué, véhicule le comprenant, et support de stockage WO2024082996A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202211281641.X 2022-10-19
CN202211281641.XA CN115686205A (zh) 2022-10-19 2022-10-19 交互方法、车机系统及包括其的车辆、存储介质

Publications (1)

Publication Number Publication Date
WO2024082996A1 true WO2024082996A1 (fr) 2024-04-25

Family

ID=85065835

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/123723 WO2024082996A1 (fr) 2022-10-19 2023-10-10 Procédé d'interaction, système d'infodivertissement embarqué, véhicule le comprenant, et support de stockage

Country Status (2)

Country Link
CN (1) CN115686205A (fr)
WO (1) WO2024082996A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115686205A (zh) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 交互方法、车机系统及包括其的车辆、存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603167A (zh) * 2017-05-05 2019-12-20 奥迪股份公司 用于可用在车辆中的头戴式视觉输出装置的移动式传感器设备和运行显示系统的方法
CN111149041A (zh) * 2017-09-26 2020-05-12 奥迪股份公司 用于运行能佩戴在头上的电子显示装置的方法和用于显示虚拟内容的显示系统
US20210174590A1 (en) * 2019-12-09 2021-06-10 At&T Intellectual Property I, L.P. Cognitive stimulation in vehicles
CN114003126A (zh) * 2021-09-26 2022-02-01 歌尔光学科技有限公司 虚拟现实设备的交互控制方法、装置及设备
CN115080484A (zh) * 2022-06-24 2022-09-20 蔚来汽车科技(安徽)有限公司 车机系统及其数据处理方法、存储介质
CN115686205A (zh) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 交互方法、车机系统及包括其的车辆、存储介质

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110603167A (zh) * 2017-05-05 2019-12-20 奥迪股份公司 用于可用在车辆中的头戴式视觉输出装置的移动式传感器设备和运行显示系统的方法
CN111149041A (zh) * 2017-09-26 2020-05-12 奥迪股份公司 用于运行能佩戴在头上的电子显示装置的方法和用于显示虚拟内容的显示系统
US20210174590A1 (en) * 2019-12-09 2021-06-10 At&T Intellectual Property I, L.P. Cognitive stimulation in vehicles
CN114003126A (zh) * 2021-09-26 2022-02-01 歌尔光学科技有限公司 虚拟现实设备的交互控制方法、装置及设备
CN115080484A (zh) * 2022-06-24 2022-09-20 蔚来汽车科技(安徽)有限公司 车机系统及其数据处理方法、存储介质
CN115686205A (zh) * 2022-10-19 2023-02-03 蔚来汽车科技(安徽)有限公司 交互方法、车机系统及包括其的车辆、存储介质

Also Published As

Publication number Publication date
CN115686205A (zh) 2023-02-03

Similar Documents

Publication Publication Date Title
WO2024082996A1 (fr) Procédé d'interaction, système d'infodivertissement embarqué, véhicule le comprenant, et support de stockage
CN114502335B (zh) 用于具有几何约束的非线性机器人系统的轨迹优化的方法和系统
CN105555222A (zh) 医用机械臂装置、医用机械臂控制系统、医用机械臂控制方法、及程序
Huynh et al. Direct method for updating flexible multibody systems applied to a milling robot
EP2885059B1 (fr) Étalonnage de magnétomètre dynamique
US11995536B2 (en) Learning device, estimating device, estimating system, learning method, estimating method, and storage medium to estimate a state of vehicle-occupant with respect to vehicle equipment
CN102192740A (zh) 姿势信息计算装置、姿势信息计算系统及姿势信息计算法
CN104793017A (zh) 一种加速度校正方法及终端
WO2014028789A1 (fr) Étalonnage de magnétomètre dynamique
CN111723716B (zh) 确定目标对象朝向的方法、装置、系统、介质及电子设备
JP2022177202A (ja) レーザーレーダーと測位装置の校正方法、機器及び自律運転車両
CN113119096B (zh) 机械臂空间位置调整方法、装置、机械臂及存储介质
US10890981B2 (en) Gesture-based vehicle control
US20190204931A1 (en) Sign language inputs to a vehicle user interface
CN110096134A (zh) 一种vr手柄射线抖动矫正方法、装置、终端和介质
EP1540511A1 (fr) Systeme et procede de simulation de systemes dynamiques non lineaires pouvant etre appliques au moyen de logiciels
Fründ et al. Using augmented reality technology to support the automobile development
KR102333768B1 (ko) 딥러닝 기반 손 인식 증강현실 상호 작용 장치 및 방법
CN108196678A (zh) 手势操作方法及带手势操作功能的电子设备
CN114489412A (zh) 用于对运动传感器进行偏置矫正的方法、装置和交互方法
JPH11120385A (ja) 二次元・三次元統合型cadシステム及び図面作成プログラムを記録した記憶媒体
CN110813647A (zh) 五轴运动控制方法、装置及点胶设备
TW202040506A (zh) 虛擬實境視角補償方法、非暫時性電腦可讀取媒體及虛擬實境裝置
CN110162251A (zh) 图像缩放方法及装置、存储介质、电子设备
CN113496165B (zh) 用户手势识别方法、装置、手部智能穿戴设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23878988

Country of ref document: EP

Kind code of ref document: A1