CN117193515A - Equipment interaction method, equipment and vehicle - Google Patents

Equipment interaction method, equipment and vehicle Download PDF

Info

Publication number
CN117193515A
CN117193515A CN202210611765.3A CN202210611765A CN117193515A CN 117193515 A CN117193515 A CN 117193515A CN 202210611765 A CN202210611765 A CN 202210611765A CN 117193515 A CN117193515 A CN 117193515A
Authority
CN
China
Prior art keywords
vehicle
pose data
seat
data
wearable device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210611765.3A
Other languages
Chinese (zh)
Inventor
余以翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202210611765.3A priority Critical patent/CN117193515A/en
Priority to PCT/CN2023/096129 priority patent/WO2023231875A1/en
Publication of CN117193515A publication Critical patent/CN117193515A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses a device interaction method, device and vehicle, relates to the technical field of vehicles, and can accurately identify the relative pose between wearing equipment and a specific position on a vehicle so that the wearing equipment can carry out corresponding processing according to the relative pose and the use experience of the wearing equipment on the vehicle is improved. According to the scheme, the vehicle-mounted equipment and the wearable equipment are in communication connection, the vehicle-mounted equipment can determine pose data of a target position in the vehicle according to the pose data of the vehicle and the running data of the vehicle, and can send the pose data of the target position in the vehicle to the wearable equipment, so that the wearable equipment can determine relative pose data between the wearable equipment and the target position according to the pose data of the target position.

Description

Equipment interaction method, equipment and vehicle
Technical Field
The embodiment of the application relates to the technical field of vehicles, in particular to a device interaction method, device and vehicle.
Background
With the popularization of wearable devices such as smart glasses (e.g., virtual Reality (VR) glasses) and smart bracelets, users choose to wear the wearable devices when traveling. For example, a user may wear VR glasses during a ride to alleviate boring emotions during the ride through conference, social, gaming, video, etc. services provided by the VR glasses.
Currently, wearable devices often need to recognize their own gesture information to implement various applications according to the recognized gesture information. For example, VR glasses need to calculate pose information of the virtual world character from their own pose information to display the virtual world character in a correct position. However, when the user wears the wearable device on the running vehicle, even if the user wearing the wearable device does not actively move or shake, the posture information of the wearable device is changed due to the movement of the vehicle, which causes deviation in the posture identification of the wearable device, and further affects the subsequent application of the posture information of the wearable device.
Disclosure of Invention
The equipment interaction method, equipment and vehicle provided by the application not only can identify the gesture information of the specific position on the vehicle, but also can accurately identify the relative gesture information between the wearing equipment and the specific position, thereby improving the use experience of the wearing equipment on the vehicle.
In order to achieve the above purpose, the application adopts the following technical scheme:
in a first aspect, a device interaction method is provided, where the method may be applied to a vehicle-mounted device, and the vehicle-mounted device establishes a communication connection with a wearable device, and the device interaction method includes: acquiring vehicle pose data and vehicle driving data; determining pose data of a target position in the vehicle according to the vehicle pose data and the vehicle driving data; and sending pose data of the target position to the wearing equipment, wherein the pose data of the target position are used for determining relative pose data between the wearing equipment and the target position by the wearing equipment.
In the foregoing aspect, the vehicle-mounted device may determine pose data of a target position in the vehicle by using pose data of the vehicle and driving data of the vehicle, where the target position may be any position in the vehicle or may be a seat position in the vehicle, and the specific target position is not limited by the present application. And then the vehicle-mounted equipment can send the pose data of the target position in the vehicle to the wearing equipment, so that the wearing equipment can determine the relative pose data between the wearing equipment and the target position according to the pose data of the target position in the vehicle, and the wearing equipment can realize various application scenes according to the relative pose data.
For example, the wearing device can accurately identify the position of the user wearing the wearing device according to the relative pose data between the wearing device and the target position, so that corresponding functions can be realized according to the position of the user, such as triggering a vehicle-mounted display screen of a corresponding seat to display entertainment information, or identifying that the wearing device VR displays navigation information when the user is on a driver seat.
For another example, the wearable device may calculate pose information of the virtual world character according to relative pose data between the wearable device and the target position, thereby accurately implementing pose display of the virtual world character. Specific application scenario the present application is not limited. In this way, the vehicle-mounted device sends pose information which is accurate to a specific position in the vehicle to the wearing device on the vehicle, so that the wearing device can be combined with the pose information of the specific position in the vehicle to optimize subsequent application of the pose information of the wearing device, and the use experience of the wearing device on the vehicle is improved.
In one possible implementation manner, the determining the seat pose data of the seat in the vehicle according to the vehicle pose data and the vehicle driving data may include: and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and a preset pose conversion relation. The preset pose conversion relation comprises conversion relations between the vehicle pose data under different vehicle driving data and pose data of different positions in the vehicle. Therefore, the vehicle-mounted equipment can quickly identify the pose data of the specific position in the vehicle by inquiring the preset pose conversion relation, so that the calculation time is short, and the real-time performance is high.
In one possible implementation manner, the vehicle driving data may include rudder angles of the vehicle, and the preset pose conversion relationship may include a conversion relationship between the vehicle pose data under different rudder angles and pose data of different positions in the vehicle. Therefore, through pre-establishing the conversion relation between the vehicle pose under different rudder angles and the poses at different positions in the vehicle, the vehicle-mounted equipment can quickly inquire the accurate pose conversion relation by utilizing the current rudder angle information of the vehicle, so that pose data of a specific position in the vehicle can be quickly identified, the calculation time is short, and the real-time performance is high.
In one possible implementation manner, the vehicle includes a first sensor, where the first sensor is configured to detect vehicle pose data of the vehicle, and determining pose data of a target position in the vehicle according to the vehicle pose data and vehicle driving data may include: and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and the position of the first sensor in the vehicle. Thus, the vehicle-mounted equipment can calculate the pose data of the specific position in the vehicle in real time according to the specific position of the first sensor in the vehicle, and accuracy of pose recognition of the specific position in the vehicle is improved.
In one possible implementation manner, the vehicle driving data may include a rudder angle of the vehicle, and determining the seat pose data of the seat in the vehicle according to the vehicle pose data, the vehicle driving data, and the position of the first sensor in the vehicle may include: and determining pose data of the target position in the vehicle according to the pose data of the vehicle, the rudder angle, the position of the target position in the vehicle and the position of the first sensor in the vehicle. Thus, the vehicle-mounted equipment can be combined with the current rudder angle information of the vehicle, the target position and the position of the first sensor in the vehicle to accurately calculate the pose data of the specific position in the vehicle in real time.
In one possible implementation manner, the pose data of the target position may include pose data of a plurality of seats, and the device interaction method of the present application may further include: receiving equipment pose data of the wearable equipment, which is sent by the wearable equipment; and determining the seat where the wearable device is positioned in the vehicle according to the pose data of the device and the pose data of the plurality of seats. Therefore, after the vehicle-mounted equipment identifies the pose of the plurality of seats in the vehicle, the pose of the wearing equipment can be matched with the pose of the plurality of seats in the vehicle, so that the vehicle-mounted equipment can identify the seat where the wearing equipment is located, and the in-vehicle positioning of the wearing equipment is realized.
In one possible implementation manner, the pose data of the target position may be pose data of a target seat, where the target seat is a seat where the wearable device is located in the vehicle. In this way, the vehicle-mounted device can only identify the pose of the seat where the wearing device is located and send the pose to the wearing device, so that the wearing device can directly obtain the pose information of the seat where the wearing device is located, and then the wearing device can determine the relative pose data between the wearing device and the seat where the wearing device is located according to the pose information of the seat where the wearing device is located, thereby not only realizing the relative pose identification of the wearing device relative to the vehicle, but also improving the accuracy of the relative pose.
In a second aspect, a device interaction method is provided, where the method may be applied to a wearable device, and the wearable device establishes a communication connection with a vehicle-mounted device, and the device interaction method includes: acquiring equipment pose data of the wearable equipment; receiving pose data of a target position in a vehicle, which is sent by vehicle-mounted equipment, wherein the pose data of the target position is determined by the vehicle-mounted equipment according to the vehicle pose data and vehicle driving data; and determining relative pose data between the wearing equipment and the target position according to the pose data of the target position and the equipment pose data of the wearing equipment.
According to the scheme provided by the second aspect, when the vehicle-mounted device identifies pose data of a target position in a vehicle by using vehicle pose data and vehicle driving data, the wearable device can receive the pose data of the target position sent by the vehicle-mounted device, and then the wearable device can determine relative pose data between the wearable device and the target position according to the pose data of the target position in the vehicle and the device pose data of the wearable device. And the wearable device can realize various application scenes according to the relative pose data.
For example, the wearable device may accurately identify, according to relative pose data between the wearable device and the target position, a position where a user wearing the wearable device is located, so as to trigger a vehicle-mounted display screen of a corresponding seat to display entertainment information, or identify that when the user is on a driver seat, the wearable device VR displays navigation information.
For another example, the wearable device may calculate pose information of the virtual world character according to relative pose data between the wearable device and the target position, thereby accurately implementing pose display of the virtual world character. Specific application scenario the present application is not limited. In this way, the wearing equipment optimizes the subsequent application of the posture information of the wearing equipment by combining the posture information which is accurate to the specific position in the vehicle, and the use experience of the wearing equipment on the vehicle is improved.
In a possible implementation manner, the wearable device may be a virtual reality device, and the pose data of the target position may be pose data of a target seat, where the target seat is a seat where the wearable device is located in a vehicle, and the device interaction method of the present application may further include: and generating a virtual reality picture according to the relative pose data between the wearing equipment and the target seat. So, when wearing equipment that user used in the car is virtual reality equipment, vehicle-mounted equipment can discern the position appearance of virtual reality equipment place seat and send to virtual reality equipment can utilize the position appearance of virtual reality equipment place seat, discerns the relative position appearance data between self and self place seat, and then virtual reality equipment can utilize this relative position appearance, realizes virtual reality's picture display more accurately, improves virtual reality equipment's use experience.
In one possible implementation manner, the pose data of the target position may include pose data of a plurality of seats, and the device interaction method of the present application may further include: and determining the seat where the wearable device is positioned in the vehicle according to the relative pose data between the wearable device and the plurality of seats. Therefore, the vehicle-mounted device can recognize the poses of the multiple seats in the vehicle and send the poses to the wearing device, so that the wearing device can match the poses of the vehicle with the poses of the multiple seats in the vehicle, the matched seats are used as the seats where the vehicle is located, and the vehicle-mounted positioning of the wearing device is realized.
In one possible implementation manner, determining the seat where the wearable device is located in the vehicle according to the relative pose data between the wearable device and the plurality of seats may include: and determining the seat corresponding to the minimum relative pose data as the seat where the wearable device is positioned in the vehicle according to the relative pose data between the wearable device and the plurality of seats. Therefore, through the minimum relative pose data, the wearing equipment can identify the seat closest to the pose of the wearing equipment from a plurality of seats in the vehicle as the seat where the wearing equipment is positioned, and the positioning in the vehicle of the wearing equipment is realized.
In one possible implementation manner, the determining that the seat corresponding to the minimum relative pose data is a seat where the wearable device is located in the vehicle may include: and determining the seat corresponding to the minimum variance as the seat where the wearable equipment is positioned in the vehicle according to the variances of the relative pose data corresponding to the plurality of seats. Therefore, the wearing equipment can position the seat with the smallest variance in the vehicle by matching the pose of the wearing equipment with the poses of the seat and solving the variance, so that the in-vehicle positioning of the wearing equipment is realized.
In a possible implementation manner, the wearable device may be a virtual reality device, and after determining, according to the relative pose data between the wearable device and the plurality of seats, the seat in which the wearable device is located in the vehicle, the device interaction method of the present application may further include: and generating a virtual reality picture according to the relative pose data between the wearing equipment and the seat where the wearing equipment is located. So, when the wearing equipment that user used in the car was virtual reality equipment, vehicle-mounted equipment can discern the position appearance of a plurality of seats in the car and send to virtual reality equipment can match the position appearance of self and a plurality of seats in the car, in order to fix a position the seat that self is located, and then virtual reality equipment can utilize the relative position appearance data between self and the seat that self is located, more accurately realizes virtual reality's picture display, has also optimized wearing equipment's virtual display when having realized wearing equipment's car location.
In one possible implementation manner, before determining the seat in which the wearable device is located in the vehicle according to the relative pose data between the wearable device and the plurality of seats, the method further includes: the method comprises the steps of detecting that the wearable device is in an automatic positioning mode, wherein a triggering condition of the automatic positioning mode comprises the step of detecting that the wearable device is not positioned, or detecting that the wearable device moves, or detecting a repositioning instruction of the wearable device. In this way, the wearable device can perform in-vehicle positioning of the wearable device by using the seat pose of a plurality of seats in the vehicle and the device pose data of the wearable device in an automatic positioning mode.
In a possible implementation manner, the pose data of the target position may be pose data of a target seat, where the target seat is a seat where the wearable device is located in the vehicle, and before determining the relative pose data between the wearable device and the target position according to the pose data of the target position and the pose data of the device, the method further includes: the wearable device is detected to be in a non-automatic positioning mode. In this way, when the wearable device is in the non-automatic positioning mode, the wearable device can only receive the pose of the seat where the wearable device is located in the vehicle, so that the relative pose data between the wearable device and the seat where the wearable device is located can be identified.
In one possible implementation manner, before determining the seat in which the wearable device is located in the vehicle according to the relative pose data between the wearable device and the plurality of seats, the method further includes: the method comprises the steps of detecting that a vehicle meets a positioning success condition, wherein the positioning success condition is used for indicating that the vehicle has acceleration in different directions of a horizontal plane or is used for indicating that the vehicle has acceleration in the horizontal plane and a vertical plane at the same time, and the vertical plane is perpendicular to the horizontal plane.
It will be appreciated that the pose changes at each location of the vehicle are not the same because the vehicle has one more acceleration in the other direction. When the vehicle has acceleration in only one direction, the pose changes of the positions of the seats of the vehicle are identical, so that the wearable device cannot realize the in-vehicle positioning of the wearable device according to the seat pose data of a plurality of seats. That is, the condition for the wearable device to successfully perform in-vehicle positioning is that the vehicle has acceleration in different directions. The different directions can be different directions of the same horizontal plane or different directions on two different planes, namely the horizontal plane and the vertical plane, based on the different directions, the wearable device can set a successful positioning condition, so that when the vehicle meets the successful positioning condition, the wearable device can successfully position the inside of the vehicle by utilizing the seat pose of a plurality of seats in the vehicle and the equipment pose data of the wearable device.
In one possible implementation manner, the pose data of the target position includes acceleration of the target position, and the successful positioning condition may include at least one of a turning driving state, an ascending driving state or a descending driving state of the vehicle, where the acceleration of different positions in the vehicle in the at least one driving state is different. Because the acceleration is increased in the horizontal direction when the vehicle turns, the acceleration is increased in the vertical direction when the vehicle goes up and down a slope, and the pose changes of all positions of the vehicle are different, the wearable device can successfully position the wearable device in the vehicle by utilizing the seat poses of a plurality of seats in the vehicle and the device pose data of the wearable device when the vehicle is in the turning, ascending and descending states and the like.
In a third aspect, the present application provides an in-vehicle apparatus, in which the in-vehicle apparatus establishes a communication connection with a wearable apparatus, the in-vehicle apparatus may include: the data acquisition unit is used for acquiring vehicle pose data and vehicle running data; a data calculation unit for determining pose data of a target position in the vehicle according to the vehicle pose data and the vehicle running data; the data sending unit is used for sending pose data of a target position in the vehicle to the wearable device, and the pose data of the target position are used for determining relative pose data between the wearable device and the target position by the wearable device.
In one possible implementation, the data calculation unit may be configured to: and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and a preset pose conversion relation. The preset pose conversion relation comprises conversion relations between the vehicle pose data under different vehicle driving data and pose data of different positions in the vehicle. Therefore, the vehicle-mounted equipment can quickly identify the pose data of the specific position in the vehicle by inquiring the preset pose conversion relation, so that the calculation time is short, and the real-time performance is high.
In one possible implementation manner, the vehicle driving data may include rudder angles of the vehicle, and the preset pose conversion relationship may include a conversion relationship between the vehicle pose data under different rudder angles and pose data of different positions in the vehicle. Therefore, through pre-establishing the conversion relation between the vehicle pose under different rudder angles and the poses at different positions in the vehicle, the vehicle-mounted equipment can quickly inquire the accurate pose conversion relation by utilizing the current rudder angle information of the vehicle, so that pose data of a specific position in the vehicle can be quickly identified, the calculation time is short, and the real-time performance is high.
In one possible implementation, the vehicle includes a first sensor for detecting vehicle pose data of the vehicle, the data calculation unit may also be configured to: and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and the position of the first sensor in the vehicle. Thus, the vehicle-mounted equipment can calculate the pose data of the specific position in the vehicle in real time according to the specific position of the first sensor in the vehicle, and accuracy of pose recognition of the specific position in the vehicle is improved.
In one possible implementation manner, the vehicle driving data may include a rudder angle of the vehicle, and the data calculating unit may be configured to: and determining pose data of the target position in the vehicle according to the pose data of the vehicle, the rudder angle, the position of the target position in the vehicle and the position of the first sensor in the vehicle. Thus, the vehicle-mounted equipment can be combined with the current rudder angle information of the vehicle, the target position and the position of the first sensor in the vehicle to accurately calculate the pose data of the specific position in the vehicle in real time.
In one possible implementation manner, the pose data of the target position may include pose data of a plurality of seats, and the vehicle-mounted device of the present application may further include: the data receiving unit is used for receiving the equipment pose data of the wearable equipment, which are sent by the wearable equipment; and the equipment positioning unit is used for determining the seat where the wearing equipment is positioned in the vehicle according to the equipment pose data and the pose data of the plurality of seats. Therefore, after the vehicle-mounted equipment identifies the pose of the plurality of seats in the vehicle, the pose of the wearing equipment can be matched with the pose of the plurality of seats in the vehicle, so that the vehicle-mounted equipment can identify the seat where the wearing equipment is located, and the in-vehicle positioning of the wearing equipment is realized.
In one possible implementation manner, the pose data of the target position may be pose data of a target seat, where the target seat is a seat where the wearable device is located in the vehicle. In this way, the vehicle-mounted device can only identify the pose of the seat where the wearing device is located and send the pose to the wearing device, so that the wearing device can directly obtain the pose information of the seat where the wearing device is located, and then the wearing device can determine the relative pose data between the wearing device and the seat where the wearing device is located according to the pose information of the seat where the wearing device is located, thereby not only realizing the relative pose identification of the wearing device relative to the vehicle, but also improving the accuracy of the relative pose.
In a fourth aspect, the present application provides a wearable device, where the wearable device establishes a communication connection with an in-vehicle device, and the wearable device may include: the data acquisition unit is used for acquiring equipment pose data of the wearable equipment; the data receiving unit is used for receiving pose data of a target position in a vehicle, which is sent by the vehicle-mounted equipment, wherein the pose data of the target position in the vehicle is determined by the vehicle-mounted equipment according to the pose data of the vehicle and the running data of the vehicle; and the data calculation unit is used for determining relative pose data between the wearing equipment and the target position according to the pose data of the target position in the vehicle and the equipment pose data of the wearing equipment.
In a possible implementation manner, the wearable device may be a virtual reality device, and the pose data of the target position may be pose data of a target seat, where the target seat is a seat where the wearable device is located in a vehicle, and the wearable device of the present application may further include: and the virtual rendering unit is used for generating a virtual reality picture according to the relative pose data between the wearing equipment and the target seat. So, when wearing equipment that user used in the car is virtual reality equipment, vehicle-mounted equipment can discern the position appearance of virtual reality equipment place seat and send to virtual reality equipment can utilize the position appearance of virtual reality equipment place seat, discerns the relative position appearance data between self and self place seat, and then virtual reality equipment can utilize this relative position appearance, realizes virtual reality's picture display more accurately, improves virtual reality equipment's use experience.
In one possible implementation manner, the pose data of the target position may include pose data of a plurality of seats, and the wearable device of the present application may further include: and the equipment positioning unit is used for determining the seat where the wearing equipment is positioned in the vehicle according to the relative pose data between the wearing equipment and the plurality of seats. Therefore, the vehicle-mounted device can recognize the poses of the multiple seats in the vehicle and send the poses to the wearing device, so that the wearing device can match the poses of the vehicle with the poses of the multiple seats in the vehicle, the matched seats are used as the seats where the vehicle is located, and the vehicle-mounted positioning of the wearing device is realized.
In one possible implementation manner, the device positioning unit may also be used for: and determining the seat corresponding to the minimum relative pose data as the seat where the wearable device is positioned in the vehicle according to the relative pose data between the wearable device and the plurality of seats. Therefore, through the minimum relative pose data, the wearing equipment can identify the seat closest to the pose of the wearing equipment from a plurality of seats in the vehicle as the seat where the wearing equipment is positioned, and the positioning in the vehicle of the wearing equipment is realized.
In one possible implementation, the device positioning unit may also be configured to: and determining the seat corresponding to the minimum variance as the seat where the wearable equipment is positioned in the vehicle according to the variances of the relative pose data corresponding to the plurality of seats. Therefore, the wearing equipment can take the seat with the smallest variance as the in-vehicle positioning of the wearing equipment by matching the pose of the wearing equipment with the pose of a plurality of seats in the vehicle and solving the variance, so that the in-vehicle positioning of the wearing equipment is realized.
In one possible implementation manner, the wearable device may be a virtual reality device, and the virtual rendering unit may be further configured to: and generating a virtual reality picture according to the relative pose data between the wearing equipment and the seat where the wearing equipment is located. So, when the wearing equipment that user used in the car was virtual reality equipment, vehicle-mounted equipment can discern the position appearance of a plurality of seats in the car and send to virtual reality equipment can match the position appearance of self and a plurality of seats in the car, in order to fix a position the seat that self is located, and then virtual reality equipment can utilize the relative position appearance data between self and the seat that self is located, more accurately realizes virtual reality's picture display, has also optimized wearing equipment's virtual display when having realized wearing equipment's car location.
In one possible implementation manner, the wearable device may further include a first detection unit, configured to detect that the wearable device is in an automatic positioning mode, where a triggering condition of the automatic positioning mode includes detecting that the wearable device is not positioned, detecting that the wearable device moves, or detecting a repositioning instruction of the wearable device. In this way, the wearable device can perform in-vehicle positioning of the wearable device by using the seat pose of a plurality of seats in the vehicle and the device pose data of the wearable device in an automatic positioning mode.
In one possible implementation, the wearable device may further comprise a second detection unit for detecting that the wearable device is in the non-automatic positioning mode. In this way, when the wearable device is in the non-automatic positioning mode, the wearable device can only receive the pose of the seat where the wearable device is located in the vehicle, so that the relative pose data between the wearable device and the seat where the wearable device is located can be identified.
In one possible implementation manner, the wearable device may further include a third detection unit, configured to detect that the vehicle meets a positioning success condition, where the positioning success condition is used to indicate that the vehicle has acceleration in different directions of a horizontal plane, or to indicate that the vehicle has acceleration in both a horizontal plane and a vertical plane, and the vertical plane is perpendicular to the horizontal plane.
It will be appreciated that the pose changes at each location of the vehicle are not the same because the vehicle has one more acceleration in the other direction. When the vehicle has acceleration in only one direction, the pose changes of the positions of the seats of the vehicle are identical, so that the wearable device cannot realize the in-vehicle positioning of the wearable device according to the seat pose data of a plurality of seats. That is, the condition for the wearable device to successfully perform in-vehicle positioning is that the vehicle has acceleration in different directions. The different directions may be different directions on the same horizontal plane or different directions on two different planes, i.e., a horizontal plane and a vertical plane. Based on the positioning success condition, the wearable device can set the positioning success condition, so that when the vehicle meets the positioning success condition, the wearable device can successfully position the wearable device in the vehicle by utilizing the seat pose of a plurality of seats in the vehicle and the device pose data of the wearable device.
In one possible implementation manner, the pose data of the target position includes acceleration of the target position, and the successful positioning condition may include at least one of a turning driving state, an ascending driving state or a descending driving state of the vehicle, where the acceleration of different positions in the vehicle in the at least one driving state is different. Because the acceleration is increased in the horizontal direction when the vehicle turns, the acceleration is increased in the vertical direction when the vehicle goes up and down a slope, and the pose changes of all positions of the vehicle are different, the wearable device can successfully position the wearable device in the vehicle by utilizing the seat poses of a plurality of seats in the vehicle and the device pose data of the wearable device when the vehicle is in the turning, ascending and descending states and the like.
In a fifth aspect, the present application provides an in-vehicle apparatus comprising one or more processors and one or more memories. The one or more memories are coupled to the one or more processors, the one or more memories being configured to store computer program code comprising computer instructions that, when executed by the one or more processors, cause the in-vehicle device to perform the device interaction method in any of the possible implementations of the first aspect described above.
In a sixth aspect, the present application provides a wearable device comprising one or more processors and one or more memories. The one or more memories are coupled with the one or more processors, the one or more memories being for storing computer program code comprising computer instructions that, when executed by the one or more processors, cause the wearable device to perform the device interaction method in any of the possible implementations of the second aspect described above.
In a seventh aspect, the present application provides an in-vehicle apparatus, which is included in a vehicle or an in-vehicle device, and has a function of implementing the behavior of the in-vehicle device in any one of the above first aspect and possible implementation manners of the first aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or units corresponding to the functions described above.
In an eighth aspect, the present application provides an interaction device, where the interaction device is included in a wearable apparatus, and the device has a function of implementing the behavior of the wearable apparatus in any one of the second aspect and the possible implementation manners of the second aspect. The functions can be realized by hardware, and can also be realized by executing corresponding software by hardware. The hardware or software includes one or more modules or units corresponding to the functions described above.
In a ninth aspect, the present application provides a vehicle comprising the in-vehicle apparatus as in the foregoing third and fifth aspects and possible implementation manners of the embodiments of the present application, or comprising the in-vehicle device as in the foregoing seventh aspect of the embodiments of the present application. The vehicle may be adapted to implement a device interaction method as in any of the possible implementations of the first aspect above.
In a tenth aspect, the present application provides a computer storage medium comprising computer instructions that, when run on a vehicle-mounted device, cause the vehicle-mounted device to perform the device interaction method in any of the possible implementations of the first aspect described above, or when run on a wearable device, cause the wearable device to perform the device interaction method in any of the possible implementations of the second aspect described above.
In an eleventh aspect, the present application provides a computer program product for causing a computer to perform the method of device interaction in any of the possible implementations of the first aspect or to perform the method of device interaction in any of the possible implementations of the second aspect when the computer program product is run on a computer.
In a twelfth aspect, the present application provides a communication system, including an in-vehicle device as in any one of the possible implementations of the third aspect, and a wearable device as in any one of the possible implementations of the fourth aspect, wherein the in-vehicle device is communicatively connected with the wearable device.
It will be appreciated that the benefits achieved by the third aspect and any possible implementation thereof, the vehicle-mounted device of the fifth aspect, the vehicle of the seventh aspect, the vehicle of the ninth aspect, the computer storage medium of the tenth aspect, the computer program product of the eleventh aspect and the communication system of the twelfth aspect provided above may refer to the benefits of the first aspect and any possible implementation thereof, the benefits achieved by the fourth aspect and any possible implementation thereof, the wearable device of the sixth aspect, the interaction device of the eighth aspect, the computer storage medium of the tenth aspect, the computer program product of the eleventh aspect and the communication system of the twelfth aspect provided above may refer to the benefits of the second aspect and any possible implementation thereof, and are not repeated herein.
Drawings
Fig. 1 is a schematic view of an application scenario provided in an embodiment of the present application;
fig. 2 is a schematic hardware structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic diagram illustrating a change of a virtual screen displayed by a VR device according to an embodiment of the present application;
fig. 4 is a schematic diagram illustrating a change of a virtual screen displayed by another VR device according to an embodiment of the present application;
FIG. 5 is a flowchart of a method for device interaction according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an example of a three-dimensional coordinate system according to an embodiment of the present application;
FIG. 7 is a schematic illustration of an example of a vehicle turn provided by an embodiment of the present application;
fig. 8 is a schematic structural diagram of a vehicle and VR device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. The terms "first" and "second" are used below for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. It should be understood that in the present application, "at least one" means one or more, and "a plurality" means two or more. "and/or" is used to describe association relationships of associated objects, meaning that there may be three relationships, e.g., "a and/or B" may mean: only a, only B and both a and B are present, wherein a, B may be singular or plural.
First, an application environment according to an embodiment of the present application will be described. Referring to fig. 1, fig. 1 (a) and fig. 1 (b) are schematic diagrams illustrating an application environment according to an embodiment of the present application. As shown in (a) of fig. 1 and (b) of fig. 1, the application environment may include a vehicle 10 and a wearable device 20 within the vehicle 10.
In an embodiment of the present application, the vehicle 10 may be a conventional vehicle for carrying personnel or an autonomous vehicle. An autonomous vehicle may also be referred to as an unmanned vehicle, an intelligent driven vehicle, or the like, which may travel in a manual mode, a fully autonomous mode, or a partially autonomous mode. When configured to travel in a fully autonomous mode or a partially autonomous mode, an autonomous vehicle may travel autonomously over a geographic area with little or no control input from a driver.
Alternatively, the vehicle 10 may be a ground-based vehicle, such as an automobile, bus, motorcycle, locomotive, subway, or the like. As shown in fig. 1 (a), the vehicle 10 is a passenger car having a passenger number of 4, and the seats in the vehicle 10 include a main driver seat 101-a, a passenger seat 101-b, a rear seat 101-c, and a rear seat 101-d. Alternatively, the vehicle 10 may also be a surface-based vehicle, such as a ship, a hovercraft, a submarine, or the like. Alternatively, the vehicle 10 may also be a flying vehicle, such as an aircraft, helicopter, or the like. The specific type of vehicle 10 is not limited by the embodiments of the present application.
In an embodiment of the present application, the wearable device 20 may be a terminal device that is wearable on the user 30. The user 30 is a person carried by the vehicle 10, and may be a driver on the main driver seat 101-a of the vehicle 10 or a passenger on another seat of the vehicle 10. As shown in fig. 1 (a), the user 30 is an occupant in the rear seat 101-d of the vehicle 10.
Alternatively, the wearable device 20 may be a VR/augmented reality (augmented reality, AR)/Mixed Reality (MR) smart device for building a virtual world, such as VR/AR/MR smart glasses, VR/AR/MR smart helmets, or the like. As shown in fig. 1 (a) and fig. 1 (b), the wearable device 20 worn by the user 30 is VR smart glasses. Alternatively, the wearable device 20 may be a portable device with a communication function, such as a smart bracelet, a smart watch, a smart phone, or a wireless earphone. The specific type of the wearable device 20 is not limited in the embodiment of the present application.
It is to be understood that embodiments of the present application are not limited to the number of seats in the vehicle 10 nor the number of wearable devices 20 in the vehicle 10. The number of seats in the vehicle 10 and the number of wearable devices 20 in the vehicle 10 may be the same or different. The number of seats in the vehicle 10 and the number of users 30 in the vehicle 10 may be the same or different, and in addition, the number of wearable devices 20 that each user may carry is not limited by the embodiments of the present application.
In an embodiment of the present application, the vehicle 10 may be mounted with an in-vehicle apparatus 100, and the in-vehicle apparatus 100 may be a central control module of the vehicle 10, and may be used to control at least one module of the vehicle 10, such as a door, an engine, a speaker, an air conditioner, and the like.
The in-vehicle device 100 and the wearable device 20 may establish a communication connection by wired or wireless means. For example, the in-vehicle device 100 may establish a wired connection with the wearable device 20 through a universal serial bus (universal serial bus, USB) interface via a USB data transmission line. As another example, the in-vehicle device 100 may also establish a wireless connection with the wearable device 20 via a wireless fidelity (wireless fidelity, wi-Fi) protocol. The specific communication manner between the wearable device 20 and the in-vehicle device 100 is not limited in the embodiment of the present application.
In the embodiment of the present application, the in-vehicle apparatus 100 may acquire various vehicle data (such as vehicle speed, rudder angle, gear, turn signal, etc.) of the vehicle 10 through the in-vehicle bus. The bus may be a controller area network (controller area network, CAN) bus, or may be an ethernet bus, a local interconnect network (local interconnect network, LIN) bus, a media oriented system transport (media oriented system transport, MOST) bus, or a FlexRay or other wired communication line.
In one possible implementation, the in-vehicle device 100 may connect via a bus to various electronic control units (electronic control unit, ECUs) within the vehicle 10 to enable communication of the in-vehicle device 100 with other ECUs in the vehicle 10. The in-vehicle apparatus 100 can thus read data of each ECU through the bus, for example, acquire data of a vehicle condition report, a running report, fuel consumption statistics, driving behavior, and the like.
Alternatively, as shown in fig. 1 (b), the in-vehicle apparatus 100 may be an in-vehicle communication terminal, which is also called an in-vehicle telecommunications BOX (T-BOX). The vehicle-mounted T-BOX is a BOX with a communication function in the vehicle, and can provide a remote communication interface for the vehicle 10, and is usually hidden in the vehicle. The vehicle-mounted T-BOX is mainly used for communicating with a background system (such as a server)/electronic equipment, and vehicle information display and vehicle control at the electronic equipment end are realized. The electronic device may be a mobile phone, a tablet computer, a wearable device, etc. In one possible implementation, the vehicle-mounted T-BOX may read data of each ECU in the vehicle via the CAN bus and send the data to the electronic device via the network for viewing by the user.
Alternatively, the vehicle-mounted device 100 may also be other devices having a communication function in a vehicle, such as a vehicle machine (also referred to as a vehicle-mounted computer or a vehicle-mounted navigation), a vehicle-mounted electronic control unit (electronic control unit, ECU), a vehicle-mounted cruise system, a vehicle recorder, or the like, and the embodiment of the application is not limited to the specific type of the vehicle-mounted device 100. The vehicle machine can comprise a host machine and a display screen, and the host machine and the display screen of the vehicle machine can be arranged together or separately. The vehicle is usually mounted inside a center console. The center console is a console in front of the main and auxiliary seats in the vehicle 10, and is typically a carrier for devices such as dashboards, air conditioners, audio panels, storage boxes, and airbags.
In an embodiment of the present application, the vehicle 10 may also include a sensor system. The sensor system may include several sensors that sense information regarding the state of travel of the vehicle 10.
Alternatively, as shown in fig. 1 (b), the sensor system may include an inertial measurement unit (inertialmeasurement unit, IMU) sensor 101 for sensing attitude changes (including position and orientation changes) of the vehicle 10 based on inertial acceleration.
In some embodiments, IMU sensor 101 may include one or more of an accelerometer sensor, a gyroscope sensor, a magnetometer sensor, or any other suitable six-degree-of-freedom sensor capable of sensing the motion pose of vehicle 10 to measure, by either alone or in combination, motion information of angular speed, acceleration, etc. of vehicle 10. For example, the IMU sensor 101 may be a combination of an acceleration sensor and a gyro sensor for measuring angular velocity, acceleration of the vehicle 10.
Optionally, the sensor system may also include a vehicle sensor 102, which vehicle sensor 102 may be used to sense vehicle data while the vehicle 10 is traveling. The vehicle data may include vehicle speed and rudder angle information, and may also include other information such as gears, turn signals, and the like. In one possible implementation, the vehicle sensors 102 may include wheel speed sensors and steering wheel angle sensors. The wheel speed sensor is used for measuring the speed of the vehicle 10, and the steering wheel angle sensor is used for measuring steering wheel angle information of the vehicle 10.
In some embodiments, each sensor may be deployed in the vehicle 10 as a separate module or may be integrated into other devices within the vehicle as a component of the other devices within the vehicle. The arrangement mode of the sensor is not limited in the embodiment of the application. For example, the IMU sensor 101 may be built in the ECU of the in-vehicle apparatus 100 or the like. The IMU sensor 101 may also be deployed in the vehicle 10 as a stand-alone ECU, where the IMU sensor 101 may establish a communication connection with the in-vehicle device 100 and other ECUs via an in-vehicle bus.
In an embodiment of the present application, the vehicle 10 may also include an antenna. The antenna is used for transmitting and receiving electromagnetic wave signals. Each antenna in the vehicle 10 may be used to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. It is to be understood that embodiments of the present application are not limited to the number of antennas in the vehicle 10 nor to the location of the placement of the antennas in the vehicle 10. For example, 2 antennas may be disposed within the vehicle 10, and may be disposed near the top left side and the top right side of the vehicle 10, respectively.
In the embodiment of the present application, the wireless communication function of the in-vehicle apparatus 100 may be implemented by an antenna. For example, when the in-vehicle device 100 is in wireless communication with the wearable device 20, the antenna may be used to transmit electromagnetic wave signals to the wearable device 20 and receive electromagnetic wave signals from the wearable device 20.
The in-vehicle apparatus 100 may include a wireless communication module, among others. The wireless communication module may provide solutions for wireless communication including wireless local area network (wireless local area networks, WLAN) (e.g., wi-Fi network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied to the in-vehicle device 100. The wireless communication module may be one or more devices that integrate at least one communication processing module. The wireless communication module receives electromagnetic waves through an antenna, modulates the electromagnetic wave signals, filters the electromagnetic wave signals and sends the processed signals to other modules. The wireless communication module can also receive the signal to be transmitted, frequency-modulate and amplify the signal, and convert the signal into electromagnetic waves to radiate through the antenna.
In some embodiments, the antenna may be coupled with a wireless communication module of the in-vehicle device 100 such that the in-vehicle device 100 may communicate with a network and other devices through wireless communication techniques. Illustratively, when the antenna includes a Wi-Fi antenna, the in-vehicle device 100 may establish a Wi-Fi connection with the wearable device 20 through the Wi-Fi communication module and the Wi-Fi antenna and perform information interaction through the Wi-Fi connection. As shown in fig. 1 (b), the in-vehicle device 100 may transmit Wi-Fi signals 102 to the wearable device 20 and receive Wi-Fi signals 103 from the wearable device 20.
In an embodiment of the present application, the wearable device 20 may also include an IMU sensor for sensing a change in the posture of the wearable device 20. Optionally, the IMU sensors on the wearable device 20 may also include one or more of accelerometer sensors, gyroscope sensors, magnetometer sensors, or any other suitable six degree of freedom sensor capable of sensing the motion pose of the wearable device 20 to measure motion information of angular rate, acceleration, etc. of the wearable device 20 by separate or joint operations. In one possible implementation, the IMU sensor on the wearable device 20 may be a combination of an acceleration sensor and a gyroscope sensor for measuring angular velocity, acceleration of the wearable device 20.
In some embodiments, when the wearable device 20 is a VR/AR/MR smart device for building a virtual world, the wearable device 20 may also adjust the posture of a character in the virtual world in real time according to its posture change.
In the embodiment of the present application, the wearable device 20 may also include an antenna to implement the wireless communication function of the wearable device 20. Wherein the wearable device 20 may comprise a wireless communication module. The wireless communication module may provide a solution for wireless communication, including WLAN, BT, GNSS, FM, NFC, IR, applied on the wearable device 20.
In some embodiments, an antenna within the wearable device 20 may be coupled with a wireless communication module such that the wearable device 20 may communicate with a network and other devices through wireless communication techniques. Illustratively, when the wireless communication module in the wearable device 20 includes a Wi-Fi communication module and the antenna includes a Wi-Fi antenna, the wearable device 20 may establish a Wi-Fi connection with the in-vehicle device 100 through the Wi-Fi communication module and the Wi-Fi antenna and perform information interaction through the Wi-Fi connection. As shown in fig. 1 (b), the wearable device 20 may transmit a Wi-Fi signal 103 to the in-vehicle device 100 and receive a Wi-Fi signal 102 from the in-vehicle device 100.
Fig. 2 is a schematic structural diagram of an electronic device 200 according to an embodiment of the present application. The block diagrams of the in-vehicle device and the wearable device can both be referred to as the electronic device shown in fig. 2. As shown in fig. 2, the electronic device 200 may include a processor 210, an external memory interface 220, an internal memory 221, a usb interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a microphone 270B, an earphone interface 270C, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display module 294, and a subscriber identity module (subscriber identification module, SIM) card interface 295, etc. The sensor module 280 may include, among other things, a pressure sensor 280A, a gyroscope sensor 280B, a barometric sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity light sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
It is to be understood that the structure illustrated in this embodiment does not constitute a specific limitation on the electronic apparatus 200. In other embodiments, the electronic device 200 may include more or fewer components than shown, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), a controller, a memory, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller may be a neural hub and command center of the electronic device 200. The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution. When the electronic device 200 is an in-vehicle device, the controller may include a bus controller in some embodiments. The bus controller can be used for receiving vehicle data such as vehicle speed, rudder angle, gear, turn signals and the like from a vehicle bus and sending the vehicle data to other processing units for subsequent data analysis and processing.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a USB interface, among others.
It should be understood that the connection relationship between the modules illustrated in this embodiment is only illustrative, and does not limit the structure of the electronic device 200. In other embodiments, the electronic device 200 may also employ different interfaces in the above embodiments, or a combination of interfaces.
The USB interface 230 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. USB interface 230 may be used to connect other electronic devices, such as a wearable device, etc. In other words, when the electronic device 200 is an in-vehicle device, the electronic device 200 may establish a communication connection with the wearable device through the USB interface 230.
The wireless communication function of the electronic device 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein the antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals.
The mobile communication module 250 may provide a solution for wireless communication including 2G/3G/4G/5G, etc., applied on the electronic device 200. The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (low noise amplifier, LNA), etc. The mobile communication module 250 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 250 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be provided in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 270A, the receiver 270B, etc.), or displays images or videos through the display module 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 250 or other functional module, independent of the processor 210.
The wireless communication module 260 may provide a solution for wireless communication, including WLAN, BT, GNSS, FM, NFC, IR, applied on the electronic device 200. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 250 of electronic device 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, such that electronic device 200 may communicate with a network and other devices via wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (code division multiple access, CDMA), wideband code division multiple access (wideband code division multiple access, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidou navigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellite system, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
The electronic device 200 may implement audio functions through an audio module 270, a speaker 270A, a microphone 270B, an ear-headphone interface 270C, an application processor, and the like. Such as music playing, recording, etc.
The audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210.
Speaker 270A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The electronic device 200 may listen to music, or to hands-free conversations, through the speaker 270A.
Microphone 270B, also referred to as a "microphone" or "microphone," is used to convert sound signals into electrical signals. When the user utters in the vehicle, the user's voice signal is input to the microphone 270B. The electronic device 200 may be provided with at least one microphone 270B. In some embodiments, the electronic device 200 may be provided with three, four, or more microphones 270B to enable collection of sound signals, noise reduction, identification of sound sources, directional recording functions, etc.
The earphone interface 270C is used to connect a wired earphone. Earphone interface 270C may be USB interface 230 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The gyro sensor 280B may be used to determine a motion gesture of the electronic device 200. In some embodiments, the angular velocity of electronic device 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 280B. When the electronic device 200 is an in-vehicle device, the angular velocities of the in-vehicle device about three axes (such as x, y, and z axes) can be determined by the gyro sensor 280B. When the electronic device 200 is a wearable device, the angular velocity of the wearable device about three axes (e.g., x, y, and z axes) may be determined by the gyro sensor 280B. The gyro sensor 280B may be used for navigation, somatosensory of a game scene.
The air pressure sensor 280C is used to measure air pressure. In some embodiments, the electronic device 200 calculates altitude from barometric pressure values measured by the barometric pressure sensor 280C, aiding in positioning and navigation.
The acceleration sensor 280E may detect the magnitude of acceleration of the electronic device 200 in various directions. When the electronic apparatus 200 is an in-vehicle apparatus, the acceleration sensor 280E may detect the magnitude of acceleration of the in-vehicle apparatus in various directions. When the electronic device 200 is a wearable device, the acceleration sensor 280E may detect the magnitude of acceleration of the wearable device in various directions.
In some embodiments, gyroscope sensor 280B and acceleration sensor 280E may also be combined to form an IMU sensor. The IMU sensor is used to detect acceleration and angular velocity of the electronic device 200 in three axes (i.e., x, y, and z axes). When the electronic device 200 is an in-vehicle device, the IMU sensor may be used to detect acceleration and angular velocity of the in-vehicle device in various directions. When the electronic device 200 is a wearable device, IMU sensors may be used to detect acceleration and angular velocity of the wearable device in various directions.
A distance sensor 280F for measuring distance. The electronic device 200 may measure the distance by infrared or laser.
The electronic device 200 may implement photographing functions through an ISP, a camera 293, a video codec, a GPU, a display module 294, an application processor, and the like. In some embodiments, the electronic device 200 may include 1 or N cameras 293, N being a positive integer greater than 1.
The electronic device 200 implements display functions through a GPU, a display module 294, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display module 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
When the electronic device 200 is an in-vehicle device, the display module 294 may include a display screen for displaying images, videos, or the like. In some embodiments, the electronic device 200 may include 1 or N displays, N being a positive integer greater than 1.
Wherein the display screen may comprise a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light emitting diode (AMOLED), a flexible light-emitting diode (flex), a mini, a Micro-OLED, a quantum dot light-emitting diode (quantum dot light emitting diodes, QLED), or the like.
When the electronic device 200 is a wearable device, the display module 294 may include a display lens or a display mask. The display lens or the display mask can be an optical waveguide, a free-form surface prism or a free space and the like; the display lens or the display mask is a propagation path of an imaging light path, and can transmit a virtual image to human eyes. Optionally, the display module 294 may also include a display screen. The display screen may include a display panel. The display panel may be LCD, OLED, AMOLED, FLED, miniled, microLed, micro-oLed, QLED, etc.
It is understood that when the electronic device 200 is a wearable device, the display module 294 may further include other optical elements, and the specific structure of the display module 294 is not limited by the embodiment of the present application. For example, if the wearable device is used to implement the functionality of a VR device, the display module 294 may include a reflective polarizer, a 1/4 wave plate, a transflective film, a polarizer, a lens, a display screen, and the like. If the wearable device is used to implement the functions of an AR device, the display module 294 may include AR optics, such as polarizers, wave plates, lenses, and the like.
The methods in the following embodiments may be implemented in the in-vehicle apparatus and the wearable apparatus having the above-described hardware structures.
The technical solution provided by the embodiment of the present application will be specifically described below by taking the above-mentioned wearable device as a VR device as an example.
At present, based on the characteristic that the VR technology can simulate the virtual environment to bring the sense of environmental immersion to people, the VR technology is increasingly required by various industries. Vehicles are also increasingly in demand for VR in vehicles as the most common mobile space in human life. Generally, in-car VR can provide meeting, social, game, video, etc. services to people during riding, so that boring emotion during riding can be relieved, and even motion sickness symptoms can be relieved through the services provided by VR.
It will be appreciated that in building a virtual world, VR devices typically need to recognize their own pose information to calculate pose information for the virtual world character from their own pose information so that the VR device can display the virtual world character in the correct position. However, when a user uses a VR device on a running vehicle, even if the user using the VR device is stationary relative to the vehicle, the posture information of the VR device may change due to movement of the vehicle, which results in deviation in posture recognition of the VR device, and thus, accuracy of building a virtual world by the VR device is affected.
For example, when a user engages in a virtual meeting with a VR device on a traveling vehicle, even if the user is stationary with respect to the vehicle, the turning behavior (or the lane change behavior) of the vehicle may cause the VR device within the vehicle to recognize an angular velocity in a horizontal direction, resulting in the VR device correspondingly adjusting the posture of the virtual world character according to the recognized posture change.
Illustratively, (a) in fig. 3 and (b) in fig. 3 show a changing schematic diagram of a virtual screen displayed by an in-vehicle VR device. As shown in fig. 3 (a) and 3 (b), when a user on a rear seat of a vehicle participates in a virtual conference through VR glasses worn on his head, the user can view the virtual conference screen 300 through the VR glasses, and the virtual conference screen 300 includes a virtual character 301. Wherein the posing appearance of the virtual character 301 can rotate following the rotation of the user's head.
As shown in fig. 3 (a), when the user's head is facing forward, the virtual character 301 displayed on the VR glasses is in an upright posture. As shown in fig. 3 (b), after the user turns his head, the VR glasses worn on his head recognize the angular velocity in the horizontal direction, and the VR glasses adjust the posture of the displayed virtual character 301 to be on the side of his body according to the recognized angular velocity in the horizontal direction. That is, rotation of the user's head causes the virtual character 301 displayed by the VR glasses to also rotate.
However, when the user does not turn the head, i.e., when the user is stationary with respect to the vehicle, turning of the vehicle may also cause the virtual character displayed by the VR glasses to turn. Illustratively, (a) in fig. 4 and (b) in fig. 4 show a changing schematic diagram of a virtual screen displayed by another in-vehicle VR device. As shown in fig. 4 (a) and 4 (b), when a user on a rear seat of a vehicle participates in a virtual conference through VR glasses worn on his head, the user can view the virtual conference screen 400 through the VR glasses, and the virtual conference screen 400 includes a virtual character 401.
As shown in fig. 4 (a), before the vehicle turns, the virtual character 401 displayed by the VR glasses is oriented with the body facing the user. If the vehicle turns at this time, even if the user and the VR glasses worn are stationary with respect to the vehicle, the VR glasses recognize the angular velocity in the horizontal direction, and the VR glasses adjust the posture of the displayed virtual character 401 according to the recognized angular velocity in the horizontal direction. As shown in fig. 4 (b), when the vehicle turns, if the user and the VR glasses worn are still stationary with respect to the vehicle, the posture of the virtual character 401 displayed on the VR glasses changes to the body side facing the user. That is, turning of the vehicle also causes the virtual character 401 displayed by the VR glasses to rotate.
However, in practical applications, if the user and the VR device worn are stationary relative to the vehicle, the virtual world character should also remain stationary, and the virtual world character should not turn due to the turning of the vehicle (i.e., when the VR device in the vehicle recognizes an angular velocity in the horizontal direction, it is necessary to distinguish whether the angular velocity source is a turning of the vehicle or a turning of the user).
Therefore, in view of the above-described problems, in an in-vehicle VR scene, it is necessary for the VR device to acquire relative posture information between the VR device and the vehicle (i.e., posture information of the VR device with respect to the vehicle) to calculate posture information of the virtual world character from the relative posture information of the VR device. Rather than relying solely on the pose information of the VR device itself to calculate the pose information of the virtual world character.
Currently, in order to obtain relative posture information of a VR device, the VR device is usually implemented by providing 4 cameras at four corners of the VR device. Specifically, the VR device can acquire the surrounding environment image in real time through the equipped 4 cameras, so that the VR device can recognize the relative posture information of the VR device according to the front and rear frame image changes. However, this method has high requirements for configuring cameras of VR devices, high cost of 4 cameras, and inaccurate and erroneous recognition problems (such as image change caused by brightness change of ambient light, and misjudgment as posture change of VR devices) exist due to the requirement of image recognition on ambient light. In addition, the image acquisition also involves the problem of user privacy, and the user experience is poor.
In view of the above problems, the embodiment of the application provides a device interaction method, which can identify the relative gesture information of a VR device relative to a specific position in a vehicle only by using a sensor carried by the vehicle and a sensor carried by the VR device, and the accuracy of the identified relative gesture information is higher. The technical scheme provided by the embodiment of the application not only can identify the gesture information of different positions in the vehicle, but also can identify the position of the user using the VR equipment in the vehicle, and can accurately identify the gesture change of the VR equipment relative to the position of the user. Therefore, when a user uses the VR equipment at different positions in the vehicle, the application can accurately acquire the gesture information of the VR equipment relative to different positions in the vehicle, and can also realize the accurate positioning of the VR equipment in the vehicle. Compared with the prior art, the application does not need to additionally increase devices, and has low cost and good privacy.
It can be understood that, because the postures of different positions in the vehicle are different in the running process of the vehicle, when the posture of the virtual world role is constructed, the VR device can accurately calculate the posture information of the virtual world role by utilizing the posture information of the VR device relative to the position of the user, rather than simply calculating by utilizing the posture information of the VR device relative to the vehicle, thereby improving the posture accuracy of the virtual world role of the VR in the vehicle and ensuring the use experience of the VR device.
The device interaction method provided by the embodiment of the application is described below by taking the vehicle-mounted device as a vehicle-mounted communication terminal T-Box as an example and combining the drawings. As shown in fig. 5, the device interaction method may include:
s510, the vehicle-mounted communication terminal acquires vehicle pose data.
In the embodiment of the application, the vehicle pose data can comprise motion data such as the speed, the acceleration, the angular speed, the angular acceleration and the like of the vehicle in the running process of the vehicle, and the data comprise components along all directions of a space. The vehicle pose data may also include data such as geographic coordinate data (e.g., longitude and latitude data of the earth) and displacement during the running process of the vehicle, so as to calculate motion data such as speed, acceleration, angular velocity and angular acceleration of the vehicle according to the data.
In one possible implementation, the vehicle pose data may be acceleration data and angular velocity data of the vehicle during its travel. Alternatively, in the x, y, z three-dimensional coordinate system, the vehicle pose data of the present application may be obtained using three-axis angular velocities ω x ,ω y ,ω z And triaxial acceleration a x ,a y ,a z To represent. The three-axis angular velocity is understood to be the angular velocity of the vehicle about the three x, y, z axes, and the three-axis acceleration is understood to be the acceleration of the vehicle in the three x, y, z axes.
For example, as shown in fig. 6, the motion of the vehicle in the x, y, z three-dimensional coordinate system may include three translational motions including a leftward and rightward translational motion of the vehicle in the x-axis, a forward and backward translational motion in the y-axis, and an upward and downward translational motion in the z-axis. The three rotational movements include a rotational movement of the vehicle about the x-axis (the angle of rotation is also referred to as Pitch angle Pitch), a rotational movement about the y-axis (the angle of rotation is also referred to as Roll angle Roll), and a rotational movement about the z-axis (the angle of rotation is also referred to as Yaw angle Yaw).
It will be appreciated that the movement of the vehicle includes translational movement in each of the x, y, and z axes, with an acceleration, a, in each of the x, y, and z axes x ,a y ,a z . The movement of the vehicle comprising a rotational movement about each of the three axes x, y, z corresponds to an angular velocity, ω, in each of the three axes x, y, z x ,ω y ,ω z . Movement of the vehicle in three dimensions can be resolved into translation and/or movement in the x, y, z axes.Thus, the pose change generated during the running of the vehicle can be controlled by the three-axis angular velocity omega of the vehicle in the x, y and z axes x ,ω y ,ω z And triaxial acceleration a x ,a y ,a z To represent.
Due to the three translational movements and three rotational movements in the x, y, z three-dimensional coordinate system, which are physical, also referred to as six degrees of freedom of the object. Thus, in some embodiments, the vehicle pose data acquired by the in-vehicle communication terminal may also be the degree of freedom data of the vehicle.
Since a sensor (such as an IMU sensor) for periodically detecting a change in the posture of the vehicle is generally installed in the vehicle, the data sensed by the sensor generally includes three-axis angular velocity ω x ,ω y ,ω z And triaxial acceleration a x ,a y ,a z . Therefore, the vehicle pose data in the embodiment of the application can be measured by the sensor. The vehicle-mounted communication terminal can acquire vehicle pose data from the sensor.
Because a sensor for periodically detecting a posture change of the VR device is also typically installed in the VR device, for convenience of understanding and distinction, the sensor for detecting a posture change of the vehicle may be referred to as a first sensor, and the sensor for detecting a posture change of the VR device may be referred to as a second sensor in the embodiments of the present application.
Alternatively, the first sensor may be built in the vehicle-mounted communication terminal, so that the vehicle-mounted communication terminal may directly acquire the vehicle pose data detected by the first sensor.
Alternatively, the first sensor may also be accessed to the vehicle-mounted network as an independent ECU, so that the vehicle-mounted communication terminal may read the vehicle pose data from the vehicle-mounted network. The arrangement mode of the first sensor is not limited in the embodiment of the application. The on-board network may be a CAN, LIN, MOST, flexRay, ethernet or other on-board network. Taking a CAN network as an example, each ECU and vehicle-mounted equipment in the vehicle are connected to a CAN bus to form a vehicle-mounted network of the vehicle, and the vehicle-mounted equipment and the ECU CAN communicate through the CAN bus. The vehicle-mounted equipment CAN read the vehicle pose data detected by the first sensor through the CAN bus.
Alternatively, the vehicle-mounted communication terminal may acquire vehicle pose data at a fixed period, or may acquire vehicle travel data in real time. The fixed period may be set according to actual needs, and the embodiment of the present application is not limited.
S520, the vehicle-mounted communication terminal acquires vehicle running data.
In the embodiment of the application, the vehicle running data may refer to vehicle data in the running process of the vehicle. The vehicle data may include rudder angle of the vehicle, vehicle speed, gear, and the like. Since a vehicle sensor for detecting vehicle travel data is generally provided in a vehicle, the vehicle travel data in the embodiment of the application can be measured by the vehicle sensor. The vehicle sensor may include a wheel speed sensor for measuring a vehicle speed, a rudder angle sensor for measuring a rudder angle of the vehicle, and other sensors for measuring other vehicle data, and the specific type of the vehicle sensor is not limited in the embodiment of the present application.
Alternatively, since a steering wheel on a vehicle is generally interlocked with the front wheels of the vehicle, a user can operate the steering direction of the front wheels by turning the steering wheel, so that the traveling direction of the vehicle can be controlled. Accordingly, the rudder angle of the vehicle may also be referred to as a steering wheel angle, and the vehicle sensor may include a steering wheel angle sensor for measuring the steering wheel angle of the vehicle. Alternatively, the rudder angle of the vehicle may be a vehicle front wheel angle, and the vehicle sensor may also include a front wheel angle sensor for measuring the vehicle front wheel angle.
Alternatively, the vehicle sensor may be accessed to the in-vehicle network as an independent ECU, so that the in-vehicle communication terminal may read the vehicle pose data from the in-vehicle network. Alternatively, the vehicle sensor may be integrated as a component in other devices in the vehicle, so that the vehicle-mounted communication terminal may acquire vehicle pose data from the other devices through the vehicle-mounted network. The arrangement mode of the vehicle sensor is not limited in the embodiment of the application.
Alternatively, the vehicle-mounted communication terminal may acquire the vehicle running data at a fixed period, or may acquire the updated vehicle running data when the vehicle running data changes. The fixed period may be set according to actual needs, and the embodiment of the present application is not limited.
Note that S510 and S520 may be performed simultaneously, or S520 may be performed before S510. The embodiment of the present application does not limit the specific sequence between S510 and S520.
It will be appreciated that if the vehicle turns (or runs) during its travel, the vehicle will experience an acceleration in the horizontal direction. When the vehicle turns, the centrifugal forces of different seats in the vehicle are different, so that the accelerations generated on the different seats in the vehicle are substantially different, which means that the posture changes generated on the different seats in the vehicle cabin are different during the running process of the vehicle. If the VR device directly uses the vehicle gesture to identify the relative gesture information of the VR device, a relatively large deviation in gesture identification of the VR device is easily caused, so that subsequent application of the VR device is affected.
Based on the above, in the embodiment of the application, the vehicle-mounted communication terminal can identify the gesture of the target position in the vehicle, so that the VR device can identify the relative gesture information of the VR device relative to the target position by utilizing the gesture of the target position, thereby enabling the VR device to accurately identify the relative gesture information of the VR device and the target position when the VR device is used by a user in the target position in the vehicle, improving the accuracy of gesture identification of the VR device and ensuring the use experience of the VR device. The target position may be any one or more positions in the vehicle, which may include one or more positions of seats in the cabin, or may include a position other than a seat (such as a position of a passage between seats), and the specific target position is not limited in the embodiment of the present application.
It can be understood that when the target position is the positions of a plurality of seats in the seat cabin, the vehicle-mounted communication terminal can identify the gesture information of the plurality of seats in the vehicle, so that when the VR equipment is used by a user on different seats in the seat cabin of the vehicle, the relative gesture information of the VR equipment and the current seat of the user can be accurately identified, the gesture identification accuracy of the VR equipment is improved, and the use experience of the VR equipment is ensured.
Alternatively, the in-vehicle communication terminal may determine the pose data of the target position in the vehicle by two methods. As an embodiment, the vehicle-mounted communication terminal may determine pose data of the target position in the vehicle according to a preset pose conversion relationship. As shown in fig. 5, the device interaction method may include S530A:
S530A, the vehicle-mounted communication terminal determines pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and a preset pose conversion relation.
The preset pose conversion relationship may refer to a mapping relationship between a pose of the vehicle under different turning degrees and poses of different positions in the vehicle.
Alternatively, when the target position is a seat position in the vehicle, the preset pose conversion relationship may also refer to a mapping relationship between the vehicle poses under different turning degrees and the seat poses of different seats in the vehicle. It is understood that the pose conversion relationship corresponding to the vehicles of different models may be different due to the difference in the number and arrangement of seats in the vehicles of different models.
Optionally, the preset pose conversion relationship may include a turning parameter, a seat parameter in the vehicle, and a mapping relationship. The turning parameter indicates the turning degree of the vehicle, and may be a rudder angle (such as 0 °, 10 °) of the vehicle, a steering wheel angle, or a turning radius for indicating the turning degree. The specific turning parameters are not limited in the embodiments of the present application. The seat parameter in the vehicle may be a seat identifier, such as Post 1, post 2, etc., for indicating a particular seat in the vehicle. The mapping relationship may be a mapping parameter involved in the transition of the vehicle pose to the seat pose.
In one possible implementation, the preset pose conversion relationship may be stored in the form of a conversion table T (Pos), so that the pose conversion relationship may be converted by a conversion rule: seat pose of target seatD [Pos] Vehicle pose d×t (Pos), seat pose data for each seat in the vehicle is obtained. Wherein the target seat is any seat in the vehicle, and the Pos is a seat identifier of the target seat in the vehicle. Illustratively, the conversion table T (Pos) may be as shown in table 1:
TABLE 1
Wherein k0, k1, …, k11, …, m0, m1, …, m5, … are the triaxial angular velocities ω in the above vehicle pose x ,ω y ,ω z And triaxial acceleration a x ,a y ,a z And the mapping coefficients corresponding to the mapping coefficients respectively. Taking the seat Post 1 in the vehicle as an example, when the rudder angle of the vehicle is 10 °, the triaxial acceleration a of the seat Post 1 can be obtained based on the above-described conversion table T (Pos) x ' k6×a x ,a y ' k7×a y ,a z ' k8×a z And a triaxial angular velocity omega x '=k9×ω x ,ω y ' k10×ω y ,ω z ' k11×ω z
Alternatively, the above-described pose conversion relationship may be obtained in advance by performing experiments of various driving scenes on the vehicle. As a way, when a vehicle of a target vehicle type is developed, different turning scenes (such as left lane changing, right lane changing, left turning, right turning, turning around and the like) of the vehicle can be tested in advance to measure the pose changes of the vehicle and the pose changes of each seat in the vehicle under different scenes, and then the conversion relation between the pose of the vehicle of the target vehicle type and the pose of each seat in the vehicle under different turning degrees (different rudder angle angles) can be established according to test results. In the practical application scene, after the vehicle of the target vehicle type acquires the vehicle pose data and the vehicle driving data, the seat pose data of each seat in the vehicle can be quickly identified by inquiring the preset pose conversion relation, so that the calculation time is short and the instantaneity is high.
Similarly, when the target position is a non-seat position in the vehicle, the embodiment of the application can also refer to the principle to pre-establish the conversion relation between the vehicle pose and the pose of the non-seat position in the vehicle under different turning degrees (different rudder angle angles) of the vehicle of the target vehicle type. Correspondingly, the preset pose conversion relation can comprise a turning parameter, a position parameter of a non-seat position and a mapping relation. Therefore, in an actual application scene, even if a user wearing the VR equipment is not in a seat, the vehicle of the target vehicle type can quickly identify pose data of a non-seat position in the vehicle by inquiring the preset pose conversion relation.
In some embodiments, the preset pose conversion relationship may be stored in the vehicle in advance. Alternatively, the preset pose conversion relationship may be preconfigured in the vehicle-mounted communication terminal. Alternatively, the preset pose conversion relationship may be configured in other devices in the vehicle, and obtained by the vehicle-mounted communication terminal from the other devices in the vehicle. In other embodiments, the preset pose conversion relationship may also be stored in the server, and the vehicle-mounted communication terminal requests the server to obtain the pose conversion relationship. The storage location and the acquisition mode of the specific pose conversion relationship are not limited in the embodiment of the application.
As an alternative, in some embodiments, as shown in fig. 5, S530A may be replaced with S530B:
S530B, the vehicle-mounted communication terminal determines pose data of a target position in the vehicle according to the vehicle pose data, the vehicle running data and the position of the first sensor in the vehicle.
As another embodiment, when the vehicle pose data is detected by the first sensor in the vehicle, the vehicle-mounted communication terminal may calculate the pose data of the target position in the vehicle in real time according to the position of the first sensor in the vehicle.
It is understood that the difference between the pose data of the vehicle detected by the first sensor and the pose data of the target position is related not only to the degree of turning of the vehicle but also to the difference between the position of the first sensor in the vehicle and the position of the target position in the vehicle. Therefore, the vehicle-mounted communication terminal can identify the position relation between the first sensor and the target position in the vehicle in the current vehicle turning process according to the vehicle running data and the position of the first sensor in the vehicle, and accordingly the vehicle-mounted communication terminal can correspondingly calculate the pose data of the target position in the vehicle according to the position relation.
Alternatively, when the target position is a seat position in the vehicle, the vehicle-mounted communication terminal may identify a positional relationship between the first sensor and each seat in the vehicle during a current turning of the vehicle according to the vehicle running data and the position of the first sensor in the vehicle, so that the vehicle-mounted communication terminal may correspondingly calculate the seat pose data of each seat in the vehicle according to the positional relationship.
In one possible implementation, the first sensor is mounted at a front axle center position of the vehicle, for example. The vehicle-mounted communication terminal can calculate the seat pose data of each seat in the vehicle in real time according to the turning radius of the position of the first sensor and the turning radius of the position of each seat in the vehicle.
Referring to fig. 7, fig. 7 shows a schematic diagram of a vehicle turning according to an embodiment of the present application. Taking the calculation of the seat Post 4 in the vehicle as an example, as shown in fig. 7, an xOy coordinate system is first constructed with the center position of the rear axle of the vehicle as the origin O (0, 0), and then the positions of the first sensor and each seat in the vehicle can be represented by the coordinate points in the coordinate system.
As shown in fig. 7, a coordinate point P (0, h) in the xOy coordinate system indicates an in-vehicle position where the first sensor is located (i.e., a front axle center position of the vehicle). Wherein h is the longitudinal distance between the position of the first sensor and the center position of the rear wheel axle, and r is the turning radius of the position of the first sensor. The coordinate point O' (I, 0) represents the center of a circle corresponding to the turning radius when the vehicle turns. Wherein I is the turning radius of the center position of the rear wheel axle. The coordinate point P' (x, y) indicates the in-vehicle position where the seat Post 4 is located. r' is the turning radius of the seat Post 4 position. θ is the rudder angle of the vehicle.
Since the vehicle is a rigid body, the angular velocity of each position in the vehicle is the same during the turning of the vehicle, and therefore, the seatAngular velocity ω of position Post 4 x '=ω x ,ω y ' ω y ,ω z ' ω z Which coincides with the angular velocity in the vehicle pose data.
Since the turning of the vehicle is performed on a plane and does not involve a change in altitude (e.g., uphill, downhill, etc.), the angular velocity of the seat Post 4 in the vertical z-axis is constant, a z ' a z
For acceleration of the vehicle in the plane, taking acceleration in the horizontal direction on the plane as an example, acceleration of the seat Post 4 in the horizontal direction x-axis is:due to->Thus, the first and second substrates are bonded together, it can thus be deduced that:
due toThus, i=h×cotθ, so that it can be further deduced:
substituting the formula to obtain
From the above formulaAs can be seen from the above, the vehicle-mounted communication terminal can obtain the acceleration a of the seat Post 4 on the x-axis by substituting the coordinates x and y of the position in the vehicle of the seat Post 4, the longitudinal distance h between the first sensor and the center position of the rear axle, and the rudder angle θ of the vehicle into the above formula x '。
Similarly, the vehicle-mounted communication terminal can obtain the acceleration of the seat Post 4 on the y axis on the plane as follows:
/>
it can be understood that, in the case where the xOy coordinate system is fixed, the longitudinal distance h between the first sensor and the center position of the rear wheel axle and the coordinates x, y of the position in the vehicle of the seat Post 4 are fixed values, and are not changed with the change of the running behavior of the vehicle such as turning, lane changing, straight running, etc., only the rudder angle θ of the vehicle needs to be acquired in real time by the vehicle-mounted communication terminal. Accordingly, the in-vehicle position of the first sensor and the in-vehicle positions of the respective seats may be stored in the vehicle in advance, such as being configured in the in-vehicle communication terminal in advance.
It should be noted that, the corresponding calculation formula can be deduced by referring to the above principle for the acceleration of other seats in the vehicle on the plane. When the vehicle rudder angle is obtained, the vehicle-mounted communication terminal can substitute the vehicle-mounted position of the other seats and the vehicle-mounted position of the first sensor into corresponding formulas to obtain pose data of the other seats. The specific derivation process is not described in detail in the embodiments of the present application. Alternatively, the pose calculation formulas of the respective seats may be configured in advance in the vehicle-mounted communication terminal.
Alternatively, the in-vehicle communication terminal may identify only the seat pose data of a certain seat in the vehicle without identifying the seat pose data of all the seats in the vehicle. For example, the vehicle-mounted communication terminal can only identify the pose of the target seat where the VR device is located, so that unnecessary calculation amount can be reduced, and interaction efficiency can be improved.
Similarly, when the target position is a non-seat position in the vehicle, the embodiment of the application can also refer to the principle to identify the position relation of the first sensor and the non-seat position in the vehicle in the current vehicle turning process, so that the vehicle-mounted communication terminal can correspondingly calculate the pose data of the non-seat position in the vehicle according to the position relation. In this way, even if the user wearing the VR device is not in the seat, the in-vehicle communication terminal can recognize the relative pose data between the VR device and the non-seat position.
It can be understood that in the turning process (or lane changing process) of the vehicle, the vehicle generates an acceleration in the horizontal direction (i.e., the x axis) so that the pose changes of all positions on the vehicle are different, so that the embodiment of the application can obtain the pose data of the target position in the vehicle by inquiring the preset pose conversion relationship, or can obtain the pose data of the target position in the vehicle by calculating the position in the vehicle of the first sensor in real time.
Similarly, if the vehicle generates an acceleration in a direction (i.e., a z-axis) perpendicular to the plane during the running process of the vehicle, and the pose changes of each position on the vehicle are different, for example, the vehicle runs on an uphill road section, a downhill road section or an arched bridge, the embodiment of the application can also pre-establish a pose conversion relationship according to the above principle, so that the pose data of the target position in the vehicle can be obtained by inquiring the preset pose conversion relationship. Or the embodiment of the application can refer to the principle so as to calculate the pose data of the target position in the vehicle in real time through the position in the vehicle of the first sensor. The embodiments of the present application are not limited in this regard.
S540, the vehicle-mounted communication terminal sends pose data of a target position in the vehicle to the VR equipment.
In the embodiment of the application, after the vehicle-mounted communication terminal identifies the pose data of the target position in the vehicle, the pose data of the target position in the vehicle can be sent to the VR equipment through wireless connection or wired connection, so that the VR equipment can realize various applications according to the pose data of the target position in the vehicle.
Alternatively, the vehicle-mounted communication terminal and the VR device may complete transmission through wireless connection established by respective wireless communication modules. The wireless connection may be a communication connection such as bluetooth (including bluetooth low energy, classical bluetooth, etc.), WLAN, NFC, IR, etc., and the embodiment of the present application does not limit the type of wireless connection.
Optionally, the vehicle-mounted communication terminal and the VR device may also complete transmission through respective USB interfaces and a wired connection established by a cable.
Taking a wireless connection as a Wi-Fi connection for example, both the vehicle and VR device may have functionality to conduct Wi-Fi communications. When a user carries a VR device within a vehicle, the VR device may access a Wi-Fi local area network provided by the vehicle. When the VR device is connected to the vehicle Wi-Fi network, a Wi-Fi communication module on the vehicle may establish a Wi-Fi communication connection with a Wi-Fi communication module on the VR device. Optionally, the Wi-Fi communication module on the vehicle may be an independent vehicle-mounted component, or may be integrated as a component in a vehicle-mounted communication terminal or other vehicle-mounted devices.
The establishment of the wireless connection or the wired connection may be performed before S510 and S520, or may be performed after S510 and S520.
Alternatively, when the target location is a seat location within the vehicle, the in-vehicle communication terminal may send pose data for one or more seats to the VR device. In one possible implementation, the in-vehicle communication terminal may send pose data for each seat in the vehicle to the VR device. Alternatively, the vehicle-mounted communication terminal may send the seat pose data of each seat in the vehicle to the VR device in real time, or may send the seat pose data of each seat in the vehicle to the VR device in a fixed period. It will be appreciated that when multiple VR devices or other types of devices access a Wi-Fi local area network provided by the vehicle, the in-vehicle communication terminal may also send the seat pose data for each seat in the vehicle to each device that accesses the same Wi-Fi network.
Optionally, the in-vehicle communication terminal may also send seat pose data for each seat in the vehicle to the VR device upon detecting that the vehicle has acceleration in a horizontal direction (i.e., x-axis). It will be appreciated that the vehicle has acceleration in only the fore-aft direction (i.e., the y-axis) when the vehicle is straight on a flat surface. When the vehicle turns or changes track, the vehicle has acceleration in the horizontal direction (i.e., x-axis) in addition to the front-rear direction (i.e., y-axis), that is, the vehicle has acceleration in both directions of the horizontal plane. Thus, the in-vehicle communication terminal may also send seat pose data for each seat in the vehicle to the VR device upon detecting that the vehicle has acceleration in both directions (i.e., x-axis and y-axis) of the horizontal plane.
It will be appreciated that the specific manner of transmitting the seat pose data of each seat is not limited in the embodiments of the present application. For example, the in-vehicle communication terminal may also send seat pose data for each seat in the vehicle to the VR device upon detecting that the vehicle has acceleration in a vertical direction (i.e., z-axis) or that the vehicle has acceleration in both the y-axis and z-axis.
It can be understood that when the target position is a non-seat position in the vehicle, the possible implementation manners of the seat position are also applicable to the non-seat position, and the embodiments of the present application will not be repeated.
In the embodiment of the application, after the VR device receives the pose data of the target position in the vehicle sent by the vehicle-mounted communication terminal, various applications, such as identifying the seat where the passenger is located or identifying the relative pose of the VR device and the vehicle, can be realized according to the received pose data of the target position.
In one embodiment, when the pose data of the target position identified by the vehicle-mounted communication terminal includes seat pose data of each seat in the vehicle, the VR device may implement in-vehicle positioning of the VR device according to the seat pose data of each seat. As shown in fig. 5, the device interaction method may include S550A:
The S550A, VR device determines the seat in which the VR device is located in the vehicle based on the received seat pose data for each seat.
In the embodiment of the application, after the VR device receives the seat pose data of each seat sent by the vehicle-mounted communication terminal, the relative pose data between the VR device and each seat can be determined according to the received seat pose data of each seat and the device pose data of the VR device, and then the VR device can determine the seat where the VR device is located in the vehicle according to the relative pose data between the VR device and each seat, thereby realizing the in-vehicle position positioning of the VR device. The seat in which the VR device is located in the vehicle may be understood as a seat in which a user wearing the VR device is located in the vehicle. Therefore, the in-vehicle positioning of the VR device can be realized by only utilizing the in-vehicle sensor and the in-vehicle sensor of the VR device, the period for positioning is not required to be additionally increased, and the cost is low.
Optionally, the VR device may match the received seat pose data for each seat with device pose data of the VR device itself to obtain relative pose data between the VR device and each seat. In one possible implementation, the VR device may separately obtain the difference between the seat pose data for each seat and the device pose data for the VR device as the relative pose data between the wearable device and each seat. The VR device may then determine, based on the relative pose data corresponding to each seat, the seat corresponding to the minimum relative pose data as the seat in which the VR device is located in the vehicle. The seat corresponding to the minimum relative pose data may be understood as a seat whose pose data is closest to the pose data of the VR device, which is determined from all seats in the vehicle.
In one possible implementation, the VR device may obtain the variance of the relative pose data corresponding to each seat, so as to determine the seat corresponding to the minimum variance as the seat corresponding to the minimum relative pose data, where the VR device is located in the vehicle. Alternatively, the seat corresponding to the minimum relative pose data may be a seat corresponding to the minimum standard deviation (minimum mean square error), which is not limited in the embodiment of the present application.
It can be appreciated that when the pose data of the target position identified by the vehicle-mounted communication terminal does not include the pose data of each seat in the cabin, the vehicle-mounted communication terminal may also send the pose of a part of seats that can be identified to the VR device, so that the VR device may determine, from the part of seats, a seat closest to the pose of the VR device, approximately as a seat in the vehicle, and thus may approximately locate the position of the VR device in the vehicle.
In the embodiment of the application, the VR device can acquire the own device pose data in real time, and also can acquire the own device pose data in a fixed period. The fixed period may be set according to actual needs, and the embodiment of the present application is not limited. The device pose data may include motion data such as velocity, acceleration, angular velocity, and angular acceleration of the VR device, including components along various directions in space. The device pose data may also include data such as geographic coordinate data (e.g., longitude and latitude data of the earth) and displacement of the VR device, so as to calculate, according to the data, motion data such as a speed, an acceleration, an angular speed, an angular acceleration, and the like of the VR device.
Because the VR device is generally built with a second sensor (such as an IMU sensor) for periodically detecting the posture change of the VR device, in the embodiment of the present application, the VR device may acquire its own device posture data through the built-in second sensor.
In some embodiments, the VR device may perform step 540A upon detecting that the VR device is currently in an automatic positioning mode to enable the VR device to achieve its in-vehicle positioning.
Optionally, the VR device may enter an automatic positioning mode when the VR device satisfies a trigger positioning condition. The trigger positioning condition may include at least one of a VR device not being positioned, a VR device being moved, or a user triggered relocation.
It will be appreciated that the pose changes at each position of the vehicle are not the same because the vehicle turns causing the vehicle to experience one more acceleration in the horizontal direction (i.e., x-axis). If the vehicle is traveling straight, however, the vehicle has only one acceleration in the fore-and-aft direction (i.e., the y-axis), the pose changes of the respective seat positions of the vehicle are virtually identical, and in this case the VR device cannot realize its own in-vehicle positioning according to the seat pose data of the respective seats. That is, the VR device successfully performs in-vehicle positioning if the vehicle has acceleration in different directions of the horizontal plane. Thus, in some embodiments, the VR device may enter an automatic positioning mode when the vehicle satisfies a positioning success condition. The successful positioning condition may include, among other things, acceleration of the vehicle in both directions (i.e., x-axis and y-axis) of the horizontal plane.
Similarly, when the vehicle has one more acceleration in the vertical direction (z axis) and the pose of each position of the vehicle is different (such as the vehicle is ascending and descending), the successful positioning condition may also include that the vehicle has acceleration in the horizontal direction (i.e. x axis) of the plane and in the direction perpendicular to the plane (i.e. y axis). The embodiments of the present application are not limited in this regard.
Alternatively, since the acceleration of each position in the vehicle is different when the vehicle turns, the vehicle goes up and down a slope, the positioning success condition may also include at least one of a turning state, an uphill driving state, or a downhill driving state of the vehicle.
Optionally, the trigger positioning condition and the positioning success condition may also be used together as a determination condition of the automatic positioning mode, i.e. when the VR device satisfies the trigger positioning condition and the vehicle satisfies the positioning success condition, the VR device may enter the automatic positioning mode.
Alternatively, since the position changes of the respective seat positions of the vehicle are substantially identical when the vehicle does not satisfy the positioning success condition, the position recognition of the respective seats of the vehicle is not significant at this time. Therefore, the vehicle-mounted communication terminal can recognize the seat pose of the seat in the vehicle when the vehicle meets the positioning success condition. Or the vehicle-mounted communication terminal can also send the seat pose data of each seat in the vehicle to the VR device when the vehicle meets the positioning success condition.
In some embodiments, the in-vehicle communication terminal may also send only the seat pose data of the target seat to the VR device after locating the target seat in which the VR device is located in the vehicle.
Optionally, after the VR device locates the target seat where the VR device itself is located, the positioning result of the VR device may be sent to the vehicle-mounted communication terminal, so that the vehicle-mounted communication terminal may send only the seat pose data of the target seat to the VR device according to the positioning result of the VR device.
Optionally, the VR device may also send its own device pose data to the vehicle-mounted communication terminal, where the vehicle-mounted communication terminal locates the VR device according to the device pose data of the VR device. Therefore, the vehicle-mounted communication terminal can only send the seat pose data of the target seat to the VR device according to the positioning result of the VR device.
Alternatively, the target seat in which the VR device is located in the vehicle may also be manually specified by the user. It is understood that when the user manual specification result is received by the in-vehicle communication terminal, the in-vehicle communication terminal may transmit only the seat pose data of the target seat to the VR device according to the user manual specification result. When the result of the manual specification by the user is received by the VR device, the VR device may acquire the seat pose data of the target seat from the seat pose data of each seat transmitted by the in-vehicle communication terminal.
As an alternative, when the pose data of the target position identified by the vehicle-mounted communication terminal includes pose data of a target seat in the vehicle, the VR device may implement relative pose identification of the VR device with respect to the target seat according to the pose data of the target seat. The target seat is a seat where the VR device is located in the vehicle. As shown in fig. 5, S550A may be replaced with S550B:
the S550B, VR device determines relative pose data between the VR device and the target seat from the received pose data of the target seat.
As another embodiment, the VR device may determine the relative pose data between the VR device and the target seat according to the received seat pose data of the target seat, so that the relative pose recognition of the VR device relative to the seat in which the VR device is located is not simply implemented, but rather the relative pose recognition of the VR device relative to the vehicle is accurately implemented. Therefore, the posture information of the VR device relative to the seat where the VR device is located can be identified by only using the sensor of the vehicle and the sensor of the VR device, and meanwhile, the accuracy of the identified relative posture information is high.
Alternatively, the VR device may determine relative pose data between the VR device and the target seat based on the received seat pose data of the target seat and the device pose data of the VR device itself. The relative pose data can be used as Relative pose data between VR devices and vehicles for use in implementing various applications. Alternatively, the VR device may use the VR device's own device pose data (ω vr ,a vr ) The seat pose data (ω ', a') of the target seat is subtracted to obtain relative pose data between the VR device and the target seat.
In one possible implementation, the VR device may use the relative pose data between the VR device and the target seat for pose representation in the virtual world, improving pose accuracy in the virtual world when the user uses the VR device in the vehicle. For example, the VR device may calculate pose information for the virtual world character from relative pose data between the VR device and the target seat and generate the virtual reality screen. Accurate display of the virtual world roles is achieved.
It can be understood that the VR device of the present application can accurately calculate the posture information of the virtual world character by using the posture information of the VR device relative to the seat where the VR device is located, instead of simply calculating the posture information of the VR device relative to the vehicle, thereby greatly improving the posture accuracy of the virtual world character of the VR in the vehicle and ensuring the use experience of the VR device.
It can be appreciated that, because the VR device can calculate the pose information of the virtual world character according to the relative pose data between the VR device and the seat in which the VR device is located, when the VR device is used by the user on a different seat in the vehicle, the VR device can recognize that the relative pose data changes, so that the pose of the virtual world character displayed by the VR device also changes.
In some embodiments, the relative pose recognition of S550B may be performed after the VR device positioning of S550A described above. That is, the VR device may obtain the seat in which the VR device is located according to S550A, and then the VR device may obtain only the seat pose data of the seat in which the VR device is located from the vehicle-mounted communication terminal, so as to determine the relative pose data between the VR device and the seat in which the VR device is located. Therefore, the VR device can calculate the posture information of the virtual world roles according to the relative posture data between the VR device and the seat where the VR device is located, and generate a virtual reality picture. Accurate display of the virtual world roles is achieved.
In other embodiments, when the target seat in which the VR device is located is not obtained by performing S550A, the VR device may also perform S550B only, i.e., the VR device directly acquires the seat pose data of the target seat from the in-vehicle communication terminal to determine the relative pose data between the VR device and the target seat and use for pose representation in the virtual world.
In some embodiments, the VR device may also perform S550B relative pose recognition when currently in a non-automatic positioning mode. Thus, as shown in fig. 5, before the above S550A and S550B, the device interaction method may further include:
S550, the VR device judges whether the VR device is in an automatic positioning mode currently. If yes, S550A is executed, and if no, S550B is executed.
It can be appreciated that when the VR device is in the automatic positioning mode, the VR device may determine, according to the received seat pose data of each seat, a target seat in which the VR device is located in the vehicle, so as to implement in-vehicle positioning of the VR device. When the VR equipment is in the non-automatic positioning mode, the VR equipment does not need to be positioned in the vehicle, and the relative pose data between the VR equipment and the target seat is determined according to the received seat pose data of the target seat, so that the relative pose recognition of the VR equipment is realized.
In summary, according to the device interaction method provided by the embodiment of the application, not only the posture information of different seats in the vehicle cabin can be identified, but also the seat of the user using the VR device in the vehicle can be identified, and the posture change of the VR device relative to the seat of the user can be accurately identified. Therefore, when a user uses the VR equipment on different seats in the vehicle seat cabin, the application can accurately acquire the posture information of the VR equipment relative to the different seats in the vehicle, and can also realize the accurate positioning of the VR equipment in the vehicle.
It will be appreciated that the vehicle and VR devices include corresponding hardware and/or software modules that perform the various functions in order to achieve the above-described functionality. The present application can be implemented in hardware or a combination of hardware and computer software, in conjunction with the example algorithm steps described in connection with the embodiments disclosed herein. Whether a function is implemented as hardware or computer software driven hardware depends upon the particular application and design constraints imposed on the solution. Those skilled in the art may implement the described functionality using different approaches for each particular application in conjunction with the embodiments, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The present embodiment may divide the functional modules of the vehicle and the VR device according to the above-described method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated in one processing module. The integrated modules described above may be implemented in hardware. It should be noted that, in this embodiment, the division of the modules is schematic, only one logic function is divided, and another division manner may be implemented in actual implementation.
Fig. 8 shows a schematic diagram of one possible composition of the vehicle involved in the above-described embodiment, which may be mounted with an in-vehicle communication terminal, in the case where respective functional modules are divided with corresponding respective functions. As shown in fig. 8, the vehicle may include: the system comprises a bus controller, a gyroscope sensor, a vehicle sensor, a data acquisition unit, a data calculation unit, a data transmission unit, a Wi-Fi controller and a Wi-Fi antenna. The bus controller, the gyroscope sensor, the data acquisition unit, the data calculation unit, the data transmission unit and the Wi-Fi controller can be integrated in the vehicle-mounted communication terminal. Optionally, the gyro sensor may also be deployed in the vehicle as an independent module, where the gyro sensor may establish a communication connection with the vehicle-mounted communication terminal through the in-vehicle bus, where:
The bus controller can be used for receiving vehicle signals such as vehicle speed, rudder angle, gear, steering lamps and the like from a vehicle bus and sending the vehicle signals to the data acquisition unit for subsequent data analysis and processing.
The gyroscope sensor (IMU sensor) can be used for periodically collecting vehicle pose data of the current vehicle, such as angular velocity and acceleration information of XYZ three axes, and sending the vehicle pose data to the data collecting unit for subsequent data analysis and processing.
Vehicle sensors may be used to obtain vehicle speed and rudder angle information.
The Wi-Fi controller may be used to control Wi-Fi communication of the vehicle with other devices.
The data acquisition unit can be used for summarizing and packaging vehicle information and information acquired by the gyroscope sensor, and sending the summarized and packaged vehicle information and the information to the data calculation unit for data analysis in a fixed period.
The data calculation unit may be configured to calculate seat pose data of each seat in the vehicle cabin based on the received vehicle information and the gyro sensor.
The data transmission unit may be used to transmit the seat pose data of the respective seats to other devices.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Of course, the unit modules in the vehicle include, but are not limited to, the bus controller, the gyro sensor, the vehicle sensor, the data acquisition unit, the data calculation unit, the data transmission unit, the Wi-Fi controller, and the Wi-Fi antenna. For example, a storage unit or the like may be further included in the vehicle. In addition, when the functions of the data acquisition unit, the data calculation unit, and the data transmission unit are integrated in one unit, such as a processing unit, the processing unit is one or more processors (such as the processor 210 shown in fig. 2). Wherein the one or more processors, memories may be coupled together, such as by a bus. The memory is used to hold computer program code, the computer program code comprising instructions. When the processor executes the instructions, the vehicle may perform the relevant method steps of the above embodiments to implement the methods of the above embodiments.
Fig. 8 also shows a schematic diagram of one possible composition of VR devices involved in the above embodiments. As shown in fig. 8, the VR device may include: the system comprises a gyroscope sensor, a data receiving unit, a data calculating unit, a Wi-Fi controller and a Wi-Fi antenna. Wherein:
the gyroscope sensor (IMU sensor) can be used for periodically collecting equipment pose data of the current VR equipment, such as angular velocity and acceleration information of XYZ three axes, and sending the information to the data calculation unit for subsequent data analysis and processing.
The data receiving unit may be configured to receive seat pose data for each seat in the vehicle.
The data computing unit may be configured to match the device pose data of the VR device with the seat pose data of each seat, so as to locate a target seat where the VR device is located based on a matching result, and perform relative pose computation of the VR device according to the seat pose data of the target seat where the VR device is located and the device pose data of the VR device.
The Wi-Fi controller can be used to control Wi-Fi communication of the VR device with other devices.
It should be noted that, all relevant contents of each step related to the above method embodiment may be cited to the functional description of the corresponding functional module, which is not described herein.
Of course, the unit modules in the VR device include, but are not limited to, the gyro sensor, the data receiving unit, the data calculating unit, the Wi-Fi controller, and the Wi-Fi antenna. For example, a storage unit or the like may also be included in the VR device. In addition, when the functions of the data calculation unit and the data transmission unit are integrated in one unit, such as a processing unit, the processing unit is one or more processors (such as the processor 210 shown in fig. 2). Wherein the one or more processors, memories may be coupled together, such as by a bus. The memory is used to hold computer program code, the computer program code comprising instructions. When the processor executes the instructions, the VR device may perform the relevant method steps in the above embodiments to implement the methods in the above embodiments.
Those skilled in the art will appreciate that the configuration of the vehicle and VR device shown in fig. 8 is not limiting of the vehicle and VR device, and that embodiments of the present application provide a vehicle and VR device that may include more or fewer components than shown, or may combine certain components, or a different arrangement of components.
Still further embodiments of the present application provide a vehicle-mounted device, where the vehicle-mounted device may be applied to the vehicle, and the vehicle-mounted device is configured to perform each function or step performed by the vehicle-mounted communication terminal in the foregoing method embodiment.
Still further embodiments of the present application provide a wearable device configured to perform each function or step performed by the VR device in the foregoing method embodiments.
Further embodiments of the present application provide an in-vehicle apparatus, which is characterized in that the in-vehicle apparatus may be applied to the above-described vehicle or in-vehicle device. The vehicle-mounted device is used for executing each function or step executed by the vehicle-mounted communication terminal in the method embodiment.
Still further embodiments of the present application provide an interaction device, where the interaction device may be applied to the wearable apparatus. The interaction means is configured to perform the functions or steps performed by the VR device in the above-described method embodiments.
Still further embodiments of the present application provide a communication system, which may include the above-mentioned vehicle-mounted device and the wearable device. Communication connection is established between the vehicle-mounted equipment and the wearable equipment.
The embodiment of the application also provides a chip system which comprises at least one processor and at least one interface circuit. The processors and interface circuits may be interconnected by wires. The interface circuit may read the instructions stored in the memory and send the instructions to the processor. The instructions, when executed by the processor, may cause the in-vehicle apparatus, in-vehicle device, or vehicle to perform the respective functions or steps performed by the in-vehicle communication terminal in the above-described method embodiments. Or when executed by a processor, may cause the wearable device to perform the various functions or steps performed by the VR device in the method embodiments described above. Of course, the system-on-chip may also include other discrete devices, which are not particularly limited in accordance with embodiments of the present application.
The embodiment of the application also provides a computer storage medium, which comprises computer instructions, and when the computer instructions run on the vehicle-mounted device, the vehicle-mounted equipment or the vehicle, the vehicle-mounted device, the vehicle-mounted equipment or the vehicle is caused to execute the functions or steps executed by the vehicle-mounted communication terminal in the embodiment of the method. Or when executed on the wearable device, cause the wearable device to perform the various functions or steps performed by the VR device in the method embodiments described above.
The embodiment of the application also provides a computer program product, which when run on a computer, causes the computer to execute the functions or steps executed by the vehicle-mounted communication terminal in the embodiment of the method. Or cause the computer to perform the functions or steps performed by the VR device in the method embodiments described above when the computer program product runs on a computer.
The vehicle-mounted device, the wearable device, the vehicle-mounted device, the interaction device, the computer storage medium, the computer program product or the chip provided in this embodiment are all configured to execute the corresponding method provided above, so that the beneficial effects that can be achieved by the vehicle-mounted device, the wearable device, the vehicle-mounted device, the interaction device, the computer storage medium, the computer program product or the chip can refer to the beneficial effects in the corresponding method provided above, and are not repeated herein.
It will be apparent to those skilled in the art from this description that, for convenience and brevity of description, only the above-described division of the functional modules is illustrated, and in practical application, the above-described functional allocation may be performed by different functional modules according to needs, i.e. the internal structure of the apparatus is divided into different functional modules to perform all or part of the functions described above.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of the modules or units is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple units or components may be combined or integrated into another apparatus, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and the parts displayed as units may be one physical unit or a plurality of physical units, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a readable storage medium. Based on such understanding, the technical solution of the embodiments of the present application may be essentially or a part contributing to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium, including several instructions for causing a device (may be a single-chip microcomputer, a chip or the like) or a processor (processor) to perform all or part of the steps of the method described in the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read Only Memory (ROM), a random access memory (random access memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing program codes.
The foregoing is merely illustrative of specific embodiments of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present application should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. A device interaction method, which is applied to a vehicle-mounted device, wherein the vehicle-mounted device establishes a communication connection with a wearable device, the method comprising:
acquiring vehicle pose data and vehicle driving data;
determining pose data of a target position in a vehicle according to the vehicle pose data and the vehicle running data;
and sending pose data of the target position to the wearable device, wherein the pose data of the target position is used for determining relative pose data between the wearable device and the target position by the wearable device.
2. The method of claim 1, wherein determining pose data for a target location in a vehicle based on the vehicle pose data and the vehicle travel data comprises:
and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle driving data and a preset pose conversion relation, wherein the preset pose conversion relation comprises conversion relations between the vehicle pose data under different vehicle driving data and the pose data of different positions in the vehicle.
3. The method of claim 2, wherein the vehicle travel data includes rudder angles of the vehicle, and the preset pose conversion relationship includes a conversion relationship between the vehicle pose data at different rudder angles and pose data at different positions in the vehicle.
4. The method of claim 1, wherein the vehicle includes a first sensor for detecting vehicle pose data of the vehicle, the determining pose data of a target location in the vehicle based on the vehicle pose data and the vehicle travel data comprising:
and determining pose data of a target position in the vehicle according to the vehicle pose data, the vehicle running data and the position of the first sensor in the vehicle.
5. The method of claim 4, wherein the vehicle travel data comprises rudder angle of the vehicle, and wherein determining pose data for a target location in the vehicle based on the vehicle pose data, the vehicle travel data, and the location of the first sensor within the vehicle comprises:
and determining pose data of the target position in the vehicle according to the vehicle pose data, the rudder angle, the position of the target position in the vehicle and the position of the first sensor in the vehicle.
6. The method of any of claims 1-5, wherein the pose data for the target location comprises pose data for a plurality of seats, the method further comprising:
Receiving equipment pose data of the wearable equipment, which is sent by the wearable equipment;
and determining the seat where the wearable device is positioned in the vehicle according to the equipment pose data and the pose data of the plurality of seats.
7. The method of any of claims 1-5, wherein the pose data of the target location is pose data of a target seat, wherein the target seat is a seat in which the wearable device is located in the vehicle.
8. A device interaction method, applied to a wearable device, where the wearable device establishes a communication connection with a vehicle-mounted device, the method comprising:
acquiring equipment pose data of the wearable equipment;
receiving pose data of a target position in a vehicle, which is sent by the vehicle-mounted equipment, wherein the pose data of the target position is determined by the vehicle-mounted equipment according to the vehicle pose data and vehicle running data;
and determining relative pose data between the wearable device and the target position according to the pose data of the target position and the pose data of the device.
9. The method of claim 8, wherein the wearable device is a virtual reality device and the pose data of the target location is pose data of a target seat, wherein the target seat is a seat in which the wearable device is located in the vehicle, the method further comprising:
And generating a virtual reality picture according to the relative pose data between the wearable equipment and the target seat.
10. The method of claim 8, wherein the pose data for the target location comprises pose data for a plurality of seats, the method further comprising:
and determining the seat where the wearable device is positioned in the vehicle according to the relative pose data between the wearable device and the plurality of seats.
11. The method of claim 10, wherein the determining the seat in which the wearable device is located in the vehicle based on the relative pose data between the wearable device and the plurality of seats comprises:
and determining the seat corresponding to the least relative pose data as the seat where the wearable equipment is positioned in the vehicle according to the relative pose data between the wearable equipment and the plurality of seats.
12. The method of claim 11, wherein the determining that the seat corresponding to the least relative pose data is the seat in which the wearable device is located in the vehicle comprises:
and determining the seat corresponding to the minimum variance as the seat where the wearable equipment is located in the vehicle according to the variances of the relative pose data corresponding to the plurality of seats.
13. The method of any of claims 10-12, wherein the wearable device is a virtual reality device, the method further comprising, after the determining a seat in which the wearable device is located in the vehicle from relative pose data between the wearable device and the plurality of seats:
and generating a virtual reality picture according to the relative pose data between the wearing equipment and the seat where the wearing equipment is located.
14. The method of any of claims 10-13, wherein prior to the determining the seat in which the wearable device is located in the vehicle from the relative pose data between the wearable device and the plurality of seats, the method further comprises:
detecting that the wearable device is in an automatic positioning mode, wherein the triggering condition of the automatic positioning mode comprises detecting that the wearable device is not positioned, detecting that the wearable device moves, or detecting a repositioning instruction of the wearable device.
15. The method of any of claims 10-14, wherein prior to the determining the seat in which the wearable device is located in the vehicle from the relative pose data between the wearable device and the plurality of seats, the method further comprises:
And detecting that the vehicle meets a positioning success condition, wherein the positioning success condition is used for indicating that the vehicle has acceleration in different directions of a horizontal plane or is used for indicating that the vehicle has acceleration in the horizontal plane and a vertical plane at the same time, and the vertical plane is perpendicular to the horizontal plane.
16. The method of claim 15, wherein the successful positioning condition includes at least one of a cornering situation, an uphill situation, or a downhill situation, and wherein the pose data of the target location includes acceleration of the target location, wherein the acceleration of different locations in the vehicle in the at least one driving situation is different.
17. An in-vehicle apparatus, the in-vehicle apparatus comprising a memory and one or more processors; the memory is coupled to the processor; the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the in-vehicle apparatus to perform the method of any of claims 1-7.
18. A wearable device, the wearable device comprising a memory and one or more processors; the memory is coupled to the processor; the memory is for storing computer program code comprising computer instructions which, when executed by the processor, cause the wearable device to perform the method of any of claims 8-16.
19. A vehicle, characterized in that it comprises an onboard device that performs the method according to any one of claims 1-7.
CN202210611765.3A 2022-05-31 2022-05-31 Equipment interaction method, equipment and vehicle Pending CN117193515A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210611765.3A CN117193515A (en) 2022-05-31 2022-05-31 Equipment interaction method, equipment and vehicle
PCT/CN2023/096129 WO2023231875A1 (en) 2022-05-31 2023-05-24 Device interaction method, and device and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210611765.3A CN117193515A (en) 2022-05-31 2022-05-31 Equipment interaction method, equipment and vehicle

Publications (1)

Publication Number Publication Date
CN117193515A true CN117193515A (en) 2023-12-08

Family

ID=88994814

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210611765.3A Pending CN117193515A (en) 2022-05-31 2022-05-31 Equipment interaction method, equipment and vehicle

Country Status (2)

Country Link
CN (1) CN117193515A (en)
WO (1) WO2023231875A1 (en)

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101570432B1 (en) * 2014-08-18 2015-11-27 엘지전자 주식회사 Wearable device and method for controlling the same
US11023049B2 (en) * 2015-11-24 2021-06-01 Ford Global Technologies, Llc Methods and systems for enabling gesture control for a vehicle feature
KR101809922B1 (en) * 2016-02-02 2018-01-25 현대자동차주식회사 A vehicle, a method for adjustment of the driver's seat of the vehicle and a system for controlling the vehicle
CN110275618A (en) * 2019-06-21 2019-09-24 北京三快在线科技有限公司 Vehicle part display methods and vehicle part display device

Also Published As

Publication number Publication date
WO2023231875A1 (en) 2023-12-07

Similar Documents

Publication Publication Date Title
US10410250B2 (en) Vehicle autonomy level selection based on user context
EP3338136B1 (en) Augmented reality in vehicle platforms
CN106240457B (en) Vehicle parking assistance device and vehicle
US11127373B2 (en) Augmented reality wearable system for vehicle occupants
CN108860141B (en) Parking method, parking device and storage medium
US10086763B2 (en) System and method for enhancing vehicle environment perception
CN106094809A (en) Vehicle parking assistance device
CN105916750A (en) Vehicle driving aid device and vehicle having same
CN110360996B (en) Sensor unit, moving object positioning device, and moving object
CN109196557A (en) Image processing apparatus, image processing method and vehicle
WO2018225392A1 (en) Control apparatus, image pickup apparatus, control method, program, and image pickup system
JP6813027B2 (en) Image processing device and image processing method
CN114584917B (en) Position information acquisition method, device and system
US10748264B2 (en) Image processing apparatus and image processing method
US20190296833A1 (en) Terminal apparatus and apparatus system
CN113343457B (en) Automatic driving simulation test method, device, equipment and storage medium
CN112912852B (en) Vehicle infotainment device and method of operating the same
CN117193515A (en) Equipment interaction method, equipment and vehicle
KR101781689B1 (en) Vitual image generating apparatus, head mounted display and vehicle
TWI660873B (en) Driving data recording device for vehicles
US20200134922A1 (en) Movable body
CN111323042A (en) Target object early warning method and device and electronic equipment
KR20170011881A (en) Radar for vehicle, and vehicle including the same
US20220035752A1 (en) Device, method and computer program
CN110297325A (en) The method that augmented reality glasses and system and augmented reality glasses show information on vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination