CN115447506A - Equipment control method, device, vehicle, medium and chip - Google Patents

Equipment control method, device, vehicle, medium and chip Download PDF

Info

Publication number
CN115447506A
CN115447506A CN202211043582.2A CN202211043582A CN115447506A CN 115447506 A CN115447506 A CN 115447506A CN 202211043582 A CN202211043582 A CN 202211043582A CN 115447506 A CN115447506 A CN 115447506A
Authority
CN
China
Prior art keywords
seat
target
vehicle
determining
ultrasonic signal
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211043582.2A
Other languages
Chinese (zh)
Inventor
周岭松
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Pinecone Electronic Co Ltd
Xiaomi Automobile Technology Co Ltd
Original Assignee
Beijing Xiaomi Pinecone Electronic Co Ltd
Xiaomi Automobile Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Pinecone Electronic Co Ltd, Xiaomi Automobile Technology Co Ltd filed Critical Beijing Xiaomi Pinecone Electronic Co Ltd
Priority to CN202211043582.2A priority Critical patent/CN115447506A/en
Publication of CN115447506A publication Critical patent/CN115447506A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • B60R16/023Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for transmission of signals between vehicle parts or subsystems
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C23/00Non-electrical signal transmission systems, e.g. optical systems
    • G08C23/02Non-electrical signal transmission systems, e.g. optical systems using infrasonic, sonic or ultrasonic waves

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)

Abstract

The disclosure relates to a device control method, apparatus, vehicle, medium, and chip. The method comprises the following steps: transmitting a preset first ultrasonic signal in a space in a vehicle; for each seat within the vehicle, determining a second ultrasonic signal corresponding to the first ultrasonic signal that the seat picked up; determining channel impulse response characterization information corresponding to each seat according to the second ultrasonic signal, wherein the channel impulse response characterization information is used for reflecting whether a propagation path of the ultrasonic signal is influenced or not; determining a target seat for a user to take to make a control action according to the channel impulse response characterization information; and controlling at least one of the devices associated with the target seat according to the control action corresponding to the target seat. Thus, the individual recognition and individual response of the movement of each seat occupant can be realized, and the recognition can be accurately completed even at night without being affected by external factors such as light.

Description

Equipment control method, device, vehicle, medium and chip
Technical Field
The present disclosure relates to the field of vehicle technologies, and in particular, to a device control method, apparatus, vehicle, medium, and chip.
Background
Currently, a user can perform contactless control on a vehicle through gesture actions in the vehicle. For in-vehicle gesture control, when there are a plurality of in-vehicle persons, it is necessary to respond to the movements of persons at different positions, respectively. In the related art, a camera is usually arranged in front of each seat to collect images of people in a vehicle, and the images are subjected to feature extraction and analysis to identify the actions of the people in the vehicle. However, a plurality of cameras need to be added to above-mentioned mode, has the problem of too high cost, and simultaneously, the camera requires for light higher, when light is not good, especially when the driving at night, the recognition effect can receive the influence, and in addition, the setting of camera still can influence personnel's in the car experience by bus.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an apparatus control method, device, vehicle, medium, and chip.
According to a first aspect of embodiments of the present disclosure, there is provided an apparatus control method, the method including:
transmitting a preset first ultrasonic signal in a space in a vehicle;
for each seat within the vehicle, determining a second ultrasonic signal corresponding to the first ultrasonic signal that was picked up by that seat;
determining channel impulse response characterization information corresponding to each seat according to the second ultrasonic signal, wherein the channel impulse response characterization information is used for reflecting whether a propagation path of the ultrasonic signal is influenced or not;
determining a target seat for a user to take to make a control action according to the channel impulse response characterization information;
and controlling at least one of the devices associated with the target seat according to the control action corresponding to the target seat.
Optionally, each seat is provided with a signal acquisition device;
the determining that the seat picks up the second ultrasonic signal corresponding to the first ultrasonic signal comprises:
acquiring an initial ultrasonic signal acquired by signal acquisition equipment of the seat;
performing band-pass filtering processing on the initial ultrasonic signal to obtain a processing signal;
demodulating the processed signal to obtain a baseband signal;
determining a target beam matrix corresponding to the seat;
and determining the second ultrasonic wave signal according to the baseband signal and the target beam matrix.
Optionally, the first ultrasonic signal is emitted by a signal emitting device in the vehicle;
the determining a target beam matrix corresponding to the seat includes:
determining the relative position between the signal acquisition device and the signal emission device of the seat;
and determining a target beam matrix corresponding to the seat according to the relative position.
Optionally, the determining, according to the second ultrasonic signal, channel impulse response characterizing information corresponding to each seat includes:
for each of the seats, performing the following operations:
determining a channel impulse response vector corresponding to the seat according to the first ultrasonic signal and a second ultrasonic signal corresponding to the seat;
and determining the intensity change information of the channel impulse response vector according to the signal impulse response vector corresponding to the seat, wherein the intensity change information is used as the channel impulse response characterization information corresponding to the seat.
Optionally, the determining, according to the channel impulse response characterization information, a target seat in which a user performs a control action includes:
determining a recognition result corresponding to each seat by utilizing a pre-trained recognition model according to the channel impulse response characterization information, wherein the recognition result is used for indicating a control action type corresponding to each seat, and the control action type is one of multiple preset control actions or no control action;
and determining the seat with the control action type indicated by the identification result as the preset control action as the target seat.
Optionally, the preset control action is used for instructing to control a device of the vehicle;
the controlling at least one of the devices associated with the target seat according to the control action corresponding to the target seat includes:
determining a target preset control action corresponding to the target seat according to the identification result of the target seat;
determining target equipment aimed at by the target preset control action and a target control instruction corresponding to the target preset control action;
and sending the target control instruction to the target equipment.
Optionally, a speaker is disposed inside the vehicle, and the first ultrasonic signal is emitted by the speaker inside the vehicle; and the number of the first and second groups,
each seat in the vehicle is provided with a microphone array for picking up ultrasonic signals.
According to a second aspect of embodiments of the present disclosure, there is provided an apparatus for controlling a device, the apparatus including:
a transmitting module configured to transmit a preset first ultrasonic signal in a space in a vehicle;
a first determination module configured to determine, for each seat within the vehicle, a second ultrasonic signal corresponding to the first ultrasonic signal that the seat picked up;
a second determining module, configured to determine, according to the second ultrasonic signal, channel impulse response characterizing information corresponding to each seat, where the channel impulse response characterizing information is used to reflect whether a propagation path of the ultrasonic signal is affected;
a third determination module configured to determine a target seat for a user to take to make a control action according to the channel impulse response characterization information;
and the control module is configured to control at least one of the devices associated with the target seat according to the control action corresponding to the target seat.
Optionally, each seat is provided with a signal acquisition device;
the first determining module includes:
the acquisition sub-module is configured to acquire an initial ultrasonic signal acquired by the signal acquisition equipment of the seat;
the first processing submodule is configured to perform band-pass filtering processing on the initial ultrasonic signal to obtain a processing signal;
the second processing submodule is configured to demodulate the processing signal to obtain a baseband signal;
a first determination submodule configured to determine a target beam matrix corresponding to the seat;
a second determination sub-module configured to determine the second ultrasonic signal from the baseband signal and the target beam matrix.
Optionally, the first ultrasonic signal is emitted by a signal emitting device within the vehicle;
the first determination submodule is configured to: determining the relative position between the signal acquisition device and the signal emission device of the seat; and determining a target beam matrix corresponding to the seat according to the relative position.
Optionally, the second determination module is configured to, for each of the seats, perform the following:
determining a channel impulse response vector corresponding to the seat according to the first ultrasonic signal and a second ultrasonic signal corresponding to the seat;
and determining the intensity change information of the channel impulse response vector according to the signal impulse response vector corresponding to the seat, wherein the intensity change information is used as the channel impulse response characterization information corresponding to the seat.
Optionally, the third determining module includes:
a third determining sub-module, configured to determine, according to the channel impulse response characterization information, a recognition result corresponding to each seat by using a pre-trained recognition model, where the recognition result is used to indicate a control action type corresponding to each seat, and the control action type is one of multiple preset control actions or no control action;
a fourth determination submodule configured to determine a seat, of which the control action type indicated by the recognition result is the preset control action, as the target seat.
Optionally, the preset control action is used for instructing to control the equipment of the vehicle;
the control module includes:
the fifth determining submodule is configured to determine a target preset control action corresponding to the target seat according to the recognition result of the target seat;
a sixth determining sub-module, configured to determine a target device to which the target preset control action is directed and a target control instruction corresponding to the target preset control action;
a transmitting sub-module configured to transmit the target control instruction to the target device.
Optionally, a speaker is disposed inside the vehicle, and the first ultrasonic signal is emitted by the speaker inside the vehicle; and the number of the first and second groups,
each seat in the vehicle is provided with a microphone array for picking up ultrasonic signals.
According to a third aspect of the embodiments of the present disclosure, there is provided a vehicle including:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions in the memory to implement the steps of the method of the first aspect of the disclosure.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the steps of the device control method provided by the first aspect of the present disclosure.
According to a fifth aspect of embodiments of the present disclosure, there is provided a chip comprising a processor and an interface; the processor is configured to read instructions to perform the method of the first aspect of the present disclosure.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
through the scheme, a preset first ultrasonic signal is emitted in the space in the vehicle, for each seat in the vehicle, a second ultrasonic signal which is acquired by the seat and corresponds to the first ultrasonic signal is determined, channel impulse response characterization information corresponding to each seat is determined according to the second ultrasonic signal, a target seat which is used for a user to control is determined according to the channel impulse response characterization information, and at least one device related to the target seat is controlled according to the control action corresponding to the target seat. Wherein, the channel impulse response characterization information is used for reflecting whether the propagation path of the ultrasonic wave signal is affected or not. Therefore, by emitting ultrasonic signals in the vehicle and identifying whether the seats have personnel to do control actions through the pick-up condition of the ultrasonic signals by the seats, the actions of the personnel sitting on the seats can be identified and responded independently, and because the identification of the control actions is based on the ultrasonic signals, the control actions can be identified accurately even at night without being influenced by external factors such as light rays. In addition, the transmission and the reception of the ultrasonic signals can be realized by utilizing the existing equipment on the vehicle, no additional hardware is needed, and the cost can be effectively saved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating a device control method according to an exemplary embodiment.
FIG. 2 is a block diagram illustrating a device control apparatus according to an example embodiment.
FIG. 3 is a functional block diagram schematic of a vehicle shown in an exemplary embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below do not represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
It should be noted that all actions of acquiring signals, information or data in the present application are performed under the premise of complying with the corresponding data protection regulation policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
Fig. 1 is a flowchart illustrating a device control method according to an exemplary embodiment. For example, the method provided by the present disclosure may be applied to a vehicle. As shown in fig. 1, the method may include steps 11-15.
In step 11, a first predetermined ultrasonic signal is emitted in the vehicle interior.
The first ultrasonic signal may be emitted by a signal emitting device within the vehicle. For example, the vehicle interior may be provided with a speaker, and the first ultrasonic signal may be emitted by the speaker of the vehicle interior.
Illustratively, the first ultrasonic signal may use a CHIRP signal.
In step 12, for each seat in the vehicle, a second ultrasonic signal corresponding to the first ultrasonic signal picked up by the seat is determined.
Each seat in the vehicle may be provided with a signal acquisition device for picking up an ultrasonic signal. Wherein the signal acquisition device may be a microphone array on a seat in the vehicle. Illustratively, the signal acquisition device may be a 4-way microphone array.
The first ultrasonic signal sent out in the step 11 is propagated in the vehicle, and for the signal acquisition equipment on each seat in the vehicle, the ultrasonic signal propagated in the vehicle can be acquired.
In a possible embodiment, the step 12 of determining the second ultrasonic signal picked up by the seat corresponding to the first ultrasonic signal may include the following steps:
acquiring an initial ultrasonic signal acquired by signal acquisition equipment of the seat;
performing band-pass filtering processing on the initial ultrasonic signal to obtain a processing signal;
demodulating the processed signal to obtain a baseband signal;
determining a target beam matrix corresponding to the seat;
and determining a second ultrasonic signal according to the baseband signal and the target beam matrix.
Generally, the signal acquisition device can directly acquire an ultrasonic signal propagating in the vehicle, that is, an initial ultrasonic signal. For example, if the signal acquisition device is a 4-microphone array, the initial ultrasonic signal is a 4-microphone signal.
The initial ultrasonic signal is generally noisy and not conducive to signal processing, and therefore, the acquired initial ultrasonic signal can be processed to a certain degree. First, a band-pass filtering process may be performed on the initial ultrasonic signal to obtain a processed signal. The band-pass filtering process of the initial ultrasonic signal may be performed based on the main frequency band of the first ultrasonic signal, thereby removing noise interference outside the ultrasonic band. And then, demodulating the processed signal to obtain a baseband signal. The demodulation processing of the processed signal can demodulate the processed signal into a low-frequency baseband signal so as not to make the high-frequency signal difficult to process.
After obtaining the baseband signal, a target beam matrix corresponding to the seat may be determined, and the second ultrasonic signal may be determined based on the baseband signal and the target beam matrix.
The target beam matrix corresponding to the seat is used for determining the ultrasonic signals capable of being picked up in the seat direction.
With the in-vehicle seat, the relative positions of the signal pickup device thereon and the signal transmission device in the vehicle are known, and therefore, a beam matrix in a fixed direction can be set in advance, with which a signal obtained if the signal pickup device picks up only an ultrasonic signal in the direction of the signal transmission device can be determined. For example, for a left-hand-driven vehicle, assuming that the signal-emitting device in the vehicle is located in front of the interior of the vehicle (e.g., near the center console), the relative position between the driver's seat (including the signal acquisition device disposed thereon) and the signal-emitting device in the vehicle is known and fixed, and if the signal-emitting device is located at a north-east X degree of the signal acquisition device, the north-east X degree is the fixed direction as described above.
For example, the target beam matrix corresponding to the seat may be determined by:
determining the relative position between the signal acquisition device and the signal transmission device of the seat;
and determining a target beam matrix corresponding to the seat according to the relative position.
In the present disclosure, the beam matrix corresponding to each relative position may be set empirically, and then, a correspondence relationship between the relative position and the beam matrix may be formed. Here, each relative position corresponds to a relative position between the signal acquisition device on one seat in the vehicle and the signal transmission device in the vehicle, for example, if 4 seats are included in the vehicle and each seat is provided with a signal acquisition device, there is a relative position between the signal acquisition device and the signal transmission device on the 4 seats. Based on the corresponding relationship between the relative position and the beam matrix, according to the relative position between the signal acquisition device and the signal transmitting device on the seat, the beam matrix corresponding to the relative position between the signal acquisition device and the signal transmitting device on the seat, that is, the target beam matrix corresponding to the seat can be determined.
For example, if the primary and secondary driving positions correspond to a beam matrix respectively. If the baseband signal obtained after the above processing is a, the beam matrix corresponding to the main steering position is M0, and the beam matrix corresponding to the sub-steering position is M1, the second ultrasonic signal B including only the main steering position can be specified by B = a × M0, and the second ultrasonic signal C including only the sub-steering position can be specified by C = a × M1.
In step 13, channel impulse response characterizing information corresponding to each seat is determined according to the second ultrasonic wave signal.
The Channel Response (Channel Response) characterizes the way in which the input signal is transformed into the output signal. The channel Response in the time domain is called Channel Impulse Response (CIR). For example, in R (n) = S (t) × h (n), S (n) is a transmission signal on the baseband, R (n) is a reception signal on the baseband, h (n) is a channel impulse response vector, × is convolution, and h (n) may be referred to as a channel impulse response vector converted from S (n) to R (n).
Motion recognition (e.g., gesture motion recognition) may be implemented based on CIR, which reflects the path impact experienced between the propagation of the ultrasonic signal through the signal emitting device to the signal acquisition device, dCIR (differential Channel Impulse Response). When the seat occupant is not active, CIR is a stable value, dCIR should be close to 0; when a seat occupant is in motion, the propagation path of the ultrasonic signal is disturbed, and thus the dCIR fluctuates accordingly. Meanwhile, the action of the fluctuation can be indirectly reflected by the different fluctuation conditions of the dCIR.
Based on the above description, it can be known that the identification of the action of the person in the seat can be assisted by the information related to the channel impulse response. Therefore, whether the propagation path of the ultrasonic signal is affected or not can be reflected through the channel impulse response characterization information, and the subsequent action identification can be carried out by utilizing the channel impulse response characterization information.
In a possible embodiment, the step 13 of determining the channel impulse response characterizing information corresponding to each seat according to the second ultrasonic signal may include the following steps.
For each seat, the following operations are performed:
determining a channel impulse response vector corresponding to the seat according to the first ultrasonic signal and a second ultrasonic signal corresponding to the seat;
and determining the intensity change information of the channel impulse response vector according to the signal impulse response vector corresponding to the seat, and using the intensity change information as the channel impulse response characterization information corresponding to the seat.
As described above with respect to the channel impulse response vector, for each seat, the channel impulse response vector corresponding to the seat may be determined based on the first ultrasonic signal and the second ultrasonic signal corresponding to the seat. Furthermore, based on the signal impulse response vector corresponding to the seat, the strength variation information of the channel impulse response vector (i.e. the above dCIR) can be further determined as the channel impulse response characterization information corresponding to the seat.
Illustratively, the strength variation information of the channel impulse response vector may be amplitude variance information of the channel impulse response vector or energy variance information of the channel impulse response vector.
In step 14, a target seat for the riding user to perform a control action is determined according to the channel impulse response characterization information.
In one possible embodiment, step 14 may include the steps of:
determining a recognition result corresponding to each seat by utilizing a pre-trained recognition model according to the channel impulse response characterization information;
and determining the seat with the control action type indicated by the identification result as the preset control action as the target seat.
And the identification result is used for indicating the control action type corresponding to each seat. And, the control action type may be one of a plurality of preset control actions or a no-control action. The preset control action may be used to instruct control of a device of the vehicle. The preset control action may be any available gesture action, for example, a finger drawing a designated graphic.
The recognition model may be a Convolutional Neural Network (CNN) model. The training of the recognition model may first obtain multiple sets of training data including training channel impulse response characterization information and a control action type corresponding to the training channel impulse response characterization information, and in the model training process, perform model training by using the training channel impulse response characterization information as an input of the model and using the control action type corresponding to the training channel impulse response characterization information as an expected output of the model to obtain a trained recognition model.
Further, based on the recognition result, the seat whose control action type indicated by the recognition result is the preset control action may be determined as the target seat. The seat whose control action type indicated by the recognition result is no control action is not determined as the target seat.
At step 15, at least one of the devices associated with the target seat is controlled in accordance with the control action corresponding to the target seat.
In one possible embodiment, step 15 may comprise the steps of:
determining a target preset control action corresponding to the target seat according to the identification result of the target seat;
determining target equipment aimed at by the target preset control action and a target control instruction corresponding to the target preset control action;
and sending a target control instruction to the target equipment.
As mentioned above, the preset control actions may be used to indicate control of a device of the vehicle, whereas a user in a seat typically only needs to control the device in relation to the seat. Therefore, the target device to which the target preset control action corresponding to the target seat is directed is a device associated with the target seat, and meanwhile, the preset control action generally has its corresponding instruction meaning, so that based on the target preset control action corresponding to the target seat, the target device to which the target preset control action is directed and the target control instruction can be determined. Further, the target device may be controlled by sending a target control instruction to the target device. Thereby, a separate response to the control actions of the user on different seats can be achieved.
Through the scheme, a preset first ultrasonic signal is emitted in the space in the vehicle, for each seat in the vehicle, a second ultrasonic signal which is acquired by the seat and corresponds to the first ultrasonic signal is determined, channel impulse response characterization information corresponding to each seat is determined according to the second ultrasonic signal, a target seat which is used for a user to control is determined according to the channel impulse response characterization information, and at least one device related to the target seat is controlled according to the control action corresponding to the target seat. Wherein, the channel impulse response characterization information is used for reflecting whether the propagation path of the ultrasonic wave signal is affected or not. Therefore, by emitting ultrasonic signals in the vehicle and identifying whether the seats have personnel to do control actions through the pick-up condition of the ultrasonic signals by the seats, the actions of the personnel sitting on the seats can be identified and responded independently, and because the identification of the control actions is based on the ultrasonic signals, the control actions can be identified accurately even at night without being influenced by external factors such as light rays. In addition, the transmission and the reception of the ultrasonic signals can be realized by utilizing the existing equipment on the vehicle, no additional hardware is needed, and the cost can be effectively saved.
Fig. 2 is a block diagram illustrating an apparatus control device according to an exemplary embodiment. As shown in fig. 2, the apparatus 20 includes:
a transmission module 21 configured to transmit a preset first ultrasonic signal in a space in a vehicle;
a first determination module 22 configured to determine, for each seat in the vehicle, a second ultrasonic signal corresponding to the first ultrasonic signal that the seat has picked up;
a second determining module 23, configured to determine, according to the second ultrasonic signal, channel impulse response characterizing information corresponding to each seat, where the channel impulse response characterizing information is used to reflect whether a propagation path of the ultrasonic signal is affected;
a third determining module 24 configured to determine a target seat for a user to perform a control action according to the channel impulse response characterization information;
a control module 25 configured to control at least one of the devices associated with the target seat according to the control action corresponding to the target seat.
Optionally, each seat is provided with a signal acquisition device;
the first determining module 22 includes:
the acquisition sub-module is configured to acquire an initial ultrasonic signal acquired by the signal acquisition equipment of the seat;
the first processing submodule is configured to perform band-pass filtering processing on the initial ultrasonic signal to obtain a processed signal;
the second processing submodule is configured to demodulate the processing signal to obtain a baseband signal;
a first determination submodule configured to determine a target beam matrix corresponding to the seat;
a second determination sub-module configured to determine the second ultrasonic signal according to the baseband signal and the target beam matrix.
Optionally, the first ultrasonic signal is emitted by a signal emitting device within the vehicle;
the first determination submodule is configured to: determining the relative position between the signal acquisition device and the signal emission device of the seat; and determining a target beam matrix corresponding to the seat according to the relative position.
Optionally, the second determining module 23 is configured to, for each seat, perform the following operations:
determining a channel impulse response vector corresponding to the seat according to the first ultrasonic signal and a second ultrasonic signal corresponding to the seat;
and determining the intensity change information of the channel impulse response vector according to the signal impulse response vector corresponding to the seat, wherein the intensity change information is used as the channel impulse response characterization information corresponding to the seat.
Optionally, the third determining module 24 includes:
a third determining submodule configured to determine, according to the channel impulse response characterization information, a recognition result corresponding to each seat by using a pre-trained recognition model, where the recognition result is used to indicate a control action type corresponding to each seat, and the control action type is one of multiple preset control actions or no control action;
a fourth determination submodule configured to determine a seat, of which the control action type indicated by the recognition result is the preset control action, as the target seat.
Optionally, the preset control action is used for instructing to control a device of the vehicle;
the control module 25 includes:
the fifth determining submodule is configured to determine a target preset control action corresponding to the target seat according to the recognition result of the target seat;
a sixth determining sub-module, configured to determine a target device to which the target preset control action is directed and a target control instruction corresponding to the target preset control action;
a transmitting sub-module configured to transmit the target control instruction to the target device.
Optionally, a speaker is disposed inside the vehicle, and the first ultrasonic signal is emitted by the speaker inside the vehicle; and (c) a second step of,
each seat in the vehicle is provided with a microphone array for picking up ultrasonic signals.
With regard to the apparatus in the above embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be described in detail here.
The present disclosure also provides a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the steps of the device control method provided by the present disclosure.
The apparatus may be a part of a stand-alone electronic device, for example, in an embodiment, the apparatus may be an Integrated Circuit (IC) or a chip, where the IC may be one IC or a collection of multiple ICs; the chip may include, but is not limited to, the following categories: a GPU (Graphics Processing Unit), a CPU (Central Processing Unit), an FPGA (Field Programmable Gate Array), a DSP (Digital Signal Processor), an ASIC (Application Specific Integrated Circuit), an SOC (System on Chip, SOC, system on Chip, or System on Chip), and the like. The integrated circuit or chip may be configured to execute executable instructions (or code) to implement the device control method. Where the executable instructions may be stored in the integrated circuit or chip or may be retrieved from another device or apparatus, such as an integrated circuit or chip that includes a processor, memory, and an interface for communicating with other devices. The executable instructions may be stored in the memory, and when executed by the processor, implement the device control method described above; alternatively, the integrated circuit or chip may receive executable instructions through the interface and transmit the executable instructions to the processor for execution, so as to implement the device control method.
Referring to fig. 3, fig. 3 is a functional block diagram of a vehicle 600 according to an exemplary embodiment. The vehicle 600 may be configured in a fully or partially autonomous driving mode. For example, the vehicle 600 may acquire environmental information around the vehicle through the sensing system 620 and derive an automatic driving strategy based on an analysis of the surrounding environmental information to implement fully automatic driving, or present the analysis results to the user to implement partially automatic driving.
Vehicle 600 may include various subsystems such as infotainment system 610, perception system 620, decision control system 630, drive system 640, and computing platform 650. Alternatively, vehicle 600 may include more or fewer subsystems, and each subsystem may include multiple components. In addition, each of the sub-systems and components of the vehicle 600 may be interconnected by wire or wirelessly.
In some embodiments, the infotainment system 610 may include a communication system 611, an entertainment system 612, and a navigation system 613.
The communication system 611 may comprise a wireless communication system that may communicate wirelessly with one or more devices, either directly or via a communication network. For example, the wireless communication system may use 3G cellular communication, such as CDMA, EVD0, GSM/GPRS, or 4G cellular communication, such as LTE. Or 5G cellular communication. The wireless communication system may communicate with a Wireless Local Area Network (WLAN) using WiFi. In some embodiments, the wireless communication system may communicate directly with the device using an infrared link, bluetooth, or ZigBee. Other wireless protocols, such as various vehicular communication systems, for example, a wireless communication system may include one or more Dedicated Short Range Communications (DSRC) devices that may include public and/or private data communications between vehicles and/or roadside stations.
The entertainment system 612 may include a display device, a microphone, and a sound box, and a user may listen to a broadcast in the car based on the entertainment system, playing music; or the mobile phone is communicated with the vehicle, screen projection of the mobile phone is realized on the display equipment, the display equipment can be in a touch control type, and a user can operate the display equipment by touching the screen.
In some cases, the voice signal of the user may be captured by a microphone, and certain control of the vehicle 600 by the user, such as adjusting the temperature in the vehicle, etc., may be implemented according to the analysis of the voice signal of the user. In other cases, music may be played to the user through a stereo.
The navigation system 613 may include a map service provided by a map provider to provide navigation of a route of travel for the vehicle 600, and the navigation system 613 may be used in conjunction with a global positioning system 621 and an inertial measurement unit 622 of the vehicle. The map service provided by the map provider can be a two-dimensional map or a high-precision map.
The sensing system 620 may include several sensors that sense information about the environment surrounding the vehicle 600. For example, the sensing system 620 may include a global positioning system 621 (the global positioning system may be a GPS system, a beidou system or other positioning system), an Inertial Measurement Unit (IMU) 622, a laser radar 623, a millimeter wave radar 624, an ultrasonic radar 625, and a camera 626. The sensing system 620 may also include sensors of internal systems of the monitored vehicle 600 (e.g., an in-vehicle air quality monitor, a fuel gauge, an oil temperature gauge, etc.). Sensor data from one or more of these sensors may be used to detect the object and its corresponding characteristics (position, shape, orientation, velocity, etc.). Such detection and identification is a critical function of the safe operation of the vehicle 600.
Global positioning system 621 is used to estimate the geographic location of vehicle 600.
The inertial measurement unit 622 is used to sense a pose change of the vehicle 600 based on the inertial acceleration. In some embodiments, the inertial measurement unit 622 may be a combination of an accelerometer and a gyroscope.
Lidar 623 utilizes laser light to sense objects in the environment in which vehicle 600 is located. In some embodiments, lidar 623 may include one or more laser sources, laser scanners, and one or more detectors, among other system components.
The millimeter-wave radar 624 utilizes radio signals to sense objects within the surrounding environment of the vehicle 600. In some embodiments, in addition to sensing objects, the millimeter-wave radar 624 may also be used to sense the speed and/or heading of objects.
The ultrasonic radar 625 may sense objects around the vehicle 600 using ultrasonic signals.
The camera 626 is used to capture image information of the surroundings of the vehicle 600. The image capturing device 626 may include a monocular camera, a binocular camera, a structured light camera, a panoramic camera, and the like, and the image information acquired by the image capturing device 626 may include still images or video stream information.
Decision control system 630 includes a computing system 631 that makes analytical decisions based on information acquired by sensing system 620, decision control system 630 further includes a vehicle control unit 632 that controls the powertrain of vehicle 600, and a steering system 633, throttle 634, and brake system 635 for controlling vehicle 600.
The computing system 631 may operate to process and analyze the various information acquired by the perception system 620 to identify objects, and/or features in the environment surrounding the vehicle 600. The targets may include pedestrians or animals, and the objects and/or features may include traffic signals, road boundaries, and obstacles. The computing system 631 may use object recognition algorithms, structure From Motion (SFM) algorithms, video tracking, and the like. In some embodiments, the computing system 631 may be used to map an environment, track objects, estimate the speed of objects, and so on. The computing system 631 may analyze the various information obtained and derive a control strategy for the vehicle.
The vehicle controller 632 may be used to perform coordinated control on the power battery and the engine 641 of the vehicle to improve the power performance of the vehicle 600.
The steering system 633 is operable to adjust the heading of the vehicle 600. For example, in one embodiment, a steering wheel system.
The throttle 634 is used to control the operating speed of the engine 641 and thus the speed of the vehicle 600.
The brake system 635 is used to control the deceleration of the vehicle 600. The braking system 635 may use friction to slow the wheel 644. In some embodiments, the braking system 635 may convert kinetic energy of the wheels 644 to electrical current. The braking system 635 may also take other forms to slow the rotational speed of the wheels 644 to control the speed of the vehicle 600.
The drive system 640 may include components that provide powered motion to the vehicle 600. In one embodiment, the drive system 640 may include an engine 641, an energy source 642, a transmission 643, and wheels 644. The engine 641 may be an internal combustion engine, an electric motor, an air compression engine, or other types of engine combinations, such as a hybrid engine consisting of a gasoline engine and an electric motor, a hybrid engine consisting of an internal combustion engine and an air compression engine. The engine 641 converts the energy source 642 into mechanical energy.
Examples of energy sources 642 include gasoline, diesel, other petroleum-based fuels, propane, other compressed gas-based fuels, ethanol, solar panels, batteries, and other sources of electrical power. The energy source 642 may also provide energy to other systems of the vehicle 600.
The transmission 643 may transmit mechanical power from the engine 641 to the wheels 644. The transmission 643 may include a gearbox, a differential, and a drive shaft. In one embodiment, the transmission 643 may also include other devices, such as clutches. Wherein the drive shaft may include one or more axles that may be coupled to one or more wheels 644.
Some or all of the functionality of the vehicle 600 is controlled by the computing platform 650. Computing platform 650 can include at least one processor 651, and processor 651 can execute instructions 653 stored in a non-transitory computer-readable medium, such as memory 652. In some embodiments, the computing platform 650 may also be a plurality of computing devices that control individual components or subsystems of the vehicle 600 in a distributed manner.
The processor 651 may be any conventional processor, such as a commercially available CPU. Alternatively, the processor 651 may also include a processor such as a Graphics Processor Unit (GPU), a Field Programmable Gate Array (FPGA), a System On Chip (SOC), an Application Specific Integrated Circuit (ASIC), or a combination thereof. Although fig. 3 functionally illustrates a processor, memory, and other elements of a computer in the same block, those skilled in the art will appreciate that the processor, computer, or memory may actually comprise multiple processors, computers, or memories that may or may not be stored within the same physical housing. For example, the memory may be a hard drive or other storage medium located in a different enclosure than the computer. Thus, reference to a processor or computer will be understood to include reference to a collection of processors or computers or memories that may or may not operate in parallel. Rather than using a single processor to perform the steps described herein, some components, such as the steering component and the retarding component, may each have their own processor that performs only computations related to the component-specific functions.
In the disclosed embodiment, the processor 651 may perform the device control method described above.
In various aspects described herein, the processor 651 may be located remotely from the vehicle and in wireless communication with the vehicle. In other aspects, some of the processes described herein are executed on a processor disposed within the vehicle and others are executed by a remote processor, including taking the steps necessary to execute a single maneuver.
In some embodiments, the memory 652 may contain instructions 653 (e.g., program logic), which instructions 653 may be executed by the processor 651 to perform various functions of the vehicle 600. Memory 652 may also contain additional instructions, including instructions to send data to, receive data from, interact with, and/or control one or more of infotainment system 610, perception system 620, decision control system 630, drive system 640.
In addition to instructions 653, memory 652 may store data such as road maps, route information, the location, direction, speed of the vehicle, and other such vehicle data, as well as other information. Such information may be used by the vehicle 600 and the computing platform 650 during operation of the vehicle 600 in autonomous, semi-autonomous, and/or manual modes.
The computing platform 650 may control functions of the vehicle 600 based on inputs received from various subsystems (e.g., the drive system 640, the perception system 620, and the decision control system 630). For example, computing platform 650 may utilize input from decision control system 630 in order to control steering system 633 to avoid obstacles detected by sensing system 620. In some embodiments, the computing platform 650 is operable to provide control over many aspects of the vehicle 600 and its subsystems.
Optionally, one or more of these components described above may be mounted or associated separately from the vehicle 600. For example, the memory 652 may exist partially or completely separate from the vehicle 600. The above components may be communicatively coupled together in a wired and/or wireless manner.
Optionally, the above components are only an example, in an actual application, components in the above modules may be added or deleted according to an actual need, and fig. 3 should not be construed as limiting the embodiment of the present disclosure.
An autonomous automobile traveling on a roadway, such as vehicle 600 above, may identify objects within its surrounding environment to determine an adjustment to the current speed. The object may be another vehicle, a traffic control device, or another type of object. In some examples, each identified object may be considered independently, and based on the respective characteristics of the object, such as its current speed, acceleration, separation from the vehicle, etc., may be used to determine the speed at which the autonomous vehicle is to be adjusted.
Optionally, the vehicle 600 or a sensory and computing device associated with the vehicle 600 (e.g., computing system 631, computing platform 650) may predict behavior of the identified object based on characteristics of the identified object and the state of the surrounding environment (e.g., traffic, rain, ice on the road, etc.). Optionally, each identified object depends on the behavior of each other, so it is also possible to predict the behavior of a single identified object taking all identified objects together into account. The vehicle 600 is able to adjust its speed based on the predicted behavior of the identified object. In other words, the autonomous vehicle is able to determine what steady state the vehicle will need to adjust to (e.g., accelerate, decelerate, or stop) based on the predicted behavior of the object. In this process, other factors may also be considered to determine the speed of the vehicle 600, such as the lateral position of the vehicle 600 in the road being traveled, the curvature of the road, the proximity of static and dynamic objects, and so forth.
In addition to providing instructions to adjust the speed of the autonomous vehicle, the computing device may also provide instructions to modify the steering angle of the vehicle 600 to cause the autonomous vehicle to follow a given trajectory and/or maintain a safe lateral and longitudinal distance from objects in the vicinity of the autonomous vehicle (e.g., vehicles in adjacent lanes on the road).
The vehicle 600 may be any type of vehicle, such as a car, a truck, a motorcycle, a bus, a boat, an airplane, a helicopter, a recreational vehicle, a train, etc., and the embodiment of the present disclosure is not particularly limited.
In another exemplary embodiment, a computer program product is also provided, which comprises a computer program executable by a programmable apparatus, the computer program having code portions for performing the above-mentioned device control method when executed by the programmable apparatus.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. An apparatus control method, characterized in that the method comprises:
transmitting a preset first ultrasonic signal in a space in a vehicle;
for each seat within the vehicle, determining a second ultrasonic signal corresponding to the first ultrasonic signal that was picked up by that seat;
determining channel impulse response characterization information corresponding to each seat according to the second ultrasonic signal, wherein the channel impulse response characterization information is used for reflecting whether a propagation path of the ultrasonic signal is influenced or not;
determining a target seat for a user to take to make a control action according to the channel impulse response characterization information;
and controlling at least one of the devices associated with the target seat according to the control action corresponding to the target seat.
2. The method of claim 1, wherein each of said seats is provided with a signal acquisition device;
the determining the second ultrasonic signal picked up by the seat corresponding to the first ultrasonic signal comprises:
acquiring an initial ultrasonic signal acquired by signal acquisition equipment of the seat;
performing band-pass filtering processing on the initial ultrasonic signal to obtain a processing signal;
demodulating the processed signal to obtain a baseband signal;
determining a target beam matrix corresponding to the seat;
and determining the second ultrasonic wave signal according to the baseband signal and the target beam matrix.
3. The method of claim 2, wherein the first ultrasonic signal is emitted by a signal emitting device within a vehicle;
the determining a target beam matrix corresponding to the seat includes:
determining the relative position between the signal acquisition device and the signal emission device of the seat;
and determining a target beam matrix corresponding to the seat according to the relative position.
4. The method of claim 1, wherein determining channel impulse response characterization information corresponding to each seat from the second ultrasonic signal comprises:
for each of the seats, performing the following operations:
determining a channel impulse response vector corresponding to the seat according to the first ultrasonic signal and a second ultrasonic signal corresponding to the seat;
and determining the intensity change information of the channel impulse response vector according to the signal impulse response vector corresponding to the seat, wherein the intensity change information is used as the channel impulse response characterization information corresponding to the seat.
5. The method of claim 1, wherein determining a target seat for a user to take a control action according to the channel impulse response characterization information comprises:
determining a recognition result corresponding to each seat by utilizing a pre-trained recognition model according to the channel impulse response characterization information, wherein the recognition result is used for indicating a control action type corresponding to each seat, and the control action type is one of multiple preset control actions or no control action;
and determining the seat with the control action type indicated by the identification result as the preset control action as the target seat.
6. The method of claim 5, wherein the preset control action is indicative of controlling a device of the vehicle;
the controlling at least one of the devices associated with the target seat according to the control action corresponding to the target seat includes:
determining a target preset control action corresponding to the target seat according to the identification result of the target seat;
determining target equipment aimed at by the target preset control action and a target control instruction corresponding to the target preset control action;
and sending the target control instruction to the target equipment.
7. The method according to any one of claims 1-6, wherein a speaker is provided inside the vehicle, and the first ultrasonic signal is emitted by the speaker inside the vehicle; and the number of the first and second groups,
each seat in the vehicle is provided with a microphone array for picking up ultrasonic signals.
8. An apparatus control device, characterized in that the device comprises:
a transmitting module configured to transmit a preset first ultrasonic signal in a space in a vehicle;
a first determining module configured to determine, for each seat within the vehicle, a second ultrasonic signal corresponding to the first ultrasonic signal that was picked up by that seat;
a second determining module, configured to determine, according to the second ultrasonic signal, channel impulse response characterizing information corresponding to each seat, where the channel impulse response characterizing information is used to reflect whether a propagation path of the ultrasonic signal is affected;
a third determination module configured to determine a target seat for a user to take to make a control action according to the channel impulse response characterization information;
and the control module is configured to control at least one of the devices associated with the target seat according to the control action corresponding to the target seat.
9. A vehicle, characterized by comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to execute the instructions in the memory to implement the steps of the method of any one of claims 1-7.
10. A computer-readable storage medium, on which computer program instructions are stored, which program instructions, when executed by a processor, carry out the steps of the method according to any one of claims 1 to 7.
11. A chip comprising a processor and an interface; the processor is configured to read instructions to perform the method of any one of claims 1-7.
CN202211043582.2A 2022-08-29 2022-08-29 Equipment control method, device, vehicle, medium and chip Pending CN115447506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211043582.2A CN115447506A (en) 2022-08-29 2022-08-29 Equipment control method, device, vehicle, medium and chip

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211043582.2A CN115447506A (en) 2022-08-29 2022-08-29 Equipment control method, device, vehicle, medium and chip

Publications (1)

Publication Number Publication Date
CN115447506A true CN115447506A (en) 2022-12-09

Family

ID=84301601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211043582.2A Pending CN115447506A (en) 2022-08-29 2022-08-29 Equipment control method, device, vehicle, medium and chip

Country Status (1)

Country Link
CN (1) CN115447506A (en)

Similar Documents

Publication Publication Date Title
CN113596705B (en) Sound production device control method, sound production system and vehicle
CN115042821B (en) Vehicle control method, vehicle control device, vehicle and storage medium
CN115123257B (en) Pavement deceleration strip position identification method and device, vehicle, storage medium and chip
CN115100377A (en) Map construction method and device, vehicle, readable storage medium and chip
CN115205365A (en) Vehicle distance detection method and device, vehicle, readable storage medium and chip
CN114828131B (en) Communication method, medium, vehicle-mounted communication system, chip and vehicle
CN114842440B (en) Automatic driving environment sensing method and device, vehicle and readable storage medium
CN115056784B (en) Vehicle control method, device, vehicle, storage medium and chip
CN115222791B (en) Target association method, device, readable storage medium and chip
CN114782638B (en) Method and device for generating lane line, vehicle, storage medium and chip
CN115051723A (en) Vehicle-mounted antenna device, vehicle-mounted remote communication terminal, vehicle-mounted communication system and vehicle
CN115205848A (en) Target detection method, target detection device, vehicle, storage medium and chip
CN115334111A (en) System architecture, transmission method, vehicle, medium and chip for lane recognition
CN115100630A (en) Obstacle detection method, obstacle detection device, vehicle, medium, and chip
CN115221151A (en) Vehicle data transmission method and device, vehicle, storage medium and chip
CN115042814A (en) Traffic light state identification method and device, vehicle and storage medium
CN115447506A (en) Equipment control method, device, vehicle, medium and chip
CN115407344B (en) Grid map creation method, device, vehicle and readable storage medium
CN115179930B (en) Vehicle control method and device, vehicle and readable storage medium
CN115082886B (en) Target detection method, device, storage medium, chip and vehicle
CN115297434B (en) Service calling method and device, vehicle, readable storage medium and chip
CN115063639B (en) Model generation method, image semantic segmentation device, vehicle and medium
CN114572219B (en) Automatic overtaking method and device, vehicle, storage medium and chip
CN114771514B (en) Vehicle running control method, device, equipment, medium, chip and vehicle
CN114822216B (en) Method and device for generating parking space map, vehicle, storage medium and chip

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination