Detailed Description
The present application will be described in further detail with reference to the following drawings and examples.
The sensing device provided by the embodiment of the application can be arranged in the terminal. Illustratively, the terminal may be a terminal device such as a motor vehicle, a drone, a rail car, or a bicycle. The terminal may comprise a motor vehicle, such as a smart internet vehicle. When the sensing device is arranged in the terminal, the interaction between the passenger and the terminal can be realized, and the intellectualization of the terminal is realized. The following description will be given taking an example in which the terminal includes a vehicle.
The sensing device of the vehicle in the embodiment of the present application may be provided in a vehicle having a sensing function of the vehicle, or other devices having sensing of the vehicle in the vehicle. Such other devices include, but are not limited to: other sensors such as vehicle-mounted terminals, vehicle-mounted controllers, vehicle-mounted modules, vehicle-mounted components, vehicle-mounted radars or vehicle-mounted cameras. The sensing method of the vehicle in the embodiment of the application can also be arranged in other intelligent terminals with sensing functions of the vehicle besides the vehicle, or in components of the intelligent terminals. The intelligent terminal can be intelligent transportation equipment, intelligent household equipment, a robot and the like.
It should be noted that the terms "first", "second", and the like in the description and claims of the present application and the accompanying drawings are used for distinguishing different objects, and are not used for limiting a specific order. The following embodiments of the present application may be implemented individually, or in combination with each other, and the embodiments of the present application are not limited specifically.
The sensing device of the vehicle in the embodiment of the present application includes a skin material layer. The layer of skin material is intended to constitute at least a part of the surface of one or more components of the vehicle and is intended to be touched by an occupant of the vehicle; specifically, the skin material layer 110 may be a skin material layer of a vehicle interior. Illustratively, the skin material layer 110 may be a skin material layer in interior trim such as seats, windows, displays, steering wheels, and lights. The material of the skin material layer 110 may be glass, plastic, wood, leather, fabric, metal, and the like. Fig. 1 is a schematic structural diagram of a skin material layer provided in the present application. As shown in fig. 1, fig. 1 exemplarily shows that the skin material layer 110 may be a skin material layer of a steering wheel, a skin material layer of an instrument display screen, a skin material layer of a vehicle head interior, a skin material layer of a central control, and the like.
Fig. 2 is a schematic structural diagram of an induction device according to an embodiment of the present application. As shown in fig. 2, the sensing device includes a skin material layer 110, a first sensing unit 120, a second sensing unit 130, and a control unit 140; the first sensing unit 120 and the second sensing unit 130 are on a side of the skin material layer 110 facing an interior of one or more components of the vehicle. The first sensing unit 120 and the second sensing unit 130 are electrically connected, and at least one of the first sensing unit 120 and the second sensing unit 130 is respectively connected to the control unit 140.
The first sensing unit 120 is used for sensing a first interaction signal from an occupant;
the second sensing unit 130 is used for sensing a second interaction signal from the occupant;
the control unit 140 is configured to generate a third interactive signal in response to the first interactive signal and the second interactive signal, and send a control signal to a component 150 corresponding to the third interactive signal; the control signal is used to control the component 150 to perform an action corresponding to the control signal.
In some embodiments, the at least one sensing component included in the first sensing unit 120 and the at least one sensing component included in the second sensing unit 130 are different types of sensing components; the sensing component may sense the interaction signal.
When the first sensing unit 120 and the second sensing unit 130 have different sensing parts, the first sensing unit 120 and the second sensing unit 130 can be caused to sense different types of interaction signals.
For example, when the first interaction signal is a touch-type interaction signal, the sensing part in the first sensing unit 120 may sense the touch-type interaction signal. When the second interactive signal is a gesture type interactive signal, the sensing part in the second sensing unit 130 may sense the gesture type interactive signal. The first sensing unit 120 and the second sensing unit 130 are electrically connected, so that signal transmission can be realized between the first sensing unit 120 and the second sensing unit 130.
For example, after the first sensing unit 120 senses the first interaction signal, the first interaction signal may be transmitted to the second sensing unit 130; alternatively, the second sensing unit 130 senses a second interaction signal, and may transmit the second interaction signal to the first sensing unit 120.
In some embodiments, the control unit 140 may be a Domain control module of the vehicle (for example, a Domain control module of a smart cabin Domain, or a Domain control module of another Domain), may be a central control unit (CDC), and may be a control module for controlling the first sensing unit and the second sensing unit, which is not limited herein. The first sensing unit 120 and the second sensing unit 130 may be respectively connected to the control unit 140, and at this time, the first sensing unit 120 and the second sensing unit 130 may respectively provide the control unit 140 with a first interaction signal and a second interaction signal.
In a possible implementation manner, the first sensing unit 120 and the second sensing unit 130 are connected in series and then connected to the control unit 140. At this time, the first sensing unit 120 provides the control unit 140 with a first interaction signal through the second sensing unit 130, and the second sensing unit 130 directly provides the control unit 140 with a second interaction signal;
in another possible implementation manner, the second sensing unit 130 provides the second interaction signal to the control unit 140 through the first sensing unit 120, and the first sensing unit 120 directly provides the first interaction signal to the control unit 140.
During interaction in the vehicle interior, the occupant emits an interaction signal as needed, and the first sensing unit 120 activates the second sensing unit 130 according to the interaction signal.
For example, when the occupant sends a first interaction signal, the first sensing unit 120 senses the first interaction signal, and the second sensing unit 130 forms a second interaction signal according to the first interaction signal, and the control unit 140 is configured to generate a third interaction signal in response to the first interaction signal and the second interaction signal, and send a control signal to the component 150 corresponding to the third interaction signal; the control signal control section 150 performs an action corresponding to the control signal.
For another example, when the occupant sends a second interaction signal, the second sensing unit 130 senses the second interaction signal, and the first sensing unit 120 forms a first interaction signal according to the second interaction signal, and the control unit 140 is configured to generate a third interaction signal in response to the first interaction signal and the second interaction signal, and send a control signal to the component 150 corresponding to the third interaction signal; the control signal control section 150 performs an action corresponding to the control signal.
The different interaction signals are simultaneously sensed by the first sensing unit 120 and the second sensing unit 130, so that the sensing device can sense various interaction signals, and further, the sensing device can support richer human-vehicle interaction modes and application scenes, and the intelligence of the vehicle is improved.
The component 150 may be any component of a vehicle, including one or more. For example, component 150 may be a mechanism that adjusts any of a seat, a window, a display screen, and a light. In addition, the component 150 may also correspond to the skin material layer 110.
Illustratively, when the skin material layer 110 is a skin material layer of a seat, the component 150 may be an actuator that adjusts the seat. When the layer of skin material 110 is that of a vehicle window, the component 150 may be an actuator for adjusting the vehicle window.
In other embodiments, when the control unit 140 forms the control signal, it may implement one interaction signal to match with the actions of the plurality of components 150, so as to implement multi-modal human-vehicle interaction, and improve multi-modal attitude and intelligence of the vehicle.
For example, when the interaction signal is to control the opening and closing part and the hardware function of the vehicle, the interaction not only includes the interaction of the opening and closing part and the hardware function of the vehicle, but also includes human-vehicle interaction such as voice interaction prompt and display interaction inside and outside the vehicle, and thus multi-mode human-vehicle interaction is realized.
For example, the opening and closing member of the vehicle may include a door, a window, a sunroof, a tailgate, etc., and the controlling hardware function in the vehicle may include controlling seat sliding, seat lifting, air conditioning wind direction, temperature, sound volume, light, etc.
It should be noted that fig. 2 exemplarily shows that the skin material layer 110 is in contact with the first sensing unit 120. In other embodiments, the skin material layer 110 may also be free of contact with the first sensing element 120, simply by having the first sensing element 120 on the side of the skin material layer 110 facing the interior of one or more components of the vehicle.
In one possible implementation, the first interactive signal or the second interactive signal may be any one of:
sensing the action of the passenger formed on the surface of the skin material layer or the nearby space;
the position of the sensing action.
Specifically, the interactive signal (e.g., the first interactive signal, the second interactive signal, etc.) in the present application may include a plurality of types, and the first interactive signal and the second interactive signal may be different types of interactive signals. Illustratively, the interaction signal may be an interaction signal obtained by an action of an occupant formed on the surface of the skin material layer, a pressure change of the skin material layer 110 may be caused by touching the skin material layer 110, and the first sensing unit 120 or the second sensing unit 130 senses the pressure change of the skin material layer 110, so that the first interaction signal or the second interaction signal may be formed.
Among them, the action of the occupant formed on the surface of the skin material layer may include touching, tapping, sliding, and the like. When the action of the occupant formed on the surface of the skin material layer is a touch, an interactive signal may be formed by a limb or by touching with another object. When the action of the occupant formed on the surface of the skin material layer is tapping, the continuity and number of taps may be set for forming different interaction signals. When the action of the occupant formed on the surface of the skin material layer is sliding, it may be set that the direction of the sliding (for example, the up-down direction, the acting direction, etc.) forms different interaction signals.
The interaction signal may also be an interaction signal obtained from the action of the occupant forming a space near the skin material layer. For example, the occupant performs a preset action in the space near the skin material layer, so that the electric field variation is formed in the space near the skin material layer, and the first sensing unit 120 or the second sensing unit 130 may sense the electric field variation in the space near the skin material layer, so that the first interaction signal or the second interaction signal may be formed. The action of the passenger formed in the space near the skin material layer can comprise gesture action, and different gesture actions can form different interaction signals. For example, the gesture motion may be a slide in a different direction, or may form a fixed gesture such as "OK".
The interaction signal may also be a signal obtained by sensing the position of the action. For example, when the interaction signal is a first interaction signal obtained by the first sensing unit 120 sensing the movement of the occupant formed in the space near the skin material layer, the second sensing unit 130 may sense the movement of the occupant formed in the space near the skin material layer to form a second interaction signal.
For another example, when the interaction signal is the second interaction signal and the second interaction signal is obtained by the second sensing unit 130 sensing the action of the occupant formed in the space near the skin material layer, the first sensing unit 120 may sense the action position of the occupant formed in the space near the skin material layer to form the first interaction signal.
The first interactive signal and the second interactive signal can be interactive signals acquired in various forms, when the first induction unit 120 induces the first interactive signal and the second induction unit 130 induces the second interactive signal, the induction device can realize induction of various interactive signals, and then the induction device can support richer human-vehicle interaction modes and application scenes, so that the intelligence of the vehicle is improved.
On the basis of the above technical solution, the first sensing unit and/or the second sensing unit may include an electric field sensor; the electric field sensor is used for inducing an interaction signal according to the electric field change generated by the interaction signal; the interactive signals comprise touch interactive signals and non-touch interactive signals.
In some embodiments, the electric field sensor may sense a change in an electric field in a surrounding area, thereby generating an electrical signal. The interactive signal can comprise a touch interactive signal and a non-touch interactive signal, and the touch interactive signal can be an interactive signal generated by contacting with the skin material layer. The non-touch interaction signal may include various types, such as a motion interaction signal. When the interaction instructions sent by the passengers are different, the influence of the interaction instructions on the electric field in the area around the electric field sensor is different, so that the electric signals output by the electric field sensor are different, the interaction instructions sent by the passengers can be identified according to the electric signals output by the electric field sensor, and the control unit controls the component to act according to the interaction signals in the subsequent process, so that human-vehicle interaction is realized.
Fig. 3 is a schematic structural diagram of another sensing device according to an embodiment of the present disclosure. As shown in fig. 3, the first sensing unit 120 includes a touch film layer 121; the touch film layer 121 is fixedly connected to the skin material layer 110, and the touch film layer 121 is used for sensing a first interaction signal from the skin material layer 110.
Specifically, when the first interaction signal is generated by an action of the occupant formed on the surface of the skin material layer 110, the touch film layer 121 is fixedly connected to the skin material layer 110, so that a force generated by the first interaction signal on the skin material layer 110 is transmitted to the touch film layer 121, the touch film layer 121 may generate a touch signal according to the force, and the touch signal may include an action content and an action position of the first interaction signal.
In some embodiments, the touch signal may be transmitted to the control unit 140, the control unit 140 performs matching according to the touch signal and a pre-stored interaction signal, generates a third interaction signal after the touch signal and the pre-stored interaction signal are successfully matched, and sends a control signal to a component 150 corresponding to the third interaction signal, and the control signal controls the component 150 to execute an action corresponding to the control signal.
In some embodiments, the touch signal may be further transmitted to the second sensing unit 130, the second sensing unit 130 may form a second interaction signal according to the touch signal, the second interaction signal may include information of an action position of the occupant formed on the surface of the skin material layer 110, the second interaction signal is transmitted to the control unit 140, and the control unit 140 may improve accuracy of the touch signal according to the second interaction signal, thereby improving reliability and accuracy of human-vehicle interaction.
Illustratively, the touch film layer may include a touch layer and in-film electronic circuitry; for example, the touch layer is a capacitor film layer, and the capacitor film layer may include a plurality of capacitors arranged in an array.
When the first interaction signal is formed by an action of an occupant formed on the surface of the skin material layer 110, and the first interaction command acts on the capacitance film layer, the capacitance corresponding to the touch position in the capacitance film layer can be changed, and capacitance value change information of the capacitance is transmitted to the in-film electronic circuit, and the in-film electronic circuit can determine the content and position information of the first interaction signal of the touch command according to the capacitance position and capacitance value change value of the capacitance change in the capacitance film layer, so as to form the touch signal.
Fig. 4 is a schematic structural diagram of another sensing device according to an embodiment of the present disclosure. As shown in fig. 4, in other embodiments, the second sensing unit 130 may include a touch film layer 121; the touch film layer 121 is fixedly connected to the skin material layer 110, and the touch film layer 121 is used for sensing a second interaction signal from the skin material layer 110.
Specifically, the touch film layer 121 is disposed in the second sensing unit 130, and the touch film layer 121 is fixedly connected to the skin material layer 110, so that a force generated by the second interaction signal on the skin material layer 110 is transmitted to the touch film layer 121, the touch film layer 121 can form a touch signal according to the force, and the touch signal can include an action content and an action position of the second interaction signal.
In some embodiments, the touch signal may be transmitted to the control unit 140, and the control unit 140 performs matching according to the touch signal and a pre-stored interaction signal, generates a third interaction signal after the touch signal and the pre-stored interaction signal are successfully matched, and sends a control signal to the component 150 corresponding to the third interaction signal; the control signal control section 150 performs an action corresponding to the control signal.
In some embodiments, the touch signal may be further transmitted to the first sensing unit 120, the first sensing unit 120 may form a first interaction signal according to the touch signal, the first interaction signal may include motion position information of the occupant formed on the surface of the skin material layer 110, the first interaction signal is transmitted to the control unit 140, and the control unit 140 may improve accuracy of the touch signal according to the first interaction signal, thereby improving reliability and accuracy of human-vehicle interaction.
In other embodiments, the first sensing unit may further include a touch control film layer, and the second sensing unit includes a touch control film layer, so that the first sensing unit and the second sensing unit may simultaneously form a touch control signal according to an action of the occupant formed on the surface of the skin material layer 110, and may improve the reliability and accuracy of human-vehicle interaction on the basis that the control unit 140 forms a third interaction signal according to the touch control signal and sends a control signal to the component 150 corresponding to the third interaction signal, and the control signal control component 150 executes the action corresponding to the control signal.
Fig. 5 is a schematic structural diagram of another sensing device according to an embodiment of the present disclosure. As shown in fig. 5, the material of the skin material layer 110 is a transparent material, at least one sensing component of the first sensing unit 120 and the second sensing unit 130 includes a vision sensor 122, an output end of the vision sensor 122 is electrically connected to the control unit 140, and the actions of the occupant include: a posture action of the occupant;
the vision sensor 122 generates a first interaction signal or a second interaction signal in response to the occupant's gestural actions.
In some embodiments, it is exemplarily shown in fig. 5 that the first sensing unit 120 includes a vision sensor 122. The material of the skin material layer 110 may be a transparent material, and the material of the skin material layer 110 may be glass, for example. When the material of the skin material layer 110 is a transparent material, the vision sensor 122 may have visualization of the posture action of the occupant, that is, the vision sensor 122 may directly acquire the action content of the posture action through the skin material layer 110, so that an interaction signal may be formed according to the posture action.
When the sensing part in the first sensing unit 120 includes the vision sensor 122, the vision sensor 122 in the first sensing unit 120 may sense the posture action of the occupant to generate a first interaction signal.
When the sensing part in the second sensing unit 130 includes the vision sensor 122, the vision sensor 122 in the second sensing unit 130 may sense the posture action of the occupant to generate a second interaction signal.
For example, the gesture motion may be a gesture motion, a gesture image of the gesture motion may be collected through the visual sensor 122, and then the content of the gesture motion is determined according to the gesture image, so that the gesture motion may be recognized according to the content of the gesture motion, and in a subsequent process, the component motion is controlled by the control unit according to the gesture motion, so that human-vehicle interaction is achieved.
In some embodiments, the first sensing unit and/or the second sensing unit may further include an electric field sensor and a visual sensor, so that not only can various interaction signals be sensed, but also the sensing accuracy of the first sensing unit and/or the second sensing unit on the interaction signals can be improved through the electric field sensor and the visual sensor, so that the reliability of the first sensing unit and/or the second sensing unit can be improved, and the reliability of the sensing device can be further improved.
It should be noted that the first sensing unit and the second sensing unit may further include other sensors, for example, a pressure sensor and a light sensor, and the like, and are used to realize that the first sensing unit and the second sensing unit sense more kinds of interaction signals, so that the sensing device can support richer interaction modes and interaction scenes, and further improve the intelligence of the vehicle.
Fig. 6a is a schematic structural diagram of another sensing device according to an embodiment of the present disclosure. As shown in fig. 6a, the sensing device further includes a display module 160, the material of the skin material layer 110 is a transparent material, the display module 160 is disposed between the skin material layer 110 and the first sensing unit 120, or the display module 160 is disposed between the skin material layer 110 and the second sensing unit 130, and the display module 160 is electrically connected to at least one of the first sensing unit 120, the second sensing unit 130, or the control unit 140.
Specifically, fig. 6a exemplarily shows that the display module 160 is disposed between the skin material layer 110 and the first sensing unit 120, and is electrically connected with the first sensing unit 120. The display module 160 has a display function. When the material of the skin material layer 110 is transparent, the information displayed by the display module 160 can be transmitted to the passenger through the skin material layer 110, so as to remind the passenger of the current interaction state.
In a possible implementation manner, the display module 160 is configured to display information corresponding to the first interaction signal; for example, when the display module 160 is electrically connected to the first sensing unit 120, the display module 160 may receive a first interaction signal sensed by the first sensing unit 120 and display information corresponding to the first interaction signal according to the first interaction signal.
In another possible implementation manner, the display module 160 is configured to display information corresponding to the second interactive signal; for example, when the display module 160 is electrically connected to the second sensing unit 130, the display module 160 may receive a second interaction signal sensed by the second sensing unit 160 and display information corresponding to the second interaction signal according to the second interaction signal.
In another possible implementation manner, the display module 160 is configured to display information corresponding to the third interactive signal. For example, when the display module 160 is electrically connected to the control unit 140, the display module 160 may receive a third interactive signal generated by the control unit 140 and display information corresponding to the third interactive signal according to the third interactive signal.
In another possible implementation manner, the display module 160 is electrically connected to at least two of the first sensing unit 120, the second sensing unit 130 and the control unit 140, and can display corresponding information according to at least two of the first interaction signal, the second interaction signal and the third interaction signal.
For example, in the control unit 140, based on the third interaction signal, it is determined that the information to be displayed by the display module 160 is the mail information, as shown in fig. 6b, the control unit 140 sends a control signal of the mail information to the display module 160 according to the third interaction signal, and the display module 160 may display a corresponding mail, or an icon of the mail, and the like according to the received control signal of the mail information.
In another possible implementation manner, when the display module 160 is connected to the control unit 140, the display module 160 may further receive a control signal generated by the control unit 140, and display information corresponding to the control signal according to the control signal.
For example, in the control unit 140, based on the third interaction signal, it is determined that the information to be displayed by the display module 160 is the mail information, as shown in fig. 6b, the control unit 140 sends a control signal of the mail information to the display module 160 according to the third interaction signal, and the display module 160 may display a corresponding mail, or an icon of the mail, and the like according to the received control signal of the mail information.
A possible implementation manner is that the information corresponding to the first interactive signal, the information corresponding to the second interactive signal, or the information corresponding to the third interactive signal includes at least one of the following:
feedback information, vehicle speed information, navigation information, power information, driver status information, forward road condition information, in-vehicle temperature information, in-vehicle humidity information, tire pressure information, full vehicle information, out-vehicle temperature information, or humidity information; the feedback information is information fed back to the passenger by the sensing device according to the first interaction signal, the second interaction signal or the third interaction signal.
Specifically, the information corresponding to the interactive signal may include feedback information, and the feedback information may be preset information used for reminding the passenger of the current interactive state, where the interactive signal may include a first interactive signal, a second interactive signal, and a third interactive signal.
In some embodiments, when the display module is used for displaying the feedback information, the information fed back to the occupant by the sensing device according to the interaction signal can be displayed. Illustratively, the feedback information may be preset interaction success display information or interaction failure display information, which may include display information of the interaction signal and the sensing result for reminding the occupant of the interaction result.
When the interactive signal is used for checking the state of the vehicle, the information corresponding to the interactive signal can also comprise the state information of the vehicle, and the display module can also display the state of the vehicle according to the interactive signal. For example, the display module may display at least one of vehicle speed information, navigation information, electric quantity information, driver state information, front road condition information, in-vehicle temperature information, in-vehicle humidity information, tire pressure information, vehicle information, out-vehicle temperature information, and humidity information of the vehicle according to the interactive signal, on the basis of obtaining the authorization for communication with each component in the vehicle and the out-vehicle networking.
In addition, the display module can be arranged on any decoration in the vehicle, so that the state of the vehicle can be displayed by any decoration in the vehicle, the human-vehicle interaction scene is increased, and the intelligence of the display information in the vehicle is improved.
In some embodiments, when the display module is a touch display module, the touch display module includes a touch panel, and the touch panel can sense the action of the occupant formed on the surface of the skin material layer to form an interaction signal.
Fig. 7a is a schematic structural diagram of an induction device according to an embodiment of the present disclosure. As shown in fig. 7a, the sensing device further includes a feedback unit 170, and the feedback unit 170 is between the skin material layer 110 and the first sensing unit 120. In another possible implementation, the feedback unit 170 may also be located between the skin material layer 110 and the second sensing unit 130.
The feedback unit 170 is configured to perform a feedback action in response to the first interaction signal; and/or, the feedback unit 170 is configured to perform a feedback action in response to the second interaction signal; and/or the feedback unit 170 is configured to perform a feedback action in response to the third interaction signal.
The feedback unit 170 is exemplarily shown in fig. 7a between the skin material layer 110 and the first sensing unit 120. In some embodiments, the feedback unit 170 may be connected to the first sensing unit 120 (as shown in fig. 7), and the feedback unit 170 may respond to the first interaction signal sensed by the first sensing unit 120 and perform a feedback action for reminding the occupant of the interaction state.
In other embodiments, the feedback unit 170 may be connected to the second sensing unit 130, and the feedback unit 170 may respond to the second interaction signal sensed by the second sensing unit 130 and perform a feedback action to alert the occupant of the interaction state.
In other embodiments, the feedback unit 170 may be connected to the control unit 140, and the feedback unit 170 may respond to the third interaction signal generated by the control unit 140 and perform a feedback action for reminding the occupant of the interaction state.
In other embodiments, the feedback unit 170 may be further connected to at least two of the first sensing unit 120, the second sensing unit 130 and the control unit 140, and the feedback unit 170 may respond to at least two of the first interaction signal, the second interaction signal and the third interaction signal and perform a feedback action for reminding the occupant of the interaction state.
In other embodiments, when the feedback unit 170 is connected to the control unit 140, the feedback unit 170 may further respond to a control signal sent by the control unit 140 and perform a feedback action for reminding the occupant of the interaction state.
It should be noted that the feedback unit 170 may be sized as needed, and may be the same as or different from the size of the skin material layer 110. In addition, the feedback unit 170 may be between the skin material layer 110 and the first sensing unit 120, and located at one side of the skin material layer 110 in a horizontal plane.
As shown in fig. 7b, the feedback unit 170 includes a vibration feedback component 171, the vibration feedback component 171 is electrically connected to the first sensing unit 120, the second sensing unit 130, or the control unit 140, and the vibration feedback component 171 is configured to vibrate according to the first interaction signal, the second interaction signal, or the third interaction signal, so that the feedback action is vibration.
In one possible implementation manner, as shown in fig. 7c, the feedback unit 170 may further include a voice module 172, and after the feedback unit 170 acquires the interaction signal, the voice module 172 may perform voice information feedback according to the interaction signal. The feedback unit 170 may be on one side of the skin material layer 110, and it is only necessary that the feedback unit 170 can obtain the interaction signal.
For example, after the first interaction signal, the second interaction signal or the third interaction signal is transmitted to the voice module 172, the voice module 172 may remind the passenger of the success of the interaction signal sensing according to the interaction signal. When the interactive signal fails to be sensed, the voice module 172 may alert the passenger that the interactive signal fails to be sensed according to the interactive signal.
On the basis of the technical schemes, the surface material layer comprises a touch feeling adjusting unit; the touch adjusting unit is connected with the control unit and is used for adjusting the touch of the surface of the part formed by the surface material layer according to the control signal.
Specifically, the tactile sensation adjusting unit may be provided as a component of a vehicle for adjusting the tactile sensation of the skin material layer. For example, the tactile sensation adjusting unit may adjust a temperature of the skin material layer, a hardness of the skin material layer, and at least one of a frosted state or a smooth state of the skin material layer.
When the interaction information of the first interaction signal or the second interaction signal is the touch sense for adjusting the skin material layer, the control unit forms a third interaction signal according to the first interaction signal and the second interaction signal, the third interaction signal corresponds to the touch sense adjusting unit, so that the control unit sends a control signal to the touch sense adjusting unit, the touch sense adjusting unit is controlled to adjust the touch sense of the skin material layer, the human-vehicle interaction intelligence is improved, and the experience effect of passengers is improved.
Optionally, the skin material layer may further include a form adjusting unit; the form adjusting unit is connected with the control unit and is used for adjusting the form of the surface of the part formed by the surface layer material layer according to the control signal.
In particular, the form adjusting unit may also be a component of a vehicle for adjusting the form of the skin material layer. Illustratively, the form adjusting unit may adjust a bending state and a color change, etc., of the skin material layer.
When the interaction information of the first interaction signal or the second interaction signal is used for adjusting the form of the surface material layer, the control unit forms a third interaction signal according to the first interaction signal and the second interaction signal, the third interaction signal corresponds to the form adjusting unit, so that the control unit sends a control signal to the form adjusting unit, the form adjusting unit is controlled to adjust the form of the surface material layer, the intelligence of human-vehicle interaction is improved, and the experience effect of passengers is improved.
In one possible implementation manner, the component corresponding to the third interactive signal includes at least one of the following:
the device comprises a vehicle opening and closing piece, an in-vehicle comfort degree adjusting piece and an in-vehicle and out-vehicle display piece;
the control signal is used for sending the control signal to at least one of the vehicle opening and closing part, the interior comfort degree regulating part and the interior and exterior display part.
Specifically, the component corresponding to the third interactive signal is a component for executing the interactive information of the first interactive signal and the second interactive signal in the vehicle.
The vehicle opening and closing member may include a door, a window, a sunroof, a tailgate, and the like of the vehicle. The part corresponding to the third interactive signal comprises the vehicle opening and closing part, so that the control unit can form the third interactive signal according to the first interactive signal and the second interactive signal, send a control signal to the vehicle opening and closing part and control the vehicle opening and closing part to execute the action corresponding to the control signal, human-vehicle interaction of the vehicle opening and closing part can be realized, and function intellectualization of the vehicle is improved.
Comfort level regulating part in the car can include seat slider, seat lift spare, air conditioner wind direction regulating part, temperature regulation spare, stereo set volume size regulating part and light regulating part etc.. The part corresponding to the third interactive signal comprises the in-vehicle comfortable adjusting piece, so that the control unit can form the third interactive signal according to the first interactive signal and the second interactive signal, send a control signal to the in-vehicle comfortable adjusting piece and control the in-vehicle comfortable adjusting piece to execute the action corresponding to the control signal, thereby realizing human-vehicle interaction of the in-vehicle comfortable adjusting piece and improving the function intelligence of the vehicle.
The in-vehicle and out-vehicle display member may include an in-vehicle screen display member, an out-vehicle information display member, and the like. The part corresponding to the third interactive signal comprises the in-vehicle and external display part, so that the control unit can form the third interactive signal according to the first interactive signal and the second interactive signal, send a control signal to the in-vehicle and external display part and control the in-vehicle and external display part to execute the action corresponding to the control signal, thereby realizing human-vehicle interaction of the in-vehicle and external display part and improving the display intelligence of the vehicle.
In other embodiments, the parts corresponding to the third interaction signal include at least two of the vehicle opening and closing part, the in-vehicle comfort level adjusting part and the in-vehicle and out-vehicle display part, so that the function intellectualization and the display intellectualization of the vehicle can be further improved, and the experience effect of passengers is further improved.
Fig. 8 is a schematic structural diagram of another sensing device according to an embodiment of the present disclosure. As shown in fig. 8, the sensing device further includes a power supply unit 180 and a switch unit 190, wherein the power supply unit 180 is electrically connected to at least one of the first sensing unit 120 and the second sensing unit 130 through the switch unit 190; the switch unit 190 is used for controlling the power supply unit 180 to provide the power supply signals to the first sensing unit 120 and the second sensing unit 130.
Specifically, the switching unit 190 may include a plurality of switching devices. The switch unit 190 may be a mechanical switch, and the state of the switch unit 190 may be manually controlled by an occupant. The switching unit 190 may also be a soft switch, such as a transistor, etc., in which case the state of the switching unit 190 may be controlled by an instruction input by the occupant.
In one possible implementation manner, when at least one of the first sensing unit 120 and the second sensing unit 130 is connected to the power supply unit 180 through a switching device in the switching unit 190, the first sensing unit 120 and the second sensing unit 130 may be connected in series, and then one of the first sensing unit 120 and the second sensing unit 130 is connected to the switching unit 190.
In another possible implementation manner, when at least one of the first sensing unit 120 and the second sensing unit 130 is connected to the power supply unit 180 through a switching device in the switching unit 190, the first sensing unit 120 and the second sensing unit 130 may be connected in parallel, and then both the first sensing unit 120 and the second sensing unit 130 are connected to the switching unit 190.
It should be noted that, when the first sensing unit 120 and the second sensing unit 130 are connected in parallel, the switching unit 190 can separately control the conduction state between the power supply unit 180 and the first sensing unit 120 and the conduction state between the power supply unit 180 and the second sensing unit 130 as required, and at this time, the switching unit 190 includes at least two switching devices.
When the switch unit 190 is turned on, the power signal provided by the power unit 180 can be transmitted to the first sensing unit 120 and the second sensing unit 130 through the switch unit 190, so that the first sensing unit 120 and the second sensing unit 130 can normally operate according to the command signal.
When the switch unit 190 is turned off, the power signal provided by the power unit 180 cannot be transmitted to the first sensing unit 120 and the second sensing unit 130 through the switch unit 190, so that the first sensing unit 120 and the second sensing unit 130 cannot be powered on to work, thereby preventing a passenger from realizing human-vehicle interaction through the first sensing unit 120 and the second sensing unit 130, and providing more choices for the passenger.
In other embodiments, the switch unit 190 controls the conduction state between the power unit 180 and the first sensing unit 120 and the conduction state between the power unit 180 and the second sensing unit 130 at the same time, and the switch unit 190 may include only one switching device to control the conduction state between the power unit 180 and the first sensing unit 120 and the conduction state between the power unit 180 and the second sensing unit 130 to be the same.
The embodiment of this application still provides a vehicle, and this vehicle includes the induction system that this application arbitrary embodiment provided.
It is to be noted that the foregoing is only illustrative of the preferred embodiments of the present application and the technical principles employed. It will be understood by those skilled in the art that the present application is not limited to the particular embodiments described herein, but is capable of various obvious changes, rearrangements and substitutions as will now become apparent to those skilled in the art without departing from the scope of the application. Therefore, although the present application has been described in more detail with reference to the above embodiments, the present application is not limited to the above embodiments, and may include other equivalent embodiments without departing from the spirit of the present application, and the scope of the present application is determined by the scope of the appended claims.