CN113844384A - Vehicle-mounted human-computer interaction system - Google Patents

Vehicle-mounted human-computer interaction system Download PDF

Info

Publication number
CN113844384A
CN113844384A CN202111031556.3A CN202111031556A CN113844384A CN 113844384 A CN113844384 A CN 113844384A CN 202111031556 A CN202111031556 A CN 202111031556A CN 113844384 A CN113844384 A CN 113844384A
Authority
CN
China
Prior art keywords
vehicle
person
board
wake
body panel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111031556.3A
Other languages
Chinese (zh)
Inventor
梁晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Pateo Electronic Equipment Manufacturing Co Ltd
Original Assignee
Shanghai Pateo Electronic Equipment Manufacturing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Pateo Electronic Equipment Manufacturing Co Ltd filed Critical Shanghai Pateo Electronic Equipment Manufacturing Co Ltd
Priority to CN202111031556.3A priority Critical patent/CN113844384A/en
Publication of CN113844384A publication Critical patent/CN113844384A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R16/00Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
    • B60R16/02Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R1/00Details of transducers, loudspeakers or microphones
    • H04R1/02Casings; Cabinets ; Supports therefor; Mountings therein
    • H04R1/025Arrangements for fixing loudspeaker transducers, e.g. in a box, furniture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04RLOUDSPEAKERS, MICROPHONES, GRAMOPHONE PICK-UPS OR LIKE ACOUSTIC ELECTROMECHANICAL TRANSDUCERS; DEAF-AID SETS; PUBLIC ADDRESS SYSTEMS
    • H04R2499/00Aspects covered by H04R or H04S not otherwise provided for in their subgroups
    • H04R2499/10General applications
    • H04R2499/13Acoustic transducers and sound field adaptation in vehicles

Abstract

The invention provides a vehicle-mounted human-computer interaction system. The vehicle-mounted man-machine interaction system comprises: an external speaker provided inside a body panel that is an outer surface of a vehicle; a speaker cover plate disposed on the vehicle body panel and closely fitted with the vehicle body panel to cover the external speaker in a closed state; a wake-up subsystem for waking up an onboard controller to turn on a human-computer interaction function based on at least one of a voice, an action, and an image of an off-board object; and the vehicle-mounted controller is configured to control the loudspeaker cover plate to be switched from the closed state to the open state after being awakened by the awakening subsystem to start a man-machine interaction function, and send voice to the vehicle-mounted object through the external loudspeaker in the open state.

Description

Vehicle-mounted human-computer interaction system
Technical Field
The present invention relates generally to the field of vehicle control, and more particularly, to an in-vehicle human-computer interaction system.
Background
Currently, the automobile industry is showing development trends of electrification, networking, sharing and intelligence, which puts new technical demands on aspects of vehicle driving modes, user interaction experiences, vehicle use modes and the like, and the technical improvement of the current vehicle system is urgently needed to meet the demands.
The human-computer interaction of the current vehicle is usually focused on the interaction between the vehicle-mounted person (such as a driver or a passenger) and the vehicle-mounted control terminal, for example, the vehicle-mounted control terminal receives a voice command of the vehicle-mounted person to perform a corresponding operation (such as playing music, turning on an air conditioner, etc.). The interaction between the person outside the vehicle and the vehicle is still limited to a vehicle user such as a driver to open or close the vehicle through a smart key or to open or close the vehicle or a part of the vehicle outside the vehicle through a mobile phone client, and there is no vehicle-mounted human-computer interaction system with stronger applicability to meet the interaction between the person outside the vehicle and the vehicle (or further with the person inside the vehicle or other remote persons).
With the improvement of the intelligent requirements of vehicles, more and more application scenes needing man-machine interaction between people outside the vehicles and the vehicles appear. In many cases, it is desirable for a vehicle or an occupant in the vehicle to send a prompt or a reminder message to the occupant outside the vehicle by voice. For example, since the noise of an electric vehicle is very small, it is difficult for a pedestrian or a rider on the road to notice the approach of the vehicle by sound alone, and in such a case, the vehicle needs to actively make a prompt sound to the pedestrian or the rider. Or when the vehicle wants to give way to pedestrians on the side of the crosswalk, the vehicle also wants to actively send out a prompt to the pedestrians through voice.
However, the above-mentioned needs are not met because the current vehicles are completely without software and hardware facilities designed to allow the vehicle to interact with personnel outside the vehicle.
Disclosure of Invention
In order to solve the problems, the invention provides a vehicle-mounted man-machine interaction system, wherein the voice interaction between a vehicle or a person in the vehicle and a person outside the vehicle is realized by installing an external loudspeaker outside the vehicle.
According to one aspect of the invention, a vehicle-mounted human-computer interaction system is provided. The vehicle-mounted man-machine interaction system comprises: an external speaker provided inside a body panel that is an outer surface of a vehicle; a speaker cover plate disposed on the vehicle body panel and closely fitted with the vehicle body panel to cover the external speaker in a closed state; a wake-up subsystem for waking up an onboard controller to turn on a human-computer interaction function based on at least one of a voice, an action, and an image of an off-board object; and the vehicle-mounted controller is configured to control the loudspeaker cover plate to be switched from the closed state to the open state after being awakened by the awakening subsystem to start a man-machine interaction function, and send voice to the vehicle-mounted object through the external loudspeaker in the open state.
In some embodiments, the system further comprises: a pivot connecting one side of the speaker cover plate to the body panel; and the on-board controller is further configured to drive the pivot by a motor to switch the speaker cover from the closed state to the open state.
In some embodiments, the system further comprises: the first push rod is connected with the loudspeaker cover plate; and the on-board controller is further configured to push the first push rod toward the outside of the body panel by a motor to switch the speaker cover from the closed state to the open state.
In some embodiments, the system further comprises: the second push rod is connected with the external loudspeaker; and the on-vehicle controller is further configured to push the second push rod toward the outside of the vehicle body panel by a motor to push the external speaker from the inside of the vehicle body panel to the outside of the vehicle body panel.
In some embodiments, when the external speaker is pushed from the inside of the vehicle body panel to the outside of the vehicle body panel by the second push rod, the speaker cover is pushed by the external speaker to be converted from the closed state to the open state.
In some embodiments, the system further comprises: an external microphone disposed outside the body panel and configured to receive a voice input of the object outside the vehicle, and the on-board controller is further configured to play the voice input of the object outside the vehicle to a person inside the vehicle through an internal speaker of the vehicle or to perform semantic analysis on the voice input of the object outside the vehicle and to perform control on the vehicle based on a result of the semantic analysis.
In some embodiments, the wake-up subsystem comprises: a sensing part disposed outside the vehicle and configured to sense at least one of a voice, a motion, and an image of the object outside the vehicle; and a wake-up processor configured to determine whether at least one of the sensed voice, motion and image satisfies a predetermined wake-up condition, and wake up the on-board controller to turn on a human interaction function when it is determined that the predetermined wake-up condition is satisfied.
In some embodiments, the sensing component comprises: a shock sensor disposed on the body panel or the vehicle glazing and configured to sense a strike of the body panel or the vehicle glazing by the object outside the vehicle; and the wake-up processor is configured to determine whether the tap satisfies a predetermined tap condition, and wake up the on-board controller upon determining that the tap satisfies the predetermined tap condition.
In some embodiments, the sensing component comprises: a sensor disposed outside the body panel and configured to sense a distance of the object outside the vehicle from the body panel; and the wake-up processor is configured to determine whether the distance meets a predetermined distance threshold, and wake-up the onboard controller upon determining that the distance meets the predetermined distance threshold.
In some embodiments, the in-vehicle human-computer interaction system further comprises: an external camera disposed outside the body panel and configured to capture an image of an occupant outside the vehicle; and the on-board controller is further configured to determine an identity of the off-board person based on the image of the off-board person after being awakened by the wake-up processor, and to perform a vehicle control operation upon determining that the identity of the off-board person is a predetermined person.
In some embodiments, the on-board controller is configured to perform vehicle control operations, including: the communication connection with a remote vehicle owner is established through a vehicle-mounted mobile communication module, so that the remote vehicle owner can view the vehicle-mounted object or perform video communication with the vehicle-mounted person through the communication connection.
In some embodiments, the in-vehicle human-computer interaction system further comprises: an external microphone disposed outside the body panel and configured to receive speech input of an occupant outside the vehicle; and the on-board controller is further configured to perform a vehicle control operation based on the voice input of the off-board person after being awakened by the wake-up processor.
In some embodiments, the on-board controller being configured to perform vehicle control operations comprises: establish the communication connection with remote car owner through on-vehicle mobile communication module to make the outer personnel of car pass through communication connection with remote car owner carries out the voice call.
In some embodiments, the sensing component comprises: an external camera disposed outside the body panel and configured to capture an image of an occupant outside the vehicle; and the wake-up processor is configured to determine an identity of the person off-board based on the image of the person off-board, and wake-up the on-board controller upon determining that the identity of the person off-board is a predetermined person.
In some embodiments, the sensing component comprises: an external camera disposed outside the body panel configured to capture one or more images of an occupant outside the vehicle; and the wake-up processor is configured to determine a gesture of the person off-board based on the one or more images of the person off-board and wake-up the on-board controller upon determining that the gesture of the person off-board conforms to a predetermined gesture.
In some embodiments, the sensing component comprises: an external microphone disposed outside the body panel and configured to receive speech input of an occupant outside the vehicle; and the wake-up processor is configured to determine whether the voice input of the off-board person conforms to a predetermined wake-up word or whether the identity of the off-board person conforms to a predetermined person based on the voice input of the off-board person, and wake up the on-board controller upon determining that the voice input of the off-board person conforms to the predetermined wake-up word or determining that the identity of the off-board person conforms to the predetermined person based on the voice input of the off-board person.
Drawings
The invention will be better understood and other objects, details, features and advantages thereof will become more apparent from the following description of specific embodiments of the invention given with reference to the accompanying drawings.
Fig. 1 shows a schematic structural diagram of an exemplary vehicle-mounted human-computer interaction system according to an embodiment of the invention.
Fig. 2 shows a side view of a vehicle in which the in-vehicle human machine interaction system is installed.
Figure 3A illustrates a front view of a speaker cover panel in a closed state according to some embodiments of the present invention.
Figure 3B illustrates a rear view of a speaker cover panel in a closed state according to some embodiments of the present invention.
Figure 4A illustrates a front view of a speaker cover in an open state according to some embodiments of the present invention.
Figure 4B illustrates a rear view of a speaker cover in an open state according to some embodiments of the present invention.
Figure 5A illustrates a schematic relationship diagram of an external speaker and a speaker cover according to some embodiments of the present invention.
Fig. 5B is a schematic diagram illustrating the relationship between an external speaker and a speaker cover according to other embodiments of the present invention.
FIG. 6 illustrates a block diagram of a control device suitable for implementing embodiments of the present disclosure.
Detailed Description
Preferred embodiments of the present invention will be described in more detail below with reference to the accompanying drawings. While the preferred embodiments of the present invention are shown in the drawings, it should be understood that the present invention may be embodied in various forms and should not be limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art.
In the following description, for the purposes of illustrating various inventive embodiments, certain specific details are set forth in order to provide a thorough understanding of the various inventive embodiments. One skilled in the relevant art will recognize, however, that the embodiments may be practiced without one or more of the specific details. In other instances, well-known devices, structures and techniques associated with this application may not be shown or described in detail to avoid unnecessarily obscuring the description of the embodiments.
Throughout the specification and claims, the word "comprise" and variations thereof, such as "comprises" and "comprising," are to be understood as an open, inclusive meaning, i.e., as being interpreted to mean "including, but not limited to," unless the context requires otherwise.
Reference throughout this specification to "one embodiment" or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiments is included in at least one embodiment. Thus, the appearances of the phrases "in one embodiment" or "in some embodiments" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the terms first, second, third and the like in the description and in the claims, are used for distinguishing between various objects for clarity of description only and do not limit the size, other order and the like of the objects described therein unless otherwise specified.
Fig. 1 shows a schematic structural diagram of an exemplary vehicle-mounted human-computer interaction system 100 according to an embodiment of the invention, and fig. 2 shows a side view of a vehicle 200 in which the vehicle-mounted human-computer interaction system 100 is installed.
As shown in fig. 1 and 2, the in-vehicle human-computer interaction system 100 includes one or more external speakers 110 (1 external speaker 110 is schematically shown in fig. 1), and each external speaker 110 may be disposed inside a body panel 210 that is an outer surface of the vehicle 200. As shown in fig. 2, the external speakers 110 may be disposed at any one or more of the front side 212, the front door 214, the rear door 216, the rear side 218, etc. of the body panel 210, and in some cases, the external speakers 110 may be symmetrically disposed at both sides of the body panel 210.
Each external speaker 110 has a speaker cover 120 associated therewith, and the speaker cover 120 may be disposed on a body panel 210 of the vehicle 200 and mate with the body panel 210 to cover the corresponding external speaker 110 when in a closed state. That is, the external speaker 110 is not visible when the speaker cover 120 is in the closed state. For example, the speaker cover 120 may be closely attached to the external speaker 110 by a magnet on the back of the speaker cover. Further, the speaker cover 120 may be configured to conform to the contour of a streamline at the body panel 210 where it is located. Through these modes, on one hand, can protect external speaker 110 when external speaker 110 is out of work, for example waterproof, dustproof etc. on the other hand, can avoid external speaker 110 and the installation of speaker apron 120 to influence the streamline of vehicle 200 thereby avoid producing the influence to the inherent dynamics of vehicle 200 and statics parameter. Further, in spite of the protection of the speaker cover 120, in consideration of the frequent exposure of the external speaker 110 to the outside, in some embodiments of the present invention, it is preferable to use a waterproof speaker as the external speaker 110 to avoid the influence of rain and dust.
The in-vehicle human machine interaction system 100 further comprises a wake-up subsystem 130 and an in-vehicle controller 140, wherein the wake-up subsystem 130 is configured to wake-up the in-vehicle controller 140 to turn on human machine interaction functionality based on at least one of voice, motion and image of the off-vehicle object. Here, the vehicle-exterior object may include a person outside the vehicle or an object outside the vehicle (such as another vehicle or an obstacle).
The onboard controller 140 is configured to control the speaker cover 120 to transition from the closed state to the open state after being awakened by the wake-up subsystem 130 to turn on the human-machine interaction function, and to transmit voice to the object outside the vehicle through the external speaker 110 when the speaker cover 120 is in the open state.
The present invention provides various embodiments for the mounting and driving of the external speaker 110. Fig. 3A illustrates a front view of the speaker cover 120 in a closed state according to some embodiments of the present invention, and fig. 3B illustrates a rear view of the speaker cover 120 in a closed state according to some embodiments of the present invention. As shown in fig. 3A, the speaker cover 120 is almost integrated with the body panel 210 when it is in a closed state, and the external speaker 110 is not visible to the outside. Fig. 4A illustrates a front view of the speaker cover 120 in an open state according to some embodiments of the present invention, and fig. 4B illustrates a rear view of the speaker cover 120 in an open state according to some embodiments of the present invention. Fig. 5A illustrates a relationship diagram of an external speaker 110 and a speaker cover 120 according to some embodiments of the invention. Fig. 5B is a schematic diagram illustrating the relationship between the external speaker 110 and the speaker cover 120 according to other embodiments of the present invention.
In some embodiments, as shown in fig. 3B and 4B, the vehicle human-computer interaction system 100 may further include a pivot 122 connecting one side of the speaker cover 120 to the body panel 210. In this case, the on-board controller 140 is further configured to drive the pivot shaft 122 to rotate by a motor (not shown) to convert the speaker cover 120 from the closed state (shown in fig. 3A and 3B) to the open state (shown in fig. 4A and 4B). At this time, the external speaker 110 may be in an operating state, and may emit a voice to the outside under the control of the in-vehicle controller 140.
In other embodiments, as shown in FIG. 5A, the in-vehicle human machine interaction system 100 may further include a first push rod 124, which is also located inside the body panel 210 and may be connected to the in-vehicle controller 140 via a motor (not shown). In this case, the on-board controller 140 is also configured to push the first push rod 124 toward the outside of the body panel 210 by the motor to convert the speaker cover 120 from the closed state (shown in fig. 3A and 3B) to the open state (shown in fig. 4A and 4B). At this time, the external speaker 110 may be in an operating state, and may emit a voice to the outside under the control of the in-vehicle controller 140.
In some embodiments, as shown in FIG. 5B, the in-vehicle human-computer interaction system 100 may further include a second push rod 126 connected to the external speaker 110 and to the in-vehicle controller 140 via a motor (not shown). In this case, the on-board controller 140 is further configured to push the second push rod 126 toward the outside of the body panel 210 by the motor to push the external speaker 110 from the inside of the body panel 210 to the outside of the body panel 210, i.e., push the external speaker 110 to protrude from the body panel 210. In such an embodiment, when the external speaker 110 is pushed by the second push rod 126 from the inside of the vehicle body panel 210 to the outside of the vehicle body panel 210, the speaker cover 120 is pushed by the external speaker 110 to be converted from the closed state to the open state. That is, in this case, the in-vehicle human machine interaction system 100 may not include the first push rod 124 or need not control the pivot 122 by the in-vehicle controller 140, but push the speaker cover 120 away by means of the outward pushing force of the external speaker 110. At this time, the external speaker 110 may be in an operating state, and may emit a voice to the outside under the control of the in-vehicle controller 140.
When the external speaker 110 is in the working state, the onboard controller 140 may control the external speaker 110 to emit various reminding voices, such as a voice for reminding the vehicle 200 to pass through, a voice for reminding the person outside the vehicle to forget the article (for example, when the person outside the vehicle has just got off the vehicle), a voice for reminding weather conditions or safety awareness, and the like.
In practical applications, the external speaker 110 may not always be in operation. In case that the vehicle occupant actively wants to send out voice, he or she can send out sound through the external speaker 110 by actively operating the vehicle controller 140, such as sending out voice prompt or playing music. However, in the case where an off-board person wants to interact with the vehicle or an on-board person, or where the off-board object is an object such as another vehicle, the in-vehicle human machine interaction system 100 may wake up the in-vehicle controller 140 to turn on the human machine interaction function by the wake-up subsystem 130 based on at least one of the voice, the motion, and the image of the off-board object. To this end, the wake-up subsystem 130 may include a sensing component 132 and a wake-up processor 134, wherein the sensing component 132 is disposed outside the vehicle 200 for sensing at least one of voice, motion and an image of an object outside the vehicle, and the wake-up processor 134 is for determining whether the sensed at least one of voice, motion and image satisfies a predetermined wake-up condition, and waking up the vehicle-mounted controller 140 to turn on a human-machine interaction function when it is determined that the predetermined wake-up condition is satisfied.
In some embodiments, sensing component 132 may include one or more shock sensors 1322, which may be disposed on the outside of body panel 210 (e.g., near the windshield) or on vehicle glazing 220 (as shown in FIG. 2). Shock sensor 1322 is configured to sense a shock to body panel 210 or vehicle glass 220 by a person outside the vehicle. In this case, wake-up processor 134 is configured to determine whether the tap sensed by shock sensor 1322 satisfies a predetermined tap condition, and wake-up on-board controller 140 to turn on the human interaction function upon determining that the predetermined tap condition is satisfied. For example, the predetermined tapping condition may include a predetermined number of consecutive taps (e.g., 2 or 3). Wake-up processor 134 may determine that a predetermined tap condition is met upon determining that the number of taps sensed by shock sensor 1322 has reached the predetermined number of consecutive taps. In this way, wake-up subsystem 130 may control on-board controller 140 to turn on human-machine interaction functionality by easily understanding the intent of the off-board person to wish to interact with vehicle 200 through the intentional tapping behavior of the off-board person.
Here, further, the wake processor 134 may also filter out taps having an intensity below a predetermined threshold when determining whether a predetermined tap condition is met. In this manner, wake subsystem 130 may identify valid taps and filter out invalid or unintentional taps.
In other embodiments, the sensing component 132 may include one or more sensors 1324, which may be disposed outside of the body panel 210 (as shown in FIG. 2). The sensor 1324 is configured to sense a distance between the object outside the vehicle and the vehicle body panel 210. For example, the sensor 1324 may be a biometric sensor such as an infrared proximity sensor or a sensor such as a proximity ultrasonic sensor or UWB (ultra wide band) sensor that may sense off-board persons or other off-board objects and determine the distance between these off-board objects and the vehicle 200. In this case, wake-up processor 134 is configured to determine whether the distance sensed by sensor 1324 meets a predetermined distance threshold, and wake-up onboard controller 140 to turn on the human interaction function upon determining that the predetermined distance threshold is met. For example, the predetermined distance threshold may be a distance less than a conventional safe distance and may have a different value or range depending on the sensing object (i.e., whether a person or other object outside the vehicle). For example, in the case where the off-board object is a person, the distance threshold value may be a value in a range of 0.2 to 0.5 meters, in the case where the off-board object is another vehicle, the distance threshold value may be a value in a range of 5 to 10 meters, or the like. In this case, the wake-up processor 134 may determine that the predetermined wake-up condition is satisfied when it is determined that the distance sensed by the sensor 1324 is less than the distance threshold. In this manner, wake-up subsystem 130 may control on-board controller 140 to turn on human-machine interaction functionality by allowing off-board personnel to readily understand the intent of the off-board personnel to wish to interact with vehicle 200 by the act of being significantly too close to vehicle 200. Alternatively, the wake-up subsystem 130 may actively issue a voice alert to the other vehicle when the other vehicle is too close to the vehicle 200 to alert the other vehicle to the vehicle distance.
In other embodiments, the sensing component 132 may include one or more outboard cameras 1326, which may be disposed outboard of the body panel 210, such as may be disposed near the front window of the vehicle 200 (e.g., above the B-pillar of the vehicle 200), as shown in fig. 2. The external camera 1326 is configured to capture images of a person outside the vehicle. In this case, wake-up processor 134 is configured to determine an identity of the person off-board based on the image of the person off-board, and wake-up on-board controller 140 to turn on the human interaction function upon determining that the identity of the person off-board is a predetermined person. For example, in the case where the vehicle 200 is a vehicle for home use, the wake-up processor 134 may store an image of a permitted user (e.g., a member of the home) of the vehicle 200 in advance, and determine whether the person outside the vehicle belongs to the permitted user by comparing the image of the person outside the vehicle obtained by the external camera 1326 with the image of the permitted user stored in advance. Alternatively, where the vehicle 200 is an unmanned shared or rental vehicle, the wake-up processor 134 may acquire images of the user who is scheduled for the vehicle via the onboard controller 140 or other onboard terminal, and determine whether the person outside the vehicle belongs to the user who is scheduled for the vehicle by comparing the images of the person outside the vehicle acquired by the external camera 1326 with the images of the user who is scheduled for the vehicle acquired in advance. In this way, the wake-up subsystem 130 may easily determine from the image of the person outside the vehicle that the person outside the vehicle may be the person who is to interact with the vehicle 200, thereby controlling the on-board controller 140 to turn on the human interaction function. In this case, the onboard controller 140 may play welcome music or voice to the offboard personnel through the external speaker 110, or may send other voice, such as requesting the offboard personnel to provide specific credentials for use (e.g., a verification code sent to the vehicle 200 by a predetermined platform when it is scheduled, etc.) for further interaction with the offboard personnel.
In other embodiments, the wake-up processor 134 may determine a gesture of the person off-board based on the one or more images acquired by the external camera 1326 and wake up the on-board controller 140 to turn on the human-machine interaction function when it is determined that the gesture of the person off-board conforms to the predetermined gesture. For example, the predetermined gesture may be a hand waving motion. In this case, wake-up processor 134 may determine whether the off-board person is waving towards vehicle 200 based on the images and wake-up on-board controller 140 upon determining that the off-board person is waving towards vehicle 200. In this way, wake-up subsystem 130 may easily determine, from an image of an off-board person, an intention of the off-board person to interact with vehicle 200, thereby controlling on-board controller 140 to turn on human-machine interaction functions. For example, in the case where the vehicle 200 is an unmanned taxi, the wake-up subsystem 130 may easily determine that the person outside the vehicle wants to ride the vehicle 200 through a waving motion of the person outside the vehicle, and may turn on a human-computer interaction function to further confirm his riding intention with the person outside the vehicle through voice or perform other authentication operations, etc.
In other embodiments, sensing component 132 may include one or more external microphones 1328, which may be disposed outside of body panel 210 (as shown in FIG. 2). The external microphone 1328 is configured to receive speech input from an offboard person. In this case, the wake-up processor 134 is configured to determine whether the voice input of the offboard person corresponds to a predetermined wake-up word or whether the identity of the offboard person corresponds to a predetermined person based on the voice input of the offboard person, and wake up the onboard controller 140 to turn on the human-machine interaction function upon determining that the voice input of the offboard person corresponds to the predetermined wake-up word or determining that the identity of the offboard person corresponds to the predetermined person based on the voice input of the offboard person. For example, where the vehicle 200 is a taxi, the wake-up word may include "taxi" or the like. In this case, the wake-up processor 134 may determine whether the speech input received by the external microphone 1328 matches the wake-up word, and if so, wake up the onboard controller 140 to turn on the human interaction function. Alternatively, where the vehicle 200 is a home-use vehicle, the wake-up processor 134 may have pre-stored therein voiceprints of permitted users of the vehicle 200 (e.g., members of the home), and determine whether the person outside the vehicle belongs to the permitted user by comparing the voiceprints of the person outside the vehicle obtained by analyzing the speech input received by the external microphone 1328 with the pre-stored voiceprints of the permitted users. In this way, wake-up subsystem 130 may easily determine, via the voice of an off-board person, that the off-board person may be a person who is to interact with vehicle 200, thereby controlling on-board controller 140 to turn on human interaction functionality. In this case, the onboard controller 140 may play welcome music or voice to the offboard personnel through the external speaker 110, or may send other voice, such as requesting the offboard personnel to provide specific credentials for use (e.g., a verification code sent to the vehicle 200 by a predetermined platform when it is scheduled, etc.) for further interaction with the offboard personnel.
In addition, the external microphone 1328 may also be used to actively receive speech input from an off-board person and the on-board controller 140 may be configured to play the received speech to the on-board person through an internal speaker of the vehicle 200 (e.g., the internal speaker 152 connected to the on-board controller 140 shown in fig. 1). Through the mode, the voice transmission between the personnel in the vehicle and the personnel outside the vehicle can be realized, so that the voice interaction between the personnel in the vehicle and the personnel outside the vehicle can be realized.
Alternatively, in some embodiments, the external microphone 1328 may semantically analyze the speech input of the person outside the vehicle, and perform control of the vehicle 200 based on the result of the semantic analysis. For example, where the off-board person is an allowed user of vehicle 200 (which may be determined by the image or voiceprint approach described above, or otherwise) and the input speech is "trunk", on-board controller 140 may perform a semantic analysis on the speech input, determining whether the result of the semantic analysis is to open or close the trunk, in conjunction with the current trunk status (i.e., whether the trunk is closed or open) (e.g., determining that the result of the semantic analysis is to open the trunk if the current trunk is closed, and determining that the result of the semantic analysis is to close the trunk if the current trunk is open). In this case, the onboard controller 140 may control the vehicle 200, such as closing or opening the trunk, according to the result of the semantic analysis.
In some embodiments, as part of the in-vehicle human-machine interaction system 100, a prompting device (not shown), such as a prompting light, may also be provided for the external microphone 1328 around the external microphone 1328. The prompting device is activated when the on-board controller 140 turns on the human interaction function to prompt the off-board person for the location of the external microphone 1328 so that the off-board person can interact with the vehicle 200 or the on-board person through the external microphone 1328.
Further, in some embodiments, the onboard controllers 140 may perform different control operations based on the identity of the identified offboard person. Specifically, wake-up processor 134 and/or onboard controller 140 may further determine an authority corresponding to the identity of the person outside the vehicle, and onboard controller 140 may perform different control operations on vehicle 200 based on the authority. For example, in the case where the identified identity of the person outside the vehicle is a vehicle owner or a driver authorized by the vehicle owner, the on-board controller 140 may determine that the person outside the vehicle has the highest authority, and may control various operations of the vehicle 200, such as opening/closing of a door, a trunk, a front hatch, a charging port, a fuel filler, and the like, according to the voice input by the person outside the vehicle, and may provide various information, such as the amount of fuel, the amount of power, weather information, and the like of the vehicle 200, according to the request of the person outside the vehicle. In the event that the identified identity of the off-board person is an unauthorized driver of the vehicle owner, the on-board controller 140 may further contact the vehicle owner (e.g., through a vehicle customer service center or through a dedicated app) to further confirm to the vehicle owner whether the off-board person is authorized, and in the event that the vehicle owner authorizes the off-board person, give the off-board person the highest authority as described above. On the other hand, in the case where the identified identity of the person outside the vehicle is a passenger authorized by the vehicle owner, the on-board controller 140 may determine that the person outside the vehicle has general authority, and may control various general operations of the vehicle 200, such as opening/closing doors, a trunk, etc., according to the voice input by the person outside the vehicle, and may provide general information, such as weather information, etc., according to the request of the person outside the vehicle.
Alternatively, in some embodiments, the onboard controller 140 may perform a voice call operation with an offboard person after being awakened by the wake-up subsystem 130 without identifying the identity of the offboard person. In particular, in the event of being awakened by a tap by an off-board person, on-board controller 140 may turn on human interaction functionality to receive voice input by the off-board person. In some cases, the onboard controller 140 may directly analyze the voice input and perform a corresponding control operation on the vehicle 200. Alternatively or additionally, the onboard controller 140 may establish a voice connection with the remote vehicle owner so that personnel outside the vehicle may talk to the remote vehicle owner through the voice connection.
In the above description, the on-board controller 140 and the wake-up processor 134 are described separately, however, it will be understood by those skilled in the art that the present invention is not limited thereto, and the wake-up processor 134 may be implemented as a part of the on-board controller 140.
In addition, in the above description, the vibration sensor 1322, the sensor 1324, the external camera 1326, the external microphone 1328, and the like are described as separate implementations of the sensing part 132, however, it is understood by those skilled in the art that the present invention is not limited thereto, and the sensing part 132 may include any one or more of the vibration sensor 1322, the sensor 1324, the external camera 1326, and the external microphone 1328. In some cases, sensing component 132 may include only shock sensor 1322 and/or sensor 1324 to wake up on-board controller 140. In this regard, the in-vehicle human machine interaction system 100 may still include an external camera and/or an external microphone as hardware devices to support further human machine interaction functions. For ease of description, the same reference numerals as above are still used to describe the external camera and/or external microphone. Specifically, in some embodiments, in-vehicle human machine interaction system 100 may include external camera 1326 described above, in which case, in-vehicle controller 140 is further configured to determine an identity of the person outside the vehicle based on an image of the person outside the vehicle after being awakened by wake-up processor 134, and to perform a vehicle control operation upon determining that the identity of the person outside the vehicle is a predetermined person. For example, the onboard controller 140 may establish a communication connection with the remote vehicle owner through an onboard mobile communication module (e.g., an onboard 4G or 5G communication module) to enable the remote vehicle owner to view the vehicle-outside object or perform video communication with the vehicle-outside person through the communication connection. For example, when the vehicle 200 is executing an application for self-service shopping, refueling, shared rental car, etc., a remote owner of the vehicle 200 may view purchased items, gas station or rental car personnel, etc. through a communication connection established by the on-board controller 140, or perform video communication with a shopping venue operator, gas station operator, or rental car personnel to further complete the desired service. Alternatively or additionally, in some embodiments, in-vehicle human-machine interaction system 100 may include external microphone 1328 described above, in which case, in-vehicle controller 140 is further configured to perform vehicle control operations based on speech input from the off-vehicle person received by external microphone 1328 after being awakened by wake-up processor 134. For example, the vehicle-mounted controller 140 may establish a communication connection with the remote vehicle owner through a vehicle-mounted mobile communication module (such as a vehicle-mounted 4G or 5G communication module) so that a person outside the vehicle can perform a voice call with the remote vehicle owner through the communication connection. For example, similarly, when the vehicle 200 is executing the above-described fueling, shared rental car, etc., the remote owner of the vehicle 200 may engage in a voice call with a shopping site employee, a gas station employee, or a rental car employee via the communication connection established by the onboard controller 140 to further complete the desired service.
The rest of the invention can be implemented in software, hardware or firmware, except for specific hardware components, such as external speaker 110, speaker cover 120, sensing component 132. Where implemented as a computer program product in software, the computer program product may include a computer-readable storage medium having computer-readable program instructions embodied therewith for carrying out various aspects of the invention.
Fig. 6 illustrates a block diagram of a control device 600 suitable for implementing embodiments of the present disclosure. The control device 600 may be used to implement the on-board controller 140 and/or the wake-up processor 134 as described above.
As shown, the control device 600 may include a processor 610. The processor 610 controls the operation and functions of the control device 600. For example, in some embodiments, the processor 610 may perform various operations by way of instructions 630 stored in a memory 620 coupled thereto. The memory 620 may be of any suitable type suitable to the local technical environment and may be implemented using any suitable data storage technology, including but not limited to semiconductor-based memory devices, magnetic memory devices and systems, optical memory devices and systems. Although only one memory 620 is shown in fig. 6, one skilled in the art will appreciate that the control device 600 may include more physically distinct memories 620.
The processor 610 may be of any suitable type suitable to the local technical environment, and may include, but is not limited to, one or more of general purpose computers, special purpose computers, microprocessors, Digital Signal Processors (DSPs), and processor-based multi-core processor architectures. The control device 600 may also include a plurality of processors 610. The processor 610 is coupled to a transceiver 640, which transceiver 640 may enable the reception and transmission of information by means of one or more communication components. All of the features described above with respect to the on-board controller 140 and/or the wake-up processor 134 with reference to fig. 1-5 are applicable to the control device 600 and will not be described in detail herein.
In one or more exemplary designs, the functions described herein may be implemented in hardware, software, firmware, or any combination thereof. For example, if implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium.
The units of the apparatus disclosed herein may be implemented using discrete hardware components, or may be integrally implemented on a single hardware component, such as a processor. For example, the various illustrative logical blocks, modules, and circuits described in connection with the invention may be implemented or performed with a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein.
Those of skill would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both.
The previous description of the invention is provided to enable any person skilled in the art to make or use the invention. Various modifications to the disclosure will be readily apparent to those skilled in the art, and the generic principles defined herein may be applied to other variations without departing from the spirit or scope of the disclosure. Thus, the present invention is not intended to be limited to the examples and designs described herein but is to be accorded the widest scope consistent with the principles and novel features disclosed herein.

Claims (16)

1. An in-vehicle human-computer interaction system, comprising:
an external speaker provided inside a body panel that is an outer surface of a vehicle;
a speaker cover plate disposed on the vehicle body panel and closely fitted with the vehicle body panel to cover the external speaker in a closed state;
a wake-up subsystem for waking up an onboard controller to turn on a human-computer interaction function based on at least one of a voice, an action, and an image of an off-board object; and
an onboard controller configured to control the speaker cover to transition from the closed state to an open state after being awakened by the wake-up subsystem to turn on a human-machine interaction function, and to transmit voice to the off-board object through the external speaker in the open state.
2. The system of claim 1, further comprising:
a pivot connecting one side of the speaker cover plate to the body panel; and is
The onboard controller is further configured to drive the pivot by a motor to transition the speaker cover from the closed state to the open state.
3. The system of claim 1, further comprising:
the first push rod is connected with the loudspeaker cover plate; and is
The on-board controller is further configured to push the first push rod toward the outside of the body panel by a motor to switch the speaker cover from the closed state to the open state.
4. The system of claim 1, further comprising:
the second push rod is connected with the external loudspeaker; and is
The on-vehicle controller is further configured to push the second push rod toward the outside of the vehicle body panel by a motor to push the external speaker from the inside of the vehicle body panel to the outside of the vehicle body panel.
5. The system of claim 4, wherein the speaker cover is pushed by the external speaker to transition from the closed state to the open state when the external speaker is pushed by the second push rod from the inside of the vehicle body panel to the outside of the vehicle body panel.
6. The system of claim 1, further comprising:
an external microphone disposed outside the body panel and configured to receive a voice input of the object outside the vehicle, and
the on-board controller is further configured to play voice input of the off-board object to an in-vehicle person through a built-in speaker of the vehicle or perform semantic analysis on the voice input of the off-board object, and perform control on the vehicle based on a result of the semantic analysis.
7. The system of claim 1, wherein the wake-up subsystem comprises:
a sensing part disposed outside the vehicle and configured to sense at least one of a voice, a motion, and an image of the object outside the vehicle; and
a wake-up processor configured to determine whether at least one of the sensed voice, motion and image satisfies a predetermined wake-up condition, and wake up the on-board controller to turn on a human interaction function upon determining that the predetermined wake-up condition is satisfied.
8. The system of claim 7, wherein the sensing component comprises:
a shock sensor disposed on the body panel or the vehicle glazing and configured to sense a strike of the body panel or the vehicle glazing by the object outside the vehicle; and is
The wake-up processor is configured to determine whether the tap satisfies a predetermined tap condition, and wake up the on-board controller upon determining that the tap satisfies the predetermined tap condition.
9. The system of claim 7, wherein the sensing component comprises:
a sensor disposed outside the body panel and configured to sense a distance of the object outside the vehicle from the body panel; and is
The wake-up processor is configured to determine whether the distance meets a predetermined distance threshold, and wake-up the onboard controller upon determining that the distance meets the predetermined distance threshold.
10. The system of claim 8 or 9, wherein the in-vehicle human-machine interaction system further comprises:
an external camera disposed outside the body panel and configured to capture an image of an occupant outside the vehicle; and is
The on-board controller is further configured to determine an identity of the off-board person based on the image of the off-board person after being awakened by the wake-up processor, and to perform a vehicle control operation upon determining that the identity of the off-board person is a predetermined person.
11. The system of claim 10, wherein the on-board controller is configured to perform vehicle control operations comprising:
the communication connection with a remote vehicle owner is established through a vehicle-mounted mobile communication module, so that the remote vehicle owner can view the vehicle-mounted object or perform video communication with the vehicle-mounted person through the communication connection.
12. The system of claim 8 or 9, wherein the in-vehicle human-machine interaction system further comprises:
an external microphone disposed outside the body panel and configured to receive speech input of an occupant outside the vehicle; and is
The on-board controller is further configured to perform a vehicle control operation based on a voice input of the off-board person after being awakened by the wake-up processor.
13. The system of claim 12, wherein the on-board controller being configured to perform vehicle control operations comprises:
establish the communication connection with remote car owner through on-vehicle mobile communication module to make the outer personnel of car pass through communication connection with remote car owner carries out the voice call.
14. The system of claim 7, wherein the sensing component comprises:
an external camera disposed outside the body panel and configured to capture an image of an occupant outside the vehicle; and is
The wake-up processor is configured to determine an identity of the person off-board based on the image of the person off-board, and wake-up the on-board controller when the identity of the person off-board is determined to be a predetermined person.
15. The system of claim 7, wherein the sensing component comprises:
an external camera disposed outside the body panel configured to capture one or more images of an occupant outside the vehicle; and is
The wake-up processor is configured to determine a gesture of the offboard person based on one or more images of the offboard person, and wake-up the onboard controller upon determining that the gesture of the offboard person conforms to a predetermined gesture.
16. The system of claim 7, wherein the sensing component comprises:
an external microphone disposed outside the body panel and configured to receive speech input of an occupant outside the vehicle; and is
The wake-up processor is configured to determine whether a voice input of the off-board person conforms to a predetermined wake-up word or determine whether an identity of the off-board person conforms to a predetermined person based on the voice input of the off-board person, and wake up the on-board controller when the voice input of the off-board person is determined to conform to the predetermined wake-up word or the identity of the off-board person is determined to conform to the predetermined person based on the voice input of the off-board person.
CN202111031556.3A 2021-09-03 2021-09-03 Vehicle-mounted human-computer interaction system Pending CN113844384A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111031556.3A CN113844384A (en) 2021-09-03 2021-09-03 Vehicle-mounted human-computer interaction system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111031556.3A CN113844384A (en) 2021-09-03 2021-09-03 Vehicle-mounted human-computer interaction system

Publications (1)

Publication Number Publication Date
CN113844384A true CN113844384A (en) 2021-12-28

Family

ID=78973169

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111031556.3A Pending CN113844384A (en) 2021-09-03 2021-09-03 Vehicle-mounted human-computer interaction system

Country Status (1)

Country Link
CN (1) CN113844384A (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050074131A1 (en) * 2003-10-06 2005-04-07 Mc Call Clark E. Vehicular sound processing system
JP2008035472A (en) * 2006-06-28 2008-02-14 National Univ Corp Shizuoka Univ In-vehicle outside-vehicle acoustic transmission system
CN201619516U (en) * 2009-12-30 2010-11-03 比亚迪股份有限公司 Inside and outside conversational system on vehicle
CN109285547A (en) * 2018-12-04 2019-01-29 北京蓦然认知科技有限公司 A kind of voice awakening method, apparatus and system
CN110103815A (en) * 2019-03-05 2019-08-09 北京车和家信息技术有限公司 The method and device of voice reminder
CN212850479U (en) * 2020-07-29 2021-03-30 成璟桓 Multifunctional real-time vehicle-mounted intercom communication system
CN113002421A (en) * 2021-04-25 2021-06-22 广州小鹏汽车科技有限公司 Vehicle exterior safety prompting method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050074131A1 (en) * 2003-10-06 2005-04-07 Mc Call Clark E. Vehicular sound processing system
JP2008035472A (en) * 2006-06-28 2008-02-14 National Univ Corp Shizuoka Univ In-vehicle outside-vehicle acoustic transmission system
CN201619516U (en) * 2009-12-30 2010-11-03 比亚迪股份有限公司 Inside and outside conversational system on vehicle
CN109285547A (en) * 2018-12-04 2019-01-29 北京蓦然认知科技有限公司 A kind of voice awakening method, apparatus and system
CN110103815A (en) * 2019-03-05 2019-08-09 北京车和家信息技术有限公司 The method and device of voice reminder
CN212850479U (en) * 2020-07-29 2021-03-30 成璟桓 Multifunctional real-time vehicle-mounted intercom communication system
CN113002421A (en) * 2021-04-25 2021-06-22 广州小鹏汽车科技有限公司 Vehicle exterior safety prompting method and device

Similar Documents

Publication Publication Date Title
US8880291B2 (en) Methods and systems for preventing unauthorized vehicle operation using face recognition
TWI700201B (en) Smart key and smart driving system
US10852720B2 (en) Systems and methods for vehicle assistance
US9951549B2 (en) Vehicle power systems activation based on structured light detection
EP2577615B1 (en) Vehicle communications
US20180056937A1 (en) System for the automatic control of the access and/or engine start authorization of a user in a vehicle
CN105484608B (en) Control method, vehicle window control device and the mobile terminal of automotive window
CN110517687A (en) The system for controlling its function using the voice command outside automotive
JP4556776B2 (en) In-vehicle control device
CN202806675U (en) Automobile and keyless entry system thereof
TW202109394A (en) Service execution method and device
EP3631133B1 (en) Voice activated liftgate
CN113844360A (en) Vehicle-mounted human-computer interaction system
CN111216682A (en) Face recognition automobile unlocking system
US20210394716A1 (en) Motor vehicle and method for processing sound from outside the motor vehicle
WO2018058273A1 (en) Anti-theft method and system
WO2023029262A1 (en) Vehicle-mounted man-machine interaction system
CN214728672U (en) Vehicle entering control system based on face recognition
CN113844384A (en) Vehicle-mounted human-computer interaction system
CN114439346B (en) Vehicle trunk control method and system
CN113859173A (en) Automatic opening and closing control system for automobile side door
US20080129472A1 (en) Panic alarm system and method for motor vehicles
JP4428146B2 (en) Remote control device and remote control system
CN212313488U (en) Fingerprint identification anti-hand-opening vehicle door warning device
CN107021020A (en) A kind of intelligent vehicle-carried rear-view mirror system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 201821 room 208, building 4, No. 1411, Yecheng Road, Jiading Industrial Zone, Jiading District, Shanghai

Applicant after: Botai vehicle networking technology (Shanghai) Co.,Ltd.

Address before: 201821 room 208, building 4, No. 1411, Yecheng Road, Jiading Industrial Zone, Jiading District, Shanghai

Applicant before: SHANGHAI PATEO ELECTRONIC EQUIPMENT MANUFACTURING Co.,Ltd.

CB02 Change of applicant information