CN112389458A - Motor vehicle interaction system and method - Google Patents

Motor vehicle interaction system and method Download PDF

Info

Publication number
CN112389458A
CN112389458A CN201910697269.2A CN201910697269A CN112389458A CN 112389458 A CN112389458 A CN 112389458A CN 201910697269 A CN201910697269 A CN 201910697269A CN 112389458 A CN112389458 A CN 112389458A
Authority
CN
China
Prior art keywords
vehicle
interaction
motor vehicle
avatar
sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910697269.2A
Other languages
Chinese (zh)
Inventor
丁飞
田晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ford Global Technologies LLC
Original Assignee
Ford Global Technologies LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ford Global Technologies LLC filed Critical Ford Global Technologies LLC
Priority to CN201910697269.2A priority Critical patent/CN112389458A/en
Publication of CN112389458A publication Critical patent/CN112389458A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

The invention discloses a motor vehicle interaction system, comprising: an avatar interaction device located within the vehicle, the avatar interaction device having at least one interaction area associated with at least one vehicle component; at least one sensor that senses a state of the vehicle; a controller communicatively coupled to the avatar interaction device and the sensor, the controller configured to receive an input signal from at least one of the avatar interaction device and the sensor, and to provide an output signal to the avatar interaction device and/or to control at least one component of the vehicle based on the input signal. The invention also discloses a related motor vehicle interaction method. By the motor vehicle interaction system and the related method, a vehicle occupant can intuitively and conveniently know the condition of the vehicle and operate related components of the vehicle.

Description

Motor vehicle interaction system and method
Technical Field
The present invention relates generally to a motor vehicle interaction system and method, and to a motor vehicle employing the system.
Background
With the development of the automobile industry, the technological content of the current vehicles is higher and higher. More and more devices are installed in motor vehicles to provide better driving and riding experience for the vehicle occupants.
However, as more and more devices and components are provided in a vehicle, more and more control devices for controlling the devices and components, whether physical keys mounted on, for example, a center console, steering wheel or other body location of the vehicle, or virtual keys on an on-board center screen, tend to be more and more complex. The displayed icons on the dashboard of the vehicle are also becoming more complex in order to display the status and functionality of these devices or components. These complicated operations and displays easily cause obstacles for the driver or the vehicle occupant to operate the corresponding components or to understand the condition of the vehicle. And may also distract the driver if the vehicle is in motion, creating a hazard.
In this context, the present inventors have recognized that there is a need for an improved motor vehicle interaction system and method that allows for a concise and clear understanding of vehicle conditions, including but not limited to the status of vehicle components and vehicle surroundings, and that can readily manipulate the basic functions of the vehicle without suffering from overly complex operating and display systems.
Disclosure of Invention
The application is defined by the appended claims. This disclosure summarizes aspects of the embodiments and should not be used to limit the claims. Other embodiments are contemplated in accordance with the techniques described herein, as will be apparent to one of ordinary skill in the art upon study of the following figures and detailed description, and are intended to be included within the scope of the present application.
The present invention is advantageous in that it provides a convenient motor vehicle interaction system and method that enables a vehicle occupant to intuitively and conveniently understand vehicle conditions and operate related components of the vehicle.
According to the present invention, there is provided a motor vehicle interaction system comprising:
an avatar interaction device located within the vehicle, the avatar interaction device having at least one interaction area associated with at least one vehicle component;
at least one sensor that senses a state of the vehicle;
a controller communicatively coupled to the avatar interaction device and the sensor, the controller configured to receive an input signal from at least one of the avatar interaction device and the sensor, and to provide an output signal to the avatar interaction device and/or to control at least one component of the vehicle based on the input signal.
According to one embodiment of the invention, the state of the motor vehicle comprises at least one of a state of an accessory and an environment surrounding the motor vehicle, the accessory comprising at least one sensor of a light, a door, a tailgate, a window, a sunroof comprising at least one of a light sensor, a door sensor, a window sensor, a displacement sensor, and a camera.
According to an embodiment of the invention, the interaction area of the avatar interaction device is arranged to provide a visual, tactile, acoustic, olfactory response based on the output signal, the interaction area comprising at least one of an avatar light, an avatar door, an avatar tailgate and an avatar window in response to a state of at least one component for prompting the vehicle.
According to one embodiment of the invention, the image-bearing interaction device is a physical model or a holographic projection model of the motor vehicle.
According to one embodiment of the invention, the interaction zone of the physical model is provided with at least one of an optical means and a sound device, the response being displayed by the optical means and/or the sound device.
According to one embodiment of the invention, the inputtable interaction regions of the physical model comprise buttons and/or touch switches configured for controlling at least one vehicle component associated with the interaction region.
According to one embodiment of the present invention, the physical model is rotatably provided on a base which is detachably provided above the instrument panel, and the physical model is driven to rotate by a driving device installed inside the base.
According to one embodiment of the invention, the holographic projection model comprises a holographic projector, a position detection device and a motor vehicle hologram, the position detection device being in communicative connection with the holographic projector.
According to one embodiment of the invention, the holographic projector is communicatively coupled to the controller and displays a real-time status of the motor vehicle based on the received output signal, the real-time status including at least one of a status of an accessory of the motor vehicle and an environment surrounding the motor vehicle.
According to one embodiment of the invention, the motor vehicle hologram has a plurality of inputtable interaction regions, the position detection device detects an operation of the inputtable interaction regions by an occupant of the motor vehicle, and the controller controls the corresponding accessories of the motor vehicle in accordance with the operation.
According to one embodiment of the invention, a holographic projector is connected to a body part in the passenger compartment, the holographic projector projecting a motor vehicle hologram on at least one predefined location in the passenger compartment.
According to the present invention, there is provided a motor vehicle interaction method comprising:
collecting vehicle conditions via at least one sensor;
activating an avatar interaction device within the vehicle, the avatar interaction device having at least one interaction zone associated with at least one vehicle component;
the controller receives input signals from at least one of the sensor and the avatar-interacting device, and provides output signals to the avatar-interacting device and/or controls at least one component of the vehicle based on the input signals.
According to one embodiment of the invention, the interaction zone provides a visual, tactile, acoustic, olfactory response based on the output signal, the response being for prompting a state of at least one component of the vehicle.
According to one embodiment of the invention, controlling the motor vehicle comprises controlling an accessory of the motor vehicle, the accessory comprising at least one of a light, a door, a tailgate, a window, a sunroof.
According to one embodiment of the invention, the image-bearing interaction device is a physical model or a holographic projection model of the motor vehicle.
According to one embodiment of the invention, controlling the motor vehicle comprises providing operating advice for a sound device controlling the motor vehicle based on the input signal.
According to one embodiment of the invention, the operation on the interactive area includes a plurality of operations on the component corresponding thereto, the plurality of operations including operating the basic function and operating the sub-function.
According to one embodiment of the invention, the vehicle state is a real-time state of the vehicle, the real-time state comprising at least one of an accessory state of the motor vehicle and an environment surrounding the motor vehicle.
According to one embodiment of the invention, a holographic projection model receives the output signal and projects a holographic representation of the surrounding environment over a predetermined area.
According to the present invention, there is provided a motor vehicle comprising a motor vehicle interaction system, the system comprising:
an avatar interaction device located within the vehicle, the avatar interaction device having at least one interaction area associated with at least one vehicle component;
at least one sensor that senses a state of the vehicle;
a controller communicatively coupled to the avatar interaction device and the sensor, the controller configured to receive an input signal from at least one of the avatar interaction device and the sensor, and to provide an output signal to the avatar interaction device and/or to control at least one component of the vehicle based on the input signal.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the subject matter claimed. The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the disclosure and together with the description, serve to explain the principles of the disclosure.
Drawings
For a better understanding of the invention, reference may be made to the embodiments illustrated in the following drawings. The components in the figures are not necessarily to scale, and related elements may be omitted, or in some cases the scale may have been exaggerated, in order to emphasize and clearly illustrate the novel features described herein. In addition, the system components may be arranged differently as is known in the art. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views.
FIG. 1 shows a schematic view of a motor vehicle incorporating a motor vehicle interaction system according to the present invention;
FIG. 2 shows a schematic view of a passenger compartment incorporating an automotive vehicle interaction system according to one embodiment of the invention;
FIG. 3 shows a schematic structural diagram of a motor vehicle interaction system according to an embodiment of the present invention;
FIG. 4 shows a physical model diagram according to an embodiment of the invention;
FIG. 5 shows a flow diagram of a motor vehicle interaction method according to an embodiment of the invention;
FIG. 6 shows a passenger compartment schematic of a motor vehicle interaction system according to another embodiment of the present invention;
FIG. 7 shows a schematic structural diagram of a motor vehicle interaction system according to another embodiment of the present invention;
FIG. 8 shows a flow diagram of a motor vehicle interaction method according to another embodiment of the invention.
Detailed Description
Embodiments of the present disclosure are described below. However, it is to be understood that the disclosed embodiments are merely examples and that other embodiments may take various and alternative forms. The figures are not necessarily to scale; certain features may be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the present invention. As one of ordinary skill in the art will appreciate, various features illustrated and described with reference to any one of the figures may be combined with features illustrated in one or more other figures to produce embodiments that are not explicitly illustrated or described. The combination of features shown provides a representative embodiment for a typical application. However, various combinations and modifications of the features consistent with the teachings of the present disclosure may be desirable for certain specific applications or implementations.
Referring to FIG. 1, a motor vehicle 100 incorporating the motor vehicle interaction system of the present invention is generally shown. The vehicle may be a standard gasoline powered vehicle, a hybrid vehicle, an electric vehicle, a fuel cell vehicle, and/or any other type of vehicle, and may also be a bus, a watercraft, or an aircraft. Vehicles include mobility-related components such as an engine, electric motor, transmission, suspension, drive shafts, and/or wheels, among others. The vehicle may be non-autonomous, semi-autonomous (e.g., some conventional motor functions are controlled by the vehicle), or autonomous (e.g., motor functions are controlled by the vehicle without direct driver input). The vehicle 100 has a door 103, a window 104, a tailgate 105, a rearview mirror 106, a lamp 107, and a sunroof 108. In connection with one embodiment shown in fig. 2, motor vehicle 100 has an instrument panel 101 and a center console 102 inside the passenger compartment.
In the present embodiment, the motor vehicle 100 has a motor vehicle interaction system 200 according to the present invention as shown in fig. 3. The motor vehicle interaction system 200 in this embodiment comprises a physical model 201, a controller 202, sensors 203, actuators 204, and vehicle components 205. The sensors 203 are used to sense the status of one or more vehicle components 205 and send the sensed status signals to the controller 202. The vehicle components 205 include, but are not limited to, the vehicle lights 107, the vehicle doors 103, the tailgate 105, the vehicle window 104, the sunroof 108, the rearview mirror 106, and the like. The sensors 203 include, but are not limited to, a vehicle light sensor, a vehicle door sensor, a tailgate sensor, a vehicle window sensor, a sunroof sensor, and the like. The sensors 203 of the above examples are, for example, but not limited to, inductive sensors, optical sensors, temperature sensors, resistive sensors, ultrasonic sensors, lasers, field effect sensors, the like, or combinations thereof. It is to be understood that the components and connection relationships of the interactive system described above are only used for illustration and not for limitation, and the interactive system including other components and connection relationships can also be applied to the technical solution of the present application.
The controller 202 is communicatively coupled to the physical model 201. The controller may be part of a Vehicle Computing System (VCS) in a motor vehicle or part of a mobile device carried by a user. Communicatively connecting may include, but is not limited to, connecting via a vehicle bus or connecting via wireless means and transmitting data and signals. The physical model 201 is an isometric vehicle model of the motor vehicle 100, and the physical model 201 may be sized according to the design requirements of the vehicle passenger compartment and the ease of operation of the occupants. The physical model 201 has a plurality of interaction regions, which are respectively arranged in association with the vehicle components 205. The plurality of interaction regions are embodied as a model car light 207, a model car door 203, a model tail door 205, a model car window 204, a model skylight 208, a model rearview mirror 206, and the like as shown in fig. 4 in the present embodiment. It is understood that the interaction area of the physical model can be set according to the operation requirement, and is not limited to the area disclosed in the above embodiment.
In one embodiment, a plurality of LED light sets are respectively disposed in the plurality of interaction regions to provide a response to the status of the vehicle component 205 output by the controller 202. The current conditions of various components of the vehicle can be conveniently known by a vehicle occupant through the light-emitting indication of the plurality of LED lamp groups. For example, when any one of the door 103, the window 104, or the sunroof 108 is in an open state, the sensor of the door 103, the window 104, or the sunroof 108 may sense the open state of the vehicle component 205 described above and transmit an open state signal to the controller 202. The controller 202 then sends an output signal to the physical model 201 and prompts the vehicle occupant that the above components are in an open state by illuminating the model doors 203, model windows 205, and model sunroof 208.
Occupant input devices are also provided in the plurality of interaction regions for manipulating the state of the vehicle component 205. The input device in this embodiment may include, but is not limited to, a push button, a knob, and a touch switch. The occupant can operate the vehicle component 205 corresponding to the interaction region by the operation of the above-described input device in the interaction region. For example, the occupant may press a switch provided on the model door 203 while seeing a light prompt that the door 103 is opened, and an input signal of the above operation is transmitted from the physical model 201 to the controller 202, and is converted into a control signal by the controller 202 and controls the door actuator 204 to close the door. After the door is closed, the sensor 203 feeds back a closing state signal to the controller 202 and sends the closing state signal to the physical model 201, and after the signal feedback of the closing of the door 103 is received, the LED lamp group on the model door 203 of the physical model 201 is turned off. If the rear door 103 is not properly closed by the operation, the LED lamp group is kept lit to notify the vehicle occupant. It will be appreciated that other vehicle components 205 may also operate accordingly in accordance with the present example.
In another embodiment of the present application, other tactile, acoustic, olfactory devices may also be disposed on the plurality of interactive regions to provide a response to the vehicle component condition. For example, in the case of receiving an output signal representing the state of a vehicle component from the controller 202, a warning may be provided to the vehicle occupant by a force feedback device, a buzzer, and an olfactory alarm provided in the interaction region of the physical model 201.
In addition, in the present embodiment, the controller 202 may also be used to control a control screen on the center console 102 of the vehicle 100. In addition to operating the interaction region of the physics model 201 to enable easy operation of multiple vehicle components 205, secondary options for operation of the vehicle components 205 may be displayed on the control screen simultaneously while the interaction region is operated. For example, when the LED light set on the model door 203 of the physical model 201 is turned on, which indicates that the door 103 associated with the model door 203 is not properly closed, the vehicle occupant can close the associated door 103 by touching a switch on the model door 203. At the same time, further operating options for the door, such as "lock door", "unlock door", "raise/lower window", etc., may be displayed on the control screen. Likewise, while the touch model rearview mirror 206 enables adjustment of the rearview mirror 106 from the folded state to the unfolded state, the height and angle adjustment buttons of the rearview mirror 106 can be displayed on the control screen to enable convenient adjustment of the height and angle of the rearview mirror 106. Additionally, a vehicle occupant profile may also be stored in the VCS of the vehicle 100, which may be stored locally, on a mobile device, or in the cloud. The occupant profile is associated with a particular vehicle occupant, and by identifying biological characteristics of the occupant, including but not limited to fingerprints, irises, etc., matching the vehicle occupant with its associated occupant profile and displayed on a secondary menu of the control screen, the occupant can apply personalized vehicle component settings by simply clicking on that option, such as raising the window of the door while closing the door, deploying the rear view mirror while adjusting the height and angle of the rear view mirror to a position preferred by the occupant, etc. It is understood that other vehicle components 205 may be further modified in the manner described above and will not be described in detail herein.
Further, in the present embodiment, the occupant can set the response manner to the interaction region of the physical model 201 according to personal preference. The occupant may set a condition responsive to activation. For example, it may be set that the warning signal is sent out through the model door 203 of the physical model 201 only when the vehicle 100 is in the start state to indicate that the door is not closed correctly, and the warning signal is not sent out in the vehicle flameout state. It will be appreciated that the warning signal of the other vehicle component 205 may be set in the same manner as described above.
According to one embodiment of the invention, the physical model 201 shown in FIG. 4 is rotatably mounted on a base 210 by a pivot 209, and the pivot 209 is connected to and driven by a motor (not shown) housed in the base 210. It is understood that the physical model 201 can be driven by other driving methods and is not limited to the method in the above embodiment. The base 210 is connected to various components of the vehicle 100 by a detachable connection, including but not limited to a bayonet structure, an electrostatic patch, etc. So that the physical model 201 is in a position that is easily manipulated by the vehicle occupant.
In this embodiment, the physical model 201 also includes a wireless communication module that CAN communicate with various vehicle components, such as the controller 202, and auxiliary components over a vehicle Network, such as, but not limited to, a Controller Area Network (CAN) bus, a Local Interconnect Network (LIN) bus, a Media Oriented System Transport (MOST) bus, an ethernet bus, or the like. The communication module may communicate with other devices or information sources using various wireless communication standards or protocols to communicate various information between the various components and information sources. The wireless communication standards or protocols may include cellular communication standards (including Global System For Mobile Communications (GSM) systems, Code Division Multiple Access (CDMA) systems, Time Division Multiple Access (TDMA) systems, universal Mobile telecommunications systems, 3G or 4G systems, and 5G systems that meet International Telecommunications Union (ITU) specification requirements For speeds up to 20Gbit, etc.), IEEE 802.11 standards (802.11b, 802.11G, 802.11n, etc.), bluetooth standards (bluetooth and/or bluetooth low power, etc.), Dedicated Short Range Communications (Dedicated Short Range Communications, DSRC), etc., and the like
As shown in fig. 5 using the flow chart of the interactive system of the above-described embodiment of the present application, at 300, the interactive system 200 of the vehicle 100 is activated. The activation of the interactive system 200 of the vehicle 100 may be performed in various ways, for example, by sensing the proximity of a vehicle occupant authorized to use the vehicle 100 to the vehicle 100, receiving a sensing signal from a proximity sensor when the vehicle occupant reaches a preset distance from the vehicle 100, and the interactive system 200 being activated by a controller. It may also be activated by means of a door 103 being opened, a seat being occupied, a belt buckle insertion or a manual or voice command by the vehicle occupant. It should be understood that the above examples are not intended to be limiting, and that other vehicle component activation methods may be used to activate the interactive system 200 of the present application.
After the interactive system 200 is activated, the LED light sets of the interactive regions of the physical model 201 are illuminated to indicate the locations of the operable interactive regions. While the interactive system 200 obtains an output signal from the controller 202 that is based on a sensed signal from the sensor 203 of the state of the vehicle component 205. After acquiring the output signal, the process proceeds to 320, where the physical model 201 displays the status of the vehicle component 205 via the LED light set — for example, by flashing the light set to indicate that one or more of the vehicle door 103, the vehicle window 104, the tailgate 105, and the sunroof 108 are in an open state, or to indicate that the vehicle rearview mirror 106 is still in a folded state.
Subsequently, at step 330, if the occupant wishes to control the state of the vehicle component 205, the state of the vehicle component 205 may be controlled by an operation on the interactive region, such as closing the door 103, closing the window 104, unfolding the mirror 106, etc., and after the operation of the above-described basic function is performed, the LED light group of the interactive region may stop flashing and continue to indicate the position of the interactive region in an illuminated manner or go out. Further, the control screen of the vehicle 100 displays secondary operating options for the vehicle component 205 associated with the region while the occupant operates the interaction region, thereby enabling further operation of the vehicle component 205, which may include, but is not limited to, for example, locking, unlocking of the door 103 or adjustment of the angle of the rear view mirror 106, applying an occupant profile, and the like.
Further, it is understood that the interactive zones may be used for operation of the vehicle component 205 after the interactive system 200 is activated, regardless of whether the interactive zones signal a warning of the vehicle's status. That is, the physical model 201 may be a component of normal operational functionality for human-computer interaction with the vehicle 100 by a vehicle occupant. For example, in the case that the physical model 201 is activated during driving of the vehicle 100, even if the LED light group of the interaction region of the physical model 201 is not lighted or blinked due to the warning signal, the vehicle occupant can also realize the function control of the vehicle component 205 through the operation of the interaction region. For example, clicking on model window 204 raises or lowers window 104, or clicking on model mirror 206 to extend or fold mirror 106 and may adjust the angle of mirror 106 via a secondary option on the control screen.
Further, upon completion of the above operations, the interactive system 200 may receive a deactivation signal from the controller 202 to cease use of the interactive system 200. The controller 202 may issue the deactivation signal based on a variety of conditions including, but not limited to, the vehicle having started traveling and the speed exceeding a preset threshold, various vehicle components being in a normal state, the vehicle occupant being away from the vehicle and exceeding a preset distance, receiving a vehicle occupant's turn-off command, etc. It is understood that other similar situations may be equally applicable to deactivating the vehicle interaction system 200.
According to another embodiment of the invention, as shown in fig. 6 and 7, motor vehicle 100 comprises an interaction system 400, interaction system 400 comprising an image interaction device 401, a controller 402, a sensor 403, an actuator 404, and a vehicle component 405. The avatar interacting device 401 in this embodiment is a holographic projection interacting device comprising a holographic projection model 4010, a holographic projector 4011 and a position sensing device 4012, wherein the holographic projector 4011 has a processing module and a memory unit that can be activated based on input signals of an occupant to project the holographic projection model 4010 of the vehicle at a preset position within a passenger cabin of the vehicle 100, the holographic projection model 4010 being a three-dimensional model or a 3D model. Wherein the "processing module" may surround a target object (e.g., a person, an animal, a moving object, etc.), through a Graphics Processing Unit (GPU), a 3D pre-processing module, a 3D reconstruction module, background 3D graphical content, 360 degree virtual reality or video content, and a dynamic 3D model created by the 3D reconstruction module, i.e., a holographic projection model 4010. In one embodiment, the 3D pre-processing module and the 3D reconstruction module are graphics processing software executing in a scalable number of Graphics Processing Units (GPUs). In another embodiment, these modules may be hard-coded special purpose semiconductor chipsets or another hardware that operates with the GPU to provide 3D processing and reconstruction. The user input signal in the present embodiment may include an action of an occupant approaching the vehicle, entering the vehicle, starting the vehicle, or the like, or an action of an occupant in sound, motion, or the like, sensed by the sensor.
When the holographic projector 4011 is activated and projects a holographic projection model 4010 at a preset position, for example, above the dashboard as shown in fig. 6, the holographic projection model 4010 is an isometric-reduced holographic model after image acquisition and 3D reconstruction of the vehicle 100. Typically, creation of a digitized holographic model involves eight or more time-synchronized multi-angle image captures of the target object. It will be appreciated that the number of time-synchronized multi-angle image captures required to create a target may be determined according to the structure of the captured target and the degree of fineness of the required created target and is not limited to the specific examples given in the above embodiments. Then, the image capture conversion system performs volume conversion of eight or more time-synchronized multi-angle image captures to create digitized holographic model content and construct a hologram of the object. The holographic projection model 4010 has a plurality of interaction regions associated with the vehicle 100, which may be located in various locations of the holographic projection model 4010 such as doors, windows, lights, sunroof, tailgate, rearview mirrors, etc., it is understood that other locations of the holographic projection model 4010 may have interaction regions as well.
The sensors 403 of the vehicle 100 sense the states of the plurality of vehicle components 405 and transmit the state parameters to the controller 402, the controller 402 converts the state parameters into output signals and transmits the output signals to the holographic projector 4011, and the holographic projector 4011 projects the holographic projection model 4010 of the vehicle 100 at a preset position, in this embodiment, above the vehicle center console, based on the output signals from the controller 402. The plurality of interactive regions of the holographic projection model 4010 may display the current state of the vehicle in real time. For example, if the door 103 of the vehicle 100 is not closed, the model door of the interaction region of the holographic projection model 4010 corresponding thereto is also in an open state, and the vehicle occupant is alerted to the state by blinking the light.
In one embodiment of the invention, the driver or passenger can interact with the holographic interaction system 400 with specific in-air gesture commands near or inside the holographic projection model 4010. Preferably, the holographic projection model 4010 generated by the holographic interaction system 400 does not distract the driver from driving, and may be particularly useful in autonomous or semi-autonomous driving environments where the driver is not required to be fully concerned with the current driving environment.
The occupant can manually operate, for example, a model door in the interaction region, when the hand of the occupant approaches the model door, the position sensing device 4012, which resembles the interaction device 401, detects the motion of the occupant, and when the finger of the occupant approaches or enters the operable region, the position sensing device 4012 feeds an operation instruction obtained from the holographic projection model 4010 to the holographic projector 4011 and is sent to the controller 402 by the holographic projector 4011. After the controller 402 receives the instruction, the door is controlled, for example, closed, by driving the actuator 404 of the vehicle 100. It is understood that other components 405 of the vehicle 100 may also feed back the real-time status of the component 405 to the vehicle occupant through the holographic projection model 4010 in the above-described manner, and may implement control of the basic functions of the components 405 through the operation of the interaction region of the holographic projection model 4010. In addition, the location sensing device 4012 may also recognize different gestures of the occupant, for example, may be configured to recognize finger pinch (i.e., finger squeeze) and pinch (i.e., finger spread) gestures of the driver. If the user's current hand/finger position or gesture correctly matches the commands of the pre-configured holographic model, one or more commands associated with the user's current gesture are executed and reflected in the holographic projection model 4010 or a related multimedia interaction system. In one example, a finger pinch gesture of the driver may be interpreted as zooming out of the current holographic image, recognized as zooming in of the current holographic image, and rotation of the holographic projection model 4010 may also be controlled by clockwise and counterclockwise rotation of the finger. It is understood that different meanings and explanations for gestures are equally possible without going beyond the scope of the claims of the present application.
In one embodiment of the present application, when an occupant operates the interactive region, for example, the occupant closes the model door with a gesture, the holographic projector 4011 may project an optional secondary menu, which may include, for example, "lock the door", "unlock the door", "raise/lower the window", and the like, at a position adjacent to the interactive region. For example, when the vehicle occupant closes the model door at the interaction region by gesture, the occupant may wish to further operate other functions associated with the door, at which time the holographic projector 4011 projects other function operation items associated with the model door adjacent to the model door, including but not limited to "lock door", "unlock door", "raise/lower window", etc., as described above, the occupant may choose to lock the door, at which time the door 103 is locked while closed. It will be appreciated that the functions of unlocking the doors and/or raising/lowering the windows described above may be operated in the same manner. Likewise, if the rear view mirror of the vehicle 100 is in the folded state, the height and angle adjustment buttons of the rear view mirror may be projected at a position close to the model rear view mirror while the adjustment of the rear view mirror from the folded state to the unfolded state is accomplished by operating the model rear view mirror through a gesture, so as to accomplish convenient adjustment of the height and angle of the rear view mirror. Additionally, the vehicle occupant profile may also be stored in the VCS of the vehicle 100, and the occupant profile may be stored locally, on a mobile device, or in the cloud. The occupant profile is associated with a particular vehicle occupant, and by identifying biological features of the occupant, including but not limited to fingerprints, irises, etc., matching the vehicle occupant with its associated occupant profile and displayed on the secondary menu projected by the holographic projector 4011, the occupant can apply personalized vehicle component settings by simply clicking on this option-e.g., raising the window of the door while closing it, deploying the rear view mirror while adjusting the height and angle of the rear view mirror to a position preferred by the occupant. It is understood that other vehicle components 205 may be further modified in the manner described above and will not be described in detail herein.
In one embodiment of the present application, as shown in FIG. 7, the sensors 403 are used to obtain not only the status of the vehicle components 405, but also information about the surrounding environment 406 of the vehicle 100. The ambient environment 406 is captured, for example, by an onboard camera and sent to the controller 402. The controller 402 converts the information of the surrounding environment 406 into an output signal and transmits the output signal to the holographic projector 4011, and the holographic image is projected to a preset position through image processing of the holographic projector 4011, so that the vehicle occupant can visually see the surrounding environment of the position where the vehicle 100 is located through the holographic projection model 4010. As the vehicle driver drives the vehicle, the sensors 403 continuously acquire parameters of the surrounding environment 406 and transmit to the controller 402 in real time and project at predetermined positions through the holographic projector 4011 to provide a guidance function in the current environment.
Furthermore, in one embodiment, the holographic interactive system 400 may be communicatively connected to a navigation module including, but not limited to, a car navigation module or other mobile devices, the car navigation module transmits navigation information to the holographic interactive system 400, and the holographic projector 4011 projects a holographic image of a navigation route to a preset area based on the navigation information and parameters of the surrounding environment 406 sent by the controller 402, so as to provide the navigation information to the vehicle occupant in the form of a holographic reality. It will be appreciated that other possible navigation devices or functions may likewise be used in conjunction with the holographic interaction system 400 to provide holographic image navigation to the occupant.
In addition, the holographic projector 4011 has an adjustable projection angle to project the holographic projection model 4010 to different preset positions, so as to meet the requirement of a vehicle passenger for operating the holographic projection model 4010 according to personal preference and habit.
A flow chart of one embodiment of the present application is shown in fig. 8. At 500, the interaction system 400 of the vehicle 100 is activated. Activation of the interactive system 400 of the vehicle 100 may be performed in a variety of ways, such as sensing the proximity of a vehicle occupant by a proximity sensor, the interactive system 400 being activated in advance when the vehicle occupant is at a preset distance from the vehicle 100. Or may be activated by a door being opened, a seat being occupied, a belt buckle insertion, or a manual or voice command by the vehicle occupant. It should be understood that the above examples are not intended to be limiting, and that other vehicle component activation methods may be used to activate the interactive system 400 of the present application.
After the interactive system 400 is activated, proceeding to 510, the holographic projector 4011 projects a holographic projection model 4010 of the vehicle 100 at a preset location. The holographic projection model 4010 has a plurality of interaction regions associated with the vehicle 100. Flow then proceeds to 520 where the interactive system 400 obtains an output signal from the controller 402 that is based on the sensed signals from the sensors 403 for the status of the vehicle component 405 and the surrounding environment of the vehicle 100. After obtaining the output signals, the process proceeds to 530, where the holographic projection model 4010 displays the state of the vehicle component through an interaction region associated with the vehicle component 405 — for example, by an opening of one or more of the model door, model window, model tailgate, model sunroof, and visual indication that may be accompanied by a flashing, to alert an occupant that one or more of the door, window, tailgate, sunroof is in an open state, or to indicate that the vehicle rearview mirror is still in a folded state. And the holographic projector 4011 may further project a holographic image of the surroundings at a predetermined position according to the received output information related to the surroundings of the vehicle. The flow then proceeds to 540 where if the occupant wishes to change the state of the vehicle component 405, the state of the vehicle component 405 may be controlled by gesture operation on the interaction area, upon the gesture operation by the occupant, the position sensing device 4012 senses the gesture of the occupant, upon the proximity of the occupant's hand to the interaction area, the position sensing device 4012 will determine that the occupant intends to operate on this interaction area, and transmit an instruction signal to the holographic projector 4011 and to the controller 402, and then the actuator 404 is operated by the controller 402 to complete the operation of the vehicle-related component, such as closing the door, closing the window, unfolding the rearview mirror, and the like. After the operation of the above basic functions is performed, the sensor 403 feeds back a sensing signal of the operation completion to the controller 402 and thereby controls the holographic projector 4011 to project the holographic projection model 4010 in conformity with the current vehicle state, that is, the interaction regions of the model door, the model window, the model rearview mirror, and the like of the projected holographic projection model 4010 are in a closed or expanded state, respectively. In this way, the current state of the vehicle can be transmitted very unambiguously to the vehicle occupants. Upon completion of the above operation, the visual indicator of the interactive zone may stop flashing. Further, while the occupant operates one or more interactive regions of the holographic projection model 4010, the holographic projector 4011 projects a secondary option menu for the component associated with the interactive region adjacent to the operated interactive region or regions to display secondary operation options for the vehicle component 405 associated with the region, thereby enabling further operations on the vehicle component 405, which may include, for example, locking, unlocking of a door, adjustment of a rearview mirror angle, loading of an occupant profile, and the like.
Further, upon completion of the above operations, the interactive system 400 may receive a deactivation signal from the controller 402 to cease use of the interactive system 400. The controller 402 may issue the deactivation signal based on a variety of conditions including, but not limited to, the vehicle having started traveling and the speed exceeding a preset threshold, various vehicle components being in a normal state, the vehicle occupant being away from the vehicle and exceeding a preset distance, receiving a vehicle occupant's turn-off command, etc. It is understood that other similar situations may be equally applicable to disabling the vehicle interaction system 400.
By means of the interactive system in the above embodiment, the vehicle driver or passenger can more intuitively know the vehicle state and easily and conveniently operate the basic functions of the vehicle without being disturbed by complicated operation buttons on the operation panel and various signal marks on the instrument panel. The driving safety is improved while the convenience of vehicle operation is enhanced.
The technical features listed above for the different embodiments may be combined with each other, and several steps in the methods listed above may also be changed, omitted, combined with each other, where technically feasible, to form further embodiments within the scope of the invention.
In this application, the use of the conjunction of the contrary intention is intended to include the conjunction. The use of definite or indefinite articles is not intended to indicate cardinality. In particular, references to "the" object or "an" and "an" object are intended to mean possibly one of more such objects. Furthermore, the conjunction "or" may be used to convey simultaneous features, rather than mutually exclusive schemes. In other words, the conjunction "or" should be understood to include "and/or". The term "comprising" is inclusive and has the same scope as "comprising".
The above-described embodiments, particularly any "preferred" embodiments, are possible examples of implementations, and are presented merely for a clear understanding of the principles of the invention. Many variations and modifications may be made to the above-described embodiments without departing substantially from the spirit and principles of the technology described herein. All such modifications are intended to be included within the scope of this disclosure and protected by the following claims.

Claims (20)

1. A motor vehicle interaction system, comprising:
an avatar interaction device located within the vehicle, the avatar interaction device having at least one interaction area associated with at least one vehicle component;
at least one sensor that senses a state of the vehicle;
a controller communicatively coupled to the avatar interaction device and the sensor, the controller configured to receive an input signal from at least one of the avatar interaction device and the sensor, and to provide an output signal to the avatar interaction device and/or to control the at least one component of the vehicle based on the input signal.
2. The interactive system of claim 1, wherein the state of the motor vehicle includes at least one of an accessory state and an environment surrounding the motor vehicle;
the accessories comprise at least one of a car lamp, a car door, a tail door, a car window, a skylight and a rearview mirror;
the sensor includes at least one of a vehicle light sensor, a vehicle door sensor, a vehicle window sensor, a displacement sensor, and a camera.
3. The interaction system of claim 1, wherein said interaction zone of said avatar interaction device is configured to provide visual, tactile, acoustic, olfactory responses for prompting a state of said at least one component of said vehicle based on said output signal, said interaction zone comprising at least one of an avatar light, an avatar door, an avatar tailgate, an avatar window, and an avatar rearview mirror.
4. The interaction system according to claim 3, wherein the image-bearing interaction device is a physical model or a holographic projection model of the motor vehicle.
5. The interaction system according to claim 4, wherein said interaction area of said physical model is provided with at least one of an optical means and a sound device, said response being displayed by said optical means and/or sound device.
6. The interaction system as claimed in claim 4, wherein the inputtable interaction regions of the physical model comprise buttons and/or touch switches configured for controlling the at least one vehicle component with which the interaction regions are associated.
7. The interactive system as claimed in claim 4, wherein the physical model is rotatably disposed on a base detachably disposed above the instrument panel, the physical model being rotated by a driving means installed inside the base.
8. The interactive system according to claim 4, wherein the holographic projection model comprises a holographic projector, a position detection device and a motor vehicle hologram, the position detection device being communicatively connected to the holographic projector.
9. The interactive system as claimed in claim 8, wherein the holographic projector is communicatively coupled to the controller and displays a real-time status of the motor vehicle based on the received output signal, the real-time status including at least one of an accessory status of the motor vehicle and an environment surrounding the motor vehicle.
10. The interaction system according to claim 8, wherein the motor vehicle hologram has a plurality of inputtable interaction regions, the position detection device detects operation of the inputtable interaction regions by a motor vehicle occupant, and the controller controls the respective accessories of the motor vehicle in accordance with the operation.
11. The interactive system according to claim 8, wherein the holographic projector is connected to a body part in a passenger compartment, the holographic projector projecting the motor vehicle hologram on at least one predefined location in the passenger compartment.
12. A motor vehicle interaction method, comprising:
collecting the vehicle state by at least one sensor;
activating an avatar interaction device within the vehicle, the avatar interaction device having at least one interaction region associated with at least one vehicle component;
a controller receives input signals from at least one of the sensor and the avatar-interacting device, and provides output signals to the avatar-interacting device and/or controls the at least one component of the vehicle based on the input signals.
13. The interaction method of claim 12, wherein the interaction region provides a visual, tactile, acoustic, olfactory response based on the output signal, the response being indicative of a state of the at least one component of the vehicle.
14. The interaction method of claim 12, wherein controlling the motor vehicle comprises controlling accessories of the motor vehicle, the accessories including at least one of a vehicle light, a vehicle door, a tailgate, a vehicle window, a sunroof, a rearview mirror.
15. The interaction method according to claim 12,
the image-bearing interaction device is a physical model or a holographic projection model of the motor vehicle.
16. The method of claim 15, wherein the vehicle state is a real-time state of the vehicle, the real-time state including at least one of an accessory state of the motor vehicle and an environment surrounding the motor vehicle.
17. The method of claim 16, wherein the holographic projection model receives the output signal and projects a holographic representation of the ambient environment over a predetermined area.
18. The interaction method of claim 12, wherein controlling the motor vehicle comprises controlling a sound device of the motor vehicle to provide operating advice based on the input signal.
19. The method of claim 12, wherein the operation on the interactive zone comprises a plurality of operations on the component corresponding thereto, the plurality of operations comprising operating a base function and operating a secondary function.
20. A motor vehicle comprising a motor vehicle interaction system, the system comprising:
an avatar interaction device located within the vehicle, the avatar interaction device having at least one interaction area associated with at least one vehicle component;
at least one sensor that senses a state of the vehicle;
a controller communicatively coupled to the avatar interaction device and the sensor, the controller configured to receive an input signal from at least one of the avatar interaction device and the sensor, and to provide an output signal to the avatar interaction device and/or to control the at least one component of the vehicle based on the input signal.
CN201910697269.2A 2019-07-30 2019-07-30 Motor vehicle interaction system and method Pending CN112389458A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910697269.2A CN112389458A (en) 2019-07-30 2019-07-30 Motor vehicle interaction system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910697269.2A CN112389458A (en) 2019-07-30 2019-07-30 Motor vehicle interaction system and method

Publications (1)

Publication Number Publication Date
CN112389458A true CN112389458A (en) 2021-02-23

Family

ID=74601246

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910697269.2A Pending CN112389458A (en) 2019-07-30 2019-07-30 Motor vehicle interaction system and method

Country Status (1)

Country Link
CN (1) CN112389458A (en)

Similar Documents

Publication Publication Date Title
CN109552340B (en) Gesture and expression control for vehicles
CN106994946B (en) User interface device for vehicle and vehicle
KR101976419B1 (en) Door control Apparatus for Vehicle and Vehicle
KR101895485B1 (en) Drive assistance appratus and method for controlling the same
CN109849906B (en) Autonomous traveling vehicle and control method thereof
US20140195096A1 (en) Apparatus and method for contactlessly detecting objects and/or persons and gestures and/or operating procedures made and/or carried out thereby
US20180143754A1 (en) Vehicle steering and control device (vscd)
JP2018537332A (en) Vehicle control system based on human face recognition
CN110682913B (en) Monitoring system
CN110001547B (en) Input/output device and vehicle including the same
GB2423808A (en) Gesture controlled system for controlling vehicle accessories
US20200164770A1 (en) Vehicle control apparatus disposed in vehicle and control method of vehicle
GB2501575A (en) Interacting with vehicle controls through gesture recognition
KR20160036242A (en) Gesture recognition apparatus, vehicle having the same and method for controlling the same
JP5588764B2 (en) In-vehicle device operation device
US20180203517A1 (en) Method and operator control system for operating at least one function in a vehicle
EP3457254A1 (en) Method and system for displaying virtual reality information in a vehicle
KR102135379B1 (en) Robot for vehicle and control method of the robot
US20210087869A1 (en) Method and Device for the Non-Mechanical Activation of a Closure Arrangement
KR101946746B1 (en) Positioning of non-vehicle objects in the vehicle
CN110733437B (en) Vehicle automatic control system, method for controlling vehicle to run and vehicle
US20180297471A1 (en) Support to handle an object within a passenger interior of a vehicle
CN107264444A (en) The professional reversing of trailer auxiliary knob of modularization for open storage region
EP3126934B1 (en) Systems and methods for the detection of implicit gestures
US11314976B2 (en) Vehicle control device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination