WO2014181146A1 - Système et procédé de commande d'un véhicule - Google Patents

Système et procédé de commande d'un véhicule Download PDF

Info

Publication number
WO2014181146A1
WO2014181146A1 PCT/IB2013/001225 IB2013001225W WO2014181146A1 WO 2014181146 A1 WO2014181146 A1 WO 2014181146A1 IB 2013001225 W IB2013001225 W IB 2013001225W WO 2014181146 A1 WO2014181146 A1 WO 2014181146A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
vehicle
control system
function
recognition unit
Prior art date
Application number
PCT/IB2013/001225
Other languages
English (en)
Inventor
Raphael Ribero
Original Assignee
Renault Trucks
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Renault Trucks filed Critical Renault Trucks
Priority to PCT/IB2013/001225 priority Critical patent/WO2014181146A1/fr
Publication of WO2014181146A1 publication Critical patent/WO2014181146A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates to a system and a method for controlling vehicle functions by body postures and/or body movements.
  • human machine interface relies on interfaces such as the steering wheel, joystick, switch etc.
  • the user interacts with the machine and the machine accordingly responds by a machine output.
  • the physical interaction with the vehicle can be a problem.
  • a further example of unsatisfactory vehicle interactions can be found for example in the field of construction vehicles.
  • a number of operations have to be performed such as dump tilting, crane elevation etc.
  • Due to the general size of the vehicle it can prove to be difficult to operate the vehicle through a user interface physically connected to the vehicle while simultaneously taking into account the position of the vehicle or the position of some mobile vehicle equipment, such as a crane or a tipper, in the vehicle environment.
  • the invention concerns a vehicle control system to- control one or more vehicle functions of a vehicle by recognition of different postures or movements of a user.
  • the vehicle control system comprises :
  • a motion recognition unit positioned and oriented so that it can capture images of postures or movements assumed or performed by the user in a surrounding area of the vehicle ;
  • an electronic control unit in communication with the motion recognition unit and configured to recognize user postures or movements by comparison with predetermined postures or movements stored in a memory of the vehicle control system.
  • the electronic control unit of the vehicle control system is configured to generate, depending on the result of the comparison, a control signal in order to control one or more vehicle functions in response to the user postures or movements.
  • the invention makes it possible to control at least one vehicle function by recognition of user posture(s) or user movement(s). This proves to be a great improvement over traditional control interfaces which can be a push button or a rotary knob fixed in the vehicle and which oblige the user to be positioned in order to reach the control interface. In contrast, with the invention this user has a degree of freedom - within the field of detection of the motion recognition unit - to control a vehicle from the outside.
  • the motion recognition unit is positioned on an outside surface of the vehicle and is in wireless communication or in wire connection with the electronic control unit.
  • the vehicle control system comprises an activation interface (30, 31 , 32) that includes one or a combination of several elements among the following : • a switch located on the vehicle dashboard,
  • the motion recognition unit comprises at least one camera that is preferably a three dimensions camera (“3D camera").
  • 3D camera three dimensions camera
  • the motion recognition unit is located in a side rear view mirror unit of the vehicle.
  • side said rear view mirror unit is fixed to the vehicle via at least one articulation and the orientation of the motion recognition unit can be adjusted by modifying the orientation of said rear view mirror unit via said articulation.
  • said articulation is motorized.
  • the vehicle function controlled by the system includes at least one vehicle function amongst the following :
  • the invention concerns a controlling method for controlling one or more functions of a vehicle that is equipped with a vehicle control system that comprises at least one motion recognition unit and an electronic control unit that is in communication with the motion recognition unit.
  • the method comprises the steps of :
  • each set of comparison data represents a preregistered user posture or user movement or represents a preregistered series of user postures or user movements corresponding to at least one operation that can be executed by a vehicle function ;
  • the terms “the corresponding operation” and “the identified operation” refer to the at least one operation that corresponds to the preregistered user posture or user movement or to the preregistered series of user postures or user movements represented by the set of comparison data identified by the electronic control unit as matching with the digital signal.
  • a user can control one or several functions of a vehicle from the outside by assuming different body postures and/or by performing different body movements. Another advantage is that the user can visually check the execution of the operation corresponding to his/her body posture(s) or body movement(s) without being obliged to get in and get out of the vehicle between each execution.
  • the method comprises the further step of :
  • the operational zone is provided in a location where the user is able to visually check the execution of the corresponding operation.
  • the user can visually check the execution of the operation corresponding to his/her body posture(s) or body movement(s) without being obliged to move from the location where he/she assumes body postures or where he/she performs body movements.
  • the vehicle control system is configured to control several functions of the vehicle and the method comprises a further step of receiving the user selection of the vehicle function that the user is intended to control via the vehicle control system.
  • the selection of the function is carried out by the user via a remote interface.
  • the selection of the function is carried out by assuming an initial user posture or by performing an initial user movement that is detected and recognized by the vehicle control system.
  • the method comprises the further step of informing the user, by generating an alarm or a visual or oral message, that his/her said posture or movement doesn't correspond to an available operation of the vehicle control system if no set of comparison data matches the digital signal.
  • the method comprises a step of user identification wherein the user is identified by an identification code that is recognized by the control system in order to authorize the user to control at least one function of the vehicle.
  • said identification code is generated by a CID located in the surrounding area of the vehicle or by a CID carried by the user when he or she is located in the surrounding area of the vehicle.
  • the method comprises, before the step of executing the corresponding operation, further steps of requesting and receiving a user confirmation in order to confirm that the corresponding operation is the correct one.
  • said user confirmation is given by a new user posture or a new user movement that the vehicle control system recognizes.
  • the method may comprise a further step of checking safety conditions wherein, before and/or during the step of executing the corresponding operation, it is checked that the execution of the corresponding operation will not interfere with another operation that is currently executed or will not interfere with an outside obstacle detected by the vehicle control system.
  • the method may also comprise a further step of preventing or stopping the execution of the corresponding operation if it can interfere with another operation or with an outside obstacle.
  • the method further comprises the initial step of activating the controlling system in order to activate at least the motion recognition unit.
  • the step of activating the vehicle control system further comprises the activation of the parking brake arid in that the controlling method further comprises the step of releasing the parking brake if the execution of the corresponding operation shall result in a forward or backward motion of the vehicle.
  • the step of releasing the parking brake is conditional upon a user confirmation as previously described.
  • the method comprises, before the step of capturing at least one image of at least one user posture or one user movement, the step of checking if the user is located in the operational zone.
  • the recognition unit if the recognition unit doesn't detect the presence of the user in the operational zone after a first determined time-out following the activation of the recognition unit, the user is informed by an alarm or a visual or oral message that the vehicle control unit can't detect his or her presence in the operational zone.
  • the controlling method may also comprise a learning function in order to allow the user to determine which body postures or movements can be used to control vehicle function(s).
  • the method may comprise the further steps of :
  • o capturing, with the motion recognition unit, at least one reference image of a user posture or user movement assumed or performed by the user in the operational zone ; ⁇ converting said reference image(s) into a set of reference digital data ;
  • the method includes a learning function, it may otherwise steps of :
  • capturing, with the motion recognition unit, at least one first reference image of a first user posture or user movement assumed or performed by the user in the operational zone; ⁇ converting said first reference image(s) into a first set of reference digital data ;
  • the user selection is performed via a remote interface that is in wireless communication with the control system.
  • Figure 1 is a schematic side view of a vehicle equipped with a system according to the invention
  • Figure 2 is a schematic top view of the vehicle equipped with a system according to the invention having one motion recognition unit ;
  • Figure 3 is a schematic top view of the vehicle equipped with the system of figure 2 equipped with a second motion recognition units ;
  • FIGS 4-8 are different flowcharts representing different implementations of the method according to the invention.
  • FIGS 9 and 10 are two flowcharts representing two different learning processes of the method according to the invention.
  • Fig. 1 illustrates a vehicle 1 , more precisely a truck having a cabin 2 and a box shaped container 3.
  • the cabin 2 and the container 3 are supported by a chassis frame 4.
  • the vehicle 1 also comprises front and rear wheels 5 connected to the chassis frame 4 via, in particular, axles and a suspension system that can include adjustable pneumatic suspensions 6 in order to make possible height adjustment H of the vehicle.
  • wheels 5 are represented according to two different positions of the suspension system. A first position corresponding to a normal driving position where the wheels 5, represented in continuous lines, are relatively close to the chassis frame.
  • a second position corresponding to an operational position of the vehicle that can be, for instance, a unique position to unload the container part when the vehicle is against a loading platform.
  • the user 13, can get into the cabin 2 or the container 3 via access doors 7.
  • the vehicle also comprises a lighting system having front lights 8 and rear lights 9.
  • the vehicle comprises rear view mirrors 10a and 10b disposed on each side of the cabin and which are in a normal position oriented rearward.
  • the vehicle 1 can be equipped with a tailgate 1 1 arranged at the rear of the container 3 and which is able to rotate R and lift up or lift down L thanks to hydraulic or pneumatic actuators (not shown). It can also be equipped with a front air deflector 12 arranged on the roof of the cabin 2 that can be adjusted A between at least two positions : a folded position represented in continuous lines and a deployed position represented in dashed lines on figure 1.
  • the vehicle can be equipped with a tipping trailer, with a crane, with a concrete mixer or with any other auxiliary equipment (not shown).
  • the vehicle according to the invention can be a construction vehicle, a bus or any other road vehicle.
  • the vehicle 1 represented on figures 1 to 3 is equipped with a vehicle control system according to the invention.
  • the vehicle control system is able to control one or more vehicle functions of the vehicle 1 by recognition of different postures or movements of a user. Each function of the vehicle 1 is controlled according to different operations that the function is able to execute.
  • the vehicle control system comprises at least one motion recognition unit 15 which can be positioned on the outside surface of the vehicle 1 and at least one electronic control unit 16 (ECU, see fig° 2 and fig°3) that can be a dedicated ECU, as it will be represented hereunder, or that can be embedded within a multi-function ECU performing other control functions.
  • the vehicle control system can also comprise an activation interface 30, 31 , 32 (see fig°3).
  • the motion recognition unit 15 can be, for instance, positioned on the top surface of the container 3 or the motion recognition unit 15' can be fixed to the cabin such as represented in dashed lines on figure 1 .
  • the motion recognition unit 15 is able to capture images of body postures assumed by the user 19 or images of body movements performed by the user 19 in the surrounding area of the vehicle 1 .
  • the motion recognition unit 15 is able to capture images of body postures assumed by the user 19 or images of body movements performed by the user 19 in the surrounding area of the vehicle 1 .
  • the electronic control unit 15 or a converting unit of the system (not shown), that can be arranged between the motion recognition unit 15 and the ECU 16, converts the images into a digital signal that is transmitted to the ECU 16.
  • the ECU 16 is in communication with the motion recognition unit 15 and is configured to process data received from the motion recognition unit 15 or from a converting unit.
  • the ECU 16 is configured to recognize user postures or movements by comparison with predetermined body postures or body movements stored in a memory 22 of the vehicle control system or in a memory of the vehicle and which are assigned to at least one operation that can be executed by a function of the vehicle 1 .
  • the ECU 16 recognizes user posture(s) or movement(s), it is also configured to generate and transmit a control signal in order to control a function of the vehicle by executing the operation that corresponds to the user posture(s) or movement(s).
  • the vehicle functions that can be controlled by the vehicle control system are for instance :
  • the lighting system including, for instance, front lights 8 and rear lights 9,
  • Each function is designed to execute some operations. For instance :
  • the access door function allows the execution of a locking operation or the execution of an unlocking operation of at least one door of the vehicle
  • the tailgate function is able to execute an deployment operation or a folding operation R of the tailgate and can also execute a lift up operation or a lift down operation L of the tailgate,
  • the front air deflector function can be used to adjust A the position of the deflector between at least two positions : a folded position and a deployed position,
  • the lighting operations that can be implemented can be considered on their own or in combination with the following :
  • vehicle functions that can be controlled by the ECU 16 are not limited to the above described functions. If the vehicle is equipped with a crane or with a concrete mixer, corresponding function and operations can be controlled by the vehicle control system.
  • the motion recognition unit 15, 15' can typically comprise one or more cameras which can be fixed on a side or top surface of the cabin 2 or on a side or top surface of the box shaped container 3 or trailer.
  • the camera (s) can be a 2D camera.
  • the camera is preferably a 3D camera capable of generating data including depth information of a user posture or movements and so to determine distance and variation of distance between the 3D camera and position of the user or user's members such as user's hands.
  • the motion recognition unit 15 is fixed on a surface of the vehicle, it is preferably folded in a rest position when the vehicle is driven to prevent aerodynamic drag in order to limit fuel consumption.
  • the motion recognition unit 15 is fixed on a side rear view mirror unit 10a, 10b of the vehicle.
  • the motion recognition unit 15 can be fitted in the side rear view mirror unit 10a, 0b without protruding to the outside of the side rear view mirror unit 10a, 10b.
  • both side rear view mirrors of the vehicle are equipped with a motion recognition unit 15.
  • the motion recognition unit is able to capture images of said user posture or user movement that falls in a determined operational zone 18.
  • the operational zone 18 may be determined by the field of detection of one or several cameras or can be a more limited area included in said field of detection.
  • the field of detection can be defined by the orientation and the field of view of each camera.
  • the vehicle control system with more than one or two cameras 15 in order to increase the size or the operational area 18 or in order to provide different operational areas 18a, 18b optimally located so that the user can see more easily and visually check the execution of the different operations in response to his / her body postures or body movements.
  • a first operational area 18a located at the rear side of the vehicle the user can visually check, in response to his / her body postures or body movements and without being forced to move from the location where he / she has assumed his / her body postures or performed his / her body movements, the execution of the lighting operations corresponding to rear lights ON, rear flashing lights ON and rear fog lights ON.
  • a second operational area 18b located at the front of the vehicle the user can visually check the execution of the lighting operations corresponding daytime running lights ON, position lights ON, low beams ON, high beams ON, front flashing lights ON and front fog lights ON.
  • the motion recognition unit 15 can be in wireless communication or in wire connection with the electronic control unit 16.
  • the motion recognition unit 15 is suitably connected to a power source such as the vehicle batteries (not shown).
  • the motion recognition unit 15 can also be fitted with a motor (not shown) so that it can be rotated in order to adjust its orientation and to increase the operational zone that can be detected.
  • the motion recognition unit 15 can be located in a side rear view mirror unit 0a, 10b of the vehicle, in this case the side rear view mirror unit can be fixed to the vehicle 1 through at least one articulation 20a, 20b so that the orientation of the camera 15 can be adjusted by modifying the orientation of the rear view mirror unit.
  • the articulation 20a, 20b can also be motorized so that the vehicle control system can automatically modify the orientation of the rear view mirror unit 10a, 10b. Thanks to that the detection field of the motion recognition unit 15 and so the operational zone 18 can easily be modified.
  • the user can adjust the orientation of the rear view mirror unit 10a, 10b via a remote control interface.
  • the orientation of the motion recognition unit 15 or the orientation of the rear view mirror unit 10a, 10b supporting the camera can be automatically modified from a rest position to an active position as soon as the system is activated. For instance, as soon as the vehicle control system is activated, the right rear view mirror unit 10b can be rotated by an a angle from a position, such as depicted on figure 2, where it is oriented rearward to a position, such as depicted on figure 3, where it is oriented frontward.
  • the vehicle control system can modify the orientation of the motion recognition unit 15 or modify the orientation of the rear view mirror unit 10a, 10b that supports the motion recognition unit, depending on the vehicle function that the user has previously selected as the function he or she intends to control by his or her body posture(s) or body movement(s). For instance, if the user selects the vehicle function that corresponds to the suspension system 6, 26 in order to adjust the height of the vehicle, the motion recognition unit 15 can be oriented by the vehicle control system in order to delimit, such as represented on figure 2, an operational area 18a located on a side of the vehicle. If, on the contrary, the user selects the front air deflector vehicle function in order to adjust the position of the front air deflector 12, the vehicle control system modifies the orientation a of the motion recognition unit
  • the operational area 18b is preferably delimited, such as a represented on figure 3, in the front of the vehicle.
  • the vehicle control system can modify the orientation of one or of several motion recognition units 15 depending on the vehicle function selected by the user.
  • the ECU 16 can include a microprocessor 21 , a non-volatile memory 22 such as a Read Only Memory (ROM), and a volatile memory such as a Random Access Memory (RAM) 23.
  • the ECU 16 can also include other conventional components such as an input interface circuit, an output interface circuit, and the like.
  • the ECU 16 is programmed to analyze images of a user captured by the motion recognition unit 15. More precisely, the ECU 16 receives, in the form of a digital signal, images from the motion recognition unit 15 or from a converting unit, and compares the images with a series of images stored in the non-volatile memory in the form of a set of digital data.
  • the ECU 16 is programmed to analyze images of a user captured by the motion recognition unit 15. More precisely, the ECU 16 receives, in the form of a digital signal, images from the motion recognition unit 15 or from a converting unit, and compares the images with a series of images stored in the non-volatile memory in the form of a set of digital data.
  • each vehicle function is related to a specific ECU (Electronic Control Unit) 25, 26, 28, 29 of the vehicle that is in charge of managing the vehicle function.
  • ECU Electronic Control Unit
  • one common ECU 25, 26, 28, 29 can be in charge of managing several vehicle functions.
  • the specific ECUs 25, 26, 28, 29 are preferably connected to the main ECU 16 via a CAN bus 24 commonly used in the field of truck technology.
  • a specific ECU 25, 26, 28, 29 can manage the corresponding vehicle function via at least one actuator.
  • Such an actuator can be pneumatic suspensions 6, hydraulic cylinder and so on.
  • ⁇ a specific ECU 25 to manage the function corresponding to the low speed motions of the vehicle and that can be able to control the engine speed, the gearbox and the clutch of the driveline (not shown) in order to execute a low speed motion of the vehicle in the frontward or in the rearward direction ;
  • ⁇ a specific ECU 26 to manage the function corresponding to the suspension system and that is able to control electron- pneumatic vanes connected to at least the rear pneumatic suspensions of the vehicle in order to raise or lower the rear of the vehicle ;
  • the activation interface 30, 31 , 32 makes it possible to set the vehicle in a state where the vehicle control system is activated.
  • the vehicle control system In case of activation of the vehicle control system at least the motion recognition unit 15 is activated.
  • the ECU 16 of the vehicle control system can be activated at the same time or after, for instance, as soon as the user is detected by the motion recognition unit 15.
  • the activation interface 30, 31 , 32 can have several embodiments and can include one or a combination of several elements among the following :
  • a remote interface such as a smart phone 31 , which communicates with the vehicle control system using a wireless technology standard for exchanging data over short distances such as Bluetooth or Wi-Fi,
  • a microphone unit 32 capable of detecting and converting an oral instruction of the user into an electric or electronic control signal that controls the activation of the vehicle control system.
  • the vehicle control system can also include a user identification device 40, 41 such as an image identification system, a voice identification system or a customer identification device 40 (CID) capable of generating an identification code that is recognized by the vehicle control system to ensure that only an authorized user can use the control system.
  • a user identification device 40, 41 such as an image identification system, a voice identification system or a customer identification device 40 (CID) capable of generating an identification code that is recognized by the vehicle control system to ensure that only an authorized user can use the control system.
  • CID customer identification device 40
  • the identification device is for instance a CID 40, it can be part of the ignition key unit or key fob that the user carries with him / her.
  • the identification device can use the motion recognition unit 15 or can use a specific image identification unit that is, for instance, capable of reading a bar code fixed to a user's clothing.
  • the bar code can be printed or sewn to a user's cap.
  • the vehicle control system can give different rights to different users in terms of vehicle function that can be controlled. For instance, one user can be authorized to control all the vehicle functions accessible through the vehicle control system whereas a second can be authorized to control only the lighting system function.
  • the vehicle control system is activated.
  • this step of activation at least the motion recognition unit 15 is activated.
  • the ECU 16 of the vehicle control system can be activated during this step 79 or in a different way before or after this step.
  • the ECU 16 of the vehicle control system can be activated as soon as the ignition key is turned on to make contact with different systems and equipment of the vehicle with the service battery or after the activation of the motion recognition unit 15, for instance, when the user is detected by the motion recognition unit 15.
  • the user can activate the vehicle control system by using one of the previously described activation means 30, 31 , 32.
  • the activation of the motion recognition unit 15 can also be performed automatically, for a determined period of time.
  • the motion recognition unit 15 is automatically activated when the ignition key is turned on and as soon as the system detects that the driver has got out of the vehicle 1.
  • Door sensor and/or seat sensor can be used to detect that the driver gets out or has got out of the vehicle.
  • the step of activating the vehicle control system 79 can also comprise the activation of the parking brake of the vehicle (not shown).
  • the method can also comprise a step of releasing the parking brake (not shown) if the execution of the corresponding operation requires a forward or backward motion of the vehicle.
  • the step of releasing the parking brake is conditional upon a user confirmation wherein the user has to confirm that the operation identified by the control system as the one that the concerned function is about to execute is the correct one. Even if it is not recommended because of energy consumption, the invention doesn't exclude that the vehicle control system remains activated or in a sleeping mode, that is to say wherein energy consumed by the system is reduced, when the ignition key is removed.
  • the vehicle control system provides in the surrounding area of the vehicle 1 an operational zone 18, 18a, 18b where the user can assume different postures or perform different movements.
  • the operational zone can correspond to the zone that the motion recognition unit 15 is able to detect depending on its inherent field of view and on its orientation.
  • the operational zone 18, 18a, 18b can be provided as soon as the motion recognition unit 15 is activated.
  • the operational zone is preferably provided in a location where the user is able to see the operations that are executed by the vehicle function in response to his body postures or motions. In this aim several operational zones 18a and 18b can be provided.
  • the several operational zones 18a and 18b can be provided by using several motion recognition units 15 that can be arranged at different locations of the vehicle 2, 3, 10a, 10b.
  • the position of the operational zone can also be modified.
  • a camera 15 (see figure 3) used as a motion recognition unit can be designed so that its orientation can be adjusted depending on the user location in the surrounding area of the vehicle 1 and/or depending on the vehicle function that the user selects according to a step 104 hereinafter described in connection with flowchart of figure 5.
  • a next step 1 10 the motion recognition unit 15 captures, at least one image of a user posture or at least one user movement performed in the operational zone 18, 18a, 18b.
  • This step 1 10 is preferably preceded by a step of user detection 108 in order to detect the presence of the user in the operational zone 18, 18a, 18b.
  • the step of user detection 108 can be performed by the motion recognition unit 15 itself or by a specific sensor (not shown).
  • step 108 if the presence of the user can't be detected in the operational zone 18, 18a, 18b after a first elapsed time T1 following the activation of the vehicle control system, for instance 30 seconds, the user is informed, according to a step 109 of the method, by a visual or oral message or alarm that the control system can't detect him or her in the operational zone 18. After several attempts to try to detect the user in the operational zone 18 or after a second elapsed time T2 following the activation of the vehicle control system, for instance 90 seconds, the vehicle control system is automatically deactivated according to step 103.
  • the detection of the user in the operational zone 18 can be used to trigger the capture of user postures or user movements by the motion recognition unit.
  • different solutions can be used to trigger the capture of user postures or user movements.
  • it can be a particular voice message emitted by the user and that is recognized by the vehicle control system.
  • the vehicle control system can use a microphone unit 32 and features (not represented) able to process an audio signal.
  • the user voice message can be a single "top” to trigger the capture of one image of a user posture or a "start top” to trigger the capture of several images of successive user postures or a user movement wherein the "start top” is followed by an "end top” to stop the capture.
  • the vehicle control system converts said image(s) into a digital signal that is sent to the ECU 16.
  • the image can be converted by the motion recognition unit 15 itself or by a specific converting unit (not shown) connected between the motion recognition unit and the ECU 6.
  • a step 120 the digital signal, representative of at least one image of the user posture(s) or user movements(s) captured by the motion recognition unit 15, is compared, by the ECU 16, with several sets of comparison data stored in the non-volatile memory.
  • Each set of comparison data represents a preregistered user posture or user movement or a preregistered series of user postures or user movements to which is assigned to at least one operation of a vehicle function.
  • the user postures or user movements that the system is able to recognize can be body postures or body movements, for instance a C, U or L shape performed with arms; a hand position, for instance a distinction can be made between fist closed and open hand ; the system can be designed to recognize different finger configurations and can also be designed to recognize a combination of body, hand and/or finger postures or movements (in the present application they are call "user posture(s)” / "body posture(s)” or "user movement(s)” / "body movement(s)”).
  • the ECU 16 identifies which set of comparison data matches the digital signal and so identifies the corresponding operation that corresponds to the identified set of comparison data. If, according to step 130, no set of comparison data matches the digital signal, the controlling method preferably returns 131 to the previous step 1 10 or before, for instance to the step 108, to capture a new user posture or movement. In this case, the user is preferably informed 140 by the vehicle control system that his/her posture or movement doesn't correspond to an available operation of the control system. To inform the user, the vehicle control system can generate a visual or an oral alarm or message by using a colour code display to the outside of the vehicle or an outside speaker 33 (fig° 2 and fig°3). After several unsuccessful attempts by the vehicle control system to recognize user postures or movements, for instance after three attempts, the vehicle control system can be automatically deactivated.
  • step 130 If, according to step 130, the ECU 16 has identified a set of comparison data that matches with the digital signal, in a next step, the corresponding operation, which corresponds to the identified set of comparison data, is executed by the corresponding vehicle function.
  • a step 150 the vehicle control system can generate a control signal which contains information for the execution of the corresponding operation.
  • the vehicle control system can transmit the control signal to the specific ECU 25, 26, 28, 29 in charge of managing the vehicle function concerned by the execution of the corresponding operation.
  • the control signal is transmitted to a specific ECU 25, 26, 28, 29 of the vehicle 1 , that can be a specifically configured to only manage the identified vehicle function or a general one that is configured to manage the identified vehicle function and some other.
  • a step 170 the specific ECU controls the corresponding vehicle function according to said control signal. More precisely, the vehicle function, concerned by the execution of the corresponding operation, is controlled according to the control signal that contains information requesting the execution of the corresponding operation. The corresponding operation is then executed by the corresponding vehicle function.
  • each operation that the system can control is assigned by the system to at least one user posture or user movement that has to be recognized by the ECU 16 in order to request its execution by the corresponding function.
  • the vehicle control system After having controlled the vehicle function in response to one or several user postures or user movements, in a step 180 the vehicle control system ideally checks if a new user posture or movement can be detected in the operational zone 18a, 18b.
  • step 120 If yes, the method returns to the previous step 1 10 in order to capture at least one image of the new user posture or movement and to process it according to step 120 and the following steps.
  • the vehicle control system is preferably deactivated 190.
  • the vehicle control system is preferably deactivated, according to a step 190, after an elapsed time of 30 seconds without detecting a new user posture or a new user movement in the operational zone 18, 18a, 18b.
  • the method such as depicted in figure 4 can be implemented to control one vehicle function or can be implemented to control several functions of the vehicle 1 .
  • the user has to assume at least one specific posture or has to perform at least one specific movement for each operation that the user can control.
  • a different set of comparison data is stored in the non-volatile memory 22 of the ECU 16 for each operation of each function that the vehicle control system is capable of controlling.
  • Figure 5 depicts an improved implementation of the method according to the invention. Compared to the implementation of figure 4, in the improved implementation of figure 5 the step 104 is added and the optional steps 105, 106 and 107 can be added.
  • the vehicle control system receives the user selection of the function that he or she is intended to control via the vehicle control system.
  • the method implemented according to figure 5 is particularly suitable when the vehicle control system is configured to control several functions of the vehicle.
  • the user selects, before assuming a posture or performing a movement in order to control a function, which function he or she wants to control.
  • the vehicle control system receives according to step 104 which function the user has selected. Thanks to that, the digital signal that can be received by the ECU from the motion recognition unit can be compared to a limited number of sets of comparison data corresponding to the function that the user is intended to control. Therefore, the risk of executing the wrong operation or controlling the wrong function is reduced.
  • the number of postures or movements that the user has to remember can be limited because the same posture or movement can be used for different vehicle functions.
  • the selection of the function can be performed by the user via a remote interface 31 .
  • This selection of the function can also be performed by an initial body posture or an initial body movement of the user that is detected and recognized by the vehicle control system which identifies the function actually selected by the user.
  • step 105 it can be requested in an optional step 105 that the user confirms the vehicle function that he/she has selected.
  • step 106 the controlling method checks that the user confirms his / her selection, if not the method returns 1061 to the step 104 where it is requested once again to the user to select the function that he or she is intended to control.
  • the controlling method informs, in step 107, the user that confirmation is not valid or doesn't match with his/her initial selection.
  • the user can at any time during the implementation of the method according to the invention modify the selected function. Particularly, when a identified operation has been executed, the user has the opportunity to change the selected function for another.
  • Figure 6 depicts a further implementation of the method according to the invention.
  • user identification is performed in new step 100.
  • the user has to be recognized by the control system as being an authorized user, that is to say a user that is authorized to control at least one function of the vehicle.
  • This step 100 is performed before the execution of any operation and preferably before the step 1 10 where the motion recognition captures at least one image of at least one user posture or at least one user movement.
  • the user is identified by an identification code that is recognized by the control system in order to authorize him/her to control at least one function of the vehicle depending on the vehicle functions that the system can manage and that the user is authorized to control.
  • the identification code can authorize the user to control only a part of the available functions, that is to say part of the functions that the vehicle control system is capable of controlling.
  • the user identification can be a passive identification.
  • the identification code can be a generated by a CID 40 (customer identification device) carried by the user when he or she is located in the surrounding area of the vehicle 1.
  • the CID can be an electronic chip that is part of the ignition key unit.
  • the identification code can be a readable bar code fixed to a user's clothing, for instance, printed or sewn to a user's cap 45.
  • the electronic CID 40 When an electronic CID 40 is used to generate the identification code, the electronic CID 40 is provided with a transmitter capable of transmitting the identification code via wireless communication. Ideally the electronic CID is also equipped with a signal receiver in order to receive a challenging message received via wireless communication from a signal transmitter 41 of the control system. If it has received and recognized the challenging message, the electronic CID 40 replies to the challenging message by transmitting to the control system the identification code which in its turn has to be recognized by the control system.
  • the user identification can be an active identification where the user has, for instance, to enter an alpha-numeric code in a remote interface 31 , such as a smart phone, that is then transmitted, via wireless communication to the control system.
  • a remote interface 31 such as a smart phone
  • the controlling method can perform several attempts via the return line 101 to try to identify the user.
  • the fact that the controlling method performs several attempts to try to identify the user is particularly useful when the user identification depends on the location of the user or when the user identification is an active identification.
  • the control system can inform the user, in step 102 via for instance an outside speaker 33, that the identification code he or she has entered is incorrect.
  • control system When the user identification uses an electronic CID and if the control system can't receive from the electronic CID an identification code, the control system will also perform several attempts 101 and inform 102 between each attempt that the electronic CID is not in the vicinity of the vehicle or that the user carrying the electronic CID is not located in the surrounding area of the vehicle.
  • control system can be automatically deactivated in the step 103.
  • the steps of user identification according to the invention it is prevented that an unauthorized person uses the control system or that the user can control a vehicle function when he/she is not qualified or not authorized to control it. Therefore the security of the controlling method and the control system is greatly increased. This high level of security is especially recommended when the controlling method can be used to control accesses of the vehicle or when it can be used to control motions of the vehicle or control motion of some vehicle equipment.
  • Figure 7 depicts a particular implementation of the method according to the invention.
  • the controlling method requests now to the user, in the further step 141 and for instance via the outside speaker 33, to confirm that the operation identified by the ECU 16, that is to say the corresponding operation that corresponds to the set of comparison data identified by the ECU 16 in step 130, and before to execute it, is really the one that the user wants the vehicle to execute.
  • the user confirmation can be performed by a new user posture or a new user movement that the recognition unit detects.
  • the user confirmation can be an oral confirmation that the system recognizes via the microphone unit 32, for instance, "YES” to confirm the identified operation or "NO” to indicate that the identified operation is not the correct one.
  • step 142 If, according to step 142, the identified operation is confirmed by the user, the control system requests, according to step 150, the execution of the identified operation by generating, for instance, a control signal which contains information for the execution of the identified operation.
  • the method returns 143 to the step 110 in order to capture at least one new image representative of at least one new user posture or at least one new user movement.
  • the confirmation can be requested only before the execution of some specific operations.
  • the user confirmation will be requested only when the execution of the operation can affect safety.
  • the user confirmation will be requested when the execution of the operation will result in a motion of the vehicle or in a motion of a vehicle equipment, such as the motion of a crane or a tailgate.
  • the function concerns the lighting system and if the selected operation consists of switching on front lights of the vehicle, a user confirmation is not necessary requested.
  • the controlling method also comprises the step 105 of confirmation by the user of the selected vehicle function such as previously described in connection with figure 5, the confirmation according to steps 141 and 142 offers a safety redundancy that can be useful especially when the execution of the operation consists in a vehicle motion.
  • the method comprises the further step 145 of checking safety conditions wherein, before or during the step 170 of executing the operation identified by the control system, it is checked that the execution of the identified operation will not interfere with another operation currently being executed or will not interfere with an outside obstacle detected by the control system.
  • the execution of the identified operation can result in a vehicle motion such as a forward or a backward motion of the vehicle or can result in an adjustment of the vehicle suspensions or if the execution of the identified operation can result in a motion of a piece of vehicle equipment such as the motion of a crane or a tailgate.
  • control system detects, for instance, the presence of an outside obstacle such as an electric pole or a person located behind the vehicle, the control system will refuse, in step 146, the execution of the identified operation if this one consists in a backward motion of the vehicle or in a lift down of the rear tailgate of the vehicle.
  • an outside obstacle such as an electric pole or a person located behind the vehicle
  • control system If during the step 170 of executing the identified operation, the control system detects, for instance, the presence of an outside obstacle that can interfere with the identified operation, the control system will stop the execution of the identified operation.
  • the controlling method returns to step 1 10 where the user can assume at least a new posture or perform at least a new movement whose image(s) will be captured by the motion recognition unit. In this case, not represented on figure 8, the user can also decide to select a new function.
  • the figures 9 and 10 depict learning functions that can be selected by the user.
  • the user can select, according to step 104, which vehicle function he wants to control with the control system.
  • the control system can be configured to receive during this same step 104, the selection by the user of a learning function instead of the selection of a vehicle function.
  • the learning function can be used by the user in order to assign one or several body postures or body movements to an available operation that the vehicle control system can control.
  • the available operation can be an operation to which a body posture or a body movement is already assigned or it can be an available operation of a vehicle function to which it is not already assigned a body posture or a body movement.
  • the learning function can be implemented in different ways according to the invention.
  • the learning function can comprise the following steps :
  • the learning function can comprise the following steps :
  • the user can select the operation to be learnt by the control system via a remote interface 31 that is in wireless communication with the control system.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Fittings On The Vehicle Exterior For Carrying Loads, And Devices For Holding Or Mounting Articles (AREA)

Abstract

L'invention concerne un système et un procédé de commande d'une ou plusieurs fonctions d'un véhicule (1) par reconnaissance de différentes positions ou mouvements d'un utilisateur. Le système de commande de véhicule (13) comprend une unité de reconnaissance de mouvement (15) positionnée et orientée de sorte qu'elle peut capturer des images de positions prises ou de mouvements effectués par l'utilisateur (19) dans une zone environnante (18, 18a, 18b) du véhicule (1); et une unité de commande électronique (16) qui est en communication avec l'unité de reconnaissance de mouvement et qui est conçue pour reconnaître les positions ou mouvements de l'utilisateur par comparaison avec des positions ou mouvements prédéfinis stockés dans une mémoire (22) du système de commande de véhicule. L'unité de commande électronique (16) est conçue pour générer, en fonction du résultat de la comparaison, un signal de commande afin de commander une ou plusieurs fonctions du véhicule en réponse aux positions ou mouvements de l'utilisateur.
PCT/IB2013/001225 2013-05-06 2013-05-06 Système et procédé de commande d'un véhicule WO2014181146A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2013/001225 WO2014181146A1 (fr) 2013-05-06 2013-05-06 Système et procédé de commande d'un véhicule

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2013/001225 WO2014181146A1 (fr) 2013-05-06 2013-05-06 Système et procédé de commande d'un véhicule

Publications (1)

Publication Number Publication Date
WO2014181146A1 true WO2014181146A1 (fr) 2014-11-13

Family

ID=48782548

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2013/001225 WO2014181146A1 (fr) 2013-05-06 2013-05-06 Système et procédé de commande d'un véhicule

Country Status (1)

Country Link
WO (1) WO2014181146A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111688712A (zh) * 2019-03-14 2020-09-22 本田技研工业株式会社 车辆控制装置、车辆控制方法及存储介质
CN113772599A (zh) * 2021-09-15 2021-12-10 湖南星邦智能装备股份有限公司 一种剪叉式高空作业平台及其控制系统及方法
US11835008B1 (en) * 2023-01-12 2023-12-05 Ford Global Technologies, Llc Engine and engine exhaust control system
EP4246267A4 (fr) * 2020-12-25 2024-01-17 Huawei Technologies Co., Ltd. Procédé d'appel de véhicule, véhicule intelligent et dispositif

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020152010A1 (en) * 2001-04-17 2002-10-17 Philips Electronics North America Corporation Automatic access to an automobile via biometrics
US20070177011A1 (en) * 2004-03-05 2007-08-02 Lewin Andrew C Movement control system
US20080085048A1 (en) * 2006-10-05 2008-04-10 Department Of The Navy Robotic gesture recognition system
US20090222149A1 (en) * 2008-02-28 2009-09-03 The Boeing Company System and method for controlling swarm of remote unmanned vehicles through human gestures
US20100235034A1 (en) * 2009-03-16 2010-09-16 The Boeing Company Method, Apparatus And Computer Program Product For Recognizing A Gesture

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020152010A1 (en) * 2001-04-17 2002-10-17 Philips Electronics North America Corporation Automatic access to an automobile via biometrics
US20070177011A1 (en) * 2004-03-05 2007-08-02 Lewin Andrew C Movement control system
US20080085048A1 (en) * 2006-10-05 2008-04-10 Department Of The Navy Robotic gesture recognition system
US20090222149A1 (en) * 2008-02-28 2009-09-03 The Boeing Company System and method for controlling swarm of remote unmanned vehicles through human gestures
US20100235034A1 (en) * 2009-03-16 2010-09-16 The Boeing Company Method, Apparatus And Computer Program Product For Recognizing A Gesture

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111688712A (zh) * 2019-03-14 2020-09-22 本田技研工业株式会社 车辆控制装置、车辆控制方法及存储介质
EP4246267A4 (fr) * 2020-12-25 2024-01-17 Huawei Technologies Co., Ltd. Procédé d'appel de véhicule, véhicule intelligent et dispositif
CN113772599A (zh) * 2021-09-15 2021-12-10 湖南星邦智能装备股份有限公司 一种剪叉式高空作业平台及其控制系统及方法
US11835008B1 (en) * 2023-01-12 2023-12-05 Ford Global Technologies, Llc Engine and engine exhaust control system

Similar Documents

Publication Publication Date Title
EP3153373B1 (fr) Procédé de fourniture de mode de déplacement de véhicule arrêté, appareil d'assistance au conducteur pour fournir celui-ci et véhicule comprenant l'appareil d'assistance au conducteur
KR100863106B1 (ko) 차량 제어 장치
US10744838B2 (en) Pet mode door and suspension control system and method
CN108068737B (zh) 车辆驾驶员定位系统
JP6027505B2 (ja) 車両用照明システム
JP7102922B2 (ja) 車両盗難防止装置
US11511756B2 (en) Passenger authentication system for a vehicle
KR102128048B1 (ko) 어라운드 뷰 모니터링을 이용한 차량용 웰컴 시스템과 그 동작 방법
CN109138691A (zh) 具有主动距离控制的自动外罩系统
US20200090437A1 (en) Vehicle entry system and onboard device
WO2014181146A1 (fr) Système et procédé de commande d'un véhicule
KR102551126B1 (ko) 차량용 조정 시스템
US10803686B2 (en) Vehicle and vehicle system
US11247635B1 (en) System for providing access to a vehicle
KR101664294B1 (ko) 적응적 램프 제어 장치 및 그 방법
JP2000280863A (ja) 座席位置移動制御装置
JP2003237504A (ja) 車両機能の制御装置
US11878654B2 (en) System for sensing a living being proximate to a vehicle
US20220325569A1 (en) System for a vehicle having closure panels
US20220324309A1 (en) System for controlling a closure panel of a vehicle
CN114633739A (zh) 机动车的控制装置和用于运行停车位中的机动车的方法
US20230089000A1 (en) Vehicular power door sensing and operating system
CN116279429A (zh) 基于uwb数字钥匙的智能泊车方法、装置、车辆及介质
CN215436603U (zh) 一种车辆的停车校正装置及车辆
CN109747755A (zh) 倾斜车辆及其控制方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13735405

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13735405

Country of ref document: EP

Kind code of ref document: A1