WO2021048582A1 - 車両遠隔制御方法及び車両遠隔制御装置 - Google Patents
車両遠隔制御方法及び車両遠隔制御装置 Download PDFInfo
- Publication number
- WO2021048582A1 WO2021048582A1 PCT/IB2019/001097 IB2019001097W WO2021048582A1 WO 2021048582 A1 WO2021048582 A1 WO 2021048582A1 IB 2019001097 W IB2019001097 W IB 2019001097W WO 2021048582 A1 WO2021048582 A1 WO 2021048582A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- gesture
- vehicle
- input
- remote control
- determination area
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 67
- 238000001514 detection method Methods 0.000 description 30
- 238000004891 communication Methods 0.000 description 19
- 230000006870 function Effects 0.000 description 19
- 238000010586 diagram Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000004913 activation Effects 0.000 description 2
- 238000013459 approach Methods 0.000 description 2
- 238000007796 conventional method Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 1
- 238000002485 combustion reaction Methods 0.000 description 1
- 230000001568 sexual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/06—Automatic manoeuvring for parking
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
Definitions
- the present invention relates to a vehicle remote control method and a vehicle remote control device for autonomously traveling a vehicle having an autonomous travel control function by remote control.
- a remote control method for a vehicle in which a gesture is input to the touch panel of the remote control device of the vehicle, and when the input gesture matches a predetermined gesture, the vehicle is made to perform a predetermined function assigned to the gesture. (See Patent Document 1).
- the input position of the gesture with respect to the touch panel is predetermined. Therefore, if the input position of the gesture deviates from the specified input position, it is not determined as a regular gesture, and the vehicle cannot perform a predetermined function.
- An object to be solved by the present invention is to provide a remote control method and a remote control device for a vehicle capable of inputting a gesture at an arbitrary position on a touch panel.
- the present invention solves the above problem by detecting the input position of the touch operation on the touch panel of the remote controller and making the gesture determination area for accepting the gesture variable according to the input position.
- the vehicle since the gesture can be input to an arbitrary position on the touch panel, the vehicle can be made to perform a predetermined function regardless of the input position of the gesture.
- FIG. 5 is an explanatory diagram showing a state in which a gesture determination area is set from a gesture input at an arbitrary position on the touch panel of FIG. It is explanatory drawing which shows the state which the gesture protrudes from the touch panel of FIG. It is a flowchart which shows an example of the control procedure executed in the remote parking system of FIG. It is a flowchart which shows the procedure of the remote operation of FIG.
- FIG. 1 is a block diagram showing a remote parking system 1 to which the vehicle remote control method and the remote control device of the present invention are applied.
- the "autonomous driving control” means that the vehicle is driven by the automatic control of the on-board driving control device without depending on the driving operation of the driver.
- “Autonomous parking control” is a kind of autonomous driving control, and means that a vehicle is parked (entered or exited) by automatic control of an on-board driving control device without depending on a driver's driving operation.
- “parking” means to continuously park the vehicle in the parking space, but in the case of “traveling route”, not only the parking route when entering the garage into the parking space, but also from the parking space. It shall also include the garage exit route.
- the "vehicle travel control method and vehicle travel control device during parking” includes both vehicle travel control when the vehicle is put into the parking space and vehicle travel control when the vehicle is taken out of the parking space. Is done.
- putting in the garage is also called warehousing, and taking out the garage is also called warehousing.
- the remote parking system 1 of the present embodiment performs autonomous driving control in an assist mode in which an operator such as a driver can get on the vehicle and perform an intervention operation of the operator. After that, the operator gets off the vehicle and performs autonomous driving control from the outside of the vehicle in a remote control mode using a remote controller.
- the remote parking system 1 of the present embodiment is a system that puts in or takes out the garage by autonomous driving control when putting in the garage in the parking space or taking out the garage from the parking space. More specifically, the driver disembarks in the middle of entering the garage, and while confirming safety, the vehicle continues autonomous parking control by continuously transmitting an execution command signal to the vehicle by a remote controller. Then, when the vehicle may collide with an obstacle, the autonomous parking control is stopped by stopping the transmission of the execution command signal by the remote controller.
- the autonomous driving control mode in which an operator such as a driver can get on the vehicle and perform an intervention operation by the operator is an assist mode, and an autonomous driving control mode in which the operator gets off and enters or leaves the garage together with remote control. Is called remote control mode.
- a remote control mode combined with remote control can be used.
- the driver When entering the garage in the remote control mode, the driver carries the remote controller and disembarks when the remote control mode is activated, the entry route to the selected parking space is calculated, and the autonomous parking control is started. .. The driver who got off the vehicle completes the garage entry by continuously transmitting the execution command signal to the vehicle by the remote controller.
- the remote parking system 1 of the present embodiment is a system including a remote control mode in which such remote control is used in combination.
- the backward autonomous parking control shown in FIG. 2 is illustrated as an example of the autonomous parking control, the present invention can also be applied to garage parking, columnar autonomous parking, and other autonomous parking.
- the remote parking system 1 of the present embodiment includes a target parking space setting device 11, a vehicle position detector 12, an object detector 13, a parking route generation unit 14, an object deceleration calculation unit 15, a route tracking control unit 16, and a target vehicle speed generation unit. 17.
- the steering angle control unit 18, the vehicle speed control unit 19, the master unit 20, the remote controller 21, and the slave unit 22 are provided.
- the unit 19 and the master unit 20 are mounted on the vehicle.
- the remote controller 21 and the slave unit 22 are possessed by an operator such as a driver. Each configuration will be described below.
- the target parking space setting device 11 searches for parking spaces existing around the own vehicle in the remote control mode, and causes the operator to select a desired parking space from the parking spaces that can be parked. Further, the target parking space setting device 11 outputs the position information of the selected parking space (relative position coordinates from the current position of the own vehicle, latitude / longitude, etc.) to the parking route generation unit 14.
- the target parking space setting device 11 is provided with an input switch, a plurality of cameras, a parking space detection unit, and a touch panel type display (none of which are shown) in order to exert the above-mentioned functions.
- the input switch alternately selects the remote control mode or the assist mode. Multiple cameras capture the surroundings of the vehicle.
- the camera of the target parking space setting device 11 may also be used as the camera of the object detector 13 described later.
- the parking space detection unit is a computer on which a software program for detecting a parking space that can be parked from image data taken by a plurality of cameras is installed.
- the touch panel type display is used for displaying the detected parking space and selecting the parking space.
- the target parking space setting device 11 acquires image data around the own vehicle by a plurality of cameras, analyzes the image data, and allows parking. Detect spaces. Further, the target parking space setting device 11 displays an image including a parking space that can be parked on the touch panel display, and prompts the operator to select a parking space in which the vehicle is to be parked. When the operator selects a desired parking space from the displayed parking spaces, the target parking space setting device 11 outputs the position information of the parking space to the parking route generation unit 14. When searching for a parking space that can be parked, if the map information of the navigation device includes parking lot information having detailed position information, the parking lot information may be used.
- the vehicle position detector 12 is composed of a GPS unit, a gyro sensor, a vehicle speed sensor, and the like.
- the GPS unit detects radio waves transmitted from a plurality of satellite communications and periodically acquires the position information of the own vehicle.
- the vehicle position detector 12 detects the current position of the own vehicle based on the position information of the own vehicle acquired by the GPS unit, the angle change information acquired from the gyro sensor, and the vehicle speed acquired from the vehicle speed sensor.
- the position information of the own vehicle detected by the vehicle position detector 12 is output to the parking route generation unit 14 and the route tracking control unit 16 at predetermined time intervals.
- the object detector 13 searches for an object such as an obstacle in the vicinity of the own vehicle, and includes a camera, a radar (millimeter wave radar, laser radar, ultrasonic radar, etc.), a sonar, or the like. Alternatively, a combination of these is provided. These cameras, radars or sonars, or a combination thereof, are mounted on the outer skin around the vehicle.
- the mounting position of the object detector 13 is not particularly limited, but is, for example, all or a part of the center and both sides of the front bumper, the center and both sides of the rear bumper, the sill outer under the left and right center pillars, and the like. Can be attached to.
- the object detector 13 includes a computer in which a software program for identifying the position of an object detected by a camera, radar, or the like is installed.
- This computer uses the parking route generation unit 14 and the object deceleration calculation unit to obtain the specified object information (target information) and its position information (relative position coordinates from the current position of the own vehicle, latitude / longitude, etc.). Output to 15.
- These object information and position information are used by the parking route generation unit 14 to generate a parking route before the start of autonomous parking control.
- the object deceleration calculation unit 15 uses the object information and the position information for the control of decelerating or stopping the own vehicle.
- the parking route generation unit 14 calculates a parking route that is a parking route from the current position of the own vehicle to the target parking position (in the case of the remote control mode, it means a warehousing route; the same applies hereinafter) and does not collide with or interfere with an object. To do.
- the size of the own vehicle vehicle width, vehicle length, minimum turning radius, etc.
- the position information of the space are used.
- FIG. 2 is a plan view showing an example of the remote control mode.
- the target parking space setting device 11 sets one parking space TPS. Search for and display an image containing this on the display.
- the parking route generation unit 14 determines the parking route R1 from the current position P1 to the turning position P3 and the parking route from the turning position P3 to the target parking space TPS. Calculate with R2. Then, the series of parking routes R1 and R2 are output to the route tracking control unit 16 and the target vehicle speed generation unit 17.
- the object deceleration calculation unit 15 inputs the position information of an obstacle or other object from the object detector 13, and based on the distance to the object and the vehicle speed, the time until the object collides with the object (TTC: Time to Collection). ) Is calculated, and the deceleration start timing of the own vehicle is calculated.
- TTC Time to Collection
- the vehicle speed is set as the initial setting value, and the time TTC until the own vehicle V collides with the obstacle becomes equal to or less than the predetermined value. At the same timing, the vehicle speed of the own vehicle V is decelerated. Further, when an unexpected obstacle is detected in the parking paths R1 and R2 during the series of autonomous parking control shown in FIG. 2, the time until the own vehicle V collides with the obstacle TTC is also obtained. Decelerates or stops the vehicle speed of the own vehicle V at the timing when becomes equal to or less than a predetermined value. This deceleration start timing is output to the target vehicle speed generation unit 17.
- the route tracking control unit 16 tracks the own vehicle along the parking route at predetermined time intervals based on the parking route from the parking route generation unit 14 and the current position of the own vehicle from the vehicle position detector 12. Calculate the target steering angle of. Regarding the parking paths R1 and R2 in FIG. 2, the route tracking control unit 16 sets the target steering angle of the parking path R1 that goes straight and turns right from the current position P1 to the turning position P3 for each current position of the own vehicle V. Calculate at predetermined time intervals. Similarly, the path tracking control unit 16 calculates the target steering angle of the parking path R2 that turns left and goes straight from the turning position P3 to the target parking space TPS at predetermined time intervals for each current position of the own vehicle V. The path tracking control unit 16 outputs the calculated target steering angle to the steering angle control unit 18.
- the target vehicle speed generation unit 17 is a target for following the own vehicle along the parking route at predetermined time intervals based on the parking route from the parking route generation unit 14 and the deceleration start timing from the object deceleration calculation unit 15. Calculate the vehicle speed. Regarding the parking routes R1 and R2 in FIG. 2, the target vehicle speed at the time of starting from the current position P1, going straight and turning right, and stopping at the turning position P3 is calculated at predetermined time intervals for each current position of the own vehicle V. Then, it is output to the vehicle speed control unit 19. Similarly, the target vehicle speed generation unit 17 starts (backward) again from the turning position P3 and stops close to the target vehicle speed when turning left to the middle of the target parking space TPS and the target parking space TPS.
- the target vehicle speed at the time of the operation is calculated for each current position of the own vehicle V at predetermined time intervals and output to the vehicle speed control unit 19. Further, when the target vehicle speed generation unit 17 detects an unexpected obstacle in the parking paths R1 and R2 while executing the series of autonomous parking control shown in FIG. 2, the object deceleration calculation unit 15 decelerates or stops the vehicle. Since the timing is output, the target vehicle speed corresponding to the timing is output to the vehicle speed control unit 19.
- the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16. Further, the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17. By simultaneously controlling the steering angle control unit 18 and the vehicle speed control unit 19, autonomous parking control is executed.
- the international standard for autonomous driving control of a vehicle stipulates that the distance between the vehicle and the operator is within a predetermined remote control distance (for example, within 6 m) as a condition for allowing remote control of the vehicle. Therefore, in the remote parking system 1 of the present embodiment, the slave unit 22 possessed by the operator U and the master unit 20 mounted on the own vehicle V are used, and the relative position of the slave unit 22 with respect to the own vehicle V, that is, , Detects the relative position of the operator U who owns the slave unit 22 with respect to the own vehicle V.
- the slave unit 22 and the master unit 20 form a so-called keyless entry system.
- antennas 202a to 202d connected to the master unit 20 are installed at predetermined locations around the own vehicle V.
- the master unit 20 transmits a slave unit search signal from the antennas 202a to 202d.
- the slave unit 22 receives the slave unit search signal transmitted from each of the antennas 202a to 202d, and measures the radio field strength of the slave unit search signal of each antenna 202a to 202d. ..
- the radio field strength of the slave unit search signal changes depending on the distance between the slave unit 22 and the antennas 202a to 202d.
- the radio wave strength of the slave unit search signal received from the antenna 202b is the strongest, but is received from the antenna 202c near the right side of the rear bumper.
- the signal strength of the slave unit search signal is the weakest.
- the slave unit 22 transmits the radio wave strength of the slave unit search signal of each of the measured antennas 202a to 202d to the master unit 20.
- the position detector 201 of the master unit 20 is, for example, a computer in which a software program for calculating the position of the slave unit 22 by using a triangulation method or the like from the radio wave strength of each antenna 202a to 202d received from the slave unit 22 is installed. Is.
- the position detector 201 is the relative position of the slave unit 22 with respect to the own vehicle V, that is, the operator U who owns the slave unit 22, based on the radio wave strength of the antennas 202a to 202d received from the slave unit 22.
- the position relative to the vehicle V is detected.
- the position detector 201 outputs the detected relative position of the slave unit 22 to the path tracking control unit 16 and the target vehicle speed generation unit 17 (or instead, the steering angle control unit 18 and the vehicle speed control unit 19 may be used). , Transmit to the remote controller 21.
- the remote controller 21 is a device for the operator U to command from outside the vehicle whether to continue or stop the execution of the autonomous parking control set by the target parking space setting device 11. Therefore, the remote controller 21 wirelessly communicates to transmit an execution command signal to the route tracking control unit 16 and the target vehicle speed generation unit 17 (or instead, the steering angle control unit 18 and the vehicle speed control unit 19 may be used). It has a function and communicates with the wireless communication function provided in the own vehicle V.
- the remote controller 21 includes, for example, a mobile information terminal such as a smartphone on which application software for remote control (hereinafter referred to as an application) is installed.
- the smartphone on which the application is installed functions as the remote controller 21 of the remote parking system 1 by activating the application.
- the execution command signal is transmitted from the remote controller 21 to the own vehicle V only while the predetermined command gesture is continuously input to the touch panel 211 of the remote controller 21. carry on. Further, the own vehicle V executes the autonomous parking control only while receiving the execution command signal transmitted from the remote controller 21. That is, when the input of the command gesture to the remote controller 21 is stopped, the execution command signal is not transmitted from the remote controller 21 to the vehicle, and the execution of the autonomous parking control of the vehicle is interrupted or stopped. Further, the remote controller 21 has a function of remotely activating a drive source such as a vehicle engine or a motor in order to remotely control a vehicle parked in a narrow parking space from the outside of the vehicle. There is.
- the remote controller 21 includes a touch panel 211, a gesture determination unit 212, a storage unit 213, a command unit 214, and a communication unit 215.
- the touch panel 211 detects the touch operation of the operator U.
- the gesture determination unit 212 sets a gesture determination area for accepting gesture input on the touch panel 211. Further, the gesture determination unit 212 detects a gesture in the gesture determination area, and determines whether or not the detected gesture is a preset command gesture and whether or not the input speed of the gesture is within a predetermined range.
- the storage unit 213 stores various information related to the gesture determination by the gesture determination unit 212.
- the command unit 214 determines that the detected gesture is a command gesture and the input speed of the gesture is within a predetermined range, the command unit 214 generates an execution command signal for causing the own vehicle V to execute autonomous parking control by the autonomous driving control function. To do.
- the communication unit 215 transmits the execution command signal generated by the command unit 214 to the own vehicle V.
- the gesture input in the remote control mode is a predetermined command gesture.
- the form and size of the specified command gesture are stored in the storage unit 213 of the remote controller 21 in association with the application.
- the command gesture is, for example, such that when the horizontal direction of the touch panel 211 is the X axis and the vertical direction is the Y axis, the coordinates of the input position by the touch operation continuously change in at least one of the X axis and the Y axis. Touch operation.
- the gesture G1 for sliding the finger in the vertical direction the gesture G2 for sliding the finger in the horizontal direction, or the gesture G2 for sliding the finger in the horizontal direction on the touch panel 211, or the finger is slid diagonally.
- the gesture G3 to be made may be a command gesture.
- a monotonous gesture in which the finger is slid linearly may be erroneously determined as an input operation other than the command gesture.
- it may be determined that it is a command gesture.
- the trajectory of the closed figure in which the start end at which one input of the gesture is started and the end at which one input of the gesture ends overlap is drawn on the touch panel.
- the touch operation may be a command gesture.
- a closed figure as shown in FIG. 5B (A)
- a gesture G4 composed of a ring-shaped figure may be used, or as shown in FIGS. (B) and (C).
- Triangles or quadrangles, and other polygonal gestures G5 and G6 may be used. Further, as shown in FIG. 8D, a gesture G7 composed of a figure such as a figure of eight may be used.
- the hatched portion shows the start and end of the gesture as an example.
- the gesture G4 composed of the ring-shaped figure shown in FIG. 5B (A) is used as the specified command gesture.
- the gesture determination unit 212 functions when the CPU (Central Processing Unit) of the smartphone that functions as the remote controller 21 operates according to the application.
- the gesture determination unit 212 sets the gesture determination area according to the input position of the touch operation performed by the operator U on the touch panel 211. Further, the gesture determination unit 212 changes the size of the gesture determination area with respect to the touch panel 211. That is, since the position and size of the gesture determination area set on the touch panel 211 are variable according to the input position of the touch operation, a specified command gesture of an arbitrary size is input to an arbitrary position on the touch panel 211. can do. As a result, it is possible to prevent the autonomous parking control of the own vehicle V from being stopped due to an input error of the command gesture, as compared with the conventional technique in which the input position and the input size of the gesture are always fixed.
- the gesture determination unit 212 refers to a preset position of the touch panel 211, for example, the center position Ip0 (coordinates x0, y0), as shown in FIG.
- the input guide 212a showing the shape of is displayed.
- a message such as "Please touch the displayed input guide along the direction of the arrow" is displayed in the vicinity of the input guide 212a.
- the gesture determination unit 212 sets the first gesture determination area Ga0 at a preset position of the touch panel 211 regardless of the input position of the touch operation with respect to the touch panel 211 at the start of the gesture input. Specifically, as shown in FIG. 6B, the first gesture determination area Ga0 corresponding to the input guide 212a is set with reference to the center position Ip0 of the input guide 212a.
- the first gesture determination area Ga0 is defined by the center coordinates (x0, y0) of a predetermined center position Ip0 and the radius R when the width direction of the touch panel 211 is the X axis and the vertical direction is the Y axis. .. As shown by the broken line in FIG.
- the radius R is the minimum input size rmin obtained by reducing the radius r of the command gesture G4 of the specified size by a predetermined ratio (for example, 30%) and the radius r by a predetermined ratio (for example, 30%). For example, it has a range of maximum input size rmax expanded by 150%).
- the specified size of the command gesture G4 a value corresponding to the size and resolution of the touch panel 211 is stored in advance in the storage unit 213.
- the gesture determination unit 212 sets the specified size of the command gesture G4, the minimum input size, and the maximum input size based on the size and resolution of the touch panel 211 of the smartphone on which the application is installed.
- the input guide 212a is displayed and the first gesture determination area Ga0 is temporarily fixed by notifying the operator U of the form of the command gesture G4 and immediately after the input is started. This is to increase the judgment rate of the gesture of. Further, at the start of the remote control of the own vehicle V, the operator U is likely to turn to the remote controller 21 in order to operate the remote controller 21. Therefore, by displaying the input guide 212a at a preset position and setting the first gesture determination area Ga0, it is possible to suggest the operation required for the operator U, and thus the discomfort of the operator U is suppressed. can do.
- the first gesture determination area Ga0 is defined by the minimum input size rmin and the maximum input size rmax not only when the input gesture is the same size as the command gesture G4 of the specified size. This is to determine that the command gesture is a size within a predetermined range, that is, when the size is smaller or larger than the specified size.
- the gesture determination unit 212 determines whether or not the input gesture is a ring-shaped gesture such as the specified command gesture G4, and whether or not the size fits within the first gesture determination area Ga0. .. In addition, the gesture determination unit 212 determines whether or not the gesture input speed is within a preset predetermined range. When the input gesture is a ring-shaped gesture such as the command gesture G4, the size of the gesture determination unit 212 is within the first gesture determination area Ga0, and the input speed is within a predetermined range. It is determined that the command gesture G4 has been input. Therefore, as shown in FIG.
- the gesture determination unit 212 uses the command gesture G4 even when the input gesture G4a meanders in the first gesture determination area Ga0 and has a deformed elliptical shape. Judge that there is. The input speed of the gesture is used for determining the command gesture in order to distinguish the gesture input when some object comes into contact with the touch panel 211 from the gesture of the operator U.
- the gesture determination unit 212 hides the input guide 212a when it is determined that the input gesture is the command gesture G4, or when a new gesture determination area is set according to a change in the input position of the gesture. To. When the gesture determination unit 212 detects that the command gesture continues to be input along the input guide 212a, the gesture determination unit 212 may continue to display the input guide 212a. Further, when a new gesture determination area is set, the gesture determination unit 212 may display the input guide 212a at the position of the new gesture determination area.
- the gesture determination unit 212 sets a new gesture determination area according to the movement of the gesture input position with respect to the touch panel 211. Further, even when the operator U who understands in advance that the input position and the input size of the gesture are variable and inputs the gesture while ignoring the input guide 212a, a new gesture determination is made according to the input position. Set the area. For the setting of the new gesture determination area, the detection values xraw and yraw of the touch operation detected by the touch panel 211 and the moving average values xmean and ymean of the detected values are used.
- the graph shown in FIG. 7 shows an example of the detection value of the touch operation detected by the touch panel 211 when the operator U inputs a gesture to the touch panel 211.
- the horizontal axis indicates time
- the vertical axis indicates the detected value xraw of the touch operation in the X-axis direction
- the moving average value xmean of the detected value xraw The positive side of the vertical axis of the graph (A) indicates the right side of the center line of the X axis of the touch panel 211
- the negative side indicates the left side of the center line of the X axis.
- the horizontal axis indicates time
- the vertical axis indicates the detected value yraw of the touch operation in the Y-axis direction
- the moving average value ymean of this detected value yraw The positive side of the vertical axis of the graph (B) indicates the upper side of the touch panel 211 above the center line of the Y axis, and the negative side indicates the lower side of the center line of the Y axis.
- the detected values xraw and yraw of the touch panel 211 are stored in the storage unit 213 at any time. Further, the moving average values xmean and ymean are calculated by the gesture determination unit 212 and stored in the storage unit 213.
- the gesture determination unit 212 determines the coordinates (xb,) of the current touch position Tp1 of the operator U with respect to the touch panel 211. yb) is acquired from the storage unit 213. Further, the gesture determination unit 212 acquires the moving average value (xmean, ymean) of the past several seconds (hereinafter, referred to as the radius determination time T) until the touch position reaches Tp1 from the storage unit 213.
- the gesture determination unit 212 calculates the radius rb of the gesture G4b until the touch position moves to Tp1 and the angle ⁇ between the touch position Tp1 and the X axis of the touch panel 211 using the following formulas 1 and 2. Then, the gesture determination unit 212 calculates the coordinates (x1, Y1) of the center position Ip1 of the gesture G4b based on the radius rb and the angle ⁇ , and sets a new gesture determination area Ga1 based on the center position Ip1. To do. In this way, the gesture determination unit 212 always sets a new gesture determination area according to the movement of the gesture input position.
- the gesture may not be detected temporarily by the gesture determination unit 212.
- the gesture cannot be detected by the gesture determination unit 212 when the finger of the operator U is released from the touch panel 211, when the input gesture is not determined to be a command gesture, or when the operator U inputs the gesture. May end in the middle. Further, when the finger of the operator U is separated from the touch panel 211, the case where the finger of the operator U protrudes from the touch panel 211 is also included.
- the gesture determination unit 212 is based on the time during which the gesture cannot be detected (non-detection time) and the above-mentioned radius determination time T. Switch the setting process of the gesture judgment area.
- the gesture determination unit 212 compares the non-detection time with the radius determination time T, and when it determines that the non-detection time is less than the radius determination time T, the coordinates of the touch position detected immediately before the gesture cannot be detected.
- the moving average value is calculated using, and a new gesture judgment area is set using this calculation result.
- a new gesture determination area is set without using the touch position within the non-detection time.
- the gesture determination unit 212 detects it immediately before it protrudes from the touch panel 211.
- the moving average value is calculated using the coordinates of the touch position, and a new gesture determination area Ga2 is set using this calculation result.
- the gesture determination unit 212 determines that the input gesture is a command gesture due to an input error of the operator U.
- the gesture non-detection time is compared with the case where the finger is separated from the touch panel 211. Is expected to be relatively long. Therefore, when the gesture determination unit 212 determines that the gesture non-detection time is equal to or longer than the radius determination time T, the gesture determination area set immediately before the gesture cannot be detected is set as the second gesture determination area in the storage unit 213.
- the second gesture determination area is read from the storage unit 213 to determine the gesture. For example, in the example shown in FIG.
- the gesture determination unit 212 stores the gesture determination area Ga2 set immediately before the gesture G4c is not detected in the storage unit 213 as the second gesture determination area. Then, when the gesture determination unit 212 detects the gesture input again, the gesture determination area Ga2 is read from the storage unit 213 to determine the gesture.
- the gesture determination rate can be increased by using the gesture determination area set immediately before the gesture.
- the gesture determination unit 212 determines that the gesture non-detection time is equal to or longer than the preset input restart standby time, the gesture determination unit 212 displays the input guide 212a on the touch panel 211 and is based on the preset center position Ip0.
- the first gesture determination area Ga0 is set.
- the input restart waiting time is longer than the radius determination time T, and is preferably set to a time of several seconds, for example.
- the command unit 214 is an execution command for causing the own vehicle V to execute autonomous parking control by the autonomous driving control function when the gesture detected by the touch panel 211 is determined by the gesture determination unit 212 to be the command gesture G4. Generate a signal.
- the command unit 214 inputs the generated execution command signal to the communication unit 215.
- the command unit 214 functions by operating the CPU of the smartphone, which functions as the remote controller 21, according to the application.
- the communication unit 215 uses the communication function provided in advance by the smartphone that functions as the remote controller 21.
- the communication unit 215 is, for example, a wireless communication unit such as Bluetooth (registered trademark).
- the communication unit 215 is connected to a wireless communication unit (not shown) mounted on the own vehicle V, and an execution command is given to the own vehicle V. Send a signal.
- a wireless LAN Local Area Network
- Wi-Fi registered trademark
- a mobile phone line or the like may be used as the communication unit 215.
- the execution command signal transmitted to the own vehicle V is input to the route tracking control unit 16 and the target vehicle speed generation unit 17. Further, as described above, the relative positions of the own vehicle V and the slave unit 22 are input from the position detector 201 to the route tracking control unit 16 and the target vehicle speed generation unit 17.
- the route tracking control unit 16 sends the steering angle control unit 18 to the steering angle control unit 18. Output the target steering angle.
- the target vehicle speed generation unit 17 is a vehicle speed control unit when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote controller 21 is input.
- the target vehicle speed is output to 19.
- the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16. Further, the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17.
- the route tracking control unit 16 is in the case where the execution command signal from the remote controller 21 is input.
- the target steering angle is not output to the steering angle control unit 18.
- the target vehicle speed generation unit 17 is a place where an execution command signal from the remote controller 21 is input when the distance between the own vehicle V and the slave unit 22 is longer than the remote control distance.
- the target vehicle speed is not output to the vehicle speed control unit 19. That is, when the distance between the own vehicle V and the slave unit 22 is farther than the remote control distance, the autonomous parking control is not executed even if the command gesture is input from the remote control device 21.
- FIG. 10 is a flowchart showing a control procedure executed by the remote parking system 1 of the present embodiment.
- FIG. 11 is a flowchart showing a procedure of detecting and determining a gesture in the remote controller 21 and transmitting an execution command signal.
- FIG. 12 is a flowchart showing a procedure for setting a gesture determination area when a gesture input is detected after the gesture is temporarily stopped being detected.
- step S1 when the own vehicle V arrives at the position P1 near the target parking space TPS, in step S1 shown in FIG. 10, the operator U such as the driver performs the remote parking of the on-board target parking space setting device 11. Turn on the start switch and select the remote warehousing mode.
- the target parking space setting device 11 searches for a parking space in which the own vehicle V can park using a plurality of on-board cameras, and whether or not there is a parking space in step S3. To judge. If there is a parking space that can be parked, the process proceeds to step S4, and if there is no parking space that can be parked, the process returns to step S1. If the parking space that can be parked is not detected in step S2, the operator may be notified by a language display such as "There is no parking space" or by voice, and this process may be terminated.
- step S4 the target parking space setting device 11 displays the parking space that can be parked on the in-vehicle display, prompts the operator U to select the desired parking space, and the operator U selects a specific parking space TPS. Then, the target parking position information is output to the parking route generation unit 14.
- step S5 the parking route generation unit 14 generates the parking routes R1 and R2 shown in FIG. 2 from the current position P1 of the own vehicle V and the parking space TPS which is the target parking position.
- the object deceleration calculation unit 15 calculates the deceleration start timing at the time of autonomous parking control based on the object information detected by the object detector 13.
- the parking routes R1 and R2 generated by the parking route generation unit 14 are output to the route tracking control unit 16, and the deceleration start timing calculated by the object deceleration calculation unit 15 is output to the target vehicle speed generation unit 17.
- step S6 the operator is urged to consent to the start of the autonomous parking control, and when the operator approves the start, the autonomous driving control by the assist mode is started.
- the vehicle moves forward by turning right once from the current position P1 shown in FIG. 2, and when it reaches the turning position P3, it reverses by turning left to the intermediate stop position P4.
- step S7 since the position of the own vehicle V has reached the intermediate stop position P4, the own vehicle V is stopped and the operator U is urged to disembark.
- the operator U activates the remote controller 21 in step S8.
- the start input of the remote control by the remote controller 21 includes the activation of the application installed on the remote controller 21, the door unlocking operation, the door locking and unlocking operation, and the combination of these and the application activation. Can be exemplified.
- step S9 the own vehicle V is in a stopped state.
- step S9 the pairing process of the remote controller 21 and the own vehicle V is performed.
- the own vehicle V authenticates the remote controller 21 and can accept the command by the pairing process in step S9
- the remote operation is started in step S10.
- the gesture determination unit 212 When the remote control by the remote controller 21 is started, the gesture determination unit 212 performs the initial setting in step S101 as shown in FIG. In this initial setting, as shown in FIG. 6A, the input guide 212a and the message "Please touch the displayed input guide along the direction of the arrow" are displayed on the touch panel 211. Further, as shown in FIG. 6B, the gesture determination unit 212 sets the first gesture determination area Ga0 based on the center position Ip0 of the input guide 212a.
- the operator U performs a touch operation along the input guide 212a and inputs a ring-shaped gesture to the touch panel 211. Further, in the remote parking system 1 of the present embodiment, since the position and size of the gesture determination area are variable, the operator U ignores the input guide 212a and places an arbitrary size on the touch panel 211 at an arbitrary position. You can enter gestures with.
- the detected values xraw and yraw of the touch operation detected by the touch panel 211 are stored in the storage unit 213 at any time. Further, the moving average values xmean and ymean are calculated by the gesture determination unit 212 and stored in the storage unit 213.
- step S102 the touch panel 211 detects the gesture input by the touch operation of the operator U.
- the gesture determination unit 212 determines in step S103 whether or not the detected gesture is input to the first gesture determination area Ga0.
- the gesture determination unit 212 presets in step S104 whether the detected gesture is a ring-shaped gesture such as the command gesture G4. It is determined whether or not the input speed is within the predetermined range.
- step S106 If the detected gesture is not the command gesture G4, or if the input speed is out of the predetermined range, the process proceeds to step S106 to detect the next gesture. On the contrary, when the detected gesture is the command gesture G4 and the input speed is within the predetermined range, the process proceeds to step S105, the command unit 214 generates an execution command signal, and the communication unit 215. Sends an execution command signal to the own vehicle V. In the next step S106, if the gesture detection continues, the process returns to step 103 to repeat the gesture determination and the transmission of the execution command signal.
- the gesture determination unit 212 proceeds to step S107 and determines a new gesture based on the current gesture input position. Set the area. As described above, the gesture determination unit 212 has a new gesture determination area based on the coordinates of the current touch position of the operator U with respect to the touch panel 211 and the moving average value until the touch position reaches the current touch position. To set.
- the gesture determination unit 212 uses the newly set gesture determination area to check whether the detected gesture is a ring-shaped gesture such as the command gesture G4, or within a predetermined range set in advance. Determine if the input speed is used. If the detected gesture is not the command gesture G4, or if the input speed is out of the predetermined range, the process proceeds to step S110 to detect the next gesture. On the contrary, when the detected gesture is the command gesture G4 and the input speed is within the predetermined range, the process proceeds to step S109, the command unit 214 generates an execution command signal, and the communication unit 215. Sends an execution command signal to the own vehicle V. In the next step S110, if the gesture detection continues, the process returns to step 107, and the setting of a new gesture determination area, the gesture determination, and the transmission of the execution command signal are repeated.
- step S11 the slave unit 22 and the master unit 20 are used to determine the relative position of the slave unit 22 with respect to the own vehicle V, that is, the operator U who possesses the slave unit 22 and the own vehicle V. Detect relative position.
- the master unit 20 outputs the detected relative position to the route tracking control unit 16 and the target vehicle speed generation unit 17.
- the path tracking control unit 16 is the steering angle control unit 18 when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote control device 21 is input.
- the target steering angle is output to.
- the target vehicle speed generation unit 17 controls the vehicle speed when the distance between the own vehicle V and the slave unit 22 is within the remote control distance and the execution command signal from the remote controller 21 is input.
- the target vehicle speed is output to the unit 19.
- the steering angle control unit 18 generates a control signal for operating the steering actuator provided in the steering system of the own vehicle V based on the target steering angle from the path tracking control unit 16.
- the vehicle speed control unit 19 generates a control signal for operating the accelerator actuator provided in the drive system of the own vehicle V based on the target vehicle speed from the target vehicle speed generation unit 17.
- autonomous parking control is executed in the next step S12.
- step S10 The processes from step S10 to step S13, which will be described later, are executed at predetermined time intervals until the own vehicle V arrives at the target parking space TPS in step S13.
- step S13 it is determined whether or not the own vehicle V has arrived at the target parking space TPS, and if it has not arrived, the process returns to step S10, and when the own vehicle V arrives at the target parking space TPS. Stops the own vehicle V and ends the process.
- the travel route from the current position P1 of the own vehicle V to the intermediate stop position P4 executes autonomous travel control in the assist mode, and the travel route from the intermediate stop position P4 to the target parking space TPS is in the remote control mode. Executes autonomous driving control by.
- the gesture determination unit 212 measures the non-detection time when the gesture input cannot be detected temporarily in step S20 shown in FIG. 12, and in the next step S21, the radius used for detecting the gesture input position. Compare with the determination time T. When the non-detection time is less than the radius determination time T, it is conceivable that the finger of the operator U temporarily separates from the touch panel 211, or the finger of the operator U temporarily protrudes from the touch panel 211. Therefore, the gesture determination unit 212 proceeds to the next step S22, calculates the moving average value using the coordinates of the touch position detected immediately before the gesture cannot be detected, and uses this calculation result to determine a new gesture. Set the area.
- the gesture non-detection time is less than the radius judgment time T, even if a new gesture judgment area is set without using the touch position within the non-detection time, the set position of the gesture judgment area does not deviate significantly. .. Therefore, the judgment result of the gesture is not significantly affected.
- step S21 If the gesture non-detection time is equal to or longer than the radius determination time T in step S21, the process proceeds to step S23, and the gesture non-detection time is compared with the preset input restart waiting time. If the non-detection time is less than the input restart waiting time, it is possible that the input gesture is not determined to be the command gesture. Therefore, if the gesture determination area is set based on the gesture that is not determined to be the command gesture, the setting position of the gesture determination area may shift. In such a case, the gesture determination unit 212 proceeds to the next step S24, and stores the gesture determination area set immediately before the gesture cannot be detected in the storage unit 213 as the second gesture determination area. Then, when the gesture input is detected, the second gesture determination area is read from the storage unit 213 to determine the gesture.
- step S25 if the gesture non-detection time is equal to or longer than the input waiting time, the process proceeds to step S25.
- the gesture determination unit 212 displays the input guide 212a on the touch panel 211, and sets the first gesture determination area Ga0 based on the preset center position Ip0. As a result, the determination rate of the resumed gesture can be increased.
- the touch panel 211 of the remote controller 21 detects the input position where the touch operation of the operator U is performed.
- the position of the gesture determination area set by the gesture determination unit 212 is variable according to the input position.
- the gesture is detected by the gesture determination area, and the gesture determination unit 212 determines whether or not the detected gesture is a preset command gesture.
- the gesture is a command gesture
- the own vehicle V having the autonomous travel control function is made to execute the autonomous parking control as the autonomous travel control.
- the gesture can be input to an arbitrary position on the touch panel 211, so that the own vehicle V can execute the autonomous parking control regardless of the input position of the gesture. Further, since the gesture can be input without worrying about the input position, the operability of the remote controller 21 is improved. Further, as compared with the case where the entire touch panel 211 is used as the gesture determination area, the gesture can be determined in a small gesture determination area. As a result, the processing load required for gesture determination can be reduced.
- the size of the gesture judgment area for the touch panel 211 is variable. As a result, a gesture of any size can be input to the touch panel 211, and the own vehicle V can execute autonomous parking control regardless of the size of the gesture. Further, since the size of the gesture determination area is variable between the preset minimum input size and the maximum input size, it is possible to suppress the input of gestures having extremely different sizes.
- the gesture at the start of input is set.
- the determination rate can be increased, and the own vehicle V can quickly start the autonomous parking control.
- a new gesture judgment area is set according to the change in the input position of the touch operation, even if the gesture input position deviates from the first gesture judgment area, the own vehicle V continues the autonomous parking control. be able to. In particular, at the start of remote control of the own vehicle V, the operator U is likely to look at the remote controller 21.
- the operator U is likely to take his eyes off the remote controller 21 and monitor the own vehicle V. Therefore, there is a high possibility that the touch operation deviates from the first gesture judgment area Ga0. Therefore, by setting a new gesture judgment area according to the input position of the touch operation, the autonomous running control of the own vehicle V can be performed. It will be possible to continue.
- the gesture judgment area set immediately before the gesture cannot be detected is stored as the second gesture judgment area, and when the input of the gesture is detected, the second gesture judgment is made. Gestures are judged according to the area. If the gesture input is temporarily stopped and restarted, it is highly likely that the gesture input is restarted at the input position before the stop. Therefore, by using the second gesture determination area for the determination when the gesture input is restarted, the gesture determination rate can be increased.
- the gesture can be determined by the first gesture determination area. For example, when the gesture input is temporarily stopped and then restarted after a relatively long time, the gesture judgment rate can be increased by setting the first gesture judgment area. ..
- the gesture when the gesture cannot be detected, the finger of the operator U is released from the touch panel 211, the gesture is not determined to be the command gesture, or the operator U ends the input of the gesture in the middle.
- the gesture determination area set immediately before may be stored and used, or the first gesture determination area may be used.
- the coordinates of the input position by the touch operation of the operator are on at least one of the X axis and the Y axis.
- the touch operation changes continuously. Therefore, even an operator who is unfamiliar with the touch operation on the touch panel 211 can easily input a command gesture.
- a command gesture it is possible to perform a touch operation of drawing a trajectory of a figure on the touch panel 211 such that the start end at which one input of the gesture is started and the end at which one input of the gesture ends overlap.
- a touch operation for drawing a locus of a ring-shaped figure on the touch panel 211 can be applied.
- the command gesture can be input separately from other monotonous and simple gestures, so that erroneous determination of the gesture can be suppressed.
- the gesture determination area can be defined by the center coordinates of the ring-shaped figure and the radius.
- the gesture determination area can be set with a relatively simple process as compared with the gesture consisting of complicated figures.
- the own vehicle V when the gesture is no longer detected, the own vehicle V is stopped from the autonomous parking control. Therefore, the operation for causing the own vehicle V to stop the autonomous parking control becomes unnecessary, and the remote control of the own vehicle V becomes easy.
- the autonomous parking control for parking the own vehicle V since the autonomous parking control for parking the own vehicle V is executed as the autonomous traveling control, the own vehicle V can be remotely controlled and parked from a distant position.
- Second Embodiment a second embodiment of the remote parking system to which the vehicle remote control method and the vehicle remote control device of the present invention are applied will be described.
- the same reference numerals as those of the first embodiment will be used for the same configurations as those of the first embodiment, and detailed description thereof will be omitted.
- the first gesture determination area Ga0 is set at a preset position of the touch panel 211 regardless of the input position of the touch operation with respect to the touch panel 211. Then, when the input position of the touch operation is not within the first gesture determination area Ga0, a new gesture determination area is set according to the input position.
- the first gesture determination area Ga0 is set at a preset position of the touch panel 211 before the autonomous travel control of the own vehicle V is started. Then, after the autonomous travel control of the own vehicle V is started, a new gesture determination area is set according to the input position of the touch operation. That is, in the present embodiment, the setting of the gesture determination area is switched before and after the start of the autonomous travel control of the own vehicle V.
- step S10 relating to the remote operation of the present embodiment, in step S101a before the autonomous travel control of the own vehicle V is started, the gesture determination unit 212 inputs a touch operation to the touch panel 211. Regardless of the position, the first gesture determination area Ga0 is set at a preset position on the touch panel 211. The input guide 212a may be displayed on the touch panel 211 corresponding to the first gesture determination area Ga0.
- the gesture determination unit 212 sets in advance whether the detected gesture is the command gesture G4 in the next step S103a. It is determined whether or not the input speed is within the predetermined range.
- step S102a If the detected gesture is not the command gesture G4, or if the input speed is out of the predetermined range, the gesture determination unit 212 returns to step S102a to detect the next gesture. On the contrary, when the detected gesture is the command gesture G4 and the input speed is within the predetermined range, the process proceeds to step S104a, the command unit 214 generates an execution command signal, and the communication unit 215. Sends an execution command signal to the own vehicle V. As a result, autonomous travel control is started in the own vehicle V. In the next step S105a, if the gesture detection continues, the process proceeds to step S106a, and the gesture determination unit 212 sets a new gesture determination area according to the input position of the touch operation on the touch panel 211. .. After setting a new gesture determination area, the process returns to step S103a and the gesture determination is repeated.
- the operator U may turn to the remote controller 21 in order to start the operation of the remote controller 21. Is high. Therefore, by displaying the input guide 212a at a preset position on the touch panel 211 and setting the first gesture determination area Ga0, it is possible to suggest the operation required for the operator U. The feeling of strangeness can be suppressed.
- the operator U is likely to take his eyes off the remote controller 21 and monitor the own vehicle V. Therefore, the touch operation often deviates from the first gesture judgment area Ga0. Therefore, by setting a new gesture judgment area according to the input position of the touch operation, the autonomous running control of the own vehicle V is continued. It becomes possible to make it.
- the first gesture determination area Ga0 is set at a preset position on the touch panel 211 before the autonomous travel control of the own vehicle V is started. Then, after the autonomous travel control of the own vehicle V is started, a new gesture determination area is set according to the input position of the touch operation. On the other hand, in the present embodiment, a new gesture determination area is set according to the input position of the touch operation before the autonomous traveling control of the own vehicle V is started. Then, after the autonomous travel control of the own vehicle V is started, the first gesture determination area Ga0 is set at a preset position on the touch panel 211. That is, in this embodiment, the switching of the gesture determination area is reversed from that in the second embodiment before and after the start of the autonomous travel control of the own vehicle V.
- step S10 relating to the remote operation of the present embodiment, in step S101b before the autonomous travel control of the own vehicle V is started, the gesture determination unit 212 touches the touch panel 211.
- a new gesture judgment area is set according to the input position.
- the gesture determination unit 212 determines whether the detected gesture is the command gesture G4 in the next step S103b, and a predetermined predetermined value set in advance. Determine if the input speed is within the range.
- step S104b the command unit 214 generates an execution command signal, and the communication unit 215. Sends an execution command signal to the own vehicle V. As a result, autonomous travel control is started in the own vehicle V.
- step S105b if the gesture detection continues, the process proceeds to step S106b.
- step S106b the gesture determination unit 212 sets the first gesture determination area Ga0 at a preset position of the touch panel 211 regardless of the input position of the touch operation with respect to the touch panel 211.
- the input guide 212a may be displayed on the touch panel 211 corresponding to the first gesture determination area Ga0. After setting the first gesture determination area Ga0, the process returns to step S103b and the gesture determination is repeated.
- the operator U sets the own vehicle V to confirm that the own vehicle V starts the autonomous driving control. You may turn your eyes. Therefore, by setting a new gesture determination area according to the input position of the touch operation, it is possible to easily start the autonomous traveling control of the own vehicle V. Further, after the autonomous parking control of the own vehicle V is started, the operator U may turn to the remote controller 21 in order to accurately operate the remote controller 21. Therefore, by displaying the input guide 212a at a preset position and setting the first gesture determination area Ga0, the operator U can be prompted to operate the remote controller 21 accurately.
- the gesture determination area is switched by using a change in the input position of the touch operation and before and after the start of the autonomous driving control of the own vehicle V as a trigger.
- the autonomous driving control of the own vehicle V when executed, it is determined by the autonomous driving control whether or not the own vehicle V is traveling straight, and if the vehicle V is traveling straight, it is determined.
- a new gesture determination area is set according to the input position to the touch panel 211, and when the vehicle is not traveling straight, the first gesture determination area Ga0 is set at a preset position.
- step S10 relating to the remote operation of the present embodiment, the gesture determination unit 212 determines in step S101c whether or not the own vehicle V is executing autonomous travel control.
- the gesture determination unit 212 determines in step S102c whether or not the own vehicle V is traveling straight.
- the gesture determination unit 212 proceeds to the next step S103c and sets a new gesture determination area according to the input position of the touch operation on the touch panel 211.
- step S104c the gesture determination unit 212 sets the touch panel 211 in advance regardless of the input position of the touch operation on the touch panel 211.
- the first gesture determination area Ga0 is set at the determined position.
- the input guide 212a may be displayed on the touch panel 211 corresponding to the first gesture determination area Ga0.
- the operator U When the own vehicle V goes straight by autonomous driving control, especially when the own vehicle V goes straight in the direction away from the operator U, the operator U prevents the own vehicle V from colliding with an obstacle or the like. There is a possibility to turn to V. Therefore, by setting a new gesture determination area according to the input position of the touch operation, it is possible to facilitate the execution of the autonomous travel control of the own vehicle V. Further, when the own vehicle V is turning without going straight, the autonomous traveling control of the own vehicle V may be performed relatively close to the operator U. In such a case, the operator U may look at the remote controller 21 in order to operate the remote controller 21 accurately. Therefore, by displaying the input guide 212a at a preset position and setting the first gesture determination area Ga0, the operator U can be prompted to operate the remote controller 21 accurately.
- the gesture determination area when the own vehicle V is traveling straight by autonomous driving control, the gesture determination area is set according to the input position to the touch panel 211, and when the vehicle V is not traveling straight, the gesture determination area is set to a preset position.
- the first gesture determination area Ga0 is set.
- the first gesture determination area Ga0 when the own vehicle V is traveling straight, the first gesture determination area Ga0 is set at a preset position, and the own vehicle V is set. If is not moving straight, the gesture determination area is set according to the input position to the touch panel 211.
- the gesture determination unit 212 determines in step S101d whether or not the own vehicle V is executing the autonomous travel control.
- the gesture determination unit 212 determines in step S102d whether or not the own vehicle V is traveling straight.
- the gesture determination unit 212 proceeds to the next step S103d, and the first gesture is performed at a preset position of the touch panel 211 regardless of the input position of the touch operation with respect to the touch panel 211.
- the determination area Ga0 is set.
- the input guide 212a may be displayed on the touch panel 211 corresponding to the first gesture determination area Ga0. If the own vehicle V is not traveling straight in step S102d, the gesture determination unit 212 sets a new gesture determination area according to the input position to the touch panel 211 in the next step S104d. ..
- the operator U can turn his / her eyes to the remote controller 21 because he / she starts the operation of the remote controller 21 after confirming the traveling direction of the own vehicle V in advance. Highly sexual. Therefore, by displaying the input guide 212a at a preset position on the touch panel 211 and setting the first gesture determination area Ga0, it is possible to suggest the operation required for the operator U. The feeling of strangeness can be suppressed. On the contrary, when the own vehicle V makes a turn without going straight, the operator U is likely to look at the own vehicle V to check if there are any obstacles in the surroundings. .. Therefore, by setting a new gesture determination area according to the input position of the touch operation, it is possible to facilitate the execution of the autonomous travel control of the own vehicle V.
- the remote controller 21 is provided with the gesture determination unit 212 and the command unit 214 .
- the own vehicle V is provided with the gesture determination unit 212 and the command.
- a unit 214 may be provided.
- the remote controller 21 transmits the detection value of the touch panel 211 to the own vehicle V, determines whether or not the gesture input by the gesture determination unit 212 of the own vehicle V is a command gesture.
- the execution command signal may be output from the command unit 214 of the own vehicle V to the route tracking control unit 16 and the target vehicle speed generation unit 17.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Human Computer Interaction (AREA)
- Mathematical Physics (AREA)
- Computing Systems (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
Description
以下、本発明の実施形態を図面に基づいて説明する。図1は、本発明の車両の遠隔制御方法及び遠隔制御装置を適用したリモート駐車システム1を示すブロック図である。本明細書において「自律走行制御」とは、ドライバの運転操作に依ることなく、車載された走行制御装置の自動制御により、車両を走行させることをいう。「自律駐車制御」とは、自律走行制御の一種であって、ドライバの運転操作に依ることなく、車載された走行制御装置の自動制御により、車両を駐車(入庫又は出庫)させることをいう。また、「駐車」とは、駐車スペースへ車両を継続的に止めておくことをいうが、「走行経路」という場合には、駐車スペースへ車庫入れする場合の駐車経路のみならず、駐車スペースからの車庫出し経路をも含むものとする。その意味で、「駐車時の車両走行制御方法及び車両走行制御装置」は、駐車スペースへの車庫入れ時の車両の走行制御と、駐車スペースからの車庫出し時の車両走行制御との両方が含まれる。なお、車庫入れを入庫とも言い、車庫出しを出庫とも言う。以下の実施形態においては、本発明に係る遠隔制御方法及び遠隔制御装置を、遠隔制御された車両を自律走行制御により駐車するリモート駐車システムに適用した一例を挙げて、本発明の具体例を説明する。本実施形態のリモート駐車システム1は、ドライバなどの操作者が車両に乗車して当該操作者の介入操作が可能なアシストモードにより自律走行制御を行う。その後、操作者が車両から降車し、車両の外部から遠隔操作器を用いたリモートコントロールモードにより自律走行制御を行う。
次に、本発明の車両遠隔制御方法及び車両遠隔制御装置を適用したリモート駐車システムの第2実施形態について説明する。なお、第1実施形態と同様の構成については、第1実施形態と同符号を用いて詳しい説明は省略する。
次に、本発明の車両遠隔制御方法及び車両遠隔制御装置を適用したリモート駐車システムの第3実施形態について説明する。なお、第1実施形態と同様の構成については、第1実施形態と同符号を用いて詳しい説明は省略する。
次に、本発明の車両遠隔制御方法及び車両遠隔制御装置を適用したリモート駐車システムの第4実施形態について説明する。なお、第1実施形態と同様の構成については、第1実施形態と同符号を用いて詳しい説明は省略する。
次に、本発明の車両遠隔制御方法及び車両遠隔制御装置を適用したリモート駐車システムの第5実施形態について説明する。なお、第1実施形態と同様の構成については、第1実施形態と同符号を用いて詳しい説明は省略する。
11…目標駐車スペース設定器
12…車両位置検出器
13…物体検出器
14…駐車経路生成部
15…物体減速演算部
16…経路追従制御部
17…目標車速生成部
18…操舵角制御部
19…車速制御部
20…親機
22…子機
21…遠隔操作器
211…タッチパネル
212…ジェスチャ判定部
212a…入力ガイド
213…記憶部
214…指令部
215…通信部
G1~G7…指令ジェスチャ
Ip0…中心位置
Ga0、Ga1、Ga2…ジェスチャ判定エリア
V…自車両
TPS…目標とする駐車スペース
R1,R2…駐車経路
W…壁(障害物)
H1,H2…家屋(障害物)
WD…植木(障害物)
Claims (17)
- 遠隔操作器のタッチパネルにおいて、操作者のタッチ操作が行われた入力位置を検出し、
前記入力位置に応じて、車両を遠隔制御するためのジェスチャを受け付けるジェスチャ判定エリアを可変とし、
前記ジェスチャ判定エリアにより、前記ジェスチャを検出し、
検出した前記ジェスチャが、予め設定した指令ジェスチャであるか否かを判定し、
前記ジェスチャが前記指令ジェスチャである場合、自律走行制御機能を備えた前記車両に自律走行制御を実行させる車両遠隔制御方法。 - 前記タッチパネルに対する前記ジェスチャ判定エリアのサイズは可変である請求項1に記載の車両遠隔制御方法。
- 前記ジェスチャ判定エリアのサイズは、予め設定した最小入力サイズと、最大入力サイズとの間で可変である請求項2に記載の車両遠隔制御方法。
- 前記タッチパネルにおいて、予め設定された位置に第1の前記ジェスチャ判定エリアが設定されており、
前記入力位置に応じて、新たな前記ジェスチャ判定エリアを設定する請求項1~3のいずれか1項に記載の車両遠隔制御方法。 - 前記自律走行制御が開始される前において、予め設定された位置に第1の前記ジェスチャ判定エリアが設定されており、
前記自律走行制御が開始された後において、前記入力位置に応じて、新たな前記ジェスチャ判定エリアを設定する請求項1~4のいずれか1項に記載の車両遠隔制御方法。 - 前記自律走行制御が開始される前において、前記入力位置に応じて、前記ジェスチャ判定エリアを可変に設定し、
前記自律走行制御が開始された後において、予め設定された位置に前記ジェスチャ判定エリアを設定する請求項1~4のいずれか1項に記載の車両遠隔制御方法。 - 前記自律走行制御が実行されている場合において、前記自律走行制御により前記車両が直進しているか否かを判定し、
直進している場合は、前記入力位置に応じて前記ジェスチャ判定エリアを設定し、
直進していない場合は、予め設定された位置に前記ジェスチャ判定エリアを設定する請求項1~6のいずれか1項に記載の車両遠隔制御方法。 - 前記ジェスチャが非検出となった場合、非検出となる直前に設定したジェスチャ判定エリアを第2の前記ジェスチャ判定エリアとして記憶し、
前記ジェスチャの入力が再開された場合には、第2の前記ジェスチャ判定エリアにより、前記ジェスチャが前記指令ジェスチャであるか否かを判定する請求項4又は5に記載の車両遠隔制御方法。 - 前記ジェスチャが非検出となった後、前記ジェスチャの入力が再開された場合には、第1の前記ジェスチャ判定エリアにより、前記ジェスチャが前記指令ジェスチャであるか否かを判定する請求項4又は5に記載の車両遠隔制御方法。
- 前記ジェスチャが非検出になる場合とは、前記タッチパネルから操作者の指が離れた場合、前記ジェスチャが前記指令ジェスチャであると判定されなかった場合、あるいは操作者が前記ジェスチャの入力を途中で終了した場合である請求項8又は9に記載の車両遠隔制御方法。
- 前記指令ジェスチャは、前記タッチパネルの横方向をX軸、縦方向をY軸としたときに、操作者のタッチ操作による入力位置の座標が、X軸とY軸の少なくとも一方において連続的に変化するタッチ操作である請求項1~10のいずれか1項に記載の車両遠隔制御方法。
- 前記指令ジェスチャは、前記ジェスチャの1回の入力が開始される始端と、前記ジェスチャの1回の入力が終わる終端とが重なるような図形の軌跡を、前記タッチパネルに描くタッチ操作である請求項11に記載の車両遠隔制御方法。
- 前記指令ジェスチャは、円環形状の図形の軌跡を前記タッチパネルに描くタッチ操作である請求項12に記載の車両遠隔制御方法。
- 前記ジェスチャ判定エリアは、前記円環形状の図形の中心座標と、半径とによって規定する請求項13に記載の車両遠隔制御方法。
- 前記ジェスチャが検出されなくなった場合、前記車両に自律走行制御を中止させる請求項1~14のいずれか1項に記載の車両遠隔制御方法。
- 前記自律走行制御は、前記車両を駐車させる駐車制御である請求項1~15のいずれか1項に記載の車両遠隔制御方法。
- 操作者によるタッチ操作の入力位置を検出するタッチパネルを備えた車両の遠隔操作器と、
前記入力位置に応じて、前記車両を遠隔制御するためのジェスチャを受け付けるジェスチャ判定エリアを可変とし、前記ジェスチャ判定エリアにより検出した前記ジェスチャが、予め設定した指令ジェスチャであるか否かを判定するジェスチャ判定部と、
前記ジェスチャが前記指令ジェスチャである場合、自律走行制御機能を備えた前記車両に自律走行制御を実行させる指令部と、を備える車両遠隔制御装置。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201980100173.1A CN114616835A (zh) | 2019-09-09 | 2019-09-09 | 车辆远程控制方法及车辆远程控制装置 |
JP2021544947A JP7298699B2 (ja) | 2019-09-09 | 2019-09-09 | 車両遠隔制御方法及び車両遠隔制御装置 |
PCT/IB2019/001097 WO2021048582A1 (ja) | 2019-09-09 | 2019-09-09 | 車両遠隔制御方法及び車両遠隔制御装置 |
US17/640,990 US20220342415A1 (en) | 2019-09-09 | 2019-09-09 | Vehicle Remote Control Method and Vehicle Remote Control Device |
EP19944794.7A EP4030773B1 (en) | 2019-09-09 | 2019-09-09 | Vehicle remote control method and vehicle remote control device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/IB2019/001097 WO2021048582A1 (ja) | 2019-09-09 | 2019-09-09 | 車両遠隔制御方法及び車両遠隔制御装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021048582A1 true WO2021048582A1 (ja) | 2021-03-18 |
Family
ID=74865676
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/IB2019/001097 WO2021048582A1 (ja) | 2019-09-09 | 2019-09-09 | 車両遠隔制御方法及び車両遠隔制御装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US20220342415A1 (ja) |
EP (1) | EP4030773B1 (ja) |
JP (1) | JP7298699B2 (ja) |
CN (1) | CN114616835A (ja) |
WO (1) | WO2021048582A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023107662A (ja) * | 2022-01-24 | 2023-08-03 | 本田技研工業株式会社 | 情報端末、制御システム、及び制御方法 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021048580A1 (ja) * | 2019-09-09 | 2021-03-18 | 日産自動車株式会社 | 車両遠隔制御方法及び車両遠隔制御装置 |
US20220258965A1 (en) * | 2021-02-17 | 2022-08-18 | Oshkosh Corporation | Large cab innovations |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013012021A (ja) * | 2011-06-29 | 2013-01-17 | Sony Corp | 情報処理装置、情報処理方法、プログラム、および遠隔操作システム |
WO2013171898A1 (ja) * | 2012-05-18 | 2013-11-21 | トヨタ自動車株式会社 | 車両の情報表示装置 |
JP2014006883A (ja) * | 2012-05-31 | 2014-01-16 | Canon Inc | 電子機器及び情報処理装置及びその制御方法 |
US20160170494A1 (en) | 2013-07-26 | 2016-06-16 | Daimler Ag | Method and device for remote control of a function of a vehicle |
WO2019163165A1 (ja) * | 2018-02-23 | 2019-08-29 | アルプスアルパイン株式会社 | 車載器、携帯機、及び車両遠隔制御システム |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9731714B2 (en) * | 2014-06-25 | 2017-08-15 | Fujitsu Ten Limited | Vehicle apparatus |
US9809231B2 (en) * | 2015-10-28 | 2017-11-07 | Honda Motor Co., Ltd. | System and method for executing gesture based control of a vehicle system |
CA3072034C (en) * | 2017-08-10 | 2023-06-20 | Nissan Motor Co., Ltd. | Parking control method and parking control device |
KR102201757B1 (ko) * | 2017-10-12 | 2021-01-12 | 엘지전자 주식회사 | 자율주행 차량 및 그 제어 방법 |
KR102037324B1 (ko) * | 2017-11-30 | 2019-10-28 | 엘지전자 주식회사 | 자율주행 차량 및 그 제어 방법 |
DE102019210383A1 (de) * | 2019-07-15 | 2021-01-21 | Audi Ag | Verfahren zum Betreiben eines mobilen Endgeräts mittels einer Gestenerkennungs- und Steuereinrichtung, Gestenerkennungs- und Steuereinrichtung, Kraftfahrzeug, und am Kopf tragbare Ausgabevorrichtung |
-
2019
- 2019-09-09 US US17/640,990 patent/US20220342415A1/en active Pending
- 2019-09-09 JP JP2021544947A patent/JP7298699B2/ja active Active
- 2019-09-09 EP EP19944794.7A patent/EP4030773B1/en active Active
- 2019-09-09 WO PCT/IB2019/001097 patent/WO2021048582A1/ja active Application Filing
- 2019-09-09 CN CN201980100173.1A patent/CN114616835A/zh active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2013012021A (ja) * | 2011-06-29 | 2013-01-17 | Sony Corp | 情報処理装置、情報処理方法、プログラム、および遠隔操作システム |
WO2013171898A1 (ja) * | 2012-05-18 | 2013-11-21 | トヨタ自動車株式会社 | 車両の情報表示装置 |
JP2014006883A (ja) * | 2012-05-31 | 2014-01-16 | Canon Inc | 電子機器及び情報処理装置及びその制御方法 |
US20160170494A1 (en) | 2013-07-26 | 2016-06-16 | Daimler Ag | Method and device for remote control of a function of a vehicle |
WO2019163165A1 (ja) * | 2018-02-23 | 2019-08-29 | アルプスアルパイン株式会社 | 車載器、携帯機、及び車両遠隔制御システム |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023107662A (ja) * | 2022-01-24 | 2023-08-03 | 本田技研工業株式会社 | 情報端末、制御システム、及び制御方法 |
JP7360483B2 (ja) | 2022-01-24 | 2023-10-12 | 本田技研工業株式会社 | 情報端末、制御システム、及び制御方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4030773B1 (en) | 2023-11-01 |
EP4030773A1 (en) | 2022-07-20 |
US20220342415A1 (en) | 2022-10-27 |
JPWO2021048582A1 (ja) | 2021-03-18 |
JP7298699B2 (ja) | 2023-06-27 |
EP4030773A4 (en) | 2022-10-19 |
CN114616835A (zh) | 2022-06-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6835219B2 (ja) | 駐車制御方法及び駐車制御装置 | |
WO2021048582A1 (ja) | 車両遠隔制御方法及び車両遠隔制御装置 | |
JP6809364B2 (ja) | 自動バレー駐車システム、自動バレー車両、及び自動バレー駐車方法 | |
JP7032568B2 (ja) | 車両走行制御方法及び車両走行制御装置 | |
WO2019123585A1 (ja) | 駐車制御方法及び駐車制御装置 | |
CN111527010A (zh) | 驻车控制方法以及驻车控制装置 | |
JP7276471B2 (ja) | 車両遠隔制御方法及び車両遠隔制御装置 | |
WO2020115517A1 (ja) | 駐車時の車両走行制御方法及び車両走行制御装置 | |
EP4029749B1 (en) | Vehicle remote control method and vehicle remote control device | |
JP7206103B2 (ja) | 車両走行制御方法及び車両走行制御装置 | |
RU2795911C1 (ru) | Способ дистанционного управления транспортным средством и устройство дистанционного управления транспортным средством | |
WO2021048891A1 (ja) | 移動体および移動体制御方法 | |
JP7278760B2 (ja) | 駐車時の車両走行制御方法及び車両走行制御装置 | |
RU2795181C1 (ru) | Способ дистанционного управления транспортным средством и устройство дистанционного управления транспортным средством | |
RU2795171C1 (ru) | Способ и устройство дистанционного управления транспортным средством |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 19944794 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021544947 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2022105803 Country of ref document: RU |
|
ENP | Entry into the national phase |
Ref document number: 2019944794 Country of ref document: EP Effective date: 20220411 |