WO2024014182A1 - Dispositif de détection de geste de véhicule et procédé de détection de geste de véhicule - Google Patents

Dispositif de détection de geste de véhicule et procédé de détection de geste de véhicule Download PDF

Info

Publication number
WO2024014182A1
WO2024014182A1 PCT/JP2023/020909 JP2023020909W WO2024014182A1 WO 2024014182 A1 WO2024014182 A1 WO 2024014182A1 JP 2023020909 W JP2023020909 W JP 2023020909W WO 2024014182 A1 WO2024014182 A1 WO 2024014182A1
Authority
WO
WIPO (PCT)
Prior art keywords
gesture
user
determination
vehicle
body part
Prior art date
Application number
PCT/JP2023/020909
Other languages
English (en)
Japanese (ja)
Inventor
拓也 田村
慶一 梁井
Original Assignee
株式会社アイシン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社アイシン filed Critical 株式会社アイシン
Publication of WO2024014182A1 publication Critical patent/WO2024014182A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/01Fittings or systems for preventing or indicating unauthorised use or theft of vehicles operating on vehicle systems or fittings, e.g. on doors, seats or windscreens
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05BLOCKS; ACCESSORIES THEREFOR; HANDCUFFS
    • E05B49/00Electric permutation locks; Circuits therefor ; Mechanical aspects of electronic locks; Mechanical keys therefor
    • EFIXED CONSTRUCTIONS
    • E05LOCKS; KEYS; WINDOW OR DOOR FITTINGS; SAFES
    • E05FDEVICES FOR MOVING WINGS INTO OPEN OR CLOSED POSITION; CHECKS FOR WINGS; WING FITTINGS NOT OTHERWISE PROVIDED FOR, CONCERNED WITH THE FUNCTIONING OF THE WING
    • E05F15/00Power-operated mechanisms for wings
    • E05F15/70Power-operated mechanisms for wings with automatic actuation
    • E05F15/73Power-operated mechanisms for wings with automatic actuation responsive to movement or presence of persons or objects

Definitions

  • the present disclosure relates to a vehicle gesture detection device and a vehicle gesture detection method.
  • Patent Document 1 describes a vehicle body provided with an opening, a vehicle gate that opens and closes the opening, a gate actuator that drives the vehicle gate, a camera that photographs the surroundings of the vehicle, and a vehicle that controls the gate actuator.
  • a vehicle is described that includes a gesture detection device. When the vehicle gesture detection device determines that the user has performed a predetermined gesture based on the image taken by the camera, the vehicle gate is opened by the gate actuator.
  • the above-mentioned vehicle gesture detection device still has room for improvement in terms of accurately detecting user gestures.
  • a gesture detection device for a vehicle.
  • the vehicle gesture detection device is applied to a vehicle that includes a door drive unit that opens and closes a door opening of a vehicle body, and a camera that photographs the area around the door opening, and operates the door.
  • Detect user gestures The gesture is an action in which the user moves a body part in a first direction and then moves it in a second direction opposite to the first direction.
  • the vehicular gesture detection device performs a gesture determination process of determining whether the user has performed the gesture based on a change in the position of the body part in a plurality of images taken by the camera at different times.
  • a determination unit is provided to perform the determination. In the gesture determination process, the determination unit is configured to perform, when the position of the body part continues to move in the second direction after the position of the body part continues to move in the first direction, It is determined that the user has performed the gesture.
  • a gesture detection method for a vehicle is provided.
  • the gesture detection method for a vehicle is applied to a vehicle that includes a door drive unit that opens and closes a door opening of a vehicle body, and a camera that photographs the vicinity of the door opening, and operates the door.
  • Detect user gestures The gesture is an action in which the user moves a body part in a first direction and then moves it in a second direction opposite to the first direction.
  • the gesture detection method for a vehicle includes a determination step of determining whether the user has performed the gesture based on a change in the position of the body part in a plurality of images taken by the camera.
  • the determination step includes determining whether the user makes the gesture when the position of the body part continues to move in the second direction after the position of the body part continues to move in the first direction. This includes determining that the same has been implemented.
  • FIG. 1 is a schematic diagram of a vehicle.
  • FIG. 2 is an example of an image taken by the camera.
  • FIG. 3 is an example of the quantized movement direction of the user's foot.
  • FIG. 4 is a graph showing how the foot moves when different users perform kick gestures.
  • FIG. 5 is an example of a gesture pattern.
  • FIG. 6 is a flowchart illustrating the flow of processing performed by the gesture detection device to detect a user's kick gesture.
  • FIG. 7 is a table showing changes in the displacement vector of the foot position when the user performs a large kick gesture.
  • FIG. 8 is a table showing changes in the displacement vector of the foot position when the user performs a small kick gesture.
  • gesture detection device and gesture detection method a vehicle gesture detection method
  • the vehicle 10 includes a vehicle body 20, a front door 30, a sliding door 40, a door driving section 50, a camera 60, a wireless communication device 70, a door control device 80, and a gesture detection device. 90. Further, a portable device 100 is tied to the vehicle 10.
  • the vehicle body 20 has a front opening 21 that is opened and closed by a front door 30, and a rear opening 22 that is opened and closed by a sliding door 40.
  • the front opening 21 and the rear opening 22 are parts that the user passes through when going back and forth between the inside and outside of the vehicle 10.
  • the front opening 21 and the rear opening 22 are partitioned by the B pillar of the vehicle body 20.
  • the front opening 21 and the rear opening 22 may be integrated as one opening.
  • the rear opening 22 corresponds to a "door opening".
  • the front door 30 includes a door main body 31, a door knob 32 provided near the rear end of the door main body 31, and a side mirror 33 provided near the front end of the door main body 31.
  • the front door 30 is displaced between a fully closed position where the front opening 21 is fully closed and a fully open position where the front opening 21 is fully opened by swinging around an axis extending in the vertical direction with respect to the vehicle body 20. .
  • the sliding door 40 includes a door body 41 and a door knob 42 provided near the front end of the door body 41.
  • the sliding door 40 is displaced between a fully closed position where the rear opening 22 is fully closed and a fully open position where the rear opening 22 is fully opened by sliding in the longitudinal direction with respect to the vehicle body 20.
  • the opening direction of the sliding door 40 is backward, and the closing direction of the sliding door 40 is forward.
  • the sliding door 40 is opened and closed by a door drive unit 50 between a fully closed position and a fully open position.
  • the sliding door 40 can also be called a rear door in that it is located at the rear of the front door 30.
  • the camera 60 is installed on the side mirror 33 so as to face downward and rearward. As shown in FIG. 1, the photographing area AP of the camera 60 includes the area around the rear opening 22.
  • the camera 60 outputs the captured image to the gesture detection device 90 frame by frame.
  • the camera 60 may be, for example, a surroundings monitoring device for automatic driving or a camera that captures an original image for synthesizing a bird's-eye view of the surroundings of the vehicle.
  • the frame rate of the camera 60 may be about 30 fps, for example.
  • the portable device 100 includes a switch that is operated to open, close, or stop the sliding door 40.
  • the portable device 100 may be a so-called electronic key, a smartphone, or another communication terminal.
  • the wireless communication device 70 determines whether or not the portable device 100 is associated with the vehicle 10 by performing wireless communication with the portable device 100 located around the vehicle 10 . In this respect, the wireless communication device 70 can determine whether or not a user with the portable device 100 exists within the communication area AC set around the vehicle 10.
  • Communication area AC is a larger area than photographing area AP.
  • the wireless communication device 70 When the switch for operating the sliding door 40 is operated in the portable device 100, the wireless communication device 70 outputs any one of an open operation command signal, a close operation command signal, and a stop command signal, depending on the operated switch. is output to the door control device 80.
  • the opening operation command signal is a command signal for opening the sliding door 40.
  • the closing operation command signal is a command signal for closing the sliding door 40.
  • the stop command signal is a command signal for stopping the sliding door 40 during opening/closing operation.
  • the wireless communication device 70 outputs a signal indicating this to the gesture detection device 90.
  • the door control device 80 includes, for example, a processing circuit including a CPU and a memory.
  • Door control device 80 controls door drive unit 50 according to a program stored in memory.
  • the door control device 80 controls the door drive unit 50 based on the contents of the input command signal. Specifically, the door control device 80 operates the sliding door 40 to open when the opening operation command signal is input.
  • the door control device 80 closes the sliding door 40 when the closing operation command signal is input.
  • the door control device 80 stops the sliding door 40 in operation when a stop command signal is input.
  • the gesture detection device 90 includes, for example, a processing circuit including a CPU and a memory.
  • the gesture detection device 90 detects a gesture that causes a user's body part to reciprocate, and outputs a command signal to the door control device 80, according to a program stored in a memory.
  • the gesture in this embodiment is a kick gesture in which the user's foot Ft approaches the vehicle body 20 and then moves the user's foot Ft away from the vehicle body 20.
  • the direction in which the user's foot Ft approaches the vehicle body 20 corresponds to a "first direction”
  • the direction in which the user's foot Ft moves away from the vehicle body 20 corresponds to a "second direction.”
  • the gesture detection device 90 includes a storage section 91 and a determination section 92.
  • the storage unit 91 stores a learned model that has been subjected to machine learning using teacher data in which images photographed in advance are associated with the user's foot position Pf.
  • the learned model is a model that inputs an image captured by the camera 60 and outputs the user's foot position Pf within the image.
  • the learned model is created, for example, when designing the vehicle 10 and written into the storage unit 91 when the gesture detection device 90 is manufactured.
  • the foot position Pf means the position of the user's foot Ft, more specifically, the position of the user's toe.
  • the method for generating a trained model includes a preparation step of preparing teacher data, and a learning step of performing machine learning based on the teacher data.
  • the preparation process includes an acquisition process of acquiring images taken with the user standing in the photography area AP under various conditions, and a specification process of specifying the user's foot position Pf in the plurality of images acquired in the acquisition process. and, including.
  • the acquisition step is performed using, for example, the actual vehicle 10.
  • the acquisition step it is preferable to acquire many images taken by changing the conditions related to the user and the conditions related to the environment around the vehicle 10. This makes it possible to obtain a trained model that is adaptable to various situations, in other words, a highly versatile trained model.
  • the user's foot position Pf is designated with respect to the acquired image. To specify the position, for example, coordinates using pixels in an image may be used.
  • a model is generated by machine learning using multiple teaching data as learning data.
  • Various machine learning methods can be selected, and an example is a convolutional neural network (CNN).
  • the learned model outputs the foot position Pf of a person in the image when a photographed image is input.
  • the trained model does not output the foot position Pf when an image that does not include the person's foot Ft is input. Note that even if the trained model can output the foot position Pf, the outputted foot position Pf may not be the position of the user's toe depending on the accuracy of the trained model. However, in the following description, it is assumed that the output foot position Pf is the tip of the user's foot.
  • the determination unit 92 acquires the foot position Pf within the image by inputting the image captured by the camera 60 into the learned model. Then, the determination unit 92 performs a start determination process of determining whether the user is present in the vicinity of the vehicle 10, and determines whether the user has performed a kick gesture, based on the acquired foot position Pf. Gesture determination processing is performed.
  • the determining unit 92 determines whether the user's foot Ft exists within the determination area AG set in the image captured by the camera 60.
  • the determination unit 92 performs gesture determination processing when the user's foot position Pf exists within the determination area AG.
  • the determination unit 92 does not perform the gesture determination process when the user's foot position Pf cannot be acquired or when the user's foot position Pf does not exist within the determination area AG.
  • the determination area AG is set within the photographing area AP of the camera 60.
  • the determination area AG includes a first determination area AG1 that is close to the camera 60 and a second determination area AG2 that is far from the camera 60.
  • the first determination area AG1 can also be said to be an area close to the vehicle 10
  • the second determination area AG2 can also be said to be an area distant from the vehicle 10.
  • the determination unit 92 calculates a displacement vector Vd of the user's foot Ft between the plurality of images based on the plurality of images taken at different times. Specifically, the determination unit 92 calculates a displacement vector Vd from the foot position Pf in the N-th image to the foot position Pf in the N+1-th image.
  • the Nth image is the Nth image taken by the camera 60
  • the N+1st image is the N+1st image taken by the camera 60, and the frame after the Nth image. This is an image of
  • the determination unit 92 matches the feature amount of the area including the foot position Pf in the Nth image with respect to the N+1th image, thereby determining the N+1 corresponding to the foot position Pf in the Nth image.
  • the foot position Pf in the th image is obtained.
  • the determination unit 92 may obtain the foot position Pf in the N+1-th image by inputting the N+1-th image to the trained model. Then, the determination unit 92 calculates the displacement vector Vd of the foot position Pf based on the foot position Pf in both images. Subsequently, the determination unit 92 calculates a displacement vector Vd from the foot position Pf in the N+1-th image to the foot position Pf in the N+2-th image. Furthermore, the determination unit 92 calculates a displacement vector Vd from the foot position Pf in the N+2-th image to the foot position Pf in the N+3-th image.
  • the determination unit 92 calculates the displacement vector Vd of the foot position Pf every time a new image is taken.
  • the determination unit 92 may calculate the displacement vector Vd of the foot position Pf for each frame, or may calculate the displacement vector Vd of the foot position Pf for each plurality of frames.
  • the direction of the displacement vector Vd of the foot position Pf indicates the direction of movement of the user's foot Ft.
  • the magnitude of the displacement vector Vd of the foot position Pf indicates the amount of displacement of the foot Ft between one frame, that is, the speed of the user's foot Ft.
  • the determination unit 92 determines that a kick gesture has been performed when the direction of the displacement vector Vd of the foot position Pf changes according to the gesture pattern corresponding to the kick gesture.
  • the direction of the displacement vector Vd of the foot position Pf is quantized into eight directions. In other embodiments, the direction of the displacement vector Vd of the foot position Pf may be quantized in 4 directions, 16 directions, or any other number of directions.
  • the X direction is the horizontal direction of the image taken by the camera 60
  • the Y direction is the vertical direction of the image taken by the camera 60.
  • the gesture pattern defines the order in which the direction of the displacement vector Vd of the foot position Pf changes.
  • the determination unit 92 alternately calculates the displacement vector Vd of the foot position Pf and matches the displacement vector Vd with the gesture pattern. In other embodiments, the determination unit 92 may perform the calculation of the displacement vector Vd of the foot position Pf and the verification of the gesture pattern at the same time. Then, when the direction of the displacement vector Vd of the user's foot position Pf changes according to the gesture pattern, that is, when it is determined that the user has performed a kick gesture, the determination unit 92 issues an opening operation command to the door control device 80. Output a signal.
  • the determination unit 92 determines whether the user has performed a kick gesture if the direction of the displacement vector Vd of the foot position Pf when the user performs the kick gesture changes according to a preset gesture pattern. It is determined that the Here, if different users perform the kick gesture, there may be differences in the manner in which the kick gesture is performed. Furthermore, even if the same user performs the kick gesture, there may be differences in the manner in which the kick gesture is performed. Therefore, the determination unit 92 needs to accurately determine whether the user has performed a kick gesture, regardless of the manner in which the user performs the kick gesture.
  • the horizontal axis indicates the coordinates of the foot position Pf in the X direction within the image
  • the vertical axis indicates the coordinates of the foot position Pf in the Y direction within the image.
  • Three types of markers indicate three users. Furthermore, two markers connected by a line segment indicate the foot position Pf in the N-th image and the foot position Pf in the N+1-th image. Therefore, when two markers are connected in the order in which the images are taken, the above-mentioned displacement vector Vd is obtained.
  • the user's foot position Pf continues to move toward the upper right, and then continues to move toward the lower left.
  • the direction of the displacement vector Vd of the user's foot position Pf is quantized as shown in FIG.
  • the state of movement in the "3" direction continues.
  • the direction of movement of the foot position Pf of the three users tends to change in the same way before and after the position where the direction of movement of the user's foot position Pf is reversed (hereinafter also referred to as the "return position"). It has become.
  • the turn-back position the user's feet Ft tend to move in the same direction, and after reaching the turn-back position, the user's feet Ft tend to move in the same direction.
  • the determination unit 92 determines that when the foot position Pf as the detection target position continues to move in the "7" direction and then continues to move in the "3" direction, the user It is determined that a kick gesture has been performed. Therefore, in this embodiment, the gesture pattern set in advance is as shown in FIG. 5. As a result, if the direction of the displacement vector Vd of the foot position Pf changes to the "7" direction four times and then the direction of the displacement vector Vd of the foot position Pf changes to the "3" direction three times, the user performs a kick gesture. It is determined that this has been carried out.
  • the gesture pattern preferably corresponds to the period before and after the turning position when the user performs the kick gesture. Therefore, even if the user performs a kick gesture with a small movement, it is preferable that the gesture pattern be matched with a part of the kick gesture. Therefore, it is preferable that the gesture pattern is determined based on the results of recording kick gestures of a plurality of users in advance.
  • the duration of movement of the foot position Pf in the "7" direction (hereinafter referred to as “first determination time”) required by the gesture pattern is the duration of movement of the foot position Pf in the "3" direction (hereinafter referred to as “first determination time”). 2 judgment time.) Specifically, it is necessary to check the displacement vector Vd in the direction “7” four times, whereas the displacement vector Vd in the direction "3" only needs to be checked three times. It is shorter than the time. In this respect, the determination unit 92 determines that after the state in which the foot position Pf moves in the "7" direction continues for the first determination time, the state in which the foot position Pf moves in the "3” direction continues for less than the first determination time. If it continues for the second determination period, it is determined that the user has performed the kick gesture.
  • the direction of the user's foot position Pf changes according to the gesture pattern described above.
  • the determination unit 92 not determine that the user has performed a kick gesture.
  • the time the user's foot position Pf stays at the same position increases when the moving direction of the user's foot position Pf changes.
  • the time during which the user's foot position Pf remains at the same position is less likely to increase.
  • the determination unit 92 cancels the gesture determination process. Specifically, the determination unit 92 stops the gesture determination process when the state in which the magnitude of the displacement vector Vd is less than the lower limit determination value Vlth continues for a predetermined stop determination time. In other words, the determination unit 92 determines that the user is not performing a kick gesture. In other words, in the present embodiment, the determination unit 92 continues the gesture determination process if the time during which the magnitude of the displacement vector Vd continues to be less than the lower limit determination value Vlth is less than the predetermined stop determination time. .
  • the determination unit 92 allows the foot Ft to stop for a short time when the user performs the kick gesture.
  • the lower limit determination value Vlth is a speed determination value for determining whether the user's foot position Pf is stopped.
  • the determination unit 92 also stops the gesture determination process when the magnitude of the displacement vector Vd of the foot position Pf is greater than or equal to the upper limit determination value Vuth. In other words, the determination unit 92 determines that the user is not performing a kick gesture.
  • the upper limit determination value Vuth is a speed determination value for determining whether the user's foot position Pf is moving at a very high speed.
  • the displacement vector Vd is calculated based on the user's foot position Pf in the image taken by the camera 60. Therefore, even if the user performs the kick gesture in the same way, there will be a difference in the magnitude of the displacement vector Vd when comparing the cases where the user performs the kick gesture at a position near and far from the camera 60. . Therefore, the determination unit 92 corrects the upper limit determination value Vuth and the lower limit determination value Vlth according to the distance from the camera 60 to the user's foot position Pf. Specifically, when the user's foot position Pf is within the first determination area AG1 close to the camera 60, the determination unit 92 does not correct the upper limit determination value Vuth and the lower limit determination value Vlth.
  • the determination unit 92 corrects the upper limit determination value Vuth and the lower limit determination value Vlth to be smaller than the default values. do.
  • the amount of correction is preferably set according to the number of pixels and focal length of the camera 60.
  • This process is a process that is executed in a predetermined control cycle when the user carrying the portable device 100 enters the communication area AC and the sliding door 40 is located in the fully closed position.
  • the gesture detection device 90 acquires an image taken by the camera 60 (S11). Subsequently, the gesture detection device 90 acquires the foot position Pf by inputting the image captured by the camera 60 into the learned model stored in the storage unit 91 (S12). After that, the gesture detection device 90 determines whether the foot position Pf in the image exists within the determination area AG (S13). If the foot position Pf does not exist within the determination area AG (S13: NO), the gesture detection device 90 ends this process. Even when the foot position Pf cannot be acquired in step S12, the gesture detection device 90 ends this process.
  • the gesture detection device 90 acquires a new image captured by the camera 60 (S14).
  • the image acquired in step S14 is an image taken after the previously acquired image. Subsequently, the gesture detection device 90 acquires the foot position Pf in the new image (S15).
  • the gesture detection device 90 calculates the displacement vector Vd of the foot position Pf at the image capturing interval of the camera 60 (S16).
  • the displacement vector Vd of the foot position Pf is a vector directed from the previously specified foot position Pf to the currently specified foot position Pf.
  • the gesture detection device 90 determines whether the magnitude of the displacement vector Vd of the foot position Pf is greater than or equal to the upper limit determination value Vuth (S17).
  • the upper limit judgment value Vuth is a value set depending on which area of the first judgment area AG1 or the second judgment area AG2 the foot position Pf acquired in the most recent step S15 is located. There is.
  • the gesture detection device 90 ends this process.
  • the gesture detection device 90 determines whether the magnitude of the displacement vector Vd of the foot position Pf is less than the lower limit determination value Vlth. (S18).
  • the lower limit judgment value Vlth is a value set depending on which area of the first judgment area AG1 or the second judgment area AG2 the foot position Pf acquired in the most recent step S15 is located. It becomes.
  • the gesture detection device 90 sets the stop counter Cnt by "1".
  • the stop counter Cnt is a variable for counting the number of times the magnitude of the displacement vector Vd becomes less than the lower limit determination value Vlth.
  • the stop counter Cnt is initialized to "0" at the timing of starting this process and at step S22, which will be described later.
  • the gesture detection device 90 determines whether the stop counter Cnt is equal to or greater than the stop determination number Cntth (S20). When the stop counter Cnt is equal to or greater than the stop determination number Cntth (S20: YES), for example, when the user stops or the user stops performing the kick gesture, the gesture detection device 90 ends this process. On the other hand, if the stop counter Cnt is less than the stop determination number Cntth (S20: NO), the gesture detection device 90 moves the process to step S14.
  • the number of stoppage determinations Cntth is set to a small number, if the state in which the foot position Pf does not move continues for a short period of time, an affirmative determination is made in the process of step S20.
  • the stop determination count Cntth is a number corresponding to the stop determination time described above.
  • step S18 if the magnitude of the displacement vector Vd is greater than or equal to the lower limit determination value Vlth (S18: NO), the gesture detection device 90 determines whether the direction of the displacement vector Vd matches the gesture pattern (S21). For example, when matching up to the "N+2"th gesture pattern has been completed, it is determined whether the direction of the displacement vector Vd matches the "7" direction corresponding to the "N+3"th gesture pattern. Ru.
  • the gesture detection device 90 initializes the stop counter Cnt to "0" (S22). Subsequently, the gesture detection device 90 determines whether all gesture pattern matching is completed (S23). If the gesture pattern matching is not completed, the gesture detection device 90 moves the process to step S14. On the other hand, if the gesture pattern matching is completed (S23: YES), the gesture detection device 90 outputs an opening operation command signal to the door control device 80 (S24). That is, the sliding door 40 is operated to open.
  • steps S14 to S23 correspond to a "judgment step”.
  • FIG. 7 shows a change in the direction of the displacement vector Vd of the foot position Pf when the user performs a kick gesture with a large movement.
  • the direction of the displacement vector Vd of the user's foot position Pf becomes the "7" direction from the first timing. The condition continues for a relatively long period of time. Thereafter, when the user's foot Ft reaches the turning position at a timing between the 10th and 11th, the direction of the displacement vector Vd of the user's foot position Pf becomes the "3" direction.
  • the direction of the displacement vector Vd of the foot position Pf changes according to the gesture pattern shown in FIG.
  • the directions of the displacement vectors Vd of the fourteenth and subsequent foot positions Pf are not used for determining whether to perform a kick gesture. That is, the opening operation of the sliding door 40 is started while the user is performing the kick gesture.
  • FIG. 8 shows a change in the direction of the displacement vector Vd of the foot position Pf when the user performs a kick gesture with a small movement.
  • the direction of the displacement vector Vd of the user's foot position Pf becomes the "7" direction from the first timing. The condition continues for a relatively short period of time. Thereafter, when the user's foot Ft reaches the turning position between the seventh and eighth timings, the direction of the displacement vector Vd of the user's foot position Pf becomes the "3" direction.
  • the direction of the displacement vector Vd of the foot position Pf changes according to the gesture pattern shown in FIG.
  • the directions of the displacement vectors Vd of the 11th and subsequent foot positions Pf are not used for determining whether to perform a kick gesture. That is, the opening operation of the sliding door 40 is started while the user is performing the kick gesture.
  • the gesture detection device 90 can determine whether or not the user has performed a kick gesture, regardless of the magnitude of the movement of the foot Ft when the user performs the kick gesture. In other words, the gesture detection device 90 can accurately detect the user's kick gesture.
  • the gesture detection device 90 can determine whether or not the user performed the kick gesture at an earlier timing than in the comparative example in which the number of verifications is opposite to that of the present embodiment. Specifically, when the user performs the kick gesture, the gesture detection device 90 can determine that the user has performed the kick gesture immediately after the user's foot Ft begins to move away from the vehicle 10.
  • the gesture detection device 90 cancels the gesture determination process if the user's foot Ft continues to be stopped during the gesture determination process. Therefore, the gesture detection device 90 can cancel the gesture determination process when the user cancels the kick gesture or when the user stops on the side of the vehicle 10. Therefore, the gesture detection device 90 can improve the accuracy of determining the user's kick gesture.
  • the gesture detection device 90 sets the upper limit determination value Vuth and the lower limit determination value Vlth to different values depending on the distance from the camera 60 to the foot Ft. Therefore, the gesture detection device 90 can improve the accuracy of determining the user's kick gesture regardless of the position of the foot Ft in the image.
  • the stop determination process using the stop counter Cnt may be performed only when the moving direction of the user's foot position Pf is reversed. Specifically, the stop determination process using the stop counter Cnt is performed from the time when the direction of the N+4th displacement vector Vd is completed until the time when the direction of the N+5th displacement vector Vd is completed in the gesture pattern shown in FIG. It may be implemented only in certain cases.
  • the stop determination number Cntth may be set to a different value depending on whether the moving direction of the user's foot position Pf is reversed or not.
  • the gesture detection device 90 does not need to perform the stoppage determination process using the stop counter Cnt. Specifically, in the flowchart shown in FIG. 6, the processes of steps S18 to S20 may be omitted.
  • the kicking gesture may be a kicking gesture of opening and closing the tip of the foot centering on the heel of the foot Ft.
  • the gesture performed by the user does not have to be a kick gesture as long as it is a gesture in which the user moves a body part back and forth.
  • the gesture may be a gesture of raising and lowering a hand, or a gesture of changing the direction of the face.
  • the gesture pattern can be changed as appropriate.
  • the number of movements in the "7" direction may be the same as the number of movements in the "3" direction, or may be less than the number of movements in the "3" direction.
  • the camera 60 does not have to be installed on the side mirror 33.
  • the camera 60 may be installed at the upper end of the rear opening 22 or may be installed at the sliding door 40.
  • the gesture detection device 90 may output a closing operation command signal for closing the sliding door 40 to the door control device 80 based on the user's kick gesture.
  • the gesture detection device 90 may output one of the opening operation command signal and the closing operation command signal to the door control device 80 depending on the position of the sliding door 40 when the user performs the kick gesture.
  • the determination unit 92 may calculate the distance from the camera 60 to the user's foot position Pf. In this case, the determination unit 92 may correct the upper limit determination value Vuth and the lower limit determination value Vlth according to the distance from the camera 60 to the user's foot position Pf. In this case, the amount of correction is preferably proportional to the distance from the camera 60 to the user's foot position Pf.
  • the determination unit 92 does not need to correct the upper limit determination value Vuth and the lower limit determination value Vlth according to the distance from the camera 60 to the user's foot position Pf. In this case, the determination unit 92 may correct the magnitude of the displacement vector Vd according to the distance from the camera 60 to the user's foot position Pf.
  • the vehicle 10 may include a front door drive unit that drives the front door 30.
  • the gesture detection device 90 may detect a kick gesture for opening the front door 30, similar to the case for opening the sliding door 40.
  • the vehicle 10 may include a back door that opens and closes an opening opening to the rear of the vehicle body 20, a back door drive unit that drives the back door, and a back camera that photographs the rear of the vehicle 10.
  • the gesture detection device 90 may detect a kick gesture for opening the back door, based on an image taken by the back camera, as in the case of opening the sliding door 40.
  • the door control device 80 and the gesture detection device 90 are not limited to processing circuits that include a CPU and a ROM and execute software processing.
  • the door control device 80 and the gesture detection device 90 may include a dedicated hardware circuit that executes at least a portion of the various processes executed in each of the above embodiments.
  • An example of the dedicated hardware circuit is an ASIC.
  • ASIC is an abbreviation for "Application Specific Integrated Circuit.” That is, the door control device 80 and the gesture detection device 90 may have any of the following configurations (a) to (c).
  • a processing circuit that includes a processing device that executes all of the above processing according to a program, and a program storage device such as a ROM that stores the program.
  • a processing circuit comprising a processing device and a program storage device that execute part of the above processing according to a program, and a dedicated hardware circuit that executes the remaining processing.
  • a processing circuit that includes a dedicated hardware circuit that performs all of the above processing.
  • the vehicle gesture detection device (90) of this embodiment includes a door drive unit (50) that opens and closes a door (40) that opens and closes a door opening (22) of a vehicle body (20), and a door drive unit (50) that opens and closes a door (40) that opens and closes a door opening (22) of a vehicle body (20). ), and detects a user's gesture for operating the door (40).
  • the gesture is an action in which the user moves the body part (Ft) in a first direction and then moves it in a second direction opposite to the first direction.
  • the vehicle gesture detection device (90) determines whether the user performed the gesture based on a change in the position of the body part (Ft) in a plurality of images taken by the camera (60) at different times.
  • a determination unit (92) is provided that performs gesture determination processing to determine whether the gesture is a gesture or not. In the gesture determination process, the determination unit (92) determines whether the position of the body part (Ft) moves in the second direction after the position of the body part (Ft) continues to move in the first direction. If the moving state continues, it is determined that the user has performed the gesture.
  • the vehicular gesture detection device can determine whether the user has performed the gesture, regardless of the magnitude of the movement of the body part when the user performs the gesture. In other words, the vehicular gesture detection device can detect the user's gestures with high accuracy.
  • the determination unit (92) determines whether the body part (Ft) moves in the first direction after the body part (Ft) continues to move in the first direction for a first determination time. It is preferable to determine that the user has performed the gesture when a state in which the position of Ft) moves in the second direction continues for a second determination time. It is preferable that the second determination time is shorter than the first determination time.
  • the vehicle gesture detection device configured as described above makes the second determination time shorter than the first determination time. Therefore, the vehicular gesture detection device determines whether the user has performed the gesture or not after the relatively short second determination time has elapsed after the moving direction of the user's body part switches to the second direction. can.
  • the determination unit (92) calculates a displacement vector indicating the amount and direction of movement of the body part (Ft) within the two images taken at different timings.
  • the determination unit (92) determines whether the direction of the displacement vector continues to be in the second direction after the direction of the displacement vector continues to be in the first direction. , it is preferable to determine that the user has performed the gesture. It is preferable that the determination unit (92) cancels the gesture determination process if the magnitude of the displacement vector continues to be less than a lower limit determination value (Vlth) during the gesture determination process.
  • Vlth lower limit determination value
  • the vehicle gesture detection device detects the following: Gesture determination processing can be canceled. Therefore, the vehicular gesture detection device can improve the accuracy of determining the user's gesture.
  • the determination unit (92) determines that when the distance from the camera (60) to the body part (Ft) is long, the distance from the camera (60) to the body part (Ft) is long. It is preferable to make the lower limit judgment value (Vlth) smaller than when it is short. For example, if the distance from the camera (60) to the body part (Ft) is the first distance, then the second distance from the camera (60) to the body part (Ft) is shorter than the first distance. The lower limit determination value (Vlth) can be made smaller than when it is a distance.
  • the vehicle gesture detection device changes the magnitude of the lower limit determination value depending on the distance from the camera to the body part. Therefore, the vehicle gesture detection device can improve the accuracy of determining the user's gesture regardless of the position of the body part in the image.
  • the vehicle gesture detection method of the present embodiment includes a door drive unit (50) that opens and closes a door (40) that opens and closes a door opening (22) of a vehicle body (20), and a door drive unit (50) that opens and closes a door opening (22) of a vehicle body (20), and a periphery of the door opening (22).
  • the present invention is applied to a vehicle (10) equipped with a camera (60) for photographing the door (40), and detects a user's gesture for operating the door (40).
  • the gesture is an action in which the user moves the body part (Ft) in a first direction and then moves it in a second direction opposite to the first direction.
  • the vehicle gesture detection method includes a determination step of determining whether the user has performed the gesture based on a change in the position of the body part (Ft) in a plurality of images taken by the camera (60). Equipped with In the determination step, when the position of the body part (Ft) continues to move in the second direction after the position of the body part (Ft) continues to move in the first direction, The method includes determining that the user has performed the gesture.
  • the vehicular gesture detection method can obtain the same effects as the above-mentioned vehicular gesture detection device.

Abstract

Selon l'invention, un dispositif de détection de geste (90) comprend une unité de détermination (92) qui détermine si un utilisateur a effectué ou non un geste pour actionner une porte de véhicule (40), en fonction d'un changement de position d'une partie de corps de l'utilisateur dans chaque image d'une pluralité d'images photographiées par une caméra (60) à différents instants. L'unité de détermination (92) détermine que l'utilisateur a effectué le geste si un état, dans lequel la position de la partie de corps se déplace dans une première direction, s'est poursuivi, puis un état, dans lequel la position de la partie de corps se déplace dans une deuxième direction opposée à la première direction, s'est poursuivi.
PCT/JP2023/020909 2022-07-13 2023-06-06 Dispositif de détection de geste de véhicule et procédé de détection de geste de véhicule WO2024014182A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2022112564 2022-07-13
JP2022-112564 2022-07-13

Publications (1)

Publication Number Publication Date
WO2024014182A1 true WO2024014182A1 (fr) 2024-01-18

Family

ID=89536550

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2023/020909 WO2024014182A1 (fr) 2022-07-13 2023-06-06 Dispositif de détection de geste de véhicule et procédé de détection de geste de véhicule

Country Status (1)

Country Link
WO (1) WO2024014182A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013037467A (ja) * 2011-08-05 2013-02-21 Toshiba Corp コマンド発行装置、コマンド発行方法およびプログラム
JP2015510197A (ja) * 2012-02-13 2015-04-02 クアルコム,インコーポレイテッド エンゲージメント依存型ジェスチャー認識
US20160096509A1 (en) * 2014-10-02 2016-04-07 Volkswagen Aktiengesellschaft Vehicle access system
JP2020513491A (ja) * 2016-12-13 2020-05-14 ブローゼ ファールツォイクタイレ エスエー ウント コンパニ コマンディートゲゼルシャフト バンベルクBrose Fahrzeugteile Se & Co.Kg,Bamberg 自動車の電動閉鎖要素アセンブリを駆動制御する方法
JP2021084566A (ja) * 2019-11-29 2021-06-03 トヨタ車体株式会社 車両

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013037467A (ja) * 2011-08-05 2013-02-21 Toshiba Corp コマンド発行装置、コマンド発行方法およびプログラム
JP2015510197A (ja) * 2012-02-13 2015-04-02 クアルコム,インコーポレイテッド エンゲージメント依存型ジェスチャー認識
US20160096509A1 (en) * 2014-10-02 2016-04-07 Volkswagen Aktiengesellschaft Vehicle access system
JP2020513491A (ja) * 2016-12-13 2020-05-14 ブローゼ ファールツォイクタイレ エスエー ウント コンパニ コマンディートゲゼルシャフト バンベルクBrose Fahrzeugteile Se & Co.Kg,Bamberg 自動車の電動閉鎖要素アセンブリを駆動制御する方法
JP2021084566A (ja) * 2019-11-29 2021-06-03 トヨタ車体株式会社 車両

Similar Documents

Publication Publication Date Title
US7653458B2 (en) Robot device, movement method of robot device, and program
CN106966277B (zh) 电梯的乘坐检测系统
JP4561914B2 (ja) 操作入力装置、操作入力方法、プログラム
JP2007186915A (ja) パワーウインドウ装置
KR20150076627A (ko) 차량 운전 학습 시스템 및 방법
JP5997699B2 (ja) ジェスチャ入力を検知する方法と装置
JP2007186916A (ja) ウインドウ画像領域検出装置及びパワーウインドウ装置
JP6412054B2 (ja) 射出成形機
CN107585666B (zh) 电梯系统以及轿厢门控制方法
WO2024014182A1 (fr) Dispositif de détection de geste de véhicule et procédé de détection de geste de véhicule
CN110294391A (zh) 使用者检测系统
JP4198676B2 (ja) ロボット装置、ロボット装置の移動追従方法、および、プログラム
JP2008021266A (ja) 顔向き検出装置、顔向き検出方法、および顔向き検出プログラム
JP2019098932A (ja) 制御装置およびプログラム
US8310547B2 (en) Device for recognizing motion and method of recognizing motion using the same
KR101500412B1 (ko) 차량용 제스처 인식 장치
US11959326B2 (en) Vehicle operation detection device and vehicle operation detection method
JP2024050203A (ja) ジェスチャ検出装置及びジェスチャ検出方法
US20220412149A1 (en) Vehicle operation detection device and vehicle operation detection method
KR102324690B1 (ko) 딥러닝을 적용한 탁구 로봇 시스템
KR101374316B1 (ko) 시스루 디스플레이를 이용한 동작인식 장치 및 그 방법
JP2023170577A (ja) 車両用ジェスチャ検出装置及び車両用ジェスチャ検出方法
JP2023092133A (ja) 車両用操作検出装置及び車両用操作検出方法
US11753859B2 (en) Vehicle operation detection device and vehicle operation detection method
KR102550980B1 (ko) 동력분무기의 주행속도에 대응한 과수방제 제어장치 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23839358

Country of ref document: EP

Kind code of ref document: A1