WO2021037052A1 - Control apparatus and method - Google Patents

Control apparatus and method Download PDF

Info

Publication number
WO2021037052A1
WO2021037052A1 PCT/CN2020/111329 CN2020111329W WO2021037052A1 WO 2021037052 A1 WO2021037052 A1 WO 2021037052A1 CN 2020111329 W CN2020111329 W CN 2020111329W WO 2021037052 A1 WO2021037052 A1 WO 2021037052A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
sensing device
response
target object
control apparatus
Prior art date
Application number
PCT/CN2020/111329
Other languages
French (fr)
Inventor
Jifen FU
Tao Zhang
Ning Guo
Original Assignee
Beijing Asu Tech Co.Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Asu Tech Co.Ltd. filed Critical Beijing Asu Tech Co.Ltd.
Priority to CN202080060963.4A priority Critical patent/CN114616140A/en
Priority to EP20859091.9A priority patent/EP4021767A4/en
Publication of WO2021037052A1 publication Critical patent/WO2021037052A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/2054Means to switch the anti-theft system on or off by foot gestures
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R25/00Fittings or systems for preventing or indicating unauthorised use or theft of vehicles
    • B60R25/20Means to switch the anti-theft system on or off
    • B60R25/25Means to switch the anti-theft system on or off using biometry
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C2209/00Indexing scheme relating to groups G07C9/00 - G07C9/38
    • G07C2209/60Indexing scheme relating to groups G07C9/00174 - G07C9/00944
    • G07C2209/63Comprising locating means for detecting the position of the data carrier, i.e. within the vehicle or within a certain distance from the vehicle
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C9/00Individual registration on entry or exit
    • G07C9/00174Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
    • G07C9/00309Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys operated with bidirectional data transmission between data carrier and locks

Definitions

  • This present disclosure relates generally to control technologies, specifically to a control apparatus and method which can be applied in various fields including the technical field of intelligent vehicles, and in more particular, to a vehicle gate control apparatus and method.
  • a vehicle gate i.e. tailgate
  • a power trunk gate i.e. a vehicle gate
  • several main approaches for opening a trunk gate of a vehicle include: pressing a “trunk open” button arranged on a central control console within the vehicle, pressing a “trunk open” button on a vehicle key, or pressing a switch button arranged on the trunk gate.
  • the above three approaches belong essentially to a traditional mode for opening the trunk gate of a vehicle.
  • the trunk gate of the vehicle cannot be opened automatically.
  • the present disclosure provides a control apparatus.
  • the control apparatus includes a controller, a first sensing device, an optical device, and a second sensing device. Each of the first sensing device, the optical device, and the second sensing device is communicatively connected with the controller.
  • the first sensing device is configured to detect whether a user is in a proximity of a target object, and if so, to send object detection data of the user to the controller.
  • the controller is configured to determine whether the user is within a preset distance to the target object based on the object detection data, and if so, to send a first control command to the optical device.
  • the optical device is configured, upon receiving the first control command, to project a preset image onto a preset area of a surface for presentation to the user.
  • the second sensing device is configured to detect a response of the user to the preset image, and then to send user response data to the controller.
  • the controller is further configured to determine whether the response from the user meets a predetermined criterion based on the user response data, and if so, to send a second control command to the target object for executing a task.
  • the target object is a gate, and the task is to open the gate.
  • the target object can be other object, such as an elevator, and the task is to get the elevator to stop at the same level as the user, and thus get ready for the user to take.
  • the target object may also be a droid assistant, and the task is to move closer to the user for providing services.
  • the target object is a vehicle gate
  • the preset area of the surface is an area of ground in a proximity of the vehicle gate.
  • the vehicle gate can be any gate, such as the rear cargo trunk door, the front engine hood lid, or any of the power door proving access to the interior of the vehicle.
  • the second sensing device optionally comprises an obstruction detector configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the image.
  • the obstruction detector may comprise a camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device.
  • the second sensing device may optionally comprise a user action detector configured to detect an action of the user in response to the image.
  • the user action detector can be a camera, and the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user.
  • the user action detector can also be a microphone, and the response detected thereby comprises a voice of the user.
  • the predetermined criterion may comprise detection of any action performed by the user, which may include at least one of a motion, a gesture, a facial expression, or a voice.
  • the controller may optionally be further configured to perform a feature recognition based on the user response data, and the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record.
  • the term “substantial” , “substantially” , or alike, is considered to be exchangeable with the phrase “in most details, even if not completely” , i.e. defined as more than 80%in the level of match.
  • the first sensing device can optionally comprise at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
  • an ultrasonic sensing device e.g., a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
  • the first sensing device comprises a wireless-signal detector, which is configured to detect wireless signals transmitted from a carry-on device carried by the user, wherein the wireless signals comprise at least one of radio frequency (RF) signals, WiFi signals, Bluetooth signals, 4G signals, or 5G signals.
  • RF radio frequency
  • the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
  • the preset distance can be in a range of approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
  • the first sensing device is substantially a functional module embedded in the controller.
  • the optical device is further configured, upon starting to project the preset image, to send time stamp information to the controller; and the controller is further configured to count a working time of the optical device based on the time stamp information, to determine whether the working time is longer than a preset threshold, and if so and if no response from the user is received, to further send a stop command to the optical device to stop projection.
  • the preset threshold can be in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
  • the present disclosure further provides a control method.
  • the control method comprises the following steps:
  • the target object can optionally be a gate, and the task can be to open the gate.
  • the target object is a vehicle gate
  • the preset area of the surface is an area of ground in a proximity of the vehicle gate.
  • step (1) of determining whether a user is within a preset distance to a target object comprises the following sub-steps:
  • optionally sub-step (a) of acquiring object detection data of the user can be by means of a first sensing device.
  • the first sensing device can comprise at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
  • the first sensing device comprises an ultrasonic sensing device
  • a distance of the user to the target object is estimated by at least one of a strength of, or a time period of receiving, an echo signal received by the ultrasonic sensing device.
  • the first sensing device comprises a camera, and in the sub-step (b) of determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by analyzing images of the user.
  • the first sensing device comprises a wireless-signal detector configured to detect wireless signals transmitted from a carry-on device carried by the user, and in the sub-step (b) of determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by a strength of the wireless signals detected by the wireless-signal detector.
  • the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
  • the preset distance is in a range of approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
  • the step (3) of detecting a response of the user to the preset image is carried out by a second sensing device.
  • the second sensing device may comprise at least one of an obstruction detector or a user action detector.
  • the obstruction detector is configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the preset image.
  • the user action detector is configured to detect an action of the user in response to the preset image.
  • the second sensing device comprises an obstruction detector, which may comprise at least one of a camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device.
  • the second sensing device comprises a TOF detection device.
  • the second sensing device comprises a user action detector.
  • the user action detector can include at least one of a camera or a microphone.
  • the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user.
  • the response detected thereby comprises a voice of the user.
  • the predetermined criterion may optionally comprise detection of any action performed by the user, and the any action may comprise at least one of a motion, a gesture, a facial expression, or a voice.
  • the step (4) of determining, based on the response, whether a predetermined criterion is met comprises a sub-step of performing a feature recognition based on the user response data.
  • the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record.
  • control method further comprises the following steps:
  • controlling the optical device to stop projection if the working time is longer than a preset threshold and no response from the user is detected.
  • the preset threshold can be in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
  • control method can be carried out by the control apparatus as described in the first aspect.
  • FIG. 1A and FIG. 1B respectively show a structural diagram of a control apparatus and a control method utilizing the control apparatus provided by some embodiments of the disclosure;
  • FIG. 2 illustrates a vehicle gate control apparatus and control method based on the control apparatus and control method as illustrated in FIGS. 1A and 1B;
  • FIGS. 3A and 3B respectively show a specific application scenario of a vehicle gate control apparatus and a working flow of a vehicle gate control method utilizing the vehicle gate control apparatus;
  • FIG. 4 shows a block diagram of a controller according to certain embodiments of the disclosure.
  • FIG. 5 shows a structural diagram of a controller provided by some embodiments of the disclosure.
  • the present disclosure provides a control apparatus and a control method.
  • the control method is substantially carried out by means of the control apparatus.
  • FIG. 1A shows a structural diagram of a control apparatus provided by some embodiments of the disclosure.
  • the control apparatus 001 comprises a controller 100, a first sensing device 200, an optical device 300, and a second sensing device 400.
  • Each of the first sensing device 200, the optical device 300 and the second sensing device 400 is communicatively connected with the controller 100.
  • the first sensing device 200 is substantially an object detector, configured to detect whether there is a user U in a proximity of a target object T, and then to send object detection data to the controller 100.
  • the controller 100 is configured, based on the object detection data from the first sensing device 200, to determine whether the user U is within a preset distance to the target object T, and is further configured, if it determines that the user U is within the preset distance, to send a first control command (i.e. “1 st Control Command” in FIG. 1) to the optical device 300.
  • a first control command i.e. “1 st Control Command” in FIG. 1
  • the optical device 300 is configured, upon receiving the first control command, to project a preset image onto a preset area of a surface for presentation to the user U.
  • the user U may, under a prompt by the image projected by the optical device 300, exhibit a response thereto.
  • the second sensing device 400 is configured to detect the response of the user U to the projected image, and then to transmit user response data to the controller 100.
  • the controller 100 is further configured, based on the user response data from the second sensing device 400, to determine whether the response from user U meets a predetermined criterion, and is further configured to send a second control command (i.e. “2 st Control Command” in FIG. 1) to the target object T for executing a task corresponding to the second control command.
  • a second control command i.e. “2 st Control Command” in FIG. 1
  • FIG. 1B illustrates a flow chart of a control method using the control apparatus 001 according to certain embodiments of the disclosure.
  • the control method includes the following steps:
  • S10 Determining whether a user is within a preset distance to a target object
  • S40 Determining, based on the response of the user, whether a predetermined criterion is met;
  • the term “user” generally refers to a person, but can also be expanded to referring to an animal, a robot, a machine, or anything that can respond to the image projected by the optical device.
  • the control apparatus 001 is arranged in a vehicle (e.g. a passenger car) and configured to control the vehicle to execute a vehicle-related task, such as opening a vehicle gate (e.g. a trunk lid or a trunk door) .
  • a vehicle gate e.g. a trunk lid or a trunk door
  • the target object is a vehicle
  • the user can be a driver or a passenger approaching the vehicle and intended to open a vehicle gate of interest (e.g. trunk door or tailgate)
  • the task to be executed by the target object is to open the vehicle gate.
  • an optical device e.g. a projector
  • an image e.g. an optical spot, or an optical pattern such as a light ring or a specific logo
  • the control apparatus can, upon the control apparatus detecting that the user is getting close enough to the trunk, project an image (e.g. an optical spot, or an optical pattern such as a light ring or a specific logo) onto an area of the ground right in front of the vehicle trunk (i.e. “preset area of a surface” ) .
  • an image e.g. an optical spot, or an optical pattern such as a light ring or a specific logo
  • the user can kick his or her foot to swipe over the optical pattern.
  • the control apparatus can control the trunk door of the vehicle to open.
  • the user can conveniently open the trunk door of a car without using his or her hand, which is especially useful when the user is holding a lot of stuff (e.g. a big box) using both hands and intends to put them into the trunk.
  • a lot of stuff e.g. a big box
  • FIG. 2 illustrates a vehicle gate control apparatus and a vehicle gate control method, which are substantially based on the control apparatus and method as set forth above and illustrated in FIGS. 1A and 1B.
  • the vehicle gate control apparatus 001A comprises a vehicle controller 110, an object detector 210, an optical device 310, and a user response detector 410, which correspond to the controller 100, the first sensing device 200, the optical device 300, and the second sensing device 400 of the control apparatus 001 as illustrated in FIG. 1A.
  • the object detector 210 detects whether a user U is approaching the vehicle gate T of interest (e.g. the trunk gate, see S100) , and then sends object detection data to the vehicle controller 110 (S200) . Then based on the object detection data, the vehicle controller 110 determines whether the user U is within a preset distance to the vehicle gate T, and if so, sends a first control command to the optical device 310 (S300) to thereby control the optical device 310 to project a preset image onto the ground (S400) .
  • the object detector 210 detects whether a user U is approaching the vehicle gate T of interest (e.g. the trunk gate, see S100) , and then sends object detection data to the vehicle controller 110 (S200) . Then based on the object detection data, the vehicle controller 110 determines whether the user U is within a preset distance to the vehicle gate T, and if so, sends a first control command to the optical device 310 (S300) to thereby control the optical device 310 to project
  • the user U may exhibit a response, which is then detected by the user response detector 410 (S500) , and the user response data is further sent by the user response detector 410 to the vehicle controller 110 (S600) . Then based on the user response data, the vehicle controller 110 further determines whether the response from user U meets a predetermined criterion, and if so, controls the vehicle gate T to open (S700) .
  • vehicle generally refers to a transportation machine that transports people or cargo, which can include motor vehicles, such as passenger automobiles, cars, sports utility vehicles (SUV) , motorcycles, buses, trucks, or various commercial vehicles, and can also include hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum) .
  • motor vehicles such as passenger automobiles, cars, sports utility vehicles (SUV) , motorcycles, buses, trucks, or various commercial vehicles
  • hybrid vehicles electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum) .
  • the term can also include other types of transportation tools such as wagons, bicycles, railed vehicles (e.g. trains, trams, etc. ) , watercrafts (e.g. ship or boats) , aircrafts, and spacecrafts, etc.
  • the term “gate” is considered to be equivalent to “door” , “lid” or alike, and refers to a movable portion of a vehicle that, if opened, provides an access to, and if shut off, close an opening of, certain compartment, such as a cargo trunk (or short as trunk hereinafter) , an interior driver/passenger space, or an engine trunk, of a vehicle.
  • the vehicle gate is a trunk gate arranged at the rear of a passenger car, yet it is to be noted that the “vehicle gate” can alternatively be a power door to an interior driver/passenger space of the vehicle, or a lid/gate to other compartments (e.g. the engine trunk) of the vehicle.
  • a vehicle gate can be a hinged gate or a sliding gate, but can also be a gate of other types.
  • the “user” can be a driver or a passenger intended to ride on a vehicle, or can be a guest or a third-party person not intended to ride on the vehicle.
  • the “preset distance” to the vehicle gate can be set in advance by technicians according to practical needs. For example, in a typical passenger car, only when the user approaches within about 1 m (meter) to the vehicle gate, shall the user be considered to have the need to open the vehicle gate.
  • the preset distance is set as a range with a radius of 1.0 meter. Yet optionally, the preset distance can be set as 0.5 m, 1.5 m, 2.0 m, or any other distances. In the present disclosure, the preset distance can be typically set to have a range of 0.1 -5 m, preferably of 0.5-2 m, and more preferably of approximately 1 m.
  • the object detector 210 comprises an ultrasonic sensing device, which can periodically transmit ultrasonic signals. When a user is approaching the vehicle gate, an echo signal reflected by the user can be received by the ultrasonic sensing device. Then based on a strength of, and a time period of receiving, the echo signal, the distance of the user to the vehicle gate can be measured or estimated based on certain predetermined relationship therebetween.
  • the object detector 210 comprises a radar sensing device (i.e. radar sensor) , such as a millimeter wave (i.e. mmWave) radar sensor, which can be used for relatively short-range (e.g. approximately 30 m) or long-range (e.g. approximately 200 m) object detection, depending on practical needs.
  • the object detector 210 relies on images that are taken thereby to measure or estimate the distance of the user to the vehicle gate.
  • the object detector 210 comprises a camera (e.g. a video camera) .
  • the camera can periodically take pictures of a scene near the vehicle gate.
  • a series of pictures of the user can be periodically taken by the camera.
  • the distance of the user to the vehicle gate can be measured or estimated. It can be realized, for example, by feature recognition and by measuring the size of the user, or part thereof (e.g.
  • the series of pictures may optionally include images that contain certain features, such as facial features, gestures, and/or motions, etc., of the user, and through feature comparison and deep learning, the identity of the user may be further determined for other purposes, such as authentication.
  • the object detector 210 relies on wireless signals that are captured thereby to measure or estimate the distance of the user to the vehicle gate, and thus the object detector 210 may optionally comprises a wireless-signal detector, configured to detect wireless signals transmitted from a carry-on device carried by the approaching user.
  • the “wireless signals” can be radio frequency (RF) signals, WiFi signals, Bluetooth signals, 4G/LTE signals, 5G signals, etc., which can be transmitted by the carry-on device that is substantially a wireless signal-emitting device, such as a vehicle key, a mobile phone, or a specialized wireless-signal transmitter.
  • RF radio frequency
  • WiFi signals WiFi signals
  • Bluetooth signals 4G/LTE signals
  • 5G signals etc.
  • Such a wireless signal-transmitting device is typically carried by a vehicle user, and thus can be utilized to estimate the distance of the user when he or she is approaching the vehicle gate.
  • the object detector 210 comprises an RF detector, such as a wireless sensing module which is sometimes embedded in the vehicle controller 110, configured to sense or detect the RF signals transmitted from a vehicle key.
  • the wireless sensing module can estimate the distance of the user to the vehicle gate based on the strength of the RF signals.
  • the vehicle key may transmit other types of wireless signals (e.g. Bluetooth signals) , and a corresponding object detector may be configured to receive these types of wireless signals and work in a similar manner.
  • the object detector 210 comprises a wireless signal detector configured to detect one or more of RF signals, Bluetooth signals, WiFi signals, Bluetooth signals, 4G/LTE signals, 5G signals, etc., which are transmitted from a mobile phone.
  • the signal detector can estimate the distance of the user to the vehicle gate based on the strength of the wireless signals received.
  • the object detector may comprise other types of sensing devices, such as an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, a VCSEL (vertical-cavity surface-emitting laser) sensor, etc., which may be utilized to measure or estimate the distance of the approaching user to the vehicle gate.
  • sensing devices such as an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, a VCSEL (vertical-cavity surface-emitting laser) sensor, etc.
  • the object detector 210 and the vehicle controller 110 are physically separated but are communicatively connected.
  • the vehicle controller 110 can judge or determine whether a user appears in the preset distance to the vehicle gate according to the object detection signals transmitted from the object detector 210.
  • the object detector 210 may be functionally and/or physically embedded in the vehicle controller 110.
  • the vehicle controller 110 itself may have a functionality for detecting a user.
  • a wireless sensing module configured in the vehicle controller 110 can substantially serve as an object detector 210 to sense or acquire information of the user.
  • the object detector 210 may comprise any combination of the above different embodiments thereof.
  • the object detector 210 of the vehicle gate control apparatus 001A may comprise a camera and a RF signal-detector (e.g. a wireless sensing module embedded in the vehicle controller 110) .
  • the camera is configured to capture images of an approaching user
  • the RF signal-detector is configured to detect the RF signals transmitted from the vehicle key carried by the user.
  • a combined use may realize a better security, a robustness (i.e., in case that one fails, the other still works) , a higher effectiveness (e.g. a first approach serves as a trigger to active a second approach) , or combined functions of distance estimate and user authentication.
  • the object detection data obtained by the object detector 210 and transmitted to the vehicle controller 110 can be any of data type, such as an electrical signal, a data packet, or a binary code, etc., and can have different embodiments.
  • the object detection data may comprise the original data (e.g. images, wireless signal strength, echo signal strength, echo reflection time, etc. ) periodically captured by the object detector 210 when a user is approaching the vehicle gate.
  • Such captured original data is thus periodically transmitted to the vehicle controller 110 which, by means of a calculation module in a processor, can calculate the distance, and further by means of a determination module in the processor, can determine whether the user reaches within the preset distance to the vehicle gate.
  • the object detector 210 does not process the object detection data, but instead transmit all of the object detection data to the vehicle controller 110 for processing and determination.
  • the object detection data may only comprise a determination result.
  • the object detector 210 itself can, in addition to periodically capture distance-related data (e.g. images, wireless signal strength, echo signal strength, echo reflection time, etc. ) , also make a determination whether the captured data meets a certain criteria (e.g. the strength of the wireless signal is higher than a predetermined threshold, or the time period of the echo signal is shorter than a predetermined threshold) , which is substantially equivalent to the determination whether the approaching user is within the preset distance to the vehicle gate.
  • the object detection data may be in a form of protocol data.
  • the object detection data may only comprise a binary code “1” , which indicates that the user is within the preset distance.
  • the object detector 210 when a user is approaching to the vehicle gate, the object detector 210 periodically captures related data of the user and determines whether the approaching user is within the preset distance to the vehicle gate, and only when it determines that the user is within the preset distance, it sends the binary code “1” to the vehicle controller 110. Upon receiving the binary code “1”, the vehicle controller 110 accordingly determines that the user is within the preset distance, and can further send the first control command to the optical device 310 so as to control it to project the preset image onto the ground.
  • the object detector 210 takes a maximal effort to process the data detected regarding the user and make a determination based thereupon, and only transmits the determination result, which is substantially the object detection data, to the vehicle controller 110.
  • the object detector 210 partially processes the data detected regarding the user, and then transmits such processed data as object detection data to the vehicle controller 110, based on which the vehicle controller 110 further makes a determination whether the approaching user is within the preset distance to the vehicle gate.
  • the optical device 310 can be an illuminating device installed at an appropriate position on the vehicle and with an appropriate direction pointing to a preset area of the ground.
  • the optical device 310 and for the preset image projected on the ground thereby (S400) can be different embodiments for the optical device 310 and for the preset image projected on the ground thereby (S400) .
  • the optical device 310 comprises a projector (e.g. a laser projector, an LED projector, etc. ) .
  • the image projected by the projector can be of any shape, any color, any form, or for any purpose.
  • the image may be a spot, a ring, an arrow, of a special pattern (such as a logo) , or comprise a text.
  • the image may be a monocolored image or a colored image.
  • the image may be a still image, a blinking image, an alternating series of images, a dynamic image, or a video (such as a commercial advertisement) .
  • the image may serve certain purposes.
  • the image may comprise a prompt text such as “Please swipe your foot over or step onto the image” , which substantially prompt the user to perform certain action or motion following the prompt in order to open the vehicle gate.
  • the image may comprise a prompt, such as “Please perform a motion as a pass to open the trunk door” , and the user may perform a motion that has been previously agreed upon (e.g. a counterclockwise movement of the foot) as a pass. Then the motion may be captured by a camera (i.e. the user response detector 410) , and the vehicle controller 110 can further determine whether the motion performed by the user matches with a pre-stored key, and if so, the vehicle controller 110 can then control the vehicle gate to open, and if not, the vehicle controller 110 may further control the optical device 310 to project another preset image as a prompt asking the user to try again. As such, the motion substantially serves as a level of authentication.
  • a prompt such as “Please perform a motion as a pass to open the trunk door”
  • the user may perform a motion that has been previously agreed upon (e.g. a counterclockwise movement of the foot) as a pass.
  • the motion may be captured by a camera (i.e. the user response
  • the optical device 310 comprises a spotlight (e.g. regular spotlight, LED light, etc. ) configured to simply shed a light beam onto the ground, and the preset image thus formed may just comprise an optical spot formed on the ground by the light beam.
  • a spotlight e.g. regular spotlight, LED light, etc.
  • the image projected by the optical device 310 on the ground the response may comprise a still image (e.g. an optical spot, an optical ring, a logo, and/or a prompt text such as “please swipe your foot over or step onto the image” ) .
  • the user can thus perform an action or motion specified by the image, such as swiping his/her foot over, or stepping onto, the image.
  • the user response detector 410 may comprise an obstruction detection device (i.e. obstruction detector) which, alongside the vehicle controller 110, is configured to detect whether there is any obstruction between the image on the ground that is projected by the optical device 310 and the vehicle gate.
  • an obstruction detection device i.e. obstruction detector
  • the obstruction detector comprises a camera.
  • the camera i.e. the user response detector 410
  • the vehicle controller 110 can, based on the one or more pictures received, detect whether there is any pixel change in the projected image to thereby determine whether there is an obstruction between the projected image and the vehicle gate, or more specifically whether an obstruction occurs in an optical path for the projected image.
  • the obstruction detector may comprise an infrared sensing device or a TOF detection device.
  • an obstruction is considered to occur.
  • the obstruction detector may alternatively comprise a radar sensor (e.g. an mmWave radar sensor, a short-range radar sensor, or an obstacle detection radar) , or a noncontact capacitive sensor.
  • a radar sensor e.g. an mmWave radar sensor, a short-range radar sensor, or an obstacle detection radar
  • a noncontact capacitive sensor e.g. a capacitive sensor
  • the response of the user to the projected image may be an action (e.g. a motion, a gesture, a facial expression, a voice, etc. ) performed or exhibited by the user upon the user sees the image projected by the optical device.
  • the user response detector 410 may correspondingly comprise a user action detecting device or a user action detector (e.g. an imaging device such as a camera, or a voice recording device such as a microphone) that can detect the response performed by the user. Then after receiving the user response data sent from the user response detector 410, the vehicle controller 110 determines whether a predetermined criterion is met, and if so, further controls the vehicle gate to open.
  • the projected image may prompt the user to make a motion (e.g. move a leg, or to move the foot counterclockwise, etc. )
  • a motion detector e.g. camera
  • the vehicle controller 110 can further control the vehicle gate to open.
  • the projected image may prompt the user to direct his/her face towards certain part of the vehicle gate (e.g. the camera installed on the trunk door) , and the camera may serve as the user response detector 410 to capture the user’s face. Then the vehicle controller 110 determines whether the user’s face is detected or even whether the user’s face can be recognized (i.e. the predetermined criterion) , and if so, further controls the vehicle gate to open.
  • certain part of the vehicle gate e.g. the camera installed on the trunk door
  • the camera may serve as the user response detector 410 to capture the user’s face. Then the vehicle controller 110 determines whether the user’s face is detected or even whether the user’s face can be recognized (i.e. the predetermined criterion) , and if so, further controls the vehicle gate to open.
  • the projected image may prompt the user to talk or speak certain words, and a microphone installed in the vehicle may serve as the user response detector 410 to capture the user’s voice. Then the vehicle controller 110 determines whether the user’s voice is detected or even whether the user’s voice can be recognized (i.e. the predetermined criterion) , and if so, further controls the vehicle gate to open.
  • the predetermined criterion by which the vehicle controller 110 determines may vary depending on specific needs.
  • the predetermined criterion may be a simple one (i.e. “yes” vs “no” , depending on whether a user action is detected) without any feature recognition functionality. That is, the vehicle controller 110 may determine that the predetermined criterion is met once an action of the user, such as a motion, a voice, a face, or a gesture, etc., is detected.
  • the predetermined criterion applied by the vehicle controller 110 may be complicated, involving feature recognition. That is, after a motion, a voice, a face, a gesture, etc.
  • the vehicle controller 110 further carries out a feature recognition (e.g. a facial recognition, a motion/gesture recognition, or a voice recognition, etc. ) , and determines that the predetermined criterion is met only after the result of the feature recognition indicates that the user’s feature matches with a pre-stored record (e.g. the identity of the user can be recognized) .
  • a feature recognition e.g. a facial recognition, a motion/gesture recognition, or a voice recognition, etc.
  • the user response data transmitted from the user response detector 410 to the vehicle controller 110 can be periodically sent to the vehicle controller.
  • the user response data may comprise raw data recording the user response captured by the user response detector 410, such as an image data captured by the camera or a voice data captured by a microphone.
  • the vehicle controller 110 is equipped with the functionality to analyze the raw data to make a determination whether the predetermined criterion is met.
  • Such an analysis and determination by the vehicle controller 110 may be a simple one (e.g. a yes-vs-no determination) or a complicated one (e.g. feature recognition) .
  • the user response detector 410 itself may have the capability of analysis and making a determination based on the raw response data without resorting to the vehicle controller 110.
  • the user response data transmitted from the user response detector 410 to the vehicle controller 110 may only include the determination result, which can be in a form of protocol data. For example, a code “1” means that the predetermined criterion is met, whereas a code “0” means that the predetermined criterion is not met.
  • the optical device 310 when the optical device 310 receives the first control command from the vehicle controller 110, the optical device 310 can also send time stamp information to the vehicle controller 110, and the time stamp information records the time when the optical device 110 starts to project the preset image. After receiving the time stamp information, the vehicle controller 110 can count a working time of the optical device 310 based on the time stamp information.
  • the vehicle controller 110 can further send a stop command to the optical device 310 to thereby control the optical device 310 to stop projection.
  • a preset threshold e.g. approximately 2 seconds -30 minutes, preferably 5 seconds –5 minutes, and further preferably approximately 10 seconds –1 minutes
  • the vehicle controller 110 can send a stop-projection command to the optical device 310 to stop the projection. As such, this feature can save power, and elongate the working life, of the optical device 310.
  • vehicle controller 110 controls the vehicle gate to open, depending on the different mechanisms the vehicle gate is opened.
  • the vehicle gate is opened by means of a hydraulic rod that is operably connected with, and driven by, a driving motor.
  • the driving motor can drive the hydraulic rod to move in a certain direction, thereby opening the vehicle gate.
  • the vehicle gate is locked by means of a latch.
  • an actuator e.g. a motor, or an electromagnetic device
  • the actuator can then release the latch, and a spring then pushes or pulls the vehicle gate to open.
  • the vehicle controller 110 can be further configured to record the time period in which the vehicle gate remains open. If the time period is longer than a preset threshold, the vehicle controller can control the vehicle gate to close or shut off. In certain embodiment where the vehicle gate is opened or closed by the hydraulic rod, this can be realized by means of the driving motor which, upon receiving a reverse driving signal from the vehicle controller, drives the hydraulic rod to move in an opposite direction, thereby closing the vehicle gate.
  • the vehicle gate control apparatus and method disclosed herein can be used to conveniently realize a hands-free opening of a vehicle gate.
  • An object detector 210 and a vehicle controller 110 can detect and determine whether there is a user approaching and appearing within the preset distance to the vehicle gate. If so, the vehicle controller 110 controls the optical device 310 to project a preset image onto the ground.
  • a user response detector 410 can record the user’s response. Based on the user response received from the user response detector 410, the vehicle controller 110 further determines whether a predetermined criterion is met, and if so, controls the vehicle gate to open.
  • FIG. 3A and FIG. 3B For a better understanding of the vehicle gate control apparatus and method as described above, a specific application scenario is provided in the following, which is illustrated in FIG. 3A and FIG. 3B.
  • an ultrasonic sensing device 211 i.e. a type of the object detector 210 shown in FIG. 2
  • an optical device 311 is triggered to project light beams B to form a projected image C on the ground.
  • a motion e.g. swiping foot or stepping
  • the vehicle controller i.e. the vehicle controller 110 shown in FIG. 2, not shown here in FIG. 3 controls the trunk door T to open.
  • FIG. 3B A more detailed description of the vehicle gate control method utilizing the vehicle gate control apparatus illustrated in FIG. 3A is further illustrated in FIG. 3B.
  • an object detector 211 e.g. the ultrasonic sensing device in FIG. 3A
  • the obstruction detector 411 e.g. TOF detection device in FIG.
  • the vehicle controller 111 constantly and periodically detects whether an obstruction occurs between the projected image and the vehicle gate, and sends the obstruction detection data (i.e. a type of user response data shown in FIG. 1A) to the vehicle controller 111. If it determines that there is an obstruction (e.g. the user steps onto the projected image, as illustrated in FIG. 3A) , the vehicle controller 111 sends a second control command to the trunk door T to thereby open it.
  • the vehicle controller 111 can timely determine the presence of obstruction between the projected image and the vehicle gate, thereby having an improved responsiveness in opening the vehicle gate.
  • FIG. 4 shows a block diagram of a controller with reference to the controlling apparatus 001.
  • the controller comprises a receiving module 401, a determination module 402, and a control module 403, a transmission module 404, and optionally may further comprise a feature recognition module 405.
  • the receiving module 401 is configured to receive detection data from one or more detecting devices in the controlling apparatus 001, including receiving object detection data from a first sensing device 200, and receiving user response data from a second sensing device 400.
  • the receiving module 401 comprises a first receiving sub-module 4011 and a second receiving sub-module 4012, which are respectively configured to receive the object detection data from the first sensing device 200 and to receiving the user response data from the second sensing device 400.
  • the determination module 402 is configured to make a determination based on the one or more detection data received from the one or more detecting devices, including making a determination whether a user appears within the preset distance to the target object based on the object detection data, and making a determination whether a response from the user meets a predetermined criterion based on the user response data.
  • the determination module 402 comprises a first determination sub-module 4021 and a second determination sub-module 4022, which are configured to making the above two determinations based on the object detection data and the user response data, respectively.
  • the control module 403 is configured to generate a first control command configured to control an optical device for projecting a preset image if the determination module 402 determines that a user appears within the preset distance to the target object, and to generate a second control command configured to control the target object to execut a corresponding task if the determination module 402 determines that the response from the user meets a predetermined criterion.
  • the control module 403 comprises a first control sub-module 4031 and a second control sub-module 4032, which are configured to generate the first control command and the second control command, respectively.
  • the transmission module 404 is configured to transmit the first control command to the optical device, and to transmit the second control command to the target object.
  • the transmission module 404 comprises a first transmission sub-module 4041 and a second transmission sub-module 4042, which are configured to transmit the first control command and the second control command, respectively.
  • the controller further comprises a feature recognition module 405, configured to perform feature recognition based on the user response data.
  • the feature recognition module 405 may include any of a voice recognition, a motion recognition, or a facial recognition, etc. that corresponds to the user response data, which may additionally facilitate a user authentication process.
  • the controller in order to save power and elongate the working life, of the optical device, is further configured to realize the counting of the working time of the optical device, such that if the working time of the optical device is longer than a preset time period (e.g. 1 minute) while no response from the user is received, the controller controls the optical device to stop projection.
  • the receiving module 401 is configured to receive time stamp information from the optical device to thereby record the moment when the optical device starts projection.
  • the controller further comprises a counting module 406, configured to count a working time of the optical device based on the time stamp information.
  • the determination module 402 is further configured to determine whether the working time of the optical device is longer than a preset threshold (e.g. 1 minute) .
  • the control module 403 is further configured to generate a stop command configured to control the optical device to stop projection if the working time of the optical device is longer than the preset threshold, and if no response from the user is received.
  • the transmission module 404 is further configured to transmit the stop command generated by the control module 403 to the optical device.
  • the controller as described above can be customized to be a vehicle controller for the vehicle gate control apparatus as described above, and the detailed description of each such customized functional module can reference the above vehicle gate control apparatus and method as described above, and will be skipped herein.
  • each of the terms “module, ” “sub-module, ” or alike refers to as a computer-implemented functional entity, which can include both hardware components (i.e. processor (s) or memory) and software components.
  • hardware components i.e. processor (s) or memory
  • software components i.e. processor (s) or memory
  • the combined working of certain hardware component (s) and software components allows a prescribed functionality corresponding to a certain functional module to be carried out in the controller.
  • FIG. 5 shows a structural diagram of a controller provided by some embodiments of the disclosure.
  • the controller comprises a storage 501, and a processor 502.
  • the storage 501 is configured to store a computer program comprising executable instructions that when executed by a processor, carry out one or more steps of the controlling method, such as the vehicle gate control method, as provided in the disclosure.
  • the processor 502 is configured to execute the computer program stored in the storage 501.
  • examples of the storage 501 can include a random access memory (RAM) and/or a non-volatile memory (NVM, e.g. a disc storage) .
  • the storage 501 can be remote from the processor 502.
  • the processor 502 can be a general processor, such as a central processing unit (CPU) , a network processor (NP) , a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a programmable logic device (e.g. field programmable gate array (FPGA) ) , a discrete gate or transistor logic device, or a discrete hardware component, etc.
  • CPU central processing unit
  • NP network processor
  • DSP digital signal processor
  • ASIC application specific integrated circuit
  • FPGA field programmable gate array
  • FPGA field programmable gate array
  • the present disclosure further provides a computer-readable and non-volatile storage medium.
  • the storage medium is configured to store computer-executable instructions which, when executed by a processor, cause the processor to execute the various steps of the control method according to any of the embodiments as described above.
  • the machine-readable and non-volatile storage medium can be a portable hard disk (i.e. HDD) , a flash drive, a solid-state disk (i.e. SSD) , an optical disc (e.g. CD or DVD) , or a magnetic tape, etc.
  • a portable hard disk i.e. HDD
  • a flash drive i.e. NAND
  • a solid-state disk i.e. SSD
  • an optical disc e.g. CD or DVD
  • magnetic tape etc.
  • control apparatus and the control method as provided herein are not limited to the application in controlling the vehicle gate to open, and can be applied to many other different application scenarios as well.
  • the target object is a gate of a building (not a vehicle)
  • the object can be someone intended to get into the building through the gate.
  • the control apparatus it can realize that a user can conveniently open the gate and get access into the building without using his or her hand.
  • the terms “about, ” “approximately, ” “around” or alike refer to a quantity, level, value, number, frequency, percentage, dimension, size, amount, weight or length that varies by as much as 30, 25, 20, 25, 10, 9, 8, 7, 6, 5, 4, 3, 2 or 1%to a reference quantity, level, value, number, frequency, percentage, dimension, size, amount, weight or length.
  • the terms “about” or “approximately” when preceding a numerical value indicates the value plus or minus a range of 15%, 10%, 5%, or 1%.

Abstract

A control apparatus and a control method are provided. The control apparatus (001) includes a first sensing device (200), an optical device (300), and a second sensing device (400), each communicatively connected with a controller (100). The first sensing device (200) detects whether a user (U) is in a proximity of a target object (T), and if so, sends object detection data to the controller (100), which further determines whether the user (U) is within a preset distance to the target object (T), and if so, controls the optical device (300) to project a preset image. The second sensing device (400) then detects a response of the user to the preset image, and sends user response data to the controller (100). The controller (100) further determines whether the response from the user meets a predetermined criterion, and if so, controls the target object (T) for executing a task. The control apparatus can be utilized for realizing a hands-free opening of a vehicle gate.

Description

CONTROL APPARATUS AND METHOD
CROSS-REFERENCE TO RELATED APPLICATION
The present application claims priority to Chinese Patent Application No. 201910789846 . 0 filed on August 26, 2019, whose disclosure is hereby incorporated by reference in its entirety.
TECHNOLOGY FIELD
This present disclosure relates generally to control technologies, specifically to a control apparatus and method which can be applied in various fields including the technical field of intelligent vehicles, and in more particular, to a vehicle gate control apparatus and method.
BACKGROUND
In the field of intelligent vehicle technologies, it is an important research topic regarding how to intelligently open a vehicle gate (i.e. tailgate) , or more specifically a power trunk gate. At present, several main approaches for opening a trunk gate of a vehicle include: pressing a “trunk open” button arranged on a central control console within the vehicle, pressing a “trunk open” button on a vehicle key, or pressing a switch button arranged on the trunk gate.
The above three approaches belong essentially to a traditional mode for opening the trunk gate of a vehicle. When a user's hands are not convenient to use the vehicle key or the button operation is not convenient, the trunk gate of the vehicle cannot be opened automatically.
SUMMARY OF THE INVENTION
In a first aspect, the present disclosure provides a control apparatus.
The control apparatus includes a controller, a first sensing device, an optical device, and a second sensing device. Each of the first sensing device, the optical device, and the second sensing device is communicatively connected with the controller.
The first sensing device is configured to detect whether a user is in a proximity of a target object, and if so, to send object detection data of the user to the controller. The controller is configured to determine whether the user is within a preset distance to the target object based on the object detection data, and if so, to send a first control command to the optical device. The optical device is configured, upon receiving the first control command, to project a preset image onto a preset area of a surface for presentation to the user. The second sensing device is configured to  detect a response of the user to the preset image, and then to send user response data to the controller. The controller is further configured to determine whether the response from the user meets a predetermined criterion based on the user response data, and if so, to send a second control command to the target object for executing a task.
Herein, according to some preferred embodiments, the target object is a gate, and the task is to open the gate. Yet optionally the target object can be other object, such as an elevator, and the task is to get the elevator to stop at the same level as the user, and thus get ready for the user to take. The target object may also be a droid assistant, and the task is to move closer to the user for providing services.
Further preferably, the target object is a vehicle gate, and the preset area of the surface is an area of ground in a proximity of the vehicle gate. Herein, the vehicle gate can be any gate, such as the rear cargo trunk door, the front engine hood lid, or any of the power door proving access to the interior of the vehicle.
In the control apparatus, the second sensing device optionally comprises an obstruction detector configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the image.
Herein, the obstruction detector may comprise a camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device.
The second sensing device may optionally comprise a user action detector configured to detect an action of the user in response to the image. The user action detector can be a camera, and the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user. The user action detector can also be a microphone, and the response detected thereby comprises a voice of the user.
In the control apparatus, the predetermined criterion may comprise detection of any action performed by the user, which may include at least one of a motion, a gesture, a facial expression, or a voice.
In the control apparatus, the controller may optionally be further configured to perform a feature recognition based on the user response data, and the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record. Herein, the term “substantial” , “substantially” , or alike, is considered to be exchangeable with the phrase “in most details, even if not completely” , i.e. defined as more than 80%in the level of match.
In the control apparatus, the first sensing device can optionally comprise at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity  surface-emitting laser) sensor.
According to certain embodiments, the first sensing device comprises a wireless-signal detector, which is configured to detect wireless signals transmitted from a carry-on device carried by the user, wherein the wireless signals comprise at least one of radio frequency (RF) signals, WiFi signals, Bluetooth signals, 4G signals, or 5G signals.
Herein, the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
In the control apparatus as described above, the preset distance can be in a range of approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
According to some embodiments of the control apparatus, the first sensing device is substantially a functional module embedded in the controller.
According to some embodiments of the control apparatus, the optical device is further configured, upon starting to project the preset image, to send time stamp information to the controller; and the controller is further configured to count a working time of the optical device based on the time stamp information, to determine whether the working time is longer than a preset threshold, and if so and if no response from the user is received, to further send a stop command to the optical device to stop projection.
Herein, the preset threshold can be in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
In a second aspect, the present disclosure further provides a control method.
The control method comprises the following steps:
(1) determining whether a user is within a preset distance to a target object;
(2) if so, controlling an optical device to project a preset image onto a preset area of a surface for presentation to the user;
(3) detecting a response of the user to the preset image;
(4) determining, based on the response, whether a predetermined criterion is met; and
(5) if so, controlling the target object to execute a task.
Herein, the target object can optionally be a gate, and the task can be to open the gate.
According to some embodiments of the control method, the target object is a vehicle gate, and the preset area of the surface is an area of ground in a proximity of the vehicle gate.
According to some embodiments of the control method, step (1) of determining whether a user is within a preset distance to a target object comprises the following sub-steps:
(a) acquiring object detection data of the user; and
(b) determining whether the user is within the preset distance to the target object based on the object detection data.
Herein, optionally sub-step (a) of acquiring object detection data of the user can be by means of a first sensing device. The first sensing device can comprise at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
In certain embodiments of the control method, the first sensing device comprises an ultrasonic sensing device, and in the sub-step (b) of determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by at least one of a strength of, or a time period of receiving, an echo signal received by the ultrasonic sensing device.
Optionally, the first sensing device comprises a camera, and in the sub-step (b) of determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by analyzing images of the user.
Optionally, the first sensing device comprises a wireless-signal detector configured to detect wireless signals transmitted from a carry-on device carried by the user, and in the sub-step (b) of determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by a strength of the wireless signals detected by the wireless-signal detector. Herein, the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
In the control method as described above, the preset distance is in a range of approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
In the control method as described above, the step (3) of detecting a response of the user to the preset image is carried out by a second sensing device. The second sensing device may comprise at least one of an obstruction detector or a user action detector. The obstruction detector is configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the preset image. The user action detector is configured to detect an action of the user in response to the preset image.
According to certain embodiments of the control method, the second sensing device  comprises an obstruction detector, which may comprise at least one of a camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device. Preferably, the second sensing device comprises a TOF detection device.
According to certain embodiments of the control method, the second sensing device comprises a user action detector. Herein, the user action detector can include at least one of a camera or a microphone.
In embodiments of the control method where the user action detector comprises a camera, the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user.
In embodiments of the control method where the user action detector comprises a microphone, the response detected thereby comprises a voice of the user.
Herein, the predetermined criterion may optionally comprise detection of any action performed by the user, and the any action may comprise at least one of a motion, a gesture, a facial expression, or a voice.
According to some embodiments of the control method, the step (4) of determining, based on the response, whether a predetermined criterion is met comprises a sub-step of performing a feature recognition based on the user response data. Herein, the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record.
According to some embodiments, after the step (2) of if so, controlling an optical device to project a preset image onto a preset area of a surface for presentation to the user, the control method further comprises the following steps:
counting a working time of the optical device; and
controlling the optical device to stop projection if the working time is longer than a preset threshold and no response from the user is detected.
Herein the preset threshold can be in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
Any embodiments of the above control method can be carried out by the control apparatus as described in the first aspect.
BRIEF DESCRIPTION OF THE DRAWINGS
In the following, in order to provide a clearer description of the various embodiments of the inventions provided in the present disclosure, drawings for some embodiments of the present disclosure are briefly provided. These drawings are to be construed to cover only some, but not all, of the embodiments of the inventions disclosed herein, and people of ordinary skills in the art can, based on the drawings provided herein, readily obtain other drawings without the need to take any inventive efforts.
FIG. 1A and FIG. 1B respectively show a structural diagram of a control apparatus and a control method utilizing the control apparatus provided by some embodiments of the disclosure;
FIG. 2 illustrates a vehicle gate control apparatus and control method based on the control apparatus and control method as illustrated in FIGS. 1A and 1B;
FIGS. 3A and 3B respectively show a specific application scenario of a vehicle gate control apparatus and a working flow of a vehicle gate control method utilizing the vehicle gate control apparatus;
FIG. 4 shows a block diagram of a controller according to certain embodiments of the disclosure; and
FIG. 5 shows a structural diagram of a controller provided by some embodiments of the disclosure.
DETAILED DESCRIPTION OF THE INVENTION
In the following, with reference to the drawings that accompany this disclosure, the technical solutions provided in the various embodiments of the invention are described in greater detail. It should be noted that the embodiments provided in the disclosure shall be considered to represent only part, but not all, of the embodiments that the present disclosure covers, and thus shall not be considered to impose any limitation on the protection scope of the disclosure. Based on the embodiments provided herein, other embodiments with slight variations in designs, as long as they follow the gist of the invention disclosed herein, and can be easily obtained by people of ordinary skills in the art without involving any creative work, shall be considered to be covered by the scope of the disclosure.
On certain aspects, the present disclosure provides a control apparatus and a control method. The control method is substantially carried out by means of the control apparatus.
FIG. 1A shows a structural diagram of a control apparatus provided by some embodiments of the disclosure. As shown in the figure, the control apparatus 001 comprises a controller 100, a  first sensing device 200, an optical device 300, and a second sensing device 400. Each of the first sensing device 200, the optical device 300 and the second sensing device 400 is communicatively connected with the controller 100.
The first sensing device 200 is substantially an object detector, configured to detect whether there is a user U in a proximity of a target object T, and then to send object detection data to the controller 100.
The controller 100 is configured, based on the object detection data from the first sensing device 200, to determine whether the user U is within a preset distance to the target object T, and is further configured, if it determines that the user U is within the preset distance, to send a first control command (i.e. “1 st Control Command” in FIG. 1) to the optical device 300.
The optical device 300 is configured, upon receiving the first control command, to project a preset image onto a preset area of a surface for presentation to the user U.
The user U may, under a prompt by the image projected by the optical device 300, exhibit a response thereto.
The second sensing device 400 is configured to detect the response of the user U to the projected image, and then to transmit user response data to the controller 100.
The controller 100 is further configured, based on the user response data from the second sensing device 400, to determine whether the response from user U meets a predetermined criterion, and is further configured to send a second control command (i.e. “2 st Control Command” in FIG. 1) to the target object T for executing a task corresponding to the second control command.
Correspondingly, FIG. 1B illustrates a flow chart of a control method using the control apparatus 001 according to certain embodiments of the disclosure. As shown in FIG. 1B, the control method includes the following steps:
S10: Determining whether a user is within a preset distance to a target object;
S20: If so, controlling an optical device to project a preset image onto a preset area of a surface for presentation to the user;
S30: Detecting a response of the user to the preset image;
S40: Determining, based on the response of the user, whether a predetermined criterion is met;
S50: If so, controlling the target object to execute a corresponding task.
As used throughout the disclosure, the term “user” generally refers to a person, but can also be expanded to referring to an animal, a robot, a machine, or anything that can respond to the image projected by the optical device.
In one such scenario, which also cover preferred embodiments of this present disclosure,  the control apparatus 001 is arranged in a vehicle (e.g. a passenger car) and configured to control the vehicle to execute a vehicle-related task, such as opening a vehicle gate (e.g. a trunk lid or a trunk door) . Under this specific scenario, the target object is a vehicle, the user can be a driver or a passenger approaching the vehicle and intended to open a vehicle gate of interest (e.g. trunk door or tailgate) , and the task to be executed by the target object is to open the vehicle gate.
Thus, by means of the control apparatus 001 and the control method as described above, it can realize that when a user approaches a trunk of a vehicle (e.g. a passenger car) to try to open the trunk door, an optical device (e.g. a projector) arranged on the vehicle can, upon the control apparatus detecting that the user is getting close enough to the trunk, project an image (e.g. an optical spot, or an optical pattern such as a light ring or a specific logo) onto an area of the ground right in front of the vehicle trunk (i.e. “preset area of a surface” ) . Prompted by the optical pattern projected on the ground, the user can kick his or her foot to swipe over the optical pattern. Then after detecting this user response, the control apparatus can control the trunk door of the vehicle to open. As such, the user can conveniently open the trunk door of a car without using his or her hand, which is especially useful when the user is holding a lot of stuff (e.g. a big box) using both hands and intends to put them into the trunk.
A more detailed description regarding this above specific embodiment, as well as other related embodiments, of the control apparatus and the control method for controlling the vehicle gate to open will be provided in the following.
FIG. 2 illustrates a vehicle gate control apparatus and a vehicle gate control method, which are substantially based on the control apparatus and method as set forth above and illustrated in FIGS. 1A and 1B.
As shown in FIG. 2, the vehicle gate control apparatus 001A comprises a vehicle controller 110, an object detector 210, an optical device 310, and a user response detector 410, which correspond to the controller 100, the first sensing device 200, the optical device 300, and the second sensing device 400 of the control apparatus 001 as illustrated in FIG. 1A.
The various steps of the vehicle gate control method are also indicated in FIG. 2. Briefly, the object detector 210 detects whether a user U is approaching the vehicle gate T of interest (e.g. the trunk gate, see S100) , and then sends object detection data to the vehicle controller 110 (S200) . Then based on the object detection data, the vehicle controller 110 determines whether the user U is within a preset distance to the vehicle gate T, and if so, sends a first control command to the optical device 310 (S300) to thereby control the optical device 310 to project a preset image onto the ground (S400) . Prompted by the projected image, the user U may exhibit a response, which is then detected by the user response detector 410 (S500) , and the user response data is further sent by the  user response detector 410 to the vehicle controller 110 (S600) . Then based on the user response data, the vehicle controller 110 further determines whether the response from user U meets a predetermined criterion, and if so, controls the vehicle gate T to open (S700) .
As used herein, the term “vehicle” or “vehicles” generally refers to a transportation machine that transports people or cargo, which can include motor vehicles, such as passenger automobiles, cars, sports utility vehicles (SUV) , motorcycles, buses, trucks, or various commercial vehicles, and can also include hybrid vehicles, electric vehicles, plug-in hybrid electric vehicles, hydrogen-powered vehicles, and other alternative fuel vehicles (e.g., fuels derived from resources other than petroleum) . In a broader scope that is also covered by the present disclosure, the term can also include other types of transportation tools such as wagons, bicycles, railed vehicles (e.g. trains, trams, etc. ) , watercrafts (e.g. ship or boats) , aircrafts, and spacecrafts, etc.
As used herein, the term “gate” is considered to be equivalent to “door” , “lid” or alike, and refers to a movable portion of a vehicle that, if opened, provides an access to, and if shut off, close an opening of, certain compartment, such as a cargo trunk (or short as trunk hereinafter) , an interior driver/passenger space, or an engine trunk, of a vehicle. In the embodiment shown in FIG. 2, the vehicle gate is a trunk gate arranged at the rear of a passenger car, yet it is to be noted that the “vehicle gate” can alternatively be a power door to an interior driver/passenger space of the vehicle, or a lid/gate to other compartments (e.g. the engine trunk) of the vehicle. Such a vehicle gate can be a hinged gate or a sliding gate, but can also be a gate of other types.
As used herein, the “user” can be a driver or a passenger intended to ride on a vehicle, or can be a guest or a third-party person not intended to ride on the vehicle.
As used herein, the “preset distance” to the vehicle gate can be set in advance by technicians according to practical needs. For example, in a typical passenger car, only when the user approaches within about 1 m (meter) to the vehicle gate, shall the user be considered to have the need to open the vehicle gate. As such, according to certain embodiments, the preset distance is set as a range with a radius of 1.0 meter. Yet optionally, the preset distance can be set as 0.5 m, 1.5 m, 2.0 m, or any other distances. In the present disclosure, the preset distance can be typically set to have a range of 0.1 -5 m, preferably of 0.5-2 m, and more preferably of approximately 1 m.
In both the general control apparatus as illustrated in FIG. 1A and the specific embodiments of the vehicle gate control apparatus as illustrated in FIG. 2, there can be different embodiments for the first sensing device 200 or for the object detector 210, depending on the specific manners or working mechanisms. For brevity, only a description for the object detector 210 in the vehicle gate control apparatus as illustrated in FIG. 2 is provided, but it is to be noted that such a description shall also be applied, without any limitation, to the first sensing device 200 in the  general control apparatus as illustrated in FIG. 1A.
In certain embodiment, the object detector 210 comprises an ultrasonic sensing device, which can periodically transmit ultrasonic signals. When a user is approaching the vehicle gate, an echo signal reflected by the user can be received by the ultrasonic sensing device. Then based on a strength of, and a time period of receiving, the echo signal, the distance of the user to the vehicle gate can be measured or estimated based on certain predetermined relationship therebetween. Similarly, the object detector 210 comprises a radar sensing device (i.e. radar sensor) , such as a millimeter wave (i.e. mmWave) radar sensor, which can be used for relatively short-range (e.g. approximately 30 m) or long-range (e.g. approximately 200 m) object detection, depending on practical needs.
In certain embodiments, the object detector 210 relies on images that are taken thereby to measure or estimate the distance of the user to the vehicle gate. In one such embodiment, for example, the object detector 210 comprises a camera (e.g. a video camera) . The camera can periodically take pictures of a scene near the vehicle gate. When a user is approaching the vehicle gate, a series of pictures of the user can be periodically taken by the camera. Then based on each of the series of pictures, the distance of the user to the vehicle gate can be measured or estimated. It can be realized, for example, by feature recognition and by measuring the size of the user, or part thereof (e.g. head, or certain feature points on the body) in each picture, and then based on a predetermined size-distance relationship, the distance of the user to the vehicle gate can be calculated or estimated. Beside this above approach, other distance evaluation approaches are also possible. Furthermore, the series of pictures may optionally include images that contain certain features, such as facial features, gestures, and/or motions, etc., of the user, and through feature comparison and deep learning, the identity of the user may be further determined for other purposes, such as authentication.
In certain embodiments, the object detector 210 relies on wireless signals that are captured thereby to measure or estimate the distance of the user to the vehicle gate, and thus the object detector 210 may optionally comprises a wireless-signal detector, configured to detect wireless signals transmitted from a carry-on device carried by the approaching user. Herein the “wireless signals” can be radio frequency (RF) signals, WiFi signals, Bluetooth signals, 4G/LTE signals, 5G signals, etc., which can be transmitted by the carry-on device that is substantially a wireless signal-emitting device, such as a vehicle key, a mobile phone, or a specialized wireless-signal transmitter. Such a wireless signal-transmitting device is typically carried by a vehicle user, and thus can be utilized to estimate the distance of the user when he or she is approaching the vehicle gate.
In one such specific embodiment, the object detector 210 comprises an RF detector, such as a wireless sensing module which is sometimes embedded in the vehicle controller 110, configured to sense or detect the RF signals transmitted from a vehicle key. When a user carrying such a vehicle key is approaching the vehicle gate, the wireless sensing module can estimate the distance of the user to the vehicle gate based on the strength of the RF signals. Optionally, the vehicle key may transmit other types of wireless signals (e.g. Bluetooth signals) , and a corresponding object detector may be configured to receive these types of wireless signals and work in a similar manner.
In another such embodiment, the object detector 210 comprises a wireless signal detector configured to detect one or more of RF signals, Bluetooth signals, WiFi signals, Bluetooth signals, 4G/LTE signals, 5G signals, etc., which are transmitted from a mobile phone. When a user carrying a mobile phone is approaching the vehicle gate, the signal detector can estimate the distance of the user to the vehicle gate based on the strength of the wireless signals received.
In other embodiments, the object detector may comprise other types of sensing devices, such as an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, a VCSEL (vertical-cavity surface-emitting laser) sensor, etc., which may be utilized to measure or estimate the distance of the approaching user to the vehicle gate.
Herein, optionally there can be different manners for arranging the object detector 210 and the vehicle controller 110 in the vehicle.
According to some embodiments, the object detector 210 and the vehicle controller 110 are physically separated but are communicatively connected. The vehicle controller 110 can judge or determine whether a user appears in the preset distance to the vehicle gate according to the object detection signals transmitted from the object detector 210.
According to some other embodiments, the object detector 210 may be functionally and/or physically embedded in the vehicle controller 110. For example, the vehicle controller 110 itself may have a functionality for detecting a user. For example, a wireless sensing module configured in the vehicle controller 110 can substantially serve as an object detector 210 to sense or acquire information of the user.
Optionally, the object detector 210 may comprise any combination of the above different embodiments thereof. In one such embodiment, the object detector 210 of the vehicle gate control apparatus 001A may comprise a camera and a RF signal-detector (e.g. a wireless sensing module embedded in the vehicle controller 110) . The camera is configured to capture images of an approaching user, and the RF signal-detector is configured to detect the RF signals transmitted from the vehicle key carried by the user. Such a combined use may realize a better security, a robustness  (i.e., in case that one fails, the other still works) , a higher effectiveness (e.g. a first approach serves as a trigger to active a second approach) , or combined functions of distance estimate and user authentication. To be more specific, if only a user is detected within the preset distance to the vehicle gate whereas the user could be someone who does not carry the vehicle key, there is a risk for opening the vehicle gate by mistake. As such, in order to ensure a better security, it can simultaneously determine whether an object is detected appearing within the preset distance to the vehicle gate, and whether an RF signal from the vehicle key is detected. If both are detected, the drive command can then be sent to the optical device.
Herein, in S200 of the vehicle gate control method shown in FIG. 2, the object detection data obtained by the object detector 210 and transmitted to the vehicle controller 110 can be any of data type, such as an electrical signal, a data packet, or a binary code, etc., and can have different embodiments.
According to some embodiments, the object detection data may comprise the original data (e.g. images, wireless signal strength, echo signal strength, echo reflection time, etc. ) periodically captured by the object detector 210 when a user is approaching the vehicle gate. Such captured original data is thus periodically transmitted to the vehicle controller 110 which, by means of a calculation module in a processor, can calculate the distance, and further by means of a determination module in the processor, can determine whether the user reaches within the preset distance to the vehicle gate. Thus in these embodiments, the object detector 210 does not process the object detection data, but instead transmit all of the object detection data to the vehicle controller 110 for processing and determination.
According to some other embodiments, the object detection data may only comprise a determination result. Herein, the object detector 210 itself can, in addition to periodically capture distance-related data (e.g. images, wireless signal strength, echo signal strength, echo reflection time, etc. ) , also make a determination whether the captured data meets a certain criteria (e.g. the strength of the wireless signal is higher than a predetermined threshold, or the time period of the echo signal is shorter than a predetermined threshold) , which is substantially equivalent to the determination whether the approaching user is within the preset distance to the vehicle gate. Accordingly, the object detection data may be in a form of protocol data. For example, the object detection data may only comprise a binary code “1” , which indicates that the user is within the preset distance. As such, when a user is approaching to the vehicle gate, the object detector 210 periodically captures related data of the user and determines whether the approaching user is within the preset distance to the vehicle gate, and only when it determines that the user is within the preset distance, it sends the binary code “1” to the vehicle controller 110. Upon receiving the binary code  “1”, the vehicle controller 110 accordingly determines that the user is within the preset distance, and can further send the first control command to the optical device 310 so as to control it to project the preset image onto the ground. Thus in these embodiments, the object detector 210 takes a maximal effort to process the data detected regarding the user and make a determination based thereupon, and only transmits the determination result, which is substantially the object detection data, to the vehicle controller 110.
It is to be noted that, in addition to the above two different embodiments, there can be a middle-type of the object detection data, i.e., the object detector 210 partially processes the data detected regarding the user, and then transmits such processed data as object detection data to the vehicle controller 110, based on which the vehicle controller 110 further makes a determination whether the approaching user is within the preset distance to the vehicle gate.
In the vehicle gate control apparatus and method shown in FIG. 2, the optical device 310 can be an illuminating device installed at an appropriate position on the vehicle and with an appropriate direction pointing to a preset area of the ground. Herein, there can be different embodiments for the optical device 310 and for the preset image projected on the ground thereby (S400) .
In certain embodiments, the optical device 310 comprises a projector (e.g. a laser projector, an LED projector, etc. ) . Herein, the image projected by the projector can be of any shape, any color, any form, or for any purpose. Optionally, the image may be a spot, a ring, an arrow, of a special pattern (such as a logo) , or comprise a text. Further optionally, the image may be a monocolored image or a colored image. Further optionally, the image may be a still image, a blinking image, an alternating series of images, a dynamic image, or a video (such as a commercial advertisement) .
Further optionally, the image may serve certain purposes. In one example, the image may comprise a prompt text such as “Please swipe your foot over or step onto the image” , which substantially prompt the user to perform certain action or motion following the prompt in order to open the vehicle gate.
In another example, the image may comprise a prompt, such as “Please perform a motion as a pass to open the trunk door” , and the user may perform a motion that has been previously agreed upon (e.g. a counterclockwise movement of the foot) as a pass. Then the motion may be captured by a camera (i.e. the user response detector 410) , and the vehicle controller 110 can further determine whether the motion performed by the user matches with a pre-stored key, and if so, the vehicle controller 110 can then control the vehicle gate to open, and if not, the vehicle controller 110 may further control the optical device 310 to project another preset image as a prompt asking the user to try again. As such, the motion substantially serves as a level of authentication.
In certain other embodiments, the optical device 310 comprises a spotlight (e.g. regular spotlight, LED light, etc. ) configured to simply shed a light beam onto the ground, and the preset image thus formed may just comprise an optical spot formed on the ground by the light beam.
In S500 of the vehicle gate control method shown in FIG. 2, after observing the image projected by the optical device 310 on the ground, the user exhibits a response, which is detected by the user response detector 410. Further in S600, the user response detector 410 transmits the user response data to the vehicle controller 110. Then based on the user response data, the vehicle controller 110 further determines whether the response from user U meets a predetermined criterion, and if so, controls the vehicle gate T to open (S700) .
In the present disclosure, there can be various different embodiments for the response of the user to the projected image that can be detected, which may depend on the type of the image projected by the optical device 310.
In certain embodiments, the image projected by the optical device 310 on the ground the response may comprise a still image (e.g. an optical spot, an optical ring, a logo, and/or a prompt text such as “please swipe your foot over or step onto the image” ) . Prompted by the projected image, the user can thus perform an action or motion specified by the image, such as swiping his/her foot over, or stepping onto, the image. Then the user response detector 410 may comprise an obstruction detection device (i.e. obstruction detector) which, alongside the vehicle controller 110, is configured to detect whether there is any obstruction between the image on the ground that is projected by the optical device 310 and the vehicle gate. Herein, whether there is any obstruction between substantially constitutes the predetermined criterion for controlling the vehicle gate to open.
In one embodiment, the obstruction detector comprises a camera. When a shadow appears in the area of the projected image, which is caused by the swiping or stepping of the user’s foot over or onto the image, the camera (i.e. the user response detector 410) can take one or more pictures (substantially the user response data) , which are then transmitted to the vehicle controller 110 for analysis and determination. Herein, more specifically, the vehicle controller 110 can, based on the one or more pictures received, detect whether there is any pixel change in the projected image to thereby determine whether there is an obstruction between the projected image and the vehicle gate, or more specifically whether an obstruction occurs in an optical path for the projected image.
In other embodiments, the obstruction detector may comprise an infrared sensing device or a TOF detection device. When a blockage of the infrared beams or a fluctuation of the TOF detection signal in the area of the projected preset image is detected, an obstruction is considered to occur.
In yet other embodiments, the obstruction detector may alternatively comprise a radar sensor (e.g. an mmWave radar sensor, a short-range radar sensor, or an obstacle detection radar) , or a noncontact capacitive sensor.
In other embodiments, the response of the user to the projected image may be an action (e.g. a motion, a gesture, a facial expression, a voice, etc. ) performed or exhibited by the user upon the user sees the image projected by the optical device. The user response detector 410 may correspondingly comprise a user action detecting device or a user action detector (e.g. an imaging device such as a camera, or a voice recording device such as a microphone) that can detect the response performed by the user. Then after receiving the user response data sent from the user response detector 410, the vehicle controller 110 determines whether a predetermined criterion is met, and if so, further controls the vehicle gate to open.
In one example, the projected image may prompt the user to make a motion (e.g. move a leg, or to move the foot counterclockwise, etc. ) , and a motion detector (e.g. camera) may serve as the user response detector 410 to detect whether a movement of the user’s leg is detected, or if the motion by the user can be recognized (i.e. the predetermined criterion) . If so, the vehicle controller 110 can further control the vehicle gate to open.
In another example, the projected image may prompt the user to direct his/her face towards certain part of the vehicle gate (e.g. the camera installed on the trunk door) , and the camera may serve as the user response detector 410 to capture the user’s face. Then the vehicle controller 110 determines whether the user’s face is detected or even whether the user’s face can be recognized (i.e. the predetermined criterion) , and if so, further controls the vehicle gate to open.
In yet another example, the projected image may prompt the user to talk or speak certain words, and a microphone installed in the vehicle may serve as the user response detector 410 to capture the user’s voice. Then the vehicle controller 110 determines whether the user’s voice is detected or even whether the user’s voice can be recognized (i.e. the predetermined criterion) , and if so, further controls the vehicle gate to open.
In any of the embodiments above, the predetermined criterion by which the vehicle controller 110 determines may vary depending on specific needs. For one example, the predetermined criterion may be a simple one (i.e. “yes” vs “no” , depending on whether a user action is detected) without any feature recognition functionality. That is, the vehicle controller 110 may determine that the predetermined criterion is met once an action of the user, such as a motion, a voice, a face, or a gesture, etc., is detected. In another example, the predetermined criterion applied by the vehicle controller 110 may be complicated, involving feature recognition. That is, after a motion, a voice, a face, a gesture, etc. of the user is detected by the user response detector 410, the  vehicle controller 110 further carries out a feature recognition (e.g. a facial recognition, a motion/gesture recognition, or a voice recognition, etc. ) , and determines that the predetermined criterion is met only after the result of the feature recognition indicates that the user’s feature matches with a pre-stored record (e.g. the identity of the user can be recognized) . Thus, these latter embodiments substantially confer a user authentication before opening the vehicle gate.
In any of the above embodiments, optionally the user response data transmitted from the user response detector 410 to the vehicle controller 110 can be periodically sent to the vehicle controller. As such, the user response data may comprise raw data recording the user response captured by the user response detector 410, such as an image data captured by the camera or a voice data captured by a microphone. The vehicle controller 110 is equipped with the functionality to analyze the raw data to make a determination whether the predetermined criterion is met. Such an analysis and determination by the vehicle controller 110 may be a simple one (e.g. a yes-vs-no determination) or a complicated one (e.g. feature recognition) .
Alternatively, the user response detector 410 itself may have the capability of analysis and making a determination based on the raw response data without resorting to the vehicle controller 110. As such, the user response data transmitted from the user response detector 410 to the vehicle controller 110 may only include the determination result, which can be in a form of protocol data. For example, a code “1” means that the predetermined criterion is met, whereas a code “0” means that the predetermined criterion is not met.
In certain embodiments, when the optical device 310 receives the first control command from the vehicle controller 110, the optical device 310 can also send time stamp information to the vehicle controller 110, and the time stamp information records the time when the optical device 110 starts to project the preset image. After receiving the time stamp information, the vehicle controller 110 can count a working time of the optical device 310 based on the time stamp information.
If the working time of the optical device 310 is longer than a preset threshold (e.g. approximately 2 seconds -30 minutes, preferably 5 seconds –5 minutes, and further preferably approximately 10 seconds –1 minutes) while the user response detector 410 receives no response from the user, the vehicle controller 110 can further send a stop command to the optical device 310 to thereby control the optical device 310 to stop projection. When a user approaches a vehicle gate, it does not necessarily mean that the user wants to open the vehicle gate. Under such a situation, after the optical device 310 projects the preset image, responses from the user may not be detected for a period of time. By counting the time period of working for the optical device 310, the vehicle controller 110 can send a stop-projection command to the optical device 310 to stop the projection. As such, this feature can save power, and elongate the working life, of the optical device 310.
There can be various different manners by which the vehicle controller 110 controls the vehicle gate to open, depending on the different mechanisms the vehicle gate is opened.
In certain embodiments, the vehicle gate is opened by means of a hydraulic rod that is operably connected with, and driven by, a driving motor. As such, after the vehicle controller 110 sends a second control command to the driving motor, the driving motor can drive the hydraulic rod to move in a certain direction, thereby opening the vehicle gate.
In another embodiment, the vehicle gate is locked by means of a latch. When the vehicle controller 110 sends a second control command to an actuator (e.g. a motor, or an electromagnetic device) , the actuator can then release the latch, and a spring then pushes or pulls the vehicle gate to open.
In certain embodiments, the vehicle controller 110 can be further configured to record the time period in which the vehicle gate remains open. If the time period is longer than a preset threshold, the vehicle controller can control the vehicle gate to close or shut off. In certain embodiment where the vehicle gate is opened or closed by the hydraulic rod, this can be realized by means of the driving motor which, upon receiving a reverse driving signal from the vehicle controller, drives the hydraulic rod to move in an opposite direction, thereby closing the vehicle gate.
The vehicle gate control apparatus and method disclosed herein can be used to conveniently realize a hands-free opening of a vehicle gate. An object detector 210 and a vehicle controller 110 can detect and determine whether there is a user approaching and appearing within the preset distance to the vehicle gate. If so, the vehicle controller 110 controls the optical device 310 to project a preset image onto the ground. A user response detector 410 can record the user’s response. Based on the user response received from the user response detector 410, the vehicle controller 110 further determines whether a predetermined criterion is met, and if so, controls the vehicle gate to open.
For a better understanding of the vehicle gate control apparatus and method as described above, a specific application scenario is provided in the following, which is illustrated in FIG. 3A and FIG. 3B.
As shown in FIG. 3A, upon a user U entering within a sensing range A of an ultrasonic sensing device 211 (i.e. a type of the object detector 210 shown in FIG. 2) , an optical device 311 is triggered to project light beams B to form a projected image C on the ground. After the user observes the projected image on the ground, he or she can perform a motion (e.g. swiping foot or stepping) over or onto the projected image C. After a TOF detection device (i.e. obstruction detector, which is one type of the user response detector 410 shown in FIG. 2, not shown here in FIG. 3)  detects the motion, the vehicle controller (i.e. the vehicle controller 110 shown in FIG. 2, not shown here in FIG. 3) controls the trunk door T to open.
A more detailed description of the vehicle gate control method utilizing the vehicle gate control apparatus illustrated in FIG. 3A is further illustrated in FIG. 3B. As shown in FIG. 3B, an object detector 211 (e.g. the ultrasonic sensing device in FIG. 3A) constantly and periodically detects whether there is an object (e.g. a user) approaching and appearing within a preset distance to the trunk door T, and the object detection data is sent to the vehicle controller 111. If it determines that a user appears within the preset distance, the vehicle controller 111 sends a first control command to the optical device 311 to project a present image onto the ground. The obstruction detector 411 (e.g. TOF detection device in FIG. 3A) constantly and periodically detects whether an obstruction occurs between the projected image and the vehicle gate, and sends the obstruction detection data (i.e. a type of user response data shown in FIG. 1A) to the vehicle controller 111. If it determines that there is an obstruction (e.g. the user steps onto the projected image, as illustrated in FIG. 3A) , the vehicle controller 111 sends a second control command to the trunk door T to thereby open it. Herein, because the vehicle controller 111 can timely determine the presence of obstruction between the projected image and the vehicle gate, thereby having an improved responsiveness in opening the vehicle gate.
In the following, more detailed description is provided to a controller in the controlling apparatus 001 as described above and illustrated in FIG. 1A and 1B.
FIG. 4 shows a block diagram of a controller with reference to the controlling apparatus 001. As shown in the figure, the controller comprises a receiving module 401, a determination module 402, and a control module 403, a transmission module 404, and optionally may further comprise a feature recognition module 405.
The receiving module 401 is configured to receive detection data from one or more detecting devices in the controlling apparatus 001, including receiving object detection data from a first sensing device 200, and receiving user response data from a second sensing device 400. Optionally, the receiving module 401 comprises a first receiving sub-module 4011 and a second receiving sub-module 4012, which are respectively configured to receive the object detection data from the first sensing device 200 and to receiving the user response data from the second sensing device 400.
The determination module 402 is configured to make a determination based on the one or more detection data received from the one or more detecting devices, including making a determination whether a user appears within the preset distance to the target object based on the object detection data, and making a determination whether a response from the user meets a  predetermined criterion based on the user response data. Optionally, the determination module 402 comprises a first determination sub-module 4021 and a second determination sub-module 4022, which are configured to making the above two determinations based on the object detection data and the user response data, respectively.
The control module 403 is configured to generate a first control command configured to control an optical device for projecting a preset image if the determination module 402 determines that a user appears within the preset distance to the target object, and to generate a second control command configured to control the target object to execut a corresponding task if the determination module 402 determines that the response from the user meets a predetermined criterion. Optionally, the control module 403 comprises a first control sub-module 4031 and a second control sub-module 4032, which are configured to generate the first control command and the second control command, respectively.
The transmission module 404 is configured to transmit the first control command to the optical device, and to transmit the second control command to the target object. Optionally, the transmission module 404 comprises a first transmission sub-module 4041 and a second transmission sub-module 4042, which are configured to transmit the first control command and the second control command, respectively.
Optionally, the controller further comprises a feature recognition module 405, configured to perform feature recognition based on the user response data. Herein, the feature recognition module 405 may include any of a voice recognition, a motion recognition, or a facial recognition, etc. that corresponds to the user response data, which may additionally facilitate a user authentication process.
In certain embodiments, in order to save power and elongate the working life, of the optical device, the controller is further configured to realize the counting of the working time of the optical device, such that if the working time of the optical device is longer than a preset time period (e.g. 1 minute) while no response from the user is received, the controller controls the optical device to stop projection. As such, the receiving module 401 is configured to receive time stamp information from the optical device to thereby record the moment when the optical device starts projection. The controller further comprises a counting module 406, configured to count a working time of the optical device based on the time stamp information. The determination module 402 is further configured to determine whether the working time of the optical device is longer than a preset threshold (e.g. 1 minute) . The control module 403 is further configured to generate a stop command configured to control the optical device to stop projection if the working time of the optical device is longer than the preset threshold, and if no response from the user is received. The transmission  module 404 is further configured to transmit the stop command generated by the control module 403 to the optical device.
In certain embodiments, the controller as described above can be customized to be a vehicle controller for the vehicle gate control apparatus as described above, and the detailed description of each such customized functional module can reference the above vehicle gate control apparatus and method as described above, and will be skipped herein.
As used herein, each of the terms “module, ” “sub-module, ” or alike, refers to as a computer-implemented functional entity, which can include both hardware components (i.e. processor (s) or memory) and software components. The combined working of certain hardware component (s) and software components allows a prescribed functionality corresponding to a certain functional module to be carried out in the controller.
FIG. 5 shows a structural diagram of a controller provided by some embodiments of the disclosure. As shown, the controller comprises a storage 501, and a processor 502. The storage 501 is configured to store a computer program comprising executable instructions that when executed by a processor, carry out one or more steps of the controlling method, such as the vehicle gate control method, as provided in the disclosure. The processor 502 is configured to execute the computer program stored in the storage 501.
Herein, examples of the storage 501 can include a random access memory (RAM) and/or a non-volatile memory (NVM, e.g. a disc storage) . Optionally, the storage 501 can be remote from the processor 502.
The processor 502 can be a general processor, such as a central processing unit (CPU) , a network processor (NP) , a digital signal processor (DSP) , an application specific integrated circuit (ASIC) , a programmable logic device (e.g. field programmable gate array (FPGA) ) , a discrete gate or transistor logic device, or a discrete hardware component, etc.
In certain aspect, the present disclosure further provides a computer-readable and non-volatile storage medium. The storage medium is configured to store computer-executable instructions which, when executed by a processor, cause the processor to execute the various steps of the control method according to any of the embodiments as described above.
Herein, the machine-readable and non-volatile storage medium can be a portable hard disk (i.e. HDD) , a flash drive, a solid-state disk (i.e. SSD) , an optical disc (e.g. CD or DVD) , or a magnetic tape, etc.
It is to be noted, however, that the control apparatus and the control method as provided herein are not limited to the application in controlling the vehicle gate to open, and can be applied to many other different application scenarios as well.
In one such scenario that is similar to the above, the target object is a gate of a building (not a vehicle) , and the object can be someone intended to get into the building through the gate. By means of the control apparatus, it can realize that a user can conveniently open the gate and get access into the building without using his or her hand.
It should be noted that throughout the disclosure, relational terms such as “first, ” “second” , and the like, are only meant to distinguish one entity or operation from another, and do not necessarily require or imply any such actual relationship or order between these entities or operations. As used herein, the terms “comprise, ” “include, ” “contain, ” and the like, are synonymous and are used inclusively, in an open-ended fashion, and do not exclude additional elements, features, steps, acts, operations, and so forth.
As used herein, the terms “about, ” “approximately, ” “around” or alike, refer to a quantity, level, value, number, frequency, percentage, dimension, size, amount, weight or length that varies by as much as 30, 25, 20, 25, 10, 9, 8, 7, 6, 5, 4, 3, 2 or 1%to a reference quantity, level, value, number, frequency, percentage, dimension, size, amount, weight or length. In particular embodiments, the terms “about” or “approximately” when preceding a numerical value indicates the value plus or minus a range of 15%, 10%, 5%, or 1%.
Throughout the disclosure, all of the embodiments are described in a related manner. Description of the same or similar parts, such as components, members, steps, or processes, of each embodiment can be referenced to one another. Each embodiment focuses on the differences with other embodiments. In particular, for the device, system, vehicle controller and machine-readable storage medium, because they are substantially similar to the method, their description is thus relatively simple, and for the relevant parts, please refer to the relevant description of the method.
All of the embodiments as provided and described above are construed to only represent relatively better embodiments of the inventions disclosed in, and are not intended to limit the scope of, the present disclosure. Any modifications, equivalent replacements, improvements, etc., as long as they are made under the spirit and within the principle of the inventions disclosed herein, shall be deemed included in the scope of protection of the present disclosure.

Claims (35)

  1. A control apparatus, comprising:
    a controller; and
    a first sensing device, an optical device, and a second sensing device, each communicatively connected with the controller;
    wherein:
    the first sensing device is configured to detect whether a user is in a proximity of a target object, and if so, to send object detection data of the user to the controller;
    the controller is configured to determine whether the user is within a preset distance to the target object based on the object detection data, and if so, to send a first control command to the optical device;
    the optical device is configured, upon receiving the first control command, to project a preset image onto a preset area of a surface for presentation to the user;
    the second sensing device is configured to detect a response of the user to the preset image, and then to send user response data to the controller;
    the controller is further configured to determine whether the response from the user meets a predetermined criterion based on the user response data, and if so, to send a second control command to the target object for executing a task.
  2. The control apparatus of claim 1, wherein the target object is a gate, and the task is to open the gate.
  3. The control apparatus of claim 2, wherein the target object is a vehicle gate, and the preset area of the surface is an area of ground in a proximity of the vehicle gate.
  4. The control apparatus of claim 3, wherein the vehicle gate is at least one of a trunk door or a power door.
  5. The control apparatus of any of claims 1-4, wherein the second sensing device comprises an obstruction detector configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the image.
  6. The control apparatus of claim 5, wherein the obstruction detector comprises at least one of a  camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device.
  7. The control apparatus of claim 6, wherein the obstruction detector comprises a TOF detection device.
  8. The control apparatus of any one claims 1-4, wherein the second sensing device comprises a user action detector configured to detect an action of the user in response to the image, wherein the user action detector comprises at least one of:
    a camera, wherein the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user; or
    a microphone, wherein the response detected thereby comprises a voice of the user.
  9. The control apparatus of claim 8, wherein the predetermined criterion comprises detection of any action performed by the user, wherein the any action comprises at least one of a motion, a gesture, a facial expression, or a voice.
  10. The control apparatus of claim 8, wherein the controller is further configured to perform a feature recognition based on the user response data, and the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record.
  11. The control apparatus of any one of claims 1-10, wherein the first sensing device comprises at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
  12. The control apparatus of claim 11, wherein the first sensing device comprises a wireless-signal detector, configured to detect wireless signals transmitted from a carry-on device carried by the user, wherein the wireless signals comprise at least one of radio frequency (RF) signals, WiFi signals, Bluetooth signals, 4G signals, or 5G signals.
  13. The control apparatus of claim 12, wherein the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
  14. The control apparatus of any one of claims 1-13, wherein the preset distance is in a range of  approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
  15. The control apparatus of any one of claims 1-14, wherein the first sensing device is substantially a functional module embedded in the controller.
  16. The control apparatus of any one of claims 1-15, wherein:
    the optical device is further configured, upon starting to project the preset image, to send time stamp information to the controller; and
    the controller is further configured to count a working time of the optical device based on the time stamp information, to determine whether the working time is longer than a preset threshold, and if so and if no response from the user is received, to further send a stop command to the optical device to stop projection.
  17. The control apparatus of claim 16, wherein the preset threshold is in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
  18. A control method, comprising:
    determining whether a user is within a preset distance to a target object;
    if so, controlling an optical device to project a preset image onto a preset area of a surface for presentation to the user;
    detecting a response of the user to the preset image;
    determining, based on the response, whether a predetermined criterion is met; and
    if so, controlling the target object to execute a task.
  19. The control method of claim 18, wherein the target object is a gate, and the task is to open the gate.
  20. The control method of claim 19, wherein the target object is a vehicle gate, and the preset area of the surface is an area of ground in a proximity of the vehicle gate.
  21. The control method of any one of claims 18-20, wherein the determining whether a user is within a preset distance to a target object comprises:
    acquiring object detection data of the user; and
    determining whether the user is within the preset distance to the target object based on the object detection data.
  22. The control method of claim 21, wherein the acquiring object detection data of the user is by means of a first sensing device, the first sensing device comprising at least one of an ultrasonic sensing device, a radar sensor, a camera, a wireless-signal signal detector, an infrared sensing device, a pyroelectric sensor, a ToF (time of flight) camera, or a VCSEL (vertical-cavity surface-emitting laser) sensor.
  23. The control method of claim 22, wherein the first sensing device comprises an ultrasonic sensing device, and in the determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by at least one of a strength of, or a time period of receiving, an echo signal received by the ultrasonic sensing device.
  24. The control method of claim 22, wherein the first sensing device comprises a camera, and in the determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by analyzing images of the user.
  25. The control method of claim 22, wherein the first sensing device comprises a wireless-signal detector configured to detect wireless signals transmitted from a carry-on device carried by the user, and in the determining whether the user is within the preset distance to the target object based on the object detection data, a distance of the user to the target object is estimated by a strength of the wireless signals detected by the wireless-signal detector.
  26. The control method of claim 25, wherein the carry-on device carried by the user comprises at least one of a vehicle key, a mobile phone, or a wireless-signal transmitter.
  27. The control method of any one of claims 18-26, wherein the preset distance is is in a range of approximately 0.01-10 meters, and preferably in a range of approximately 0.1-5 meters, and further preferably in a range of approximately 0.2-1 meters.
  28. The control method of any one of claims 18-27, wherein the detecting a response of the user to the preset image is carried out by a second sensing device, the second sensing device  comprising at least one of:
    an obstruction detector configured to detect whether an obstruction occurs between the target object and the image projected on the surface by the optical device as a result of the response of the user to the preset image; or
    a user action detector configured to detect an action of the user in response to the preset image.
  29. The control method of claim 28, wherein the second sensing device comprises an obstruction detector, the obstruction detector comprising at least one of a camera, a radar sensor, a noncontact capacitive sensor, an infrared sensing device, or a TOF detection device.
  30. The control method of claim 29, wherein the second sensing device comprises a TOF detection device.
  31. The control method of claim 28, wherein the second sensing device comprises a user action detector, the user action detector comprising at least one of:
    a camera, wherein the response detected thereby comprises at least one of a motion, a gesture, or a facial expression, of the user; or
    a microphone, wherein the response detected thereby comprises a voice of the user.
  32. The control method of claim 31, wherein the predetermined criterion comprises detection of any action performed by the user, wherein the any action comprises at least one of a motion, a gesture, a facial expression, or a voice.
  33. The control method of claim 32, wherein the determining, based on the response, whether a predetermined criterion is met comprises:
    performing a feature recognition based on the user response data, wherein the predetermined criterion comprises a substantial match between a result of the feature recognition and a pre-stored record.
  34. The control method of any one of claims 18-33, further comprising, after the step of if so, controlling an optical device to project a preset image onto a preset area of a surface for presentation to the user:
    counting a working time of the optical device; and
    controlling the optical device to stop projection if the working time is longer than a preset  threshold and no response from the user is detected.
  35. The control method of claim 34, wherein the preset threshold is in a range of approximately 2 seconds -30 minutes, and preferably in a range of approximately 5 seconds –5 minutes, and further preferably in a range of approximately 10 seconds –1 minutes.
PCT/CN2020/111329 2019-08-26 2020-08-26 Control apparatus and method WO2021037052A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202080060963.4A CN114616140A (en) 2019-08-26 2020-08-26 Control apparatus and method
EP20859091.9A EP4021767A4 (en) 2019-08-26 2020-08-26 Control apparatus and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910789846.0 2019-08-26
CN201910789846.0A CN110525377A (en) 2019-08-26 2019-08-26 A kind of automobile trunk door control method and device

Publications (1)

Publication Number Publication Date
WO2021037052A1 true WO2021037052A1 (en) 2021-03-04

Family

ID=68662819

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/111329 WO2021037052A1 (en) 2019-08-26 2020-08-26 Control apparatus and method

Country Status (3)

Country Link
EP (1) EP4021767A4 (en)
CN (2) CN110525377A (en)
WO (1) WO2021037052A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023123697A1 (en) * 2021-12-29 2023-07-06 博泰车联网(南京)有限公司 Control method and control system for vehicle trunk, and vehicle
CN116605176A (en) * 2023-07-20 2023-08-18 江西欧迈斯微电子有限公司 Unlocking and locking control method and device and vehicle

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110525377A (en) * 2019-08-26 2019-12-03 北京一数科技有限公司 A kind of automobile trunk door control method and device
CN111497737A (en) * 2020-04-28 2020-08-07 一汽奔腾轿车有限公司 Automobile door control device and method
CN111691786A (en) * 2020-05-11 2020-09-22 富晟(广东)汽车电子有限公司 Tail gate light and shadow assembly control method and device
CN114103871B (en) * 2021-11-03 2024-02-20 长春富晟汽车电子有限公司 Light and shadow one-foot kick interaction control method for vehicle tail door
CN114291034B (en) * 2021-12-31 2023-08-08 佛山市安驾科技有限公司 Skirting control method and control system for electric tail door of automobile
CN115126353A (en) * 2022-05-30 2022-09-30 北京一数科技有限公司 Vehicle door control method, vehicle controller, vehicle door control system, and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103998299A (en) * 2011-09-12 2014-08-20 法雷奥安全座舱公司 Method for opening a movable panel of a motor vehicle
DE102014101661A1 (en) * 2014-02-11 2015-08-13 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Method for controlling a closure element arrangement of a motor vehicle
CN105335144A (en) * 2014-07-31 2016-02-17 比亚迪股份有限公司 Vehicle trunk automatic opening system and control method therefor
CN105644465A (en) * 2014-09-17 2016-06-08 戴姆勒大中华区投资有限公司 Automatic opening control system for vehicle trunk
CN105781278A (en) * 2016-03-01 2016-07-20 福建省汽车工业集团云度新能源汽车股份有限公司 Car trunk opening control method and system
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN109505482A (en) * 2018-11-21 2019-03-22 北京长城华冠汽车科技股份有限公司 Automatically turn on the control system and vehicle of vehicle trunk
CN110525377A (en) * 2019-08-26 2019-12-03 北京一数科技有限公司 A kind of automobile trunk door control method and device

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4569305B2 (en) * 2005-01-31 2010-10-27 マツダ株式会社 Smart entry system for vehicles
KR101316873B1 (en) * 2012-07-04 2013-10-08 현대자동차주식회사 System and method for operating gate
KR101962728B1 (en) * 2012-12-26 2019-03-27 현대모비스 주식회사 Apparatus for Controlling automobile Trunk and Door
EP2860704B1 (en) * 2013-10-10 2016-04-27 U-Shin France SAS Method for opening a movable panel of the motor vehicle and corresponding opening control device
DE102014101208A1 (en) * 2014-01-31 2015-08-06 Huf Hülsbeck & Fürst Gmbh & Co. Kg mounting module
DE112015001401T5 (en) * 2014-03-26 2017-02-16 Magna Mirrors Of America, Inc. Vehicle function control system using projected characters
EP2930071B1 (en) * 2014-04-10 2018-11-14 U-Shin France Method for opening a movable panel of the motor vehicle and corresponding opening control device
DE102014116171A1 (en) * 2014-11-06 2016-05-12 Valeo Schalter Und Sensoren Gmbh Device with external motion sensor and illuminated marking for a motor vehicle
JP6649036B2 (en) * 2015-10-22 2020-02-19 株式会社ユーシン Door opening and closing device
US10563448B2 (en) * 2015-11-10 2020-02-18 Ford Global Technologies, Llc Approach activated closure entry system for a motor vehicle
JP6634345B2 (en) * 2016-05-31 2020-01-22 株式会社ミツバ Touch sensor unit
WO2019043769A1 (en) * 2017-08-29 2019-03-07 河西工業株式会社 Tailgate opening and closing device
CN107719481B (en) * 2017-09-02 2019-06-07 浙江吉润汽车有限公司 A kind of induction trigger-type automobile trunk open method and device
CN107905676B (en) * 2017-10-10 2019-06-25 吉利汽车研究院(宁波)有限公司 A kind of vehicle trunk automatically turns on control system, method and vehicle
CN109747587A (en) * 2019-03-18 2019-05-14 上海科世达-华阳汽车电器有限公司 A kind of method, apparatus and system of intelligent opening automobile trunk

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103998299A (en) * 2011-09-12 2014-08-20 法雷奥安全座舱公司 Method for opening a movable panel of a motor vehicle
DE102014101661A1 (en) * 2014-02-11 2015-08-13 Brose Fahrzeugteile Gmbh & Co. Kommanditgesellschaft, Hallstadt Method for controlling a closure element arrangement of a motor vehicle
CN105335144A (en) * 2014-07-31 2016-02-17 比亚迪股份有限公司 Vehicle trunk automatic opening system and control method therefor
CN105644465A (en) * 2014-09-17 2016-06-08 戴姆勒大中华区投资有限公司 Automatic opening control system for vehicle trunk
CN105781278A (en) * 2016-03-01 2016-07-20 福建省汽车工业集团云度新能源汽车股份有限公司 Car trunk opening control method and system
CN108204187A (en) * 2016-12-19 2018-06-26 大众汽车(中国)投资有限公司 For the method and apparatus of the boot of unlocking vehicle
CN109505482A (en) * 2018-11-21 2019-03-22 北京长城华冠汽车科技股份有限公司 Automatically turn on the control system and vehicle of vehicle trunk
CN110525377A (en) * 2019-08-26 2019-12-03 北京一数科技有限公司 A kind of automobile trunk door control method and device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP4021767A4 *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023123697A1 (en) * 2021-12-29 2023-07-06 博泰车联网(南京)有限公司 Control method and control system for vehicle trunk, and vehicle
CN116605176A (en) * 2023-07-20 2023-08-18 江西欧迈斯微电子有限公司 Unlocking and locking control method and device and vehicle
CN116605176B (en) * 2023-07-20 2023-11-07 江西欧迈斯微电子有限公司 Unlocking and locking control method and device and vehicle

Also Published As

Publication number Publication date
EP4021767A1 (en) 2022-07-06
CN114616140A (en) 2022-06-10
CN110525377A (en) 2019-12-03
EP4021767A4 (en) 2024-01-24

Similar Documents

Publication Publication Date Title
WO2021037052A1 (en) Control apparatus and method
CN107128282B (en) Moving device control of electric vehicle door
US11225822B2 (en) System and method for opening and closing vehicle door
CN106960486B (en) System and method for functional feature activation through gesture recognition and voice commands
US10407968B2 (en) System and method for operating vehicle door
US11518341B2 (en) Method for controlling a locking element of a vehicle
US10829978B2 (en) System and method for operating vehicle door
US10465429B2 (en) Controller, control method, and computer-readable recording medium
GB2498833A (en) Ultrasonic gesture recognition for vehicle
US11760360B2 (en) System and method for identifying a type of vehicle occupant based on locations of a portable device
US11247635B1 (en) System for providing access to a vehicle
US20170114583A1 (en) Intelligent vehicle access point opening system
CN114233120B (en) Hidden door handle control method, hidden door handle control device, hidden door handle control equipment and storage medium
JP2014214472A (en) Drive control device for opening/closing body for vehicle
KR102126021B1 (en) Automatic Car Door Opening-and-Closing System Using AVM and Method thereof
US20220324309A1 (en) System for controlling a closure panel of a vehicle
US20220325569A1 (en) System for a vehicle having closure panels
JPWO2013146919A1 (en) Vehicle control structure
JP2019196096A (en) Tail gate device
US11878654B2 (en) System for sensing a living being proximate to a vehicle
US20220327873A1 (en) System for a vehicle operable to enter a reverse mode
US20220324308A1 (en) System for a vehicle with a trailer coupled thereto
KR102429499B1 (en) Appartus and method for preventing clash slidong door of vehicle
CN106600721B (en) Intelligent parking management system based on virtual projection keyboard
WO2021093934A1 (en) Automatic vehicle closure operating

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20859091

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020859091

Country of ref document: EP

Effective date: 20220328