CN116620312A - Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium - Google Patents

Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium Download PDF

Info

Publication number
CN116620312A
CN116620312A CN202210125550.0A CN202210125550A CN116620312A CN 116620312 A CN116620312 A CN 116620312A CN 202210125550 A CN202210125550 A CN 202210125550A CN 116620312 A CN116620312 A CN 116620312A
Authority
CN
China
Prior art keywords
vehicle
current
qualified
human
state information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210125550.0A
Other languages
Chinese (zh)
Inventor
孔德盛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WM Smart Mobility Shanghai Co Ltd
Original Assignee
WM Smart Mobility Shanghai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WM Smart Mobility Shanghai Co Ltd filed Critical WM Smart Mobility Shanghai Co Ltd
Priority to CN202210125550.0A priority Critical patent/CN116620312A/en
Publication of CN116620312A publication Critical patent/CN116620312A/en
Pending legal-status Critical Current

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/02Total factory control, e.g. smart factories, flexible manufacturing systems [FMS] or integrated manufacturing systems [IMS]

Abstract

The embodiment of the application provides a man-vehicle interaction control method, a man-vehicle interaction control device, man-vehicle interaction control equipment, man-vehicle interaction control system, a man-vehicle interaction control vehicle and a man-vehicle interaction control storage medium. The man-vehicle interaction control method comprises the following steps: confirming whether the authority of the current operator positioned outside the vehicle is qualified or not; if the authority of the current operator is qualified, acquiring current vehicle state information, and confirming whether the current vehicle state information is qualified or not; and if the current vehicle state information is qualified, acquiring a first action image of the current operator, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image. The method and the device for controlling the vehicle-outside interaction of the human body and the vehicle have the advantages that the control convenience is higher, the control variety is not limited, the expansion is easy, the operation safety is better, and the like in the human-vehicle interaction process outside the vehicle.

Description

Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium
Technical Field
The application relates to the technical field of vehicle control, in particular to a man-vehicle interaction control method, a man-vehicle interaction control device, a man-vehicle interaction control equipment, a man-vehicle interaction control system, a man-vehicle interaction control vehicle and a man-vehicle interaction control storage medium.
Background
The automobile industry has developed to date, no more purely pursuing the mechanical properties of the vehicle, and in order to meet the diverse needs of owners, automobiles have begun to develop towards intellectualization, in particular the individual control of the vehicle, for example: human-vehicle interaction outside the vehicle.
The existing interaction between a person and a vehicle outside the vehicle is usually realized based on a remote control vehicle key, and specifically, an operator controls the vehicle to unlock, lock, lift a window, start an engine, start a vehicle-mounted air conditioner and the like by pressing a corresponding control key on the remote control vehicle key.
However, the external human-vehicle interaction based on the mode has the defects of low control convenience, limited control types, poor safety and the like.
Disclosure of Invention
Aiming at the defects of the existing mode, the application provides a human-vehicle interaction control method, a device, equipment, a system, a vehicle and a storage medium, which are used for solving the technical problems of low control convenience, limited control variety, poor safety and the like of human-vehicle interaction outside a vehicle in the prior art.
In a first aspect, an embodiment of the present application provides a method for controlling interaction between a person and a vehicle, including:
confirming whether the authority of the current operator positioned outside the vehicle is qualified or not;
if the authority of the current operator is qualified, acquiring current vehicle state information, and confirming whether the current vehicle state information is qualified or not;
and if the current vehicle state information is qualified, acquiring first action image information of the current operator, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image information.
Optionally, determining whether the authority of the current operator located outside the vehicle is qualified includes:
confirming whether a legal car key exists in a detection area outside the car;
if a legal car key exists in the detection area, acquiring the facial image information of the current operator, and confirming whether the facial image information of the current operator is matched with the prerecorded facial information;
and if the facial image information of the current operator is matched with the prerecorded facial information, confirming that the authority of the current operator is qualified.
Optionally, determining whether the authority of the current operator located outside the vehicle is qualified includes:
confirming whether a legal car key exists in a detection area outside the car;
if a legal car key exists in the detection area, acquiring second action image information of the current operator, and confirming whether the second action image information of the current operator is matched with the prerecorded unlocking image information;
and if the second action image information of the current operator is matched with the prerecorded unlocking image information, confirming that the authority of the current operator is qualified.
Optionally, the current vehicle state information is acquired, and whether the current vehicle state information is qualified or not is confirmed, including any one of the following:
acquiring engine state information, and if the engine state information is flameout, confirming that the current vehicle state information is qualified;
acquiring gear state information, and if the gear state information is neutral or parking gear, confirming that the current vehicle state information is qualified;
and acquiring personnel information in the vehicle, and if the personnel information in the vehicle is at least that the main driving area and the auxiliary driving area are unmanned, confirming that the current vehicle state information is qualified.
Optionally, acquiring the personnel information in the vehicle, if the personnel information in the vehicle is that at least the main driving area and the auxiliary driving area are unmanned, determining that the current vehicle state information is qualified includes:
acquiring in-vehicle life information, and if at least the main driving area and the auxiliary driving area are inanimate, confirming that the current vehicle state information is qualified;
or acquiring the information carried by the seats in the vehicle, and if at least the main driver seat and the auxiliary driver seat are empty, confirming that the current vehicle state information is qualified.
Optionally, the interactive execution instruction includes: at least one of door lock control, tail door control, car lamp control, car window control, hidden door handle control, front cabin cover control, windshield wiper control, atmosphere lamp control in the car, vehicle-mounted air conditioner control and electric seat control.
In a second aspect, an embodiment of the present application provides a human-vehicle interaction control device, including:
the permission confirming module is used for confirming whether the permission of the current control personnel positioned outside the vehicle is qualified or not;
the vehicle state confirmation module is used for acquiring the current vehicle state information and confirming whether the current vehicle state information is qualified or not if the authority of the current operator is qualified;
the instruction generation module is used for acquiring first action image information of the current operator if the current vehicle state information is qualified, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image information.
In a third aspect, an embodiment of the present application provides a man-vehicle interaction control apparatus, including:
a processor;
a memory electrically connected to the processor;
at least one program stored in the memory and configured to be executed by the processor, the at least one program configured to: the human-vehicle interaction control method provided in the first aspect is performed.
In a fourth aspect, an embodiment of the present application provides a human-vehicle interaction system, including: the man-vehicle interaction control device according to the third aspect, and the vehicle-mounted camera, the keyless sensing assembly and the vehicle body controller which are respectively connected with the processor in the man-vehicle interaction control device in a signal mode.
In a fifth aspect, an embodiment of the present application provides a vehicle including: the human-vehicle interaction system provided in the fourth aspect.
In a sixth aspect, an embodiment of the present application provides a computer readable storage medium, where the computer readable storage medium is configured to store computer instructions, and when the computer instructions are executed on a computer, implement the method for controlling interaction between a person and a vehicle provided in the first aspect.
The technical scheme provided by the embodiment of the application has the beneficial technical effects that:
1. in the human-vehicle interaction process outside the vehicle, confirming the authority of the current operator and carrying out self-checking on the current vehicle state, and generating and sending an interaction execution instruction which can be executed by the vehicle under the condition that the authority of the current operator is qualified and the current vehicle state is qualified, so that the safety of the operation authority and the safety of the operation opportunity of the human-vehicle interaction are favorably ensured, and double safety guarantee is provided for the human-vehicle interaction;
2. in the human-vehicle interaction process outside the vehicle, an interaction execution instruction which can be executed by the vehicle is generated by identifying the obtained first action image information of the current operator, compared with the human-vehicle interaction process outside the vehicle which is realized based on a remote control vehicle key, the human-vehicle interaction process is free from picking out the vehicle key, the operation is more convenient, the interaction instruction is free from depending on the limited physical keys on the remote control vehicle key, the operation type of human-computer interaction is not limited, the expansion of the operation type is facilitated, and the realization of personalized operation modes such as user-defined operation actions is facilitated;
3. in the interaction process of people and vehicles outside the vehicle, the self-checking of the current vehicle state is performed after the permission of the current operator is confirmed to be qualified, so that the vehicle resource is saved, and the vehicle energy consumption is reduced.
Additional aspects and advantages of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
FIG. 1 is a schematic diagram of a structural framework of a human-vehicle interaction system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of a structural framework of a man-vehicle interaction control device according to an embodiment of the present application;
FIG. 3 is a schematic flow chart of a method for controlling interaction between a person and a vehicle according to an embodiment of the present application;
fig. 4 is a schematic flow chart of a first embodiment of a method for controlling interaction between a person and a vehicle, in which whether the authority of the current operator located outside the vehicle is qualified is confirmed;
fig. 5 is a schematic flow chart of a second embodiment of the method for controlling interaction between a person and a vehicle, wherein the method is used for determining whether the authority of a current operator outside the vehicle is qualified;
fig. 6 is a schematic structural frame diagram of a man-vehicle interaction control device according to an embodiment of the present application.
In the figure:
100-a human-vehicle interaction system;
110-human-vehicle interaction control equipment; a 111-processor; 112-memory; 113-a bus; 114-a transceiver; 115-an input unit; 116-an output unit;
120-vehicle-mounted camera; 130-keyless sensing assembly; 140-a body controller;
200-a man-vehicle interaction control device; 210-a rights validation module; 220-a vehicle status confirmation module; 230-instruction generation module.
Detailed Description
The present application is described in detail below, examples of embodiments of the application are illustrated in the accompanying drawings, wherein the same or similar reference numerals refer to the same or similar components or components having the same or similar functions throughout. Further, if detailed description of the known technology is not necessary for the illustrated features of the present application, it will be omitted. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application.
It will be understood by those skilled in the art that all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs unless defined otherwise. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the prior art and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
As used herein, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, unless expressly stated otherwise, as understood by those skilled in the art. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. It will be understood that when an element is referred to as being "connected" or "coupled" to another element, it can be directly connected or coupled to the other element or intervening elements may also be present. Further, "connected" or "coupled" as used herein may include wirelessly connected or wirelessly coupled. The term "and/or" as used herein includes all or any element and all combination of one or more of the associated listed items.
The inventor of the application researches and discovers that the interaction between the outside and the human and the vehicle based on the remote control vehicle key needs to hold the key and press the corresponding control key for operation, and the control convenience is low; along with the increase of operation types, control keys on the remote control car keys are increased, which can bring requirements to the size of the remote control car keys, but the size of the remote control car keys cannot be infinitely increased, the visible controllable types are limited, and the remote control car keys are difficult to expand; the vehicle is usually operated in real time according to remote control signals of the remote control vehicle key based on the interaction of people and vehicles outside the vehicle, and the vehicle can be immediately executed no matter what state the vehicle is in, for example, the vehicle is in a driving state and is in a misoperation state to shut off an engine, so that safety accidents can be caused, and the visible safety is poor.
The inventor of the application also finds that the interaction between the person and the vehicle outside the vehicle can identify the operation instruction of the operator by additionally arranging an additional sensor, thereby realizing the corresponding interaction operation of the vehicle. For example, an electric tail gate system (plg) and a kick sensor (kick sensor) are added to a vehicle, and the tail gate is opened by recognizing that an operator aims at the sensor to make a kick action, but the additionally added kick sensor can increase the cost of the whole vehicle.
The application provides a man-vehicle interaction control method, a man-vehicle interaction control device, man-vehicle interaction control equipment, man-vehicle interaction control system, man-vehicle interaction control vehicle and man-vehicle interaction control storage medium, and aims to solve the technical problems in the prior art.
The following describes the technical scheme of the present application and how the technical scheme of the present application solves the above technical problems in detail with specific embodiments.
An embodiment of the present application provides a vehicle including: any of the human-vehicle interaction systems 100 provided below.
In this embodiment, the human-vehicle interaction system 100 in the vehicle can execute any of the human-vehicle interaction control methods provided below, and can achieve the beneficial effects of higher control convenience, unrestricted control variety, easy expansion, better operation safety and the like in the human-vehicle interaction process outside the vehicle.
The human-vehicle interaction system 100 and the human-vehicle interaction control method will be described in detail below, and are not described in detail herein.
Alternatively, the vehicle may be a fuel vehicle or a new energy vehicle.
Based on the same inventive concept, an embodiment of the present application provides a human-vehicle interaction system 100, and a schematic structural frame of the human-vehicle interaction system 100 is shown in fig. 1, including: any of the human-vehicle interactive control devices 110 provided below, as well as an onboard camera 120, a keyless sensing assembly 130, and a body controller 140, each in signal communication with a processor in the human-vehicle interactive control device 110.
In this embodiment, the human-vehicle interaction control device 110 in the human-vehicle interaction system 100 can execute any of the human-vehicle interaction control methods provided below, and can achieve the beneficial effects of higher control convenience, unrestricted control variety, easy expansion, better operation safety, and the like in the human-vehicle interaction process outside the vehicle.
The man-vehicle interaction control apparatus 110 and the man-vehicle interaction control method will be described in detail below, and are not described in detail herein.
In some possible embodiments, the keyless sensing component 130 may be configured to sense whether a legal car key exists in a certain area outside the car, and if the legal car key is sensed, the person-car interaction control device 110 wakes the car-mounted camera 120 to work; the vehicle-mounted camera 120 may be used for acquiring facial image information of a current operator outside the vehicle or action image information of the current operator, and sending the facial image information or the action image information of the current operator to the human-vehicle interaction control device 110 for permission confirmation; after the person-vehicle interaction control device 110 confirms that the authority is qualified, the vehicle-mounted camera 120 continues to acquire the first action image information of the current operator, the person-vehicle interaction control device 110 generates and sends an interaction execution instruction which can be executed by the vehicle to the vehicle body controller 140 according to the first action image information, and finally the vehicle body controller 140 correspondingly drives each execution mechanism of the vehicle according to the interaction execution instruction to complete the vehicle-vehicle interaction. For example, opening and closing of a door lock, opening and closing of a tail door, opening and closing of a lamp, lifting and lowering of a window, extension or retraction of a door handle, and the like are driven.
In other possible embodiments, the keyless entry assembly 130 may be configured to sense whether a legal car key exists in a certain area outside the car, and if the legal car key is sensed, the person-car interaction control device 110 performs permission confirmation and wakes up the car-mounted camera 120 to work; the vehicle-mounted camera 120 can acquire first motion image information of a current operator, the human-vehicle interaction control device 110 generates and sends an interaction execution instruction which can be executed by the vehicle to the vehicle body controller 140 according to the first motion image information, and finally the vehicle body controller 140 correspondingly drives each execution mechanism of the vehicle according to the interaction execution instruction to complete vehicle-vehicle interaction.
Optionally, the in-vehicle camera 120 may include: at least one of a vehicle recorder, a reversing camera and a 360-degree looking camera.
Based on the same inventive concept, an embodiment of the present application provides a human-vehicle interaction control apparatus 110, where a schematic diagram of a structural frame of the human-vehicle interaction control apparatus 110 is shown in fig. 2, including: a processor 111 and a memory 112, and at least one program.
The memory 112 is electrically connected to the processor 111;
at least one program is stored in the memory 112 and configured to be executed by the processor 111, the at least one program configured to: any one of the human-vehicle interaction control methods provided in the following embodiments is performed.
The embodiment of the present application provides a human-vehicle interaction control device 110, which is suitable for executing any one of the human-vehicle interaction control methods provided below, and the implementation principle is similar and will not be described herein.
Those skilled in the art will appreciate that the human-vehicle interaction control device 110 provided by the embodiments of the present application may be specially designed and manufactured for the desired purpose, or may comprise a known device in a general purpose computer. These devices have computer programs stored therein that are selectively activated or reconfigured. Such a computer program may be stored in a device (e.g., computer) readable medium or in any type of medium suitable for storing electronic instructions and coupled to a bus, respectively.
The present application provides, in an alternative embodiment, a human-vehicle interaction control apparatus 110, as shown in fig. 2, where the human-vehicle interaction control apparatus 110 shown in fig. 2 includes: a processor 111 and a memory 112. Wherein the processor 111 and the memory 112 are electrically connected, such as via a bus 113.
The processor 111 may be a CPU (Central Processing Unit ), general purpose processor, DSP (Digital Signal Processor, data signal processor), ASIC (Application Specific Integrated Circuit ), FPGA (Field-Programmable Gate Array, field programmable gate array) or other programmable logic device, transistor logic device, hardware component, or any combination thereof. Which may implement or perform the various exemplary logic blocks, modules and circuits described in connection with this disclosure. The processor 111 may also be a combination that implements computing functionality, e.g., comprising one or more microprocessor combinations, a combination of a DSP and a microprocessor, etc.
Bus 113 may include a path that communicates information between the components. Bus 113 may be a PCI (Peripheral Component Interconnect, peripheral component interconnect standard) bus, or EISA (Extended Industry Standard Architecture ) bus, among others. The bus 113 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in fig. 2, but not only one bus or one type of bus.
The Memory 112 may be, but is not limited to, ROM (Read-Only Memory) or other type of static storage device that can store static information and instructions, RAM (random access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (Electrically Erasable Programmable Read Only Memory, electrically erasable programmable Read-Only Memory), CD-ROM (Compact Disc Read-Only Memory) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
Optionally, the human-vehicle interactive control device 110 may also include a transceiver 114. The transceiver 114 may be used for both reception and transmission of signals. The transceiver 114 may allow the human-vehicle interactive control device 110 to communicate wirelessly or by wire with other devices to exchange data. It should be noted that, in practical application, the transceiver 114 is not limited to one.
Optionally, the human-vehicle interaction control apparatus 110 may further include an input unit 115. The input unit 115 may be used to receive input digital, character, image, and/or sound information or to generate key signal inputs related to user settings and function controls of the human-vehicle interaction control device 110. The input unit 115 may include, but is not limited to, one or more of a touch screen, a physical keyboard, function keys (such as volume control keys, switch keys, etc.), a track ball, a mouse, a joystick, a camera, a microphone, etc.
Optionally, the human-vehicle interaction control device 110 may further include an output unit 116. The output unit 116 may be used to output or present information processed by the processor 111. The output unit 116 may include, but is not limited to, one or more of a display device, a speaker, a vibration device, and the like.
While fig. 2 illustrates a human-vehicle interactive control apparatus 110 having various devices, it should be understood that not all illustrated devices are required to be implemented or provided. More or fewer devices may be implemented or provided instead.
Optionally, the memory 112 is used for storing application program codes for executing the inventive arrangements, and is controlled by the processor 111 for execution. The processor 111 is configured to execute application program codes stored in the memory 112, so as to implement any one of the human-vehicle interaction control methods provided in the embodiments of the present application.
Based on the same inventive concept, the embodiment of the application provides a human-vehicle interaction control method, a flow diagram of which is shown in fig. 3, comprising steps S101-S103:
s101: and confirming whether the authority of the current operator positioned outside the vehicle is qualified or not.
S102: and if the authority of the current operator is qualified, acquiring the current vehicle state information, and confirming whether the current vehicle state information is qualified or not.
S103: and if the current vehicle state information is qualified, acquiring first action image information of the current operator, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image information.
Through steps S101-S103, it is possible to realize:
in the human-vehicle interaction process outside the vehicle, the authority of the current operator is confirmed, the current vehicle state is self-checked, and an interaction execution instruction which can be executed by the vehicle is generated and sent only when the authority of the current operator is qualified and the current vehicle state is qualified, so that the safety of the operation authority and the safety of the operation opportunity of the human-vehicle interaction are favorably ensured, and dual safety guarantee is provided for the human-vehicle interaction.
In the human-vehicle interaction process outside the vehicle, the interaction execution instruction which can be executed by the vehicle is generated by identifying the obtained first action image information of the current operator, compared with the human-vehicle interaction outside the vehicle which is realized based on the remote control vehicle key, the human-vehicle interaction process is free from picking out the vehicle key, the operation is more convenient, the interaction instruction is free from depending on the limited physical keys on the remote control vehicle key, the operation type of human-computer interaction is not limited, the expansion of the operation type is facilitated, and the realization of personalized operation modes such as user-defined operation actions is facilitated.
In the interaction process of people and vehicles outside the vehicle, the self-checking of the current vehicle state is performed after the permission of the current operator is confirmed to be qualified, so that the vehicle resource is saved, and the vehicle energy consumption is reduced.
In the interaction process of people and vehicles outside the vehicle, the original equipment of the vehicle can be fully utilized, for example: the vehicle-mounted camera 120, the keyless sensing assembly 130, the vehicle body controller 140 and the like do not need to be additionally provided with hardware equipment, so that the cost is reduced, the upgrading and reconstruction of the existing vehicle are realized, and the vehicle-mounted camera has good popularization prospect.
Alternatively, steps S101 to S103 may be performed by the human-vehicle interaction control apparatus 110 provided in the foregoing embodiment.
In some possible embodiments, the step S101 of confirming whether the authority of the current operator located outside the vehicle is acceptable or not may include steps S201 to S203 as shown in fig. 4:
s201: and confirming whether a legal car key exists in a detection area outside the car.
In this step S201, the detection area may be a spatial area surrounded by a set of points having a predetermined distance from the center of the vehicle, or may be a spatial area surrounded by a set of points having a predetermined distance from a certain portion (for example, a tail gate, a front and back door, or the like) or a certain portion of the vehicle.
Whether the car key is legal or not, a low-frequency signal for searching for the key can be periodically generated through a low-frequency antenna driven by the keyless sensing assembly 130, and whether the matched car key exists in the detection area or not is detected. The man-vehicle interaction control device 110 determines according to the search result of the keyless sensing assembly 130, if yes, it can determine that a legal car key exists in the detection area, and if no, it can determine that no legal car key exists in the detection area.
S202: if a legal car key exists in the detection area, acquiring the facial image information of the current operator, and confirming whether the facial image information of the current operator is matched with the prerecorded facial information.
In this step S202, the facial image information may be obtained by the vehicle-mounted camera 120 after the human-vehicle interaction control device 110 determines that the detection area has a legal vehicle key, and then obtains the facial image of the current operator.
S203: and if the facial image information of the current operator is matched with the prerecorded facial information, confirming that the authority of the current operator is qualified.
In this embodiment, through the steps S201 to S203, it may be achieved that a legal car key appears in a detection area located outside the car to wake up the car-mounted camera 120 to work, and according to the face image information of the current operator located outside the car, which is obtained by the car-mounted camera 120 and converted, it is determined whether the authority of the current operator located outside the car is qualified.
In some possible embodiments, the step S101 of confirming whether the authority of the current operator located outside the vehicle is acceptable or not may include steps S301 to S303 as shown in fig. 5:
s301: and confirming whether a legal car key exists in a detection area outside the car.
S302: if the legal car key exists in the detection area, second action image information of the current operator is obtained, and whether the second action image information of the current operator is matched with the prerecorded unlocking image information is confirmed.
In the step S302, the second motion image information may be obtained by the vehicle-mounted camera 120 after the human-vehicle interaction control device 110 determines that the detection area has a legal vehicle key, and then obtains the second motion image of the current operator.
Alternatively, the second action may be a dynamic gesture, such as: the single-hand action track of the current operator is L-shaped.
Alternatively, the second action may be a static gesture, such as: the current hand posture of the operator is "X" shaped, or "inverted V" shaped, etc.
S303: and if the second action image information of the current operator is matched with the prerecorded unlocking image information, confirming that the authority of the current operator is qualified.
In this embodiment, through the steps S301 to S303, it may be achieved that a legal car key appears in a detection area located outside the car to wake up the car-mounted camera 120 to work, and according to the second motion image information of the current operator located outside the car, which is obtained by the car-mounted camera 120 and converted, it is determined whether the authority of the current operator located outside the car is qualified.
In some possible embodiments, the step S102 acquires the current vehicle state information and confirms whether the current vehicle state information is acceptable, by the following method: and acquiring the engine state information, and if the engine state information is flameout, confirming that the current vehicle state information is qualified.
In some possible embodiments, the step S102 acquires the current vehicle state information and confirms whether the current vehicle state information is acceptable, by the following method: and acquiring gear state information, and if the gear state information is neutral or parking gear, confirming that the current vehicle state information is qualified.
In some possible embodiments, the step S102 acquires the current vehicle state information and confirms whether the current vehicle state information is acceptable, by the following method: and acquiring personnel information in the vehicle, and if the personnel information in the vehicle is at least that the main driving area and the auxiliary driving area are unmanned, confirming that the current vehicle state information is qualified.
Optionally, acquiring the personnel information in the vehicle, if the personnel information in the vehicle is that at least the main driving area and the auxiliary driving area are unmanned, determining that the current vehicle state information is qualified includes: and acquiring in-vehicle life information, and if at least the main driving area and the auxiliary driving area are inanimate, confirming that the current vehicle state information is qualified. Namely, the acquisition of the in-vehicle personnel information can be realized through the life detector.
Optionally, acquiring the personnel information in the vehicle, if the personnel information in the vehicle is that at least the main driving area and the auxiliary driving area are unmanned, determining that the current vehicle state information is qualified includes: and acquiring the in-vehicle seat bearing information, and if at least the main driving seat and the auxiliary driving seat are empty, confirming that the current vehicle state information is qualified. That is, the acquisition of the in-vehicle personnel information can be achieved by the pressure sensor of the in-vehicle seat.
It will be appreciated that the foregoing various specific ways of determining whether the current vehicle state information is acceptable may be alternatively used or arbitrarily combined according to the use requirements.
In some possible embodiments, the interactive execution instruction in step S103 includes: at least one of door lock control, tail door control, car lamp control, car window control, hidden door handle control, front cabin cover control, windshield wiper control, atmosphere lamp control in the car, vehicle-mounted air conditioner control and electric seat control.
Based on the same inventive concept, an embodiment of the present application provides a man-vehicle interaction control device 200, a schematic diagram of a structural frame of which is shown in fig. 6, including: a rights validation module 210, a vehicle status validation module 220, and an instruction generation module 230.
The permission confirming module 210 is used for confirming whether the permission of the current operator located outside the vehicle is qualified.
The vehicle state confirmation module 220 is configured to obtain the current vehicle state information if the authority of the current operator is qualified, and confirm whether the current vehicle state information is qualified.
The instruction generating module 230 is configured to obtain first motion image information of a current operator if the current vehicle state information is qualified, and generate and issue an interactive execution instruction executable by the vehicle according to the first motion image information.
In this embodiment, in the process of implementing the man-vehicle interaction outside the vehicle, the permission confirmation module 210 confirms the permission of the current operator, the vehicle state confirmation module 220 performs self-checking on the current vehicle state, and the instruction generation module 230 generates and sends the interaction execution instruction executable by the vehicle only when the permission of the current operator is qualified and the current vehicle state is qualified, so that the security of the operation permission and the security of the operation opportunity of the man-vehicle interaction are guaranteed, and dual security guarantee is provided for the man-vehicle interaction.
The interactive execution instruction which can be executed by the vehicle is generated by identifying the acquired first action image information of the current operator, compared with the interaction of the person and the vehicle outside the vehicle, which is realized based on the remote control vehicle key, the interaction process is more convenient and rapid to operate, the interactive instruction is not required to depend on the limited physical key on the remote control vehicle key, the operation type of the man-machine interaction is not limited any more, the expansion of the operation type is facilitated, and the realization of personalized operation modes such as self-defined operation action and the like is facilitated.
The vehicle state confirmation module 220 performs the self-checking of the current vehicle state after the permission confirmation module 210 confirms that the permission of the current operator is qualified, which is beneficial to saving vehicle resources and reducing vehicle energy consumption.
In some possible embodiments, the permission confirmation module 210 is configured to confirm whether the permission of the current operator located outside the vehicle is acceptable, specifically: confirming whether a legal car key exists in a detection area outside the car; if a legal car key exists in the detection area, acquiring the facial image information of the current operator, and confirming whether the facial image information of the current operator is matched with the prerecorded facial information; and if the facial image information of the current operator is matched with the prerecorded facial information, confirming that the authority of the current operator is qualified.
In some possible embodiments, the permission confirmation module 210 is configured to confirm whether the permission of the current operator located outside the vehicle is acceptable, specifically: confirming whether a legal car key exists in a detection area outside the car; if a legal car key exists in the detection area, acquiring second action image information of the current operator, and confirming whether the second action image information of the current operator is matched with the prerecorded unlocking image information; and if the second action image information of the current operator is matched with the prerecorded unlocking image information, confirming that the authority of the current operator is qualified.
In some possible embodiments, the vehicle state confirmation module 220 is configured to obtain the current vehicle state information and confirm whether the current vehicle state information is acceptable, specifically configured to: acquiring engine state information, and if the engine state information is flameout, confirming that the current vehicle state information is qualified; or acquiring gear state information, and if the gear state information is neutral or parking gear, confirming that the current vehicle state information is qualified; or acquiring the personnel information in the vehicle, and if the personnel information in the vehicle is at least that the main driving area and the auxiliary driving area are unmanned, confirming that the current vehicle state information is qualified.
In some possible embodiments, the vehicle state confirmation module 220 is configured to obtain in-vehicle personnel information, and if the in-vehicle personnel information is that at least the primary driving area and the secondary driving area are unmanned, confirm that the current vehicle state information is qualified, specifically configured to: acquiring in-vehicle life information, and if at least the main driving area and the auxiliary driving area are inanimate, confirming that the current vehicle state information is qualified; or acquiring the information carried by the seats in the vehicle, and if at least the main driver seat and the auxiliary driver seat are empty, confirming that the current vehicle state information is qualified.
Based on the same inventive concept, the embodiments of the present application provide a computer readable storage medium for storing computer instructions, which when executed on a computer, implement any one of the human-vehicle interaction control methods provided in the embodiments above.
The embodiment of the application provides various optional implementation manners of a computer readable storage medium suitable for the above-mentioned man-vehicle interaction control method, and its implementation principles are similar and will not be repeated here.
The computer readable storage medium may be, but is not limited to, ROM (Read-Only Memory) or other type of static storage device that can store static information and instructions, RAM (random access Memory ) or other type of dynamic storage device that can store information and instructions, EEPROM (Electrically Erasable Programmable Read Only Memory, electrically erasable programmable Read-Only Memory), CD-ROM (Compact Disc Read-Only Memory) or other optical disk storage, optical disk storage (including compact disk, laser disk, optical disk, digital versatile disk, blu-ray disk, etc.), magnetic disk storage or other magnetic storage device, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer.
By applying the embodiment of the application, at least the following beneficial effects can be realized:
1. in the human-vehicle interaction process outside the vehicle, the authority of the current operator is confirmed, the current vehicle state is self-checked, and an interaction execution instruction which can be executed by the vehicle is generated and sent only when the authority of the current operator is qualified and the current vehicle state is qualified, so that the safety of the operation authority and the safety of the operation opportunity of the human-vehicle interaction are favorably ensured, and dual safety guarantee is provided for the human-vehicle interaction.
2. In the human-vehicle interaction process outside the vehicle, the interaction execution instruction which can be executed by the vehicle is generated by identifying the obtained first action image information of the current operator, compared with the human-vehicle interaction outside the vehicle which is realized based on the remote control vehicle key, the human-vehicle interaction process is free from picking out the vehicle key, the operation is more convenient, the interaction instruction is free from depending on the limited physical keys on the remote control vehicle key, the operation type of human-computer interaction is not limited, the expansion of the operation type is facilitated, and the realization of personalized operation modes such as user-defined operation actions is facilitated.
3. In the interaction process of people and vehicles outside the vehicle, the self-checking of the current vehicle state is performed after the permission of the current operator is confirmed to be qualified, so that the vehicle resource is saved, and the vehicle energy consumption is reduced.
4. In the interaction process of the person and the vehicle outside the vehicle, the original equipment of the vehicle can be fully utilized without adding hardware equipment, so that the cost is reduced, the upgrading and reconstruction of the existing vehicle are realized, and the vehicle has better popularization prospect.
Those of skill in the art will appreciate that the various operations, methods, steps in the flow, acts, schemes, and alternatives discussed in the present application may be alternated, altered, combined, or eliminated. Further, other steps, means, or steps in a process having various operations, methods, or procedures discussed herein may be alternated, altered, rearranged, disassembled, combined, or eliminated. Further, steps, measures, schemes in the prior art with various operations, methods, flows disclosed in the present application may also be alternated, altered, rearranged, decomposed, combined, or deleted.
In the description of the present application, it should be understood that the terms "center," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," and the like indicate orientations or positional relationships based on the orientation or positional relationships shown in the drawings, merely to facilitate describing the present application and simplify the description, and do not indicate or imply that the devices or elements referred to must have a specific orientation, be configured and operated in a specific orientation, and thus should not be construed as limiting the present application.
The terms "first," "second," and the like, are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defining "a first" or "a second" may explicitly or implicitly include one or more such feature. In the description of the present application, unless otherwise indicated, the meaning of "a plurality" is two or more.
In the description of the present application, it should be noted that, unless explicitly specified and limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, and may be either fixedly connected, detachably connected, or integrally connected, for example; can be directly connected or indirectly connected through an intermediate medium, and can be communication between two elements. The specific meaning of the above terms in the present application will be understood in specific cases by those of ordinary skill in the art.
In the description of the present specification, a particular feature, structure, material, or characteristic may be combined in any suitable manner in one or more embodiments or examples.
It should be understood that, although the steps in the flowcharts of the figures are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited in order and may be performed in other orders, unless explicitly stated herein. Moreover, at least some of the steps in the flowcharts of the figures may include a plurality of sub-steps or stages that are not necessarily performed at the same time, but may be performed at different times, the order of their execution not necessarily being sequential, but may be performed in turn or alternately with other steps or at least a portion of the other steps or stages.
The foregoing is only a partial embodiment of the present application, and it should be noted that it will be apparent to those skilled in the art that modifications and adaptations can be made without departing from the principles of the present application, and such modifications and adaptations are intended to be comprehended within the scope of the present application.

Claims (11)

1. The human-vehicle interaction control method is characterized by comprising the following steps of:
confirming whether the authority of the current operator positioned outside the vehicle is qualified or not;
if the authority of the current operator is qualified, acquiring current vehicle state information, and confirming whether the current vehicle state information is qualified or not;
and if the current vehicle state information is qualified, acquiring first action image information of the current operator, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image information.
2. The human-vehicle interaction control method according to claim 1, wherein the confirming whether the authority of the current operator located outside the vehicle is qualified comprises:
confirming whether a legal car key exists in a detection area outside the car;
if a legal car key exists in the detection area, acquiring the facial image information of the current operator, and confirming whether the facial image information of the current operator is matched with the prerecorded facial information;
and if the facial image information of the current operator is matched with the prerecorded facial information, confirming that the authority of the current operator is qualified.
3. The human-vehicle interaction control method according to claim 1, wherein the confirming whether the authority of the current operator located outside the vehicle is qualified comprises:
confirming whether a legal car key exists in a detection area outside the car;
if a legal car key exists in the detection area, acquiring second action image information of the current operator, and confirming whether the second action image information of the current operator is matched with prerecorded unlocking image information or not;
and if the second action image information of the current operator is matched with the prerecorded unlocking image information, confirming that the authority of the current operator is qualified.
4. The human-vehicle interactive control method according to claim 1, wherein the obtaining current vehicle state information and confirming whether the current vehicle state information is acceptable comprises any one of:
acquiring engine state information, and if the engine state information is flameout, confirming that the current vehicle state information is qualified;
acquiring gear state information, and if the gear state information is neutral or parking gear, confirming that the current vehicle state information is qualified;
and acquiring in-vehicle personnel information, and if the in-vehicle personnel information is that at least a main driving area and a secondary driving area are unmanned, confirming that the current vehicle state information is qualified.
5. The method for controlling interaction between a vehicle and a person according to claim 4, wherein the obtaining the in-vehicle personnel information, if the in-vehicle personnel information is that at least a main driving area and a sub driving area are unmanned, determining that the current vehicle state information is qualified, comprises:
acquiring in-vehicle life information, and if at least a main driving area and a secondary driving area are inanimate, confirming that the current vehicle state information is qualified;
or acquiring the in-vehicle seat bearing information, and if at least the main driver seat and the auxiliary driver seat are empty, confirming that the current vehicle state information is qualified.
6. The human-vehicle interaction control method according to claim 1, wherein the interaction execution instruction includes: at least one of door lock control, tail door control, car lamp control, car window control, hidden door handle control, front cabin cover control, windshield wiper control, atmosphere lamp control in the car, vehicle-mounted air conditioner control and electric seat control.
7. A human-vehicle interaction control device, characterized by comprising:
the permission confirming module is used for confirming whether the permission of the current control personnel positioned outside the vehicle is qualified or not;
the vehicle state confirmation module is used for acquiring current vehicle state information and confirming whether the current vehicle state information is qualified or not if the authority of the current operator is qualified;
and the instruction generation module is used for acquiring first action image information of the current operator if the current vehicle state information is qualified, and generating and sending an interactive execution instruction which can be executed by the vehicle according to the first action image information.
8. A human-vehicle interactive control apparatus, characterized by comprising:
a processor;
a memory electrically connected to the processor;
at least one program stored in the memory and configured to be executed by the processor, the at least one program configured to: a human-vehicle interaction control method according to any one of the preceding claims 1-6.
9. A human-vehicle interaction system, comprising: the human-vehicle interactive control device according to claim 8, and an on-vehicle camera, a keyless sensing assembly and a vehicle body controller which are respectively in signal connection with a processor in the human-vehicle interactive control device.
10. A vehicle, characterized by comprising: the human-vehicle interaction system of claim 9.
11. A computer readable storage medium for storing computer instructions which, when run on a computer, implement the human-vehicle interaction control method of any one of the preceding claims 1-6.
CN202210125550.0A 2022-02-10 2022-02-10 Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium Pending CN116620312A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210125550.0A CN116620312A (en) 2022-02-10 2022-02-10 Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210125550.0A CN116620312A (en) 2022-02-10 2022-02-10 Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium

Publications (1)

Publication Number Publication Date
CN116620312A true CN116620312A (en) 2023-08-22

Family

ID=87640475

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210125550.0A Pending CN116620312A (en) 2022-02-10 2022-02-10 Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium

Country Status (1)

Country Link
CN (1) CN116620312A (en)

Similar Documents

Publication Publication Date Title
US20060145825A1 (en) Virtual keypad for vehicle entry control
CN106558128B (en) System and method for enhancing the range of a key fob
KR102576876B1 (en) Method for interactively presenting content on the exterior surface of a vehicle
JP6664048B2 (en) vehicle
EP1147034B1 (en) Interface for communicating with a computer mounted in a vehicle
US20130103236A1 (en) Use of smartphones, pocket size computers, tablets or other mobile personal computers as the main computer of computerized vehicles
CN109083550A (en) Vehicular door auxiliary
CN104691449A (en) Vehicle control apparatus and method thereof
CN111469908B (en) Touch control steering wheel, control method and automobile
GB2423808A (en) Gesture controlled system for controlling vehicle accessories
JP2019092191A (en) Vehicle use support system and vehicle use support method
US11267439B2 (en) Activation of valet mode for vehicles
KR20200108515A (en) Vehicle smart key system and method for controlling using the same
US20160214573A1 (en) Access control system for a motor vehicle
US20200114872A1 (en) Mobile terminal device and vehicle control system
CN112351915A (en) Vehicle and cabin zone controller
JP2007238058A (en) Smart entry system for vehicle
CN106499262A (en) Vehicular intelligent key based on APP softwares
KR20210025368A (en) Vehicle and controlling method thereof
US11287895B2 (en) System for remote vehicle door and window opening
CN107031565A (en) Motor vehicles valet parking security system
CN108216115B (en) System and method for controlling vehicle door lock system
CN214728672U (en) Vehicle entering control system based on face recognition
JP2010100125A (en) Control device and control method
CN116620312A (en) Human-vehicle interaction control method and device, equipment, system, vehicle and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication