WO2021147897A1 - Apparatus and method for controlling a vehicle-mounted device, and vehicle-mounted device - Google Patents

Apparatus and method for controlling a vehicle-mounted device, and vehicle-mounted device Download PDF

Info

Publication number
WO2021147897A1
WO2021147897A1 PCT/CN2021/072867 CN2021072867W WO2021147897A1 WO 2021147897 A1 WO2021147897 A1 WO 2021147897A1 CN 2021072867 W CN2021072867 W CN 2021072867W WO 2021147897 A1 WO2021147897 A1 WO 2021147897A1
Authority
WO
WIPO (PCT)
Prior art keywords
driver
target
vehicle
sightline
display screen
Prior art date
Application number
PCT/CN2021/072867
Other languages
French (fr)
Inventor
Kazuhiko Sakai
Weicheng Zhou
Original Assignee
Faurecia Clarion Electronics (Xiamen) Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Faurecia Clarion Electronics (Xiamen) Co., Ltd. filed Critical Faurecia Clarion Electronics (Xiamen) Co., Ltd.
Publication of WO2021147897A1 publication Critical patent/WO2021147897A1/en

Links

Images

Classifications

    • B60K35/10
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60KARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
    • B60K35/00Arrangement of adaptations of instruments
    • B60K35/22
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • B60K2360/146
    • B60K2360/1464
    • B60K2360/149
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/038Indexing scheme relating to G06F3/038
    • G06F2203/0381Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer

Definitions

  • the present application relates to the field of vehicle-mounted devices, and in particular, to an apparatus and a method for controlling a vehicle-mounted device, and a vehicle-mounted device.
  • a vehicle is usually equipped with a vehicle-mounted device.
  • the present application provides an apparatus and a method for controlling a vehicle-mounted device, and a vehicle-mounted device, which may improve an operational convenience of the vehicle-mounted device.
  • inventions of the present application provide an apparatus for controlling a vehicle-mounted device.
  • the apparatus for controlling the vehicle-mounted device includes: a sightline detection unit used to detect a sightline of a driver; a first display unit used to display a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of a first display screen, the target control being a control displayed in the target region; a gesture detection unit used to detect a hand motion of the driver after the selection cursor is displayed over the target control.
  • the first display unit is further used to move the selection cursor according to the hand motion of the driver.
  • embodiments of the present application provide a method for controlling a vehicle-mounted device, which includes: detecting a sightline of a driver; displaying a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of a first display screen, the target control being a control displayed in the target region; detecting a hand motion of the driver after displaying the selection cursor over the target control; and moving the selection cursor according to the hand motion of the driver.
  • embodiments of the present application provide an apparatus for controlling a vehicle-mounted device, which includes: a processor and a memory that is used for storing instructions that are capable of being executed by the processor; the processor is configured to execute the instructions to implement the method for controlling the vehicle-mounted device provided in the second aspect.
  • embodiments of the present application provide a computer-readable storage medium, which stores instructions that, when executed by a computer, cause the computer to perform the method for controlling the vehicle-mounted device provided in the second aspect.
  • the computer-readable storage medium may be a non-transitory computer-readable storage medium.
  • embodiments of the present application provide a computer program product that, when run on a computer, causes the computer to perform the method for controlling the vehicle-mounted device provided in the second aspect.
  • inventions of the present application provide a vehicle-mounted device.
  • the vehicle-mounted device includes the apparatus for controlling the vehicle-mounted device provided in the first aspect and the first display screen.
  • the present application take into account that in a case where the area of the vehicle-mounted display screen is too large or the position of the vehicle-mounted display screen is far away from the driver, the driver may need to make a larger range of movement to realize the operations on a part, away from the driver, of the vehicle-mounted display screen.
  • the present application proposes to combine the sightline recognition and the gesture recognition to detect the sightline of the driver, and thus the position range of the control that the driver wants to operate on the vehicle-mounted display screen is determined. After the sightline of the driver is detected to fall into a target region of the vehicle-mounted display screen, a selection cursor is displayed over a target control in the target region.
  • the vehicle-mounted device may be controlled by triggering the control.
  • the solution provided by the present application can not only avoid a problem of large control errors caused by the incapability of accurately recognize the position corresponding to the sightline when using the sightline recognition, but also avoid a problem of complicated operations caused by only using gesture recognition for control.
  • FIG. 1 is a first schematic diagram showing a structure of a vehicle-mounted device, in accordance with embodiments of the present application;
  • FIG. 2 is a first schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
  • FIG. 3 is a diagram showing a working process of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
  • FIG. 4 is a first schematic diagram of a display interface, in accordance with embodiments of the present application.
  • FIG. 5 is a second schematic diagram of a display interface, in accordance with embodiments of the present application.
  • FIG. 6 is a schematic diagram showing a correspondence between gestures and control actions, in accordance with embodiments of the present application.
  • FIG. 7 is a third schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application.
  • FIG. 8 is a fourth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application.
  • FIG. 9 is a second schematic diagram showing a structure of a vehicle-mounted device, in accordance with embodiments of the present application.
  • FIG. 10 is a schematic flow diagram of a detection of a sightline, in accordance with embodiments of the present application.
  • FIG. 11 is a schematic diagram of a detection of a sightline of a driver, in accordance with embodiments of the present application.
  • FIG. 12 is a fifth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application.
  • FIG. 13 is a sixth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application.
  • the term "for example” or “such as” is used to give an example, an illustration, or an explanation. Any embodiment or design described as “for example” or “such as” in the embodiments of the present application will not be construed as more preferred or advantageous over other embodiments or designs. Rather, use of the term “for example” or “such as” is intended to present relevant concepts in a specific manner.
  • the term "a plurality of" means two or more unless otherwise specified.
  • first may also be referred to as a second target marker or the like.
  • the word “if” used herein may be construed as “in a case where” or “when” or “in response to determining” or “in response to detecting” .
  • the phrase “if (a stated condition or event) is determined” or “if (a stated condition or event) is detected” may be construed as” in a case where the stated condition or event is determined” or “in response to determining the stated condition or event” or “in a case where the stated condition or event is detected” or “in response to detecting the stated condition or event” .
  • a function of a vehicle-mounted device may be realized by performing a series of touch operations on a vehicle-mounted display screen.
  • a user e.g., the driver
  • touch an edge region of the vehicle-mounted display screen so as to control the vehicle-mounted device to perform some of its functions, which may cost the user more time and more large movement to touch the edge region of the vehicle-mounted display screen. That is to say, as an area of the vehicle-mounted display screen increases, the operational convenience of the vehicle-mounted device may be reduced.
  • the present application provides an apparatus for controlling a vehicle-mounted device and a method for controlling a vehicle-mounted device.
  • a sightline of the driver is detected to determine a position range of a control on the vehicle-mounted display screen that the driver wants to operate.
  • a selection cursor is displayed over a target control in the target region.
  • a hand motion of the driver is detected to move a position of the selection cursor, so that the selection cursor can be accurately positioned on the control that the driver wants to operate.
  • the control of the vehicle-mounted device may be realized by triggering the control.
  • FIG. 1 is a schematic diagram showing a structure of a vehicle-mounted device provided by the present application.
  • the vehicle-mounted device 10 includes a first camera 102, a second camera 103, and a first display screen 104.
  • the first camera 102 is used for capturing facial images of the driver, so as to detect the sightline of the driver according to the facial images of the driver.
  • the second camera 103 is used for capturing hand images of the driver, so as to detect the hand motion of the driver according to the hand images of the driver.
  • the first display screen 104 is used for displaying various types of information.
  • the first display screen 104 may be a display screen of a center console of a vehicle.
  • the first camera 102 and the second camera 103 are two different cameras.
  • the facial image and the hand image of the driver may be captured by a same camera. That is, the first camera 102 and the second camera 103 are the same camera.
  • the vehicle-mounted device 10 further includes the apparatus 101 for controlling the vehicle-mounted device which is used to apply methods for controlling the vehicle-mounted device provided by the present application.
  • the apparatus 101 for controlling the vehicle-mounted device may include a sightline detection unit 1011, a first display unit 1012, and a gesture detection unit 1013.
  • the sightline detection unit 1011 is used for detecting the sightline of the driver. For example, after the first camera 102 in FIG. 1 transmits a captured facial image of the driver to the apparatus 101 for controlling the vehicle-mounted device, the sightline detection unit 1011 detects the sightline of the driver by using the obtained facial image of the driver.
  • the first display unit 1012 is used for controlling contents displayed on the vehicle-mounted display screen. For example, the first display unit 1012 is used for controlling contents displayed on the first display screen 104 in FIG. 1.
  • the gesture detection unit 1013 is used for detecting the hand motion of the driver. For example, after the second camera 103 in FIG. 1 transmits a captured hand image of the driver to the apparatus 101 for controlling the vehicle-mounted device, the gesture detection unit 1013 detects the hand motion of the driver by using the obtained hand image of the driver.
  • the above functional units may be integrated in one processing unit, or each unit may physically exist separately, or two or more of the units may be integrated in one unit.
  • the present application does not limit this.
  • the working process of the apparatus 101 for controlling the vehicle-mounted device may include the following steps, i.e., S201 to S204.
  • the sightline detection unit 1011 detects the sightline of the driver.
  • the first display unit 1012 displays a selection cursor over a target control.
  • the target control may include a control displayed in the target region.
  • FIG. 4 is a schematic diagram of a display interface of the first display screen, in accordance with embodiments of the present application.
  • the first display screen is a long and narrow rectangular display screen.
  • a display region where the driver wants to perform operations is determined by detecting the sightline of the driver. For example, when the sightline of the driver is determined, according to the sightline of the driver, to be located in the target region A in FIG. 4, the selection cursor is displayed over a target control, i.e., "button A" , in the target region A. For example, the button A is highlighted.
  • the contents displayed on the first display screen are as shown in FIG. 5, in which the "button A" is highlighted (the shaded portion in the figure is used for indicating the highlighting) .
  • control referred to in the present application is used for indicating a pattern that is displayed on the display interface and that performs a corresponding operation or function after being triggered.
  • control may also be referred to as a control component, a button, or other names. It will be understood that, these names may each be considered as the "control” referred to in the present application.
  • selection cursor refers to a cursor used for displaying a position of a control that is to be triggered currently on the display interface.
  • the selection cursor may be a pattern mark with a specific shape (e.g., an arrow-shaped pattern, or a finger-shaped image) .
  • the selection cursor may also be represented in a manner of changing a display form of the control to be triggered (e.g., changing it to a display form different from other controls' display forms, such as highlighting the control to be triggered) .
  • the gesture detection unit 1013 detects the hand motion of the driver.
  • the first display unit 1012 moves the selection cursor according to the hand motion of the driver.
  • FIG. 6 shows a correspondence between hand motions and methods of controlling the selection cursor, in accordance with embodiments of the present application.
  • waving the hand leftwards will move the selection cursor to the left; waving the hand rightwards will move the selection cursor to the right; moving the hand upwards will move the selection cursor upwards; moving the hand downwards will move the selection cursor downwards; and making a fist means a determination, i.e., triggering a function corresponding to the selection cursor.
  • a fist means a determination, i.e., triggering a function corresponding to the selection cursor.
  • pointing upwards, downwards, leftwards and rightwards with a finger correspond to moving the selection cursor upwards, downwards, leftwards and rightwards, respectively; and making an "OK" motion by hand means a determination, i.e., triggering a function corresponding to the selection cursor.
  • the vehicle-mounted device may be controlled by triggering the control.
  • the solution provided by the present application can not only avoid a problem of large control error caused by the incapability of accurately recognizing a position corresponding to the sightline when sightline recognition is used, but also avoid a problem of complicated operations caused by only using the gesture recognition for control.
  • the apparatus 101 for controlling the vehicle-mounted device provided by the present application further includes a function trigger unit 1014.
  • the function trigger unit 1014 is used for triggering functions corresponding to various controls displayed on the display screen.
  • the working process of the apparatus 101 for controlling the vehicle-mounted device may further include:
  • the first target gesture may be the motion of making a fist shown in (a) of FIG. 6, or the "OK" motion shown in (b) of FIG. 6.
  • the function of the control currently corresponding to the selection cursor is triggered.
  • the function of the control currently corresponding to the selection cursor is triggered.
  • the first display unit 1012 is further used to control the first display screen 104 to display a display interface corresponding to the control.
  • the function triggering unit 1014 is further used to trigger a function of a control displayed on the display screen 104, according to a gesture of the driver detected by the gesture detection unit 1013.
  • the vehicle-mounted device displays an audio file (e.g., the audio file of a song) , and the display interface (called music playing interface) corresponding to the control may be displayed on the first display screen 104 under the control of the first display unit 1012.
  • the function triggering unit 1014 may trigger the function of a control displayed on the music playing interface, such as a function of playing the next song, playing the previous song, increasing the volume, or decreasing the volume.
  • the target region determined according to the sightline of the driver may be displayed on another screen closer to the driver to facilitate the driver to perform the touch operations.
  • two display screens may be considered to be provided in the vehicle, and one display screen is farther from the driver, and the other display screen is closer to the driver.
  • the display screen farther from the driver may have a larger display area for displaying more richer contents, and the display screen closer to the driver may be used to display the contents in the target region that are determined according to the sightline of the driver.
  • the vehicle-mounted device 10 may further include the second display screen 105 used to display contents displayed in the target region of the first display screen 104.
  • the first display screen 104 may be the display screen closer to the driver, and the second display screen 105 may be the display screen farther from the driver.
  • the second display screen 105 is disposed at a distance from the driver greater than a distance from the first display screen 104 to the driver.
  • the apparatus 101 for controlling the vehicle-mounted device may further include a second display unit 1015.
  • the second display unit 1015 is used to control the second display screen 105 to display contents.
  • the working process of the apparatus 101 for controlling the vehicle-mounted device may further include:
  • the second target gesture may be a gesture of stretching five fingers from a fist-making state.
  • the second display screen may be a display screen of an electronic device in wireless or wired connection with the vehicle-mounted device, such as a mobile phone, or a tablet computer.
  • the driver may perform relevant touch operations after placing the electronic device at a place convenient to operate for himself/herself.
  • the target region may be set as a rectangular region, and the present application further provides a method for determining whether the sightline of the driver falls into the target region.
  • S201 in the present application may include the following steps.
  • the position of the eye of the driver and the direction of the sightline of the driver may be determined through the facial image of the driver captured by the camera by using an image recognition technology.
  • the target coordinate system includes a space rectangular coordinate system which takes the position of the eye of the driver as an origin and takes a vertical direction, a horizontal direction and a direction right ahead of the vehicle as coordinate axes.
  • FIG. 11 is a schematic diagram of the detection of the sightline of the driver, in accordance with embodiments of the present application. It is assumed that the position of the eye of the driver is at point O, a position of the camera for detecting the sightline of the driver is at point O', and the target region is a region shown by the rectangle abcd.
  • the target coordinate system may be the space rectangular coordinate system which takes the position of the eye of the driver as the origin and takes the vertical direction, the horizontal direction and the direction right ahead of the vehicle as the coordinate axes.
  • the coordinates of the four vertices a, b, c, and d, relative to the target coordinate system are (x1, y1, z1) , (x2, y2, z2) , (x3, y3, z3) , and (x4, y4, z4) , and a coordinate of the point O is (0, 0, 0) .
  • a coordinate axis in the vertical direction is referred to as a z-axis
  • a coordinate axis pointing to the front of the vehicle is referred to as an x-axis
  • a coordinate axis in the horizontal direction is referred to as a y-axis.
  • S2013 it is determined whether the sightline of the driver falls into the target region, according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
  • the sightline of the driver falls into the target region. For example, by determining whether an point of intersection, at which the sightline of the driver and a plane where the display screen is located meet, falls into the rectangle formed by the a, b, c, and d, it may be determined whether the sightline of the driver falls into the target region.
  • S2013 may include: determining whether the sightline of the driver falls into the target region if the following conditions 1 to 4 are met.
  • Condition 1 an included angle between the sightline E of the driver and an Oxy plane is greater than an included angle between an Oab plane and the Oxy plane.
  • Condition 2 the included angle between the sightline E of the driver and the Oxy plane is less than an included angle between an Ocd plane and the Oxy plane.
  • Condition 3 an included angle between the sightline E of the driver and an Oyz plane is greater than an included angle between an Oac plane and the Oyz plane.
  • Condition 4 the included angle between the sightline E of the driver and the Oyz plane is less than an included angle between an Obd plane and the Oyz plane.
  • the Oxy plane is a plane formed by the x-axis and the y-axis in the target coordinate system.
  • the Oab plane is a plane formed by the O point, the a point, and the b point.
  • the Oyz plane is a plane formed by the y-axis and the z-axis in the target coordinate system.
  • the Ocd plane is a plane formed by the O point, the c point, and the d point.
  • the Oac plane is a plane formed by the O point, the a point, and the c point.
  • the Obd plane is a plane formed by the O point, the b point, and the d point.
  • condition 1 may be that the included angle between the sightline E of the driver and the Oxy plane is greater than ⁇ 1;
  • x1, y1, and z1 are coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; and x3, y3, and z3 are coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively.
  • ⁇ 1 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0 ⁇ 1 ⁇ 180°) .
  • condition 2 may be that the included angle between the sightline E of the driver and the Oxy plane is less than ⁇ 2.
  • x1, y1, and z1 are the coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x2, y2, and z2 are coordinates of the vertex b of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x3, y3, and z3 are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively.
  • ⁇ 2 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0 ⁇ 2 ⁇ 180°) .
  • the condition 3 may be that the included angle ⁇ 3 between the sightline E of the driver and the Oyz plane meets:
  • x1, y1, and z1 are the coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x3, y3, and z3 are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x4, y4, and z4 are coordinates of the vertex d of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively.
  • ⁇ 3 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0 ⁇ 3 ⁇ 180°) .
  • the condition 4 may be that the included angle between the sightline E of the driver and the Oyz plane is less than ⁇ 4.
  • x2, y2, and z2 are the coordinates of the vertex b of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x3, y3, and z3 are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively;
  • x4, y4, and z4 are the coordinates of the vertex d of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively.
  • ⁇ 4 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0 ⁇ ⁇ 4 ⁇ 180°) .
  • the apparatus and the method for controlling the vehicle-mounted device considering that in the case where the area of the vehicle-mounted display screen is excessively large or the position where the vehicle-mounted display screen is located is far away from the driver, the driver needs to make a large movement to perform operations on the part of the vehicle-mounted display screen away from the driver.
  • the sightline recognition and the gesture recognition are combined, and by detecting the sightline of the driver, the position range of the control that the driver wants to operate on the vehicle-mounted display screen is determined. After it is detected that the sightline of the driver falls into the target region of the vehicle-mounted display screen, the selection cursor is displayed over the target control in the target region.
  • the vehicle-mounted device may be controlled by triggering the control.
  • the present application further provides a method for controlling the vehicle-mounted device.
  • the method includes the following steps.
  • the selection cursor is displayed over the target control which is a control displayed in the target region.
  • the method further includes:
  • the target region is a rectangular region.
  • the step of detecting the sightline of the driver includes: detecting the position of the eye of the driver and the direction of the sightline of the driver; calculating the coordinate values of the four vertices of the target region in the target coordinate system according to the position of the eye of the driver, wherein the target coordinate system includes the space rectangular coordinate system which takes the position of the eye of the driver as the origin and takes the vertical direction, the horizontal direction and the direction right ahead of the vehicle as the coordinate axes; and determining whether the sightline of the driver falls into the target region according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
  • the method further includes:
  • the second display screen includes the vehicle-mounted display screen at the distance from the driver greater than the distance from the first display screen to the driver.
  • FIG. 12 is a schematic diagram showing another possible structure of the apparatus for controlling the vehicle-mounted device involved in the above embodiments.
  • the apparatus 40 for controlling the vehicle-mounted device includes a processing module 401, a communication module 402, and a storage module 403.
  • the processing module 401 is used for controlling and managing actions of the apparatus 40 for controlling the vehicle-mounted device.
  • the processing module 401 is used for supporting the apparatus 40 for controlling the vehicle-mounted device to perform the steps in the above embodiments.
  • the communication module 402 is used for supporting communication between the apparatus 40 for controlling the vehicle-mounted device and other entities.
  • the storage module 403 is used for storing program codes and data of the apparatus 40 for controlling the vehicle-mounted device.
  • the processing module 401 may be a processor or a controller, such as a central processing unit (CPU) , a general-purpose processor, a digital signal processor (DSP) , an application-specific integrated circuit (ASIC) , a field programmable gate array (FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It may implement or execute various illustrative logical blocks, modules and circuits described in the disclosure of the present application.
  • the processor may also be a combination of implementation of computing functions, such as a combination including N microprocessors, or a combination of a DSP and a microprocessor.
  • the communication module 402 may be a transceiver, a transceiver circuit, a communication interface, or the like.
  • the storage module 403 may be a memory.
  • the apparatus for controlling the vehicle-mounted device involved in the embodiments of the present application may be the apparatus 50 for controlling a vehicle-mounted device as follows.
  • the apparatus 50 for controlling the vehicle-mounted device includes a processor 501 and a memory 503.
  • the apparatus 50 for controlling the vehicle-mounted device may further include a transceiver 502 and a bus 504.
  • the processor 501, the transceiver 502, and the memory 503 are connected to each other through the bus 504.
  • the bus 504 may be a peripheral component interconnect (PCI) bus, an extended industry standard architecture (EISA) bus, or the like.
  • PCI peripheral component interconnect
  • EISA extended industry standard architecture
  • the bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, in the figure, only one thick line is used to represent the bus, but this does not mean that there is only one bus or one type of bus.
  • the processor 501 may be a general-purpose central processing unit (CPU) , a microprocessor, an application-specific integrated circuit (ASIC) , or N integrated circuits for controlling execution of programs of the solutions in the present application.
  • CPU central processing unit
  • ASIC application-specific integrated circuit
  • the memory 503 may be a read-only memory (ROM) or other types of static storage devices that may store static information and instructions, a random access memory (RAM) or other types of dynamic storage devices that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM) , a compact disc read-only memory (CD-ROM) or other compact disk storages (including a compact disc, a laser disc, an optical disc, a digital versatile disc, a blu-ray disc) , a magnetic disk storage medium or other magnetic storage devices, or any other medium that can be used for carrying or storing desired program codes in a form of instructions or data structures and that can be accessed by a computer, which is not limited thereto.
  • the memory may exist separately and is connected to the processor through the bus. The memory may also be integrated with the processor.
  • the memory 503 is used for storing application program codes for performing the solutions in the present application, and the solutions in the present application are controlled and performed by the processor 501.
  • the transceiver 502 is used for receiving contents input by an external device, and the processor 501 is used for executing the application program codes stored in the memory 503, thereby implementing functions of various virtual units in the apparatus for controlling the vehicle-mounted device in the embodiments of the present application.
  • the functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may physically exist separately, or two or more of the units may be integrated in one unit.
  • the functions may be implemented in whole or in part through software, hardware, firmware, or any combination thereof.
  • the functions can be implemented in a form of a computer program product in whole or in part.
  • the computer program product includes N computer instructions.
  • computer program instructions When computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are generated in whole or in part.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium, or may be transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from one website, computer, server or data center to another website, computer, server or data center through wired means (e.g., coaxial cable, optical fiber, digital subscriber line (DSL) ) or wireless means (e.g., infrared, wireless, or microwave) .
  • the computer-readable storage medium may be any available medium that can be accessed by the computer or a data storage device including N servers or data centers that may be integrated by media.
  • the available medium may be a magnetic medium (e.g., a floppy disk, a hard disk, or a magnetic tape) , an optical medium (e.g., DVD) , a semiconductor medium (e.g., a solid state disk (SSD) ) , or the like.

Abstract

An apparatus for controlling a vehicle-mounted device are provided. The apparatus includes: a sightline detection unit used to detect a sightline of a driver; a first display unit used to display a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of a first display screen, the target control being a control displayed in the target region; and a gesture detection unit used to detect a hand motion of the driver after the selection cursor is displayed over the target control. The first display unit is further used to move the selection cursor according to the hand motion of the driver.

Description

APPARATUS AND METHOD FOR CONTROLLING A VEHICLE-MOUNTED DEVICE, AND VEHICLE-MOUNTED DEVICE
This application claims priority to Chinese Patent Application No. 202010072018.8, filed on January 21, 2020, which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
The present application relates to the field of vehicle-mounted devices, and in particular, to an apparatus and a method for controlling a vehicle-mounted device, and a vehicle-mounted device.
BACKGROUND
A vehicle is usually equipped with a vehicle-mounted device.
SUMMARY
The present application provides an apparatus and a method for controlling a vehicle-mounted device, and a vehicle-mounted device, which may improve an operational convenience of the vehicle-mounted device.
In order to achieve the above purpose, in embodiments of the present application, the following technical solutions are adopted.
In a first aspect, embodiments of the present application provide an apparatus for controlling a vehicle-mounted device. The apparatus for controlling the vehicle-mounted device includes: a sightline detection unit used to detect a sightline of a driver; a first display unit used to display a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of a first display screen, the target control being a control displayed in the target region; a gesture detection unit used to detect a hand motion of the driver after the selection cursor is displayed over the target control. The first display unit is further used to move the selection cursor according to the hand motion of the driver.
In a second aspect, embodiments of the present application provide a method for controlling a vehicle-mounted device, which includes: detecting a sightline of a driver; displaying a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of a first display screen, the target control being a control displayed in the target region; detecting a hand motion of the driver after  displaying the selection cursor over the target control; and moving the selection cursor according to the hand motion of the driver.
In a third aspect, embodiments of the present application provide an apparatus for controlling a vehicle-mounted device, which includes: a processor and a memory that is used for storing instructions that are capable of being executed by the processor; the processor is configured to execute the instructions to implement the method for controlling the vehicle-mounted device provided in the second aspect.
In a fourth aspect, embodiments of the present application provide a computer-readable storage medium, which stores instructions that, when executed by a computer, cause the computer to perform the method for controlling the vehicle-mounted device provided in the second aspect. The computer-readable storage medium may be a non-transitory computer-readable storage medium.
In a fifth aspect, embodiments of the present application provide a computer program product that, when run on a computer, causes the computer to perform the method for controlling the vehicle-mounted device provided in the second aspect.
In a sixth aspect, embodiments of the present application provide a vehicle-mounted device. The vehicle-mounted device includes the apparatus for controlling the vehicle-mounted device provided in the first aspect and the first display screen.
The present application take into account that in a case where the area of the vehicle-mounted display screen is too large or the position of the vehicle-mounted display screen is far away from the driver, the driver may need to make a larger range of movement to realize the operations on a part, away from the driver, of the vehicle-mounted display screen. For the above problem, the present application proposes to combine the sightline recognition and the gesture recognition to detect the sightline of the driver, and thus the position range of the control that the driver wants to operate on the vehicle-mounted display screen is determined. After the sightline of the driver is detected to fall into a target region of the vehicle-mounted display screen, a selection cursor is displayed over a target control in the target region. Then, by detecting a hand motion of the driver, a position of the selection cursor is moved, so that the selection cursor can be accurately positioned on the control that the driver wants to operate. In this way, after the selection cursor is moved over the control that the driver wants to operate, the vehicle-mounted device may be controlled by triggering the control.  The solution provided by the present application can not only avoid a problem of large control errors caused by the incapability of accurately recognize the position corresponding to the sightline when using the sightline recognition, but also avoid a problem of complicated operations caused by only using gesture recognition for control.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to describe technical solutions in embodiments of the present application or in the prior art more clearly, the accompanying drawings to be used in the description of the embodiments or the prior art will be introduced briefly below.
FIG. 1 is a first schematic diagram showing a structure of a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 2 is a first schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 3 is a diagram showing a working process of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 4 is a first schematic diagram of a display interface, in accordance with embodiments of the present application;
FIG. 5 is a second schematic diagram of a display interface, in accordance with embodiments of the present application;
FIG. 6 is a schematic diagram showing a correspondence between gestures and control actions, in accordance with embodiments of the present application;
FIG. 7 is a third schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 8 is a fourth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 9 is a second schematic diagram showing a structure of a vehicle-mounted device, in accordance with embodiments of the present application;
FIG. 10 is a schematic flow diagram of a detection of a sightline, in accordance with embodiments of the present application;
FIG. 11 is a schematic diagram of a detection of a sightline of a driver, in  accordance with embodiments of the present application;
FIG. 12 is a fifth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application; and
FIG. 13 is a sixth schematic diagram showing a structure of an apparatus for controlling a vehicle-mounted device, in accordance with embodiments of the present application.
DETAILED DESCRIPTION
The embodiments of the present application will be described below with reference to the accompanying drawings.
In the embodiments of the present application, the term "for example" or "such as" is used to give an example, an illustration, or an explanation. Any embodiment or design described as "for example" or "such as" in the embodiments of the present application will not be construed as more preferred or advantageous over other embodiments or designs. Rather, use of the term "for example" or "such as" is intended to present relevant concepts in a specific manner. In addition, in the description of the embodiments of the present application, the term "a plurality of" means two or more unless otherwise specified.
Terms used in the embodiments of the present application are only for a purpose of describing specific embodiments, and are not intended to limit the present application. Singular forms such as "a" , "an" and "the" used in the embodiments of the present application and the appended claims are also intended to include plural forms, unless the context clearly indicates other meanings.
It will be understood that, although the terms such as "first" , "second" and "third" may be used in the embodiments of the present application to describe various markers, thresholds, signals and instructions, these markers, thresholds, signals and instructions will not be limited to these terms. These terms are only used to distinguish the markers, thresholds, signals, and instructions from one another. For example, without departing from the scope of the embodiments of the present application, a first target marker may also be referred to as a second target marker or the like.
Depending on the context, the word "if" used herein may be construed as "in a case where" or "when" or "in response to determining" or "in response to detecting" .  Similarly, depending on the context, the phrase "if (a stated condition or event) is determined" or "if (a stated condition or event) is detected" may be construed as" in a case where the stated condition or event is determined" or "in response to determining the stated condition or event" or "in a case where the stated condition or event is detected" or "in response to detecting the stated condition or event" .
At present, a function of a vehicle-mounted device may be realized by performing a series of touch operations on a vehicle-mounted display screen. However, as the vehicle-mounted display screen becomes wider and longer, it may be difficult for a user (e.g., the driver) to touch an edge region of the vehicle-mounted display screen so as to control the vehicle-mounted device to perform some of its functions, which may cost the user more time and more large movement to touch the edge region of the vehicle-mounted display screen. That is to say, as an area of the vehicle-mounted display screen increases, the operational convenience of the vehicle-mounted device may be reduced.
Considering that in a case where a position where the vehicle-mounted display screen is located is far away from a driver, if the control of a vehicle-mounted device is realized through touch operations performed by the driver on the vehicle-mounted display screen, it becomes more inconvenient for the driver to perform direct touch operations on the vehicle-mounted display screen. Therefore, there is a need to control the vehicle-mounted device in an indirect touch manner. However, in existing control methods, whether the control is realized by using gesture recognition or voice recognition, there are problems such as low recognition precision and complicated operations.
Therefore, the present application provides an apparatus for controlling a vehicle-mounted device and a method for controlling a vehicle-mounted device. In the present application, a sightline of the driver is detected to determine a position range of a control on the vehicle-mounted display screen that the driver wants to operate. Then, after it is detected that the sightline of the driver falls into a target region of the vehicle-mounted display screen, a selection cursor is displayed over a target control in the target region. Then, a hand motion of the driver is detected to move a position of the selection cursor, so that the selection cursor can be accurately positioned on the control that the driver wants to operate. In this way, after the selection cursor is moved over the control that the driver wants to operate, the control of the vehicle-mounted device may  be realized by triggering the control.
Based on the above inventive concept, embodiments of the present application provide a method for controlling the vehicle-mounted device, which is applied to the apparatus for controlling the vehicle-mounted device. For example, FIG. 1 is a schematic diagram showing a structure of a vehicle-mounted device provided by the present application. The vehicle-mounted device 10 includes a first camera 102, a second camera 103, and a first display screen 104. The first camera 102 is used for capturing facial images of the driver, so as to detect the sightline of the driver according to the facial images of the driver. The second camera 103 is used for capturing hand images of the driver, so as to detect the hand motion of the driver according to the hand images of the driver. The first display screen 104 is used for displaying various types of information. For example, the first display screen 104 may be a display screen of a center console of a vehicle.
In some examples, the first camera 102 and the second camera 103 are two different cameras. Of course, in some other scenarios, the facial image and the hand image of the driver may be captured by a same camera. That is, the first camera 102 and the second camera 103 are the same camera.
In addition, as shown in FIG. 1, the vehicle-mounted device 10 further includes the apparatus 101 for controlling the vehicle-mounted device which is used to apply methods for controlling the vehicle-mounted device provided by the present application.
As shown in FIG. 2, the apparatus 101 for controlling the vehicle-mounted device may include a sightline detection unit 1011, a first display unit 1012, and a gesture detection unit 1013.
The sightline detection unit 1011 is used for detecting the sightline of the driver. For example, after the first camera 102 in FIG. 1 transmits a captured facial image of the driver to the apparatus 101 for controlling the vehicle-mounted device, the sightline detection unit 1011 detects the sightline of the driver by using the obtained facial image of the driver. The first display unit 1012 is used for controlling contents displayed on the vehicle-mounted display screen. For example, the first display unit 1012 is used for controlling contents displayed on the first display screen 104 in FIG. 1. The gesture detection unit 1013 is used for detecting the hand motion of the driver. For example, after the second camera 103 in FIG. 1 transmits a captured hand image of the driver to the apparatus 101 for controlling the vehicle-mounted device, the gesture detection unit  1013 detects the hand motion of the driver by using the obtained hand image of the driver.
In some implementations, the above functional units may be integrated in one processing unit, or each unit may physically exist separately, or two or more of the units may be integrated in one unit. The present application does not limit this.
Hereinafter, a working process of the apparatus 101 for controlling the vehicle-mounted device provided by the embodiments of the present application will be described below. As shown in FIG. 3, the working process of the apparatus 101 for controlling the vehicle-mounted device may include the following steps, i.e., S201 to S204.
In S201, the sightline detection unit 1011 detects the sightline of the driver.
In S202, in response to detecting that the sightline of the driver falls into a target region of the first display screen, the first display unit 1012 displays a selection cursor over a target control.
The target control may include a control displayed in the target region.
For example, FIG. 4 is a schematic diagram of a display interface of the first display screen, in accordance with embodiments of the present application. As can be seen, the first display screen is a long and narrow rectangular display screen. When the driver directly touches the display screen, it is difficult for the driver to perform touch operations on a part of the display screen away from the driver. Therefore, in the present application, a display region where the driver wants to perform operations is determined by detecting the sightline of the driver. For example, when the sightline of the driver is determined, according to the sightline of the driver, to be located in the target region A in FIG. 4, the selection cursor is displayed over a target control, i.e., "button A" , in the target region A. For example, the button A is highlighted. In this case, the contents displayed on the first display screen are as shown in FIG. 5, in which the "button A" is highlighted (the shaded portion in the figure is used for indicating the highlighting) .
It will be noted that, the "control" referred to in the present application is used for indicating a pattern that is displayed on the display interface and that performs a corresponding operation or function after being triggered. In some scenarios, the "control" may also be referred to as a control component, a button, or other names. It will be understood that, these names may each be considered as the "control" referred to in the present application.
In addition, the "selection cursor" referred to in the present application refers to a cursor used for displaying a position of a control that is to be triggered currently on the display interface. For example, the selection cursor may be a pattern mark with a specific shape (e.g., an arrow-shaped pattern, or a finger-shaped image) . For another example, the selection cursor may also be represented in a manner of changing a display form of the control to be triggered (e.g., changing it to a display form different from other controls' display forms, such as highlighting the control to be triggered) .
In S203, after the selection cursor is displayed over the target control, the gesture detection unit 1013 detects the hand motion of the driver.
In S204, the first display unit 1012 moves the selection cursor according to the hand motion of the driver.
For example, FIG. 6 shows a correspondence between hand motions and methods of controlling the selection cursor, in accordance with embodiments of the present application. In (a) of FIG. 6, waving the hand leftwards will move the selection cursor to the left; waving the hand rightwards will move the selection cursor to the right; moving the hand upwards will move the selection cursor upwards; moving the hand downwards will move the selection cursor downwards; and making a fist means a determination, i.e., triggering a function corresponding to the selection cursor. In (b) of FIG. 6, pointing upwards, downwards, leftwards and rightwards with a finger correspond to moving the selection cursor upwards, downwards, leftwards and rightwards, respectively; and making an "OK" motion by hand means a determination, i.e., triggering a function corresponding to the selection cursor.
In this way, after the selection cursor is moved over the control that the driver wants to operate, the vehicle-mounted device may be controlled by triggering the control. The solution provided by the present application can not only avoid a problem of large control error caused by the incapability of accurately recognizing a position corresponding to the sightline when sightline recognition is used, but also avoid a problem of complicated operations caused by only using the gesture recognition for control.
In an implementation, as shown in FIG. 7, the apparatus 101 for controlling the vehicle-mounted device provided by the present application further includes a function trigger unit 1014. The function trigger unit 1014 is used for triggering functions corresponding to various controls displayed on the display screen.
As shown in FIG. 3, the working process of the apparatus 101 for controlling the vehicle-mounted device may further include:
S205, in response to the gesture detection unit 1013 detecting a first target gesture of the driver, triggering, by the function trigger unit 1014, the function of a control currently corresponding to the selection cursor.
For example, the first target gesture may be the motion of making a fist shown in (a) of FIG. 6, or the "OK" motion shown in (b) of FIG. 6. When it is detected that the driver makes the motion of making a fist by hand, the function of the control currently corresponding to the selection cursor is triggered. Or, when it is detected that the driver makes the "OK" motion by hand, the function of the control currently corresponding to the selection cursor is triggered.
In some embodiments, after the function of the control currently corresponding to the selection cursor is triggered, the first display unit 1012 is further used to control the first display screen 104 to display a display interface corresponding to the control. The function triggering unit 1014 is further used to trigger a function of a control displayed on the display screen 104, according to a gesture of the driver detected by the gesture detection unit 1013.
For example, after a control for playing audio files is triggered by the function triggering unit 1014, the vehicle-mounted device displays an audio file (e.g., the audio file of a song) , and the display interface (called music playing interface) corresponding to the control may be displayed on the first display screen 104 under the control of the first display unit 1012. According to the gesture of the driver detected by the gesture detection unit 1013, the function triggering unit 1014 may trigger the function of a control displayed on the music playing interface, such as a function of playing the next song, playing the previous song, increasing the volume, or decreasing the volume.
In one implementation, when it is unsuitable to perform operations by using gestures, the target region determined according to the sightline of the driver may be displayed on another screen closer to the driver to facilitate the driver to perform the touch operations.
In a possible design, in the present application, two display screens may be considered to be provided in the vehicle, and one display screen is farther from the driver, and the other display screen is closer to the driver. The display screen farther from the driver may have a larger display area for displaying more richer contents, and the display  screen closer to the driver may be used to display the contents in the target region that are determined according to the sightline of the driver.
For example, as shown in FIG. 9, besides the first display screen 104, the vehicle-mounted device 10 may further include the second display screen 105 used to display contents displayed in the target region of the first display screen 104. The first display screen 104 may be the display screen closer to the driver, and the second display screen 105 may be the display screen farther from the driver. And the second display screen 105 is disposed at a distance from the driver greater than a distance from the first display screen 104 to the driver.
Further, as shown in FIG. 8, the apparatus 101 for controlling the vehicle-mounted device may further include a second display unit 1015. The second display unit 1015 is used to control the second display screen 105 to display contents. As shown in FIG. 3, the working process of the apparatus 101 for controlling the vehicle-mounted device may further include:
S206, in response to the gesture detection unit 1013 detecting a second target gesture of the driver, displaying, by the second display unit, contents currently displayed in the target region on the second display screen.
For example, the second target gesture may be a gesture of stretching five fingers from a fist-making state. The second display screen may be a display screen of an electronic device in wireless or wired connection with the vehicle-mounted device, such as a mobile phone, or a tablet computer. As such, after the driver makes a gesture of stretching five fingers from the fist-making state, the contents displayed in the target region are displayed on the display screen of the electronic device, and then the driver may perform relevant touch operations after placing the electronic device at a place convenient to operate for himself/herself.
In addition, in an implementation, the target region may be set as a rectangular region, and the present application further provides a method for determining whether the sightline of the driver falls into the target region. For example, as shown in FIG. 10, S201 in the present application may include the following steps.
In S2011, a position of an eye of the driver and a direction of the sightline of the driver are detected.
For example, as described above, the position of the eye of the driver and the direction of the sightline of the driver may be determined through the facial image of the  driver captured by the camera by using an image recognition technology.
In S2012, coordinate values of four vertices of the target region in a target coordinate system are calculated according to the position of the eye of the driver.
The target coordinate system includes a space rectangular coordinate system which takes the position of the eye of the driver as an origin and takes a vertical direction, a horizontal direction and a direction right ahead of the vehicle as coordinate axes.
For example, FIG. 11 is a schematic diagram of the detection of the sightline of the driver, in accordance with embodiments of the present application. It is assumed that the position of the eye of the driver is at point O, a position of the camera for detecting the sightline of the driver is at point O', and the target region is a region shown by the rectangle abcd.
Since relative position between the camera and the target region is fixed, coordinates of four vertices a, b, c, and d of the target region in a space coordinate system with the point O' as an origin may be known. In addition, according to the position of the eye of the driver, the coordinates of a, b, c, and d in the space coordinate system with the point O' as the origin may be converted to coordinates in the target coordinate system with the point O as the origin. For example, the target coordinate system may be the space rectangular coordinate system which takes the position of the eye of the driver as the origin and takes the vertical direction, the horizontal direction and the direction right ahead of the vehicle as the coordinate axes.
It is assumed that the coordinates of the four vertices a, b, c, and d, relative to the target coordinate system are (x1, y1, z1) , (x2, y2, z2) , (x3, y3, z3) , and (x4, y4, z4) , and a coordinate of the point O is (0, 0, 0) .
For convenience of description, in the following description of the present application, in the target coordinate system, a coordinate axis in the vertical direction is referred to as a z-axis, a coordinate axis pointing to the front of the vehicle is referred to as an x-axis, and a coordinate axis in the horizontal direction is referred to as a y-axis.
In S2013, it is determined whether the sightline of the driver falls into the target region, according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
For example, after the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver are known, it may be determined whether the sightline of the driver falls into the target region. For  example, by determining whether an point of intersection, at which the sightline of the driver and a plane where the display screen is located meet, falls into the rectangle formed by the a, b, c, and d, it may be determined whether the sightline of the driver falls into the target region.
In an implementation, S2013 may include: determining whether the sightline of the driver falls into the target region if the following conditions 1 to 4 are met.
Condition 1: an included angle between the sightline E of the driver and an Oxy plane is greater than an included angle between an Oab plane and the Oxy plane.
Condition 2: the included angle between the sightline E of the driver and the Oxy plane is less than an included angle between an Ocd plane and the Oxy plane.
Condition 3: an included angle between the sightline E of the driver and an Oyz plane is greater than an included angle between an Oac plane and the Oyz plane.
Condition 4: the included angle between the sightline E of the driver and the Oyz plane is less than an included angle between an Obd plane and the Oyz plane.
The Oxy plane is a plane formed by the x-axis and the y-axis in the target coordinate system. The Oab plane is a plane formed by the O point, the a point, and the b point. The Oyz plane is a plane formed by the y-axis and the z-axis in the target coordinate system. The Ocd plane is a plane formed by the O point, the c point, and the d point. The Oac plane is a plane formed by the O point, the a point, and the c point. The Obd plane is a plane formed by the O point, the b point, and the d point.
Further, in a possible design, the condition 1 may be that the included angle between the sightline E of the driver and the Oxy plane is greater than θ1;
Figure PCTCN2021072867-appb-000001
where x1, y1, and z1 are coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; and x3, y3, and z3 are coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively. In addition, θ1 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0≤θ1≤180°) .
For example, in a case where the coordinates of the a piont and the c point relative to the target coordinate system are (x1, y1, z1) and (x3, y3, z3) , respectively, according to the formula for cross product, it will be known that a normal vector of the plane Oac is Oac = Oa × Oc. That is, Oac= (y1z3-z1y3, z1x3-z3x1, x1y3-y1x3) .
Further, according to the cosine formula, it will be known that:
Figure PCTCN2021072867-appb-000002
Further, it will be known that:
Figure PCTCN2021072867-appb-000003
Similarly, in a possible design, the condition 2 may be that the included angle between the sightline E of the driver and the Oxy plane is less than θ2.
Figure PCTCN2021072867-appb-000004
where x1, y1, and z1 are the coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; x2, y2, and z2 are coordinates of the vertex b of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; and x3, y3, and z3 are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively. In addition, θ2 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0≤θ2≤180°) .
The condition 3 may be that the included angle θ3 between the sightline E of the driver and the Oyz plane meets:
Figure PCTCN2021072867-appb-000005
where x1, y1, and z1 are the coordinates of the vertex a of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; x3, y3, and z3 are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; and x4, y4, and z4 are coordinates of the vertex d of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively. In addition, θ3 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0≤θ3≤180°) .
The condition 4 may be that the included angle between the sightline E of the driver and the Oyz plane is less than θ4.
Figure PCTCN2021072867-appb-000006
where x2, y2, and z2 are the coordinates of the vertex b of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; x3, y3, and z3  are the coordinates of the vertex c of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively; and x4, y4, and z4 are the coordinates of the vertex d of the target region on the x-axis, y-axis, and z-axis in the target coordinate system, respectively. In addition, θ4 is greater than or equal to 0 degrees and is less than or equal to 180 degrees (i.e., 0≤ θ4≤180°) .
As for the apparatus and the method for controlling the vehicle-mounted device provided by the present application, considering that in the case where the area of the vehicle-mounted display screen is excessively large or the position where the vehicle-mounted display screen is located is far away from the driver, the driver needs to make a large movement to perform operations on the part of the vehicle-mounted display screen away from the driver. For the above problem, in the present application, it is provided that the sightline recognition and the gesture recognition are combined, and by detecting the sightline of the driver, the position range of the control that the driver wants to operate on the vehicle-mounted display screen is determined. After it is detected that the sightline of the driver falls into the target region of the vehicle-mounted display screen, the selection cursor is displayed over the target control in the target region. Then, by detecting the hand motion of the driver, the position of the selection cursor is moved, so that the selection cursor can be accurately positioned on the control that the driver wants to operate. In this way, after the selection cursor is moved over the control that the driver wants to operate, the vehicle-mounted device may be controlled by triggering the control. The solutions provided by the present application can not only avoid the problem of large control errors caused by the incapability of accurately recognizing the position corresponding to the sightline when the sightline recognition is used, but also avoid the problem of complicated operations caused by only using the gesture recognition for control.
In another embodiment, the present application further provides a method for controlling the vehicle-mounted device. The method includes the following steps.
In S301, the sightline of the driver is detected.
In S302, in response to detecting that the sightline of the driver falls into the target region of the first display screen, the selection cursor is displayed over the target control which is a control displayed in the target region.
In S303, after the selection cursor is displayed over the target control, the hand motion of the driver is detected.
In S304, the selection cursor is moved according to the hand motion of the driver.
Optionally, after the selection cursor is displayed over the target control and the hand motion of the driver is detected, the method further includes:
S305, in response to detecting the first target gesture of the driver, triggering the function of the control currently corresponding to the selection cursor.
Optionally, the target region is a rectangular region.
The step of detecting the sightline of the driver includes: detecting the position of the eye of the driver and the direction of the sightline of the driver; calculating the coordinate values of the four vertices of the target region in the target coordinate system according to the position of the eye of the driver, wherein the target coordinate system includes the space rectangular coordinate system which takes the position of the eye of the driver as the origin and takes the vertical direction, the horizontal direction and the direction right ahead of the vehicle as the coordinate axes; and determining whether the sightline of the driver falls into the target region according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
Optionally, after the selection cursor is displayed over the target control and the hand motion of the driver is detected, the method further includes:
S306, in response to detecting the second target gesture of the driver, displaying the contents currently displayed in the target region on the second display screen.
Optionally, the second display screen includes the vehicle-mounted display screen at the distance from the driver greater than the distance from the first display screen to the driver.
Based on a same inventive concept, as for the specific implementation process, the technical problem solved, and the technical effects achieved, of the method for controlling the vehicle-mounted device provided by the present application, reference may be made to the relevant description of the apparatus for controlling the vehicle-mounted device, and details will not be repeated here.
In a case where an integrated unit is applied, FIG. 12 is a schematic diagram showing another possible structure of the apparatus for controlling the vehicle-mounted device involved in the above embodiments. The apparatus 40 for controlling the vehicle-mounted device includes a processing module 401, a communication module  402, and a storage module 403. The processing module 401 is used for controlling and managing actions of the apparatus 40 for controlling the vehicle-mounted device. For example, the processing module 401 is used for supporting the apparatus 40 for controlling the vehicle-mounted device to perform the steps in the above embodiments. The communication module 402 is used for supporting communication between the apparatus 40 for controlling the vehicle-mounted device and other entities. The storage module 403 is used for storing program codes and data of the apparatus 40 for controlling the vehicle-mounted device.
The processing module 401 may be a processor or a controller, such as a central processing unit (CPU) , a general-purpose processor, a digital signal processor (DSP) , an application-specific integrated circuit (ASIC) , a field programmable gate array (FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It may implement or execute various illustrative logical blocks, modules and circuits described in the disclosure of the present application. The processor may also be a combination of implementation of computing functions, such as a combination including N microprocessors, or a combination of a DSP and a microprocessor. The communication module 402 may be a transceiver, a transceiver circuit, a communication interface, or the like. The storage module 403 may be a memory.
In a case where the processing module 401 is the processor 501 shown in FIG. 13, the communication module 402 is the transceiver 502 in FIG. 13, and the storage module 403 is the memory 503 in FIG. 13, the apparatus for controlling the vehicle-mounted device involved in the embodiments of the present application may be the apparatus 50 for controlling a vehicle-mounted device as follows.
Referring to FIG. 13, the apparatus 50 for controlling the vehicle-mounted device includes a processor 501 and a memory 503. The apparatus 50 for controlling the vehicle-mounted device may further include a transceiver 502 and a bus 504.
The processor 501, the transceiver 502, and the memory 503 are connected to each other through the bus 504. The bus 504 may be a peripheral component interconnect (PCI) bus, an extended industry standard architecture (EISA) bus, or the like. The bus may be divided into an address bus, a data bus, a control bus, and the like. For ease of representation, in the figure, only one thick line is used to represent the bus, but this does not mean that there is only one bus or one type of bus.
The processor 501 may be a general-purpose central processing unit (CPU) , a microprocessor, an application-specific integrated circuit (ASIC) , or N integrated circuits for controlling execution of programs of the solutions in the present application.
The memory 503 may be a read-only memory (ROM) or other types of static storage devices that may store static information and instructions, a random access memory (RAM) or other types of dynamic storage devices that may store information and instructions, an electrically erasable programmable read-only memory (EEPROM) , a compact disc read-only memory (CD-ROM) or other compact disk storages (including a compact disc, a laser disc, an optical disc, a digital versatile disc, a blu-ray disc) , a magnetic disk storage medium or other magnetic storage devices, or any other medium that can be used for carrying or storing desired program codes in a form of instructions or data structures and that can be accessed by a computer, which is not limited thereto. The memory may exist separately and is connected to the processor through the bus. The memory may also be integrated with the processor.
The memory 503 is used for storing application program codes for performing the solutions in the present application, and the solutions in the present application are controlled and performed by the processor 501. The transceiver 502 is used for receiving contents input by an external device, and the processor 501 is used for executing the application program codes stored in the memory 503, thereby implementing functions of various virtual units in the apparatus for controlling the vehicle-mounted device in the embodiments of the present application.
It will be understood that, in the embodiments of the present application, a magnitude of serial numbers of the above processes does not mean an order of execution, and the order of execution of the processes will be determined by their functions and inherent logic, and will not constitute any limitation to the implementation process of the embodiments of the present application.
A person of ordinary skill in the art will appreciate that, various illustrative units and algorithm steps described in the embodiments disclosed by the present disclosure can be implemented through electronic hardware or a combination of computer software and electronic hardware. Whether these functions are implemented through hardware or software depends on specific applications and design constraints imposed on the technical solutions. For each specific application, the described functions may be implemented in different ways by a skilled person, but such implementation will not be  considered to exceed the scope of the present application.
A person skilled in the art will clearly understand that, for convenience and conciseness of description, specific working processes of the system, apparatus, and units that are described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details will not be repeated herein.
In addition, the functional units in the embodiments of the present application may be integrated in one processing unit, or each unit may physically exist separately, or two or more of the units may be integrated in one unit.
In the above embodiments, the functions may be implemented in whole or in part through software, hardware, firmware, or any combination thereof. When implemented by using a software program, the functions can be implemented in a form of a computer program product in whole or in part. The computer program product includes N computer instructions. When computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are generated in whole or in part. The computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices. The computer instructions may be stored in a computer-readable storage medium, or may be transmitted from one computer-readable storage medium to another computer-readable storage medium. For example, the computer instructions may be transmitted from one website, computer, server or data center to another website, computer, server or data center through wired means (e.g., coaxial cable, optical fiber, digital subscriber line (DSL) ) or wireless means (e.g., infrared, wireless, or microwave) . The computer-readable storage medium may be any available medium that can be accessed by the computer or a data storage device including N servers or data centers that may be integrated by media. The available medium may be a magnetic medium (e.g., a floppy disk, a hard disk, or a magnetic tape) , an optical medium (e.g., DVD) , a semiconductor medium (e.g., a solid state disk (SSD) ) , or the like.
The foregoing descriptions are merely specific implementations of the present application, but the protection scope of the present application is not limited thereto. Any changes or replacements that a person skilled in the art can easily conceive of within the technical scope disclosed by the present application shall be included in the protection scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (16)

  1. An apparatus for controlling a vehicle-mounted device comprising a first display screen, the apparatus comprising:
    a sightline detection unit used to detect a sightline of a driver;
    a first display unit used to display a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of the first display screen, the target control being a control displayed in the target region; and
    a gesture detection unit used to detect a hand motion of the driver after the selection cursor is displayed over the target control, wherein
    the first display unit is further used to move the selection cursor according to the hand motion of the driver.
  2. The apparatus according to claim 1, further comprising:
    a function triggering unit used to trigger a function of a control currently corresponding to the selection cursor in response to the gesture detection unit detecting a first target gesture of the driver.
  3. The apparatus according to claim 2, wherein after the function of the control is triggered,
    the first display unit is further used to control the first display screen to display a display interface corresponding to the control; and
    the function triggering unit is further used to trigger a function of a control displayed on the display interface, according to a gesture of the driver detected by the gesture detection unit.
  4. The apparatus according to claim 1, wherein the target region is a rectangular region; and
    the sightline detection unit is used to:
    detect a position of an eye of the driver and a direction of the sightline of the driver;
    calculate coordinate values of four vertices of the target region in a target coordinate  system according to the position of the eye of the driver, the target coordinate system including a space rectangular coordinate system which takes the position of the eye of the driver as an origin and takes a vertical direction, a horizontal direction and a direction right ahead of a vehicle as coordinate axes; and
    determine whether the sightline of the driver falls into the target region, according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
  5. The apparatus according to claim 1, wherein the vehicle-mounted device further comprises a second display screen; and
    the apparatus further comprises:
    a second display unit used to control the second display screen to display contents currently displayed in the target region in response to the gesture detection unit detecting a second target gesture of the driver.
  6. The apparatus according to claim 1, wherein the vehicle-mounted device further comprises a first camera used to capture facial images of the driver and a second camera used to capture hand images of the driver;
    the sightline detection unit is used to detect the sightline of the driver, according to the facial images of the driver; and
    the gesture detection unit is used to detect the hand motion of the driver, according to the hand images of the driver.
  7. A vehicle-mounted device comprising:
    the apparatus according to any one of claims 1 to 6;
    the first display screen;
    a first camera used to capture facial images of the driver; and
    a second camera used to capture hand images of the driver.
  8. The vehicle-mounted device according to claim 7, wherein the first camera and the second camera are a same camera or different cameras.
  9. The vehicle-mounted device according to claim 7, further comprising:
    a second display screen at a distance from the driver greater than a distance between the first display screen and the driver; and
    the second display screen is used to display contents displayed in the target region.
  10. A method for controlling a vehicle-mounted device comprising a first display screen, the method comprising:
    detecting a sightline of a driver;
    displaying a selection cursor over a target control in response to detecting that the sightline of the driver falls into a target region of the first display screen, the target control being a control displayed in the target region;
    detecting a hand motion of the driver after the selection cursor is displayed over the target control; and
    moving the selection cursor according to the hand motion of the driver.
  11. The method according to claim 10, wherein after displaying the selection cursor over the target control and detecting the hand motion of the driver, the method further comprises:
    triggering a function of a control currently corresponding to the selection cursor in response to detecting a first target gesture of the driver.
  12. The method according to claim 10, wherein the target region is a rectangular region; and
    detecting the sightline of the driver includes:
    detecting a position of an eye of the driver and a direction of the sightline of the driver;
    calculating coordinate values of four vertices of the target region in a target coordinate system according to the position of the eye of the driver, the target coordinate system including a space rectangular coordinate system which takes the position of the eye of the driver as an origin and takes a vertical direction, a horizontal direction and a  direction right ahead of a vehicle as coordinate axes; and
    determining whether the sightline of the driver falls into the target region, according to the coordinate values of the four vertices of the target region in the target coordinate system and the direction of the sightline of the driver.
  13. The method according to claim 10, wherein after displaying the selection cursor over the target control and detecting the hand motion of the driver, the method further comprises:
    controlling a second display screen to display contents currently displayed in the target region in response to detecting a second target gesture of the driver.
  14. The method according to claim 13, wherein
    the second display screen is disposed at a distance from the driver greater than a distance between the first display screen and the driver.
  15. An apparatus for controlling the vehicle-mounted device, the apparatus comprising a processor and a memory for storing instructions executable by the processor; wherein
    the processor is configured to execute the instructions to implement the method for controlling the vehicle-mounted device according to any one of claims 10 to 14.
  16. A computer-readable storage medium storing instructions that, when executed by a computer, cause the computer to perform the method for controlling the vehicle-mounted device according to any one of claims 10 to 14.
PCT/CN2021/072867 2020-01-21 2021-01-20 Apparatus and method for controlling a vehicle-mounted device, and vehicle-mounted device WO2021147897A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010072018.8A CN113220111A (en) 2020-01-21 2020-01-21 Vehicle-mounted equipment control device and method
CN202010072018.8 2020-01-21

Publications (1)

Publication Number Publication Date
WO2021147897A1 true WO2021147897A1 (en) 2021-07-29

Family

ID=74550388

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/072867 WO2021147897A1 (en) 2020-01-21 2021-01-20 Apparatus and method for controlling a vehicle-mounted device, and vehicle-mounted device

Country Status (2)

Country Link
CN (1) CN113220111A (en)
WO (1) WO2021147897A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114610432A (en) * 2022-03-17 2022-06-10 芜湖汽车前瞻技术研究院有限公司 Graphic display control method, device, equipment and storage medium for vehicle-mounted display screen

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2801009A1 (en) * 2012-01-04 2014-11-12 Tobii Technology AB System for gaze interaction
US20170329411A1 (en) * 2016-05-13 2017-11-16 Alexander van Laack Method for the Contactless Shifting of Visual Information
EP3361352A1 (en) * 2017-02-08 2018-08-15 Alpine Electronics, Inc. Graphical user interface system and method, particularly for use in a vehicle
EP3505384A1 (en) * 2017-12-29 2019-07-03 Seat, S.A. Method and associated device for controlling at least one parameter of a vehicle

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2801009A1 (en) * 2012-01-04 2014-11-12 Tobii Technology AB System for gaze interaction
US20170329411A1 (en) * 2016-05-13 2017-11-16 Alexander van Laack Method for the Contactless Shifting of Visual Information
EP3361352A1 (en) * 2017-02-08 2018-08-15 Alpine Electronics, Inc. Graphical user interface system and method, particularly for use in a vehicle
EP3505384A1 (en) * 2017-12-29 2019-07-03 Seat, S.A. Method and associated device for controlling at least one parameter of a vehicle

Also Published As

Publication number Publication date
CN113220111A (en) 2021-08-06

Similar Documents

Publication Publication Date Title
US8941587B2 (en) Method and device for gesture recognition diagnostics for device orientation
US8531410B2 (en) Finger occlusion avoidance on touch display devices
US9069386B2 (en) Gesture recognition device, method, program, and computer-readable medium upon which program is stored
JP6292673B2 (en) Portable terminal device, erroneous operation prevention method, and program
US20150268789A1 (en) Method for preventing accidentally triggering edge swipe gesture and gesture triggering
EP2365426B1 (en) Display device and screen display method
US9047001B2 (en) Information processing apparatus, information processing method, and program
JP5802247B2 (en) Information processing device
JP6004716B2 (en) Information processing apparatus, control method therefor, and computer program
US9389766B2 (en) Image display device, image display method, image display program, and computer-readable recording medium for providing zoom functionality
US20150286283A1 (en) Method, system, mobile terminal, and storage medium for processing sliding event
JP2014130450A (en) Information processor and control method therefor
US20110043453A1 (en) Finger occlusion avoidance on touch display devices
WO2021147897A1 (en) Apparatus and method for controlling a vehicle-mounted device, and vehicle-mounted device
CN112214156B (en) Touch screen magnifier calling method and device, electronic equipment and storage medium
US8952934B2 (en) Optical touch systems and methods for determining positions of objects using the same
US9665232B2 (en) Information-processing device, storage medium, information-processing method, and information-processing system for enlarging or reducing an image displayed on a display device
US10126856B2 (en) Information processing apparatus, control method for information processing apparatus, and storage medium
JP2012221358A (en) Electronic apparatus, handwriting input method and handwriting input program
US10564762B2 (en) Electronic apparatus and control method thereof
US8106881B2 (en) System, computer program product and method of manipulating windows on portable computing devices through motion
US20140078058A1 (en) Graph display control device, graph display control method and storage medium storing graph display control program
US10101905B1 (en) Proximity-based input device
US20160124602A1 (en) Electronic device and mouse simulation method
TW201428562A (en) A gesture recognition method of a touchpad

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21703137

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21703137

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21703137

Country of ref document: EP

Kind code of ref document: A1