CN111630474A - Input control device, input device, and input control method - Google Patents

Input control device, input device, and input control method Download PDF

Info

Publication number
CN111630474A
CN111630474A CN201880087256.7A CN201880087256A CN111630474A CN 111630474 A CN111630474 A CN 111630474A CN 201880087256 A CN201880087256 A CN 201880087256A CN 111630474 A CN111630474 A CN 111630474A
Authority
CN
China
Prior art keywords
sensor
detection
unit
detection information
input control
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201880087256.7A
Other languages
Chinese (zh)
Inventor
芹泽新仁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Mitsubishi Electric Corp
Original Assignee
Mitsubishi Electric Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mitsubishi Electric Corp filed Critical Mitsubishi Electric Corp
Publication of CN111630474A publication Critical patent/CN111630474A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An input control device (1) of the present invention includes: a 1 st receiving unit (11), wherein the 1 st receiving unit (11) receives 1 st detection information from a 1 st sensor (21) that detects that an object is present in a predetermined area; a 2 nd receiving unit (12), wherein the 2 nd receiving unit (12) receives the 2 nd detection information from a 2 nd sensor (22) that detects that an object approaches the operation unit (23); and a processing determination unit (13) for determining whether or not to perform processing corresponding to the detection of the object by the 1 st sensor (21) based on the 1 st detection information received by the 1 st receiving unit (11) and the 2 nd detection information received by the 2 nd receiving unit (12).

Description

Input control device, input device, and input control method
Technical Field
The invention relates to an input control device, an input device, and an input control method.
Background
The following techniques were studied, namely: when a user performs an operation input to various devices, the device is prevented from executing processing that the user does not want. For example, patent document 1 discloses a process control apparatus including: a detection unit that detects the presence of a hand of a driver; a processing unit that performs predetermined processing corresponding to the detection; an acquisition unit that acquires operation information indicating that an operation unit of an external device has been operated; a control unit that controls the processing unit to restrict execution of processing corresponding to detection of the hand when the operation information is acquired.
Documents of the prior art
Patent document
Patent document 1:
japanese patent laid-open No. 2014-106845
Disclosure of Invention
Technical problem to be solved by the invention
The process control device disclosed in patent document 1 restricts execution of a process corresponding to detection of the presence of a hand only when operation information indicating that an external device has been operated is acquired. Therefore, there is a problem that: the processing control device cannot limit execution of processing that is not intended by the user until the operation information of the external device is acquired after the presence of the hand is detected.
The present invention has been made to solve the above-described problems, and an object thereof is to provide an input control device capable of preventing execution of a process which a user does not intend.
Technical scheme for solving technical problem
An input control device according to the present invention includes: a 1 st receiving unit that receives 1 st detection information from a 1 st sensor that detects that an object is present in a predetermined area; a 2 nd receiving unit that receives 2 nd detection information from a 2 nd sensor that detects that an object approaches the operating unit; and a process determination unit that determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor, based on the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
Effects of the invention
According to the present invention, it is possible to prevent a situation where processing not intended by the user is executed.
Drawings
Fig. 1 is a block diagram showing a configuration of a display device to which an input control device according to embodiment 1 is applied.
Fig. 2A and 2B are diagrams showing an example of a hardware configuration of the input control device according to embodiment 1.
Fig. 3A is a front view showing a configuration example of a display device to which the input control device according to embodiment 1 is applied, as viewed from the front, and fig. 3B is a side sectional view showing a configuration example of the display device shown in fig. 3A, as viewed from the right side, in cross section X-X'.
Fig. 4 is a front view of a modification of the display device to which the input control device according to embodiment 1 is applied, as viewed from the front.
Fig. 5 is a flowchart for explaining the operation of the input control device according to embodiment 1.
Fig. 6 is a flowchart illustrating an operation of the input control device according to the modification of embodiment 1.
Fig. 7 is a side sectional view showing a modification of the display device shown in fig. 3A, which is seen from the right side, and which is taken along the X-X' section.
Fig. 8 is a side sectional view showing a modification of the display device shown in fig. 3A, which is seen from the right side, and which is taken along the X-X' section.
Fig. 9 is a block diagram showing a configuration of a display device to which the input control device according to embodiment 2 is applied.
Fig. 10 is a front view showing a configuration example of a display device to which the input control device according to embodiment 2 is applied, as viewed from the front.
Fig. 11 is a flowchart for explaining the operation of the input control device according to embodiment 2.
Detailed Description
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
Embodiment 1.
Hereinafter, the input control device 1 according to embodiment 1 will be described as an example of a device applied to the display device 3 mounted on a vehicle.
Fig. 1 is a block diagram showing a configuration of a display device 3 to which an input control device 1 according to embodiment 1 is applied.
The input control device 1 includes a 1 st receiving unit 11, a 2 nd receiving unit 12, and a processing determination unit 13.
The 1 st receiving unit 11 receives 1 st detection information from the 1 st sensor 21 that detects that an object is present in a predetermined area. The 1 st receiving unit 11 transmits the received 1 st detection information to the processing and determining unit 13. The 1 st sensor 21 and the process determination unit 13 will be described later.
The 2 nd receiving unit 12 receives the 2 nd detection information from the 2 nd sensor 22 that detects that an object approaches the operation unit 23. The 2 nd receiving unit 12 transmits the received 2 nd detection information to the processing and determining unit 13. The operation unit 23 and the 2 nd sensor 22 will be explained later.
The process determination unit 13 determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor 21 based on the 1 st detection information received by the 1 st receiving unit 11 and the 2 nd detection information received by the 2 nd receiving unit 12. The process determination unit 13 generates determination result information indicating the determination result, and transmits the generated determination result information to the output processing unit 31 described later.
The input device 2 includes the input control device 1, the 1 st sensor 21, the 2 nd sensor 22, and an operation unit 23.
The 1 st sensor 21 is a sensor for detecting that an object is present in a predetermined region (hereinafter referred to as "1 st detection range") in a space around the display device 3, and is a non-contact sensor such as an infrared sensor, an ultrasonic sensor, or a non-contact capacitance sensor. Information on the object detected by the 1 st sensor 21 is transmitted to the 1 st receiving unit 11 as 1 st detection information.
The 1 st detection information is information indicating the detection intensity of the object detected by the 1 st sensor 21. The intensity of detection is the intensity of received light when the 1 st sensor 21 is an infrared sensor, the intensity of detection is the intensity of received acoustic wave when the 1 st sensor 21 is an ultrasonic sensor, and the intensity of detection is the amount of change in electrostatic capacity when the 1 st sensor 21 is a non-contact electrostatic capacity sensor.
The 2 nd sensor 22 is a sensor for detecting that an object approaches the operation unit 23, and is a non-contact sensor such as an infrared sensor, an ultrasonic sensor, or a non-contact capacitance sensor. Information on the object detected by the 2 nd sensor 22 is transmitted to the 2 nd receiving unit 12 as the 2 nd detection information.
The 2 nd detection information is information indicating the detection intensity of the object detected by the 2 nd sensor 22. The intensity of detection is the intensity of received light when the 2 nd sensor 22 is an infrared sensor, the intensity of detection is the intensity of received sound when the 2 nd sensor 22 is an ultrasonic sensor, and the intensity of detection is the amount of change in electrostatic capacity when the 2 nd sensor 22 is a non-contact type electrostatic capacity sensor.
The operation unit 23 is a touch sensor, a button, or the like for input operation by a user, such as a driver or a passenger. The operation unit 23 is not limited to a touch sensor, a button, or the like, and may be, for example, an operation unit for inputting an operation in a non-contact manner by a user, such as a non-contact capacitance sensor. When the operation unit 23 is operated, the display device 3 described later executes processing corresponding to the operation.
The display device 3 includes an input device 2, an output processing unit 31, and a display unit 32.
The output processing unit 31 generates output information to be output by the display unit 32 based on the determination result information transmitted from the processing determination unit 13, and transmits the generated output information to the display unit 32.
The display unit 32 is a display output device such as a display. The display unit 32 displays the output information transmitted from the output processing unit 31.
For example, when the display device 3 functions as a navigation device, the display device 3 performs route guidance to a destination by using a GPS (Global Positioning System) or the like, and displays a map and an image of the route guidance on the display unit 32. Here, the processing corresponding to the detection of the object by the 1 st sensor 21 refers to processing for displaying an image such as a pop-up menu superimposed on an image of a map and route guidance displayed on the display unit 32, for example, when the 1 st sensor 21 detects an object such as a hand of a user.
Fig. 2 is a diagram showing an example of the hardware configuration of the input control device 1 according to embodiment 1.
In embodiment 1, the functions of the 1 st receiving unit 11, the 2 nd receiving unit 12, and the processing determining unit 13 are realized by the processing circuit 201. That is, the input control device 1 includes a processing circuit 201, and the processing circuit 201 determines whether or not to perform processing corresponding to detection of the object by the 1 st sensor 21 based on the 1 st detection information and the 2 nd detection information.
The Processing circuit 201 may be dedicated hardware as shown in fig. 2A, or may be a CPU (Central Processing Unit) 206 that executes a program stored in the memory 205 as shown in fig. 2B.
In case the processing circuit 201 is dedicated hardware, the processing circuit 201 corresponds to, for example, a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an ASIC (application specific integrated circuit), an FPGA (Field programmable gate Array), or a combination thereof.
When the processing circuit 201 is the CPU206, the functions of the 1 st receiving unit 11, the 2 nd receiving unit 12, and the processing determining unit 13 are realized by software, firmware, or a combination of software and firmware. That is, the 1 st receiving unit 11, the 2 nd receiving unit 12, and the processing determining unit 13 are realized by a processing circuit such as a CPU206 or a system LSI (Large-Scale integrated circuit) that executes a program stored in an HDD (Hard disk drive) 202, a memory 205, or the like. The program stored in the HDD202, the memory 205, or the like may be a program for causing a computer to execute the steps or methods of the 1 st receiving unit 11, the 2 nd receiving unit 12, and the processing determining unit 13. Here, the Memory 205 corresponds to, for example, a nonvolatile or volatile semiconductor Memory such as a RAM (random access Memory), a ROM (Read Only Memory), a flash Memory, an EPROM (Erasable Programmable Read Only Memory), an EEPROM (Electrically Erasable Programmable Read Only Memory), or the like, a magnetic disk, a flexible disk, an optical disk, a compact disk, a mini disk, or a DVD (digital versatile disk).
The functions of the 1 st receiving unit 11, the 2 nd receiving unit 12, and the processing determining unit 13 may be partly realized by dedicated hardware and partly realized by software or firmware. For example, the 1 st receiving unit 11 and the 2 nd receiving unit 12 may be realized by the processing circuit 201 as dedicated hardware, and the processing determining unit 13 may be realized by the processing circuit reading and executing a program stored in the memory 205.
The input control device 1 includes an input interface device 203 and an output interface device 204 for communicating with the 1 st sensor 21, the 2 nd sensor 22, the operation unit 23, and the output processing unit 31.
In the above description, the hardware configuration of the input control device 1 is described as the configuration using the HDD202 as shown in fig. 2B, but a Solid State Drive (SSD) may be used instead of the HDD 202.
Fig. 3A is a front view showing a configuration example of the display device 3 to which the input control device 1 according to embodiment 1 is applied, as viewed from the front. Fig. 3B is a side sectional view showing a configuration example of the display device 3 shown in fig. 3A when viewed from the right side, in cross section X-X'.
As shown in fig. 3A, the operation unit 23, the display unit 32, and the external panel 53 are disposed on the display device 3.
As shown in fig. 3B, the 1 st sensor 21 and the 2 nd sensor 22 are disposed inside the display device 3. The 1 st sensor 21 is a sensor as follows: in the case where a signal transmitted from the sensor is reflected by an object as in the case of an infrared sensor and a signal reflected by the object is received by the sensor, the 1 st sensor 21 includes a 1 st transmitting unit 21a that transmits a signal and a 1 st receiving unit 21b that receives the reflected signal. The 2 nd sensor 22 is a sensor as follows: in the case where a signal transmitted from the sensor is reflected by an object as in the case of an infrared sensor and a signal reflected by the object is received by the sensor, the 2 nd sensor 22 includes a 2 nd transmitting unit 22a that transmits a signal and a 2 nd receiving unit 22b that receives the reflected signal. In embodiment 1, a case where both the 1 st sensor 21 and the 2 nd sensor 22 are infrared sensors will be described as an example. When the 1 st sensor 21 and the 2 nd sensor 22 are both infrared sensors, the 1 st transmission unit 21a of the 1 st sensor 21 and the 2 nd transmission unit 22a of the 2 nd sensor 22 are both light emitting elements of infrared light. When the 1 st sensor 21 and the 2 nd sensor 22 are both infrared sensors, the 1 st receiving unit 21b of the 1 st sensor 21 and the 2 nd receiving unit 22b of the 2 nd sensor 22 are both infrared light-receiving elements. The 1 st transmitting unit 21a and the 2 nd transmitting unit 22a irradiate infrared light from the inside of the display device 3 toward the front surface. The 1 st receiving unit 21b and the 2 nd receiving unit 22b receive the reflected light of the infrared light emitted from the 1 st transmitting unit 21a and the 2 nd transmitting unit 22a, respectively. That is, the 1 st sensor 21 and the 2 nd sensor 22 irradiate infrared light from the inside of the display device 3 toward the front surface, and the 1 st sensor 21 and the 2 nd sensor 22 receive reflected light of the infrared light irradiated by the 1 st sensor 21 and the 2 nd sensor 22, respectively.
As shown in fig. 3A and 3B, the 2 nd sensor 22 is disposed behind the back surface side of the operation surface of the operation unit 23. The operation portion 23 is formed of an infrared ray transmitting member that transmits infrared light. In the panel 53 for external use, a part of members positioned in the direction in which the 1 st sensor 21 irradiates infrared light is formed of an infrared transmitting member that transmits infrared light.
In embodiment 1, the 1 st sensor 21 and the 2 nd sensor 22 are each formed of an infrared sensor, but the present invention is not limited to this, and when the 1 st sensor 21 and the 2 nd sensor 22 are each a non-contact sensor, an ultrasonic sensor, a non-contact capacitance sensor, or the like may be used.
In embodiment 1, the 1 st sensor 21 and the 2 nd sensor 22 are both infrared sensors and are the same type of sensor, but the present invention is not limited thereto, and when the 1 st sensor 21 and the 2 nd sensor 22 are each a non-contact type sensor, they may be different types of sensors. In this case, the operation unit 23 and the external panel 53 are not necessarily formed of the infrared-transmitting member, and may be formed of a conductive member, a resin member, or the like according to the type of the sensor.
As shown in fig. 3A and 3B, the 1 st detection range 51 is set as a range in which the 1 st sensor 21 detects an object. The 1 st detection range 51 is a range for detecting whether or not an object is present in the periphery of the display device 3, and is set in advance in a space in the periphery of the display device 3. In addition, the 2 nd detection range 52 is set as a range in which the 2 nd sensor 22 detects an object. The 2 nd detection range 52 is a range for detecting whether or not an object is approaching the operation unit 23, and is set in advance in a space around the operation unit 23.
In embodiment 1, the user can operate the display device 3 by a motion such as swinging a hand around the display device 3 without touching the display device 3, that is, a so-called gesture operation. For the input of the gesture operation, the 1 st sensor 21 is used. For example, when the user starts a gesture operation, the display device 3 displays a map and an image of route guidance displayed on the display unit 32, with a method of operation input by the next gesture operation superimposed as a pop-up image. When the user performs a gesture operation, the hand, wrist, or the like of the user may fall within both the 1 st detection range 51 and the 2 nd detection range 52. Similarly, when the user operates the operation unit 23, the hand, wrist, or the like of the user may be present in both the 1 st detection range 51 and the 2 nd detection range 52.
For example, when the hand, wrist, or the like of the user comes within the range of the 1 st detection range 51 when the hand, wrist, or the like of the user approaches the display device 3, the 1 st sensor 21 detects that an object is present within the range, and transmits the detection information to the 1 st receiving unit 11 as the 1 st detection information. Similarly, when the hand, wrist, or the like of the user comes within the range of the 2 nd detection range 52, the 2 nd sensor 22 detects that an object approaches the operation unit 23, and transmits the detection information to the 2 nd receiving unit 12 as the 2 nd detection information.
Fig. 4 is a front view of a modification of the display device 3 to which the input control device 1 according to embodiment 1 is applied, as viewed from the front.
Fig. 3A and 3B show a case where there is a range where the 1 st detection range 51 and the 2 nd detection range 52 overlap, but fig. 4 shows a case where there is no overlapping range of the 1 st detection range 51 and the 2 nd detection range 52. As shown in fig. 4, even when there is no overlapping range between the 1 st detection range 51 and the 2 nd detection range 52, for example, when the user operates the operation unit 23, the user may have his wrist in the range of the 1 st detection range 51 when the user's hand approaches the 2 nd sensor 22, which is the operation unit 23. Therefore, embodiment 1 does not limit whether or not there are overlapping ranges in the 1 st detection range 51 and the 2 nd detection range 52.
The operation will be described.
In embodiment 1, the 1 st sensor 21 transmits the 1 st detection information whenever an object is present in the range of the 1 st detection range 51. In addition, the 2 nd sensor 22 transmits the 2 nd detection information whenever an object is present in the range of the 2 nd detection range 52.
Fig. 5 is a flowchart for explaining the operation of the input control device 1 according to embodiment 1.
The input control apparatus 1 repeats and executes the processing shown in the flowchart, and the display apparatus 3 generates and displays output information based on the determination result information from the input control apparatus 1 during the operation of the input control apparatus 1.
When the operation of the input control device 1 is started, the 1 ST receiving unit 11 receives the 1 ST detection information from the 1 ST sensor 21 (step ST 1).
The processing determination unit 13 determines whether or not the object is present within the range of the 1 ST detection range 51, that is, in a predetermined area in the space around the display device 3, based on the received 1 ST detection information (step ST 2).
In step ST2, when the processing determination unit 13 determines that the object does not exist in the predetermined area in the space around the display device 3 (no in step ST2), the input control device 1 ends the operation.
In step ST2, when the processing determination unit 13 determines that the object is present in the predetermined area in the space around the display device 3 (yes in step ST2), the 2 nd receiving unit 12 receives the 2 nd detection information from the 2 nd sensor 22 (step ST 3). Next, the processing determination unit 13 determines whether or not the object is approaching the operation unit 23 based on the received 2 nd detection information (step ST 4).
In step ST4, when the process determination unit 13 determines that the object is approaching the operation unit 23 (yes in step ST4), the input control device 1 ends the operation.
In step ST4, when the process determination unit 13 determines that the object does not approach the operation unit 23 (no in step ST4), the process determination unit 13 generates determination result information instructing the output processing unit 31 to perform a process corresponding to the detection of the object by the 1 ST sensor 21 (step ST 11). Thereafter, the process determining unit 13 transmits the determination result information to the output processing unit 31 (step ST12), and the input control device 1 ends the operation.
As described above, the input control apparatus 1 includes: a 1 st receiving unit 11, the 1 st receiving unit 11 receiving 1 st detection information from a 1 st sensor 21 that detects that an object is present in a predetermined area; a 2 nd receiving unit 12, the 2 nd receiving unit 12 receiving 2 nd detection information from a 2 nd sensor 22 that detects that an object approaches the operating unit 23; and a process determination unit 13 for determining whether or not to perform a process corresponding to the detection of the object by the 1 st sensor 21 based on the 1 st detection information received by the 1 st receiving unit 11 and the 2 nd detection information received by the 2 nd receiving unit 12. With such a configuration, the input control apparatus 1 can make a determination to prevent a situation where a process not intended by the user is executed.
In embodiment 1, the process determination unit 13 determines whether or not to perform the process corresponding to the detection of the object by the 1 st sensor 21 based on whether or not the object is present in a predetermined region in the space around the display device 3 and whether or not the object is close to the operation unit 23.
Next, a modification of embodiment 1 will be described.
In the modification, in addition to the processing of embodiment 1 described so far, processing for comparing the detection intensity of the object by the 1 st sensor 21 with the detection intensity of the object by the 2 nd sensor 22 is added.
Fig. 6 is a flowchart for explaining the operation of the input control device 1 according to the modification of embodiment 1. The difference between fig. 5 and fig. 6 is that the process of step ST5 in fig. 6 is added in the case where the determination of step ST4 in fig. 5 is yes. Here, only the processing of the portion different from fig. 5 will be described with reference to fig. 6.
In step ST4, when the process determination unit 13 determines that the object is approaching the operation unit 23 (yes in step ST4), the process determination unit 13 compares the detection intensity of the object in the 1 ST sensor 21 with the detection intensity of the object in the 2 nd sensor 22, and determines whether or not the detection intensity of the 2 nd sensor 22 is stronger than the detection intensity of the 1 ST sensor 21 (step ST 5).
In step ST5, when the processing and determining unit 13 determines that the detection intensity of the 2 nd sensor 22 is stronger than the detection intensity of the 1 ST sensor 21 (yes in step ST5), the processing and determining unit 13 determines that the user is about to operate the operation unit 23, and the input control device 1 ends the operation.
In step ST5, when the process determination unit 13 determines that the detection intensity of the 2 nd sensor 22 is not stronger than the detection intensity of the 1 ST sensor 21 (no in step ST5), the process determination unit 13 executes the processes from step ST11 described above.
Thus, the process determination unit 13 can more accurately determine whether or not to perform the process corresponding to the detection of the object by the 1 st sensor 21 by comparing the detection intensity of the object by the 1 st sensor 21 with the detection intensity of the object by the 2 nd sensor 22.
In addition, in a different modification of embodiment 1, in addition to the processing of embodiment 1 described so far, processing for comparing the detection timing of the object by the processing determination unit 13 with the detection timing of the object by the 1 st sensor 21 with the detection timing of the object by the 2 nd sensor 22 is added.
This modification is implemented as follows: the determination process of step ST5 by the process determination unit 13 is changed to a process in which the process determination unit 13 determines whether or not the timing at which the object is detected by the 2 nd sensor 22 is earlier than the timing at which the object is detected by the 1 ST sensor 21. In this modification, the detection timings of the objects in the 1 st sensor 21 and the 2 nd sensor 22 are compared, but the lengths of the detection times of the objects in the 1 st sensor 21 and the 2 nd sensor 22 may be compared.
In addition, in another modification of embodiment 1, in addition to the processing of embodiment 1 described so far, processing is added in which the processing determination unit 13 compares a period in which the detection intensity of the object in the 1 st sensor 21 shows an intensity equal to or higher than a threshold value with a period in which the detection intensity of the object in the 2 nd sensor 22 shows an intensity equal to or higher than a threshold value.
This modification is implemented as follows: the determination process of step ST5 by the process determination unit 13 is changed to a process in which the process determination unit 13 determines whether or not the period in which the detection intensity of the object in the 2 nd sensor 22 shows the intensity equal to or greater than the threshold value is longer than the period in which the detection intensity of the object in the 1 ST sensor 21 shows the intensity equal to or greater than the threshold value.
In addition, in another modification of embodiment 1, in addition to the processing of embodiment 1 described so far, processing is added in which the processing determination unit 13 compares the detection intensity of the object in the 2 nd sensor 22 with a predetermined threshold value.
This modification is implemented as follows: the determination process of step ST5 by the process determination unit 13 is changed to a process in which the process determination unit 13 determines whether or not the detection intensity of the object by the 2 nd sensor 22 is greater than a preset threshold value.
Fig. 7 is a side sectional view showing a modified example of the structure of the display device 3 shown in fig. 3A, which is seen from the right side surface, in cross section X-X'.
In the display device 3 shown in fig. 3B and the display device 3 shown in fig. 7, the 1 st sensor 21 and the 2 nd sensor 22 are different in structure. In fig. 7, the 1 st receiving portion 21b of the 1 st sensor 21 and the 2 nd receiving portion 22b of the 2 nd sensor 22 are formed by the same receiving portion 24. When both the 1 st sensor 21 and the 2 nd sensor 22 are infrared sensors, the receiving unit 24 is a light receiving element. In fig. 7, both the reflected light of the infrared light irradiated from the 1 st transmitting portion 21a of the 1 st sensor 21 and the reflected light of the infrared light irradiated from the 2 nd transmitting portion 22a of the 2 nd sensor 22 are received by the receiving portion 24. With this configuration, the component cost can be reduced.
Fig. 8 is a side sectional view showing a modification example different from fig. 7 in the configuration of the display device 3 shown in fig. 3A, which is viewed from the right side, in cross section X-X'.
The display device 3 shown in fig. 7 is different from the display device 3 shown in fig. 8 in the following configuration. In fig. 8, the display device 3 is added with: a design light-emitting unit 33 that emits design illumination from the inside of the display device 3 toward the front side; and a design illumination unit 34, wherein the design illumination unit 34 illuminates the illumination light emitted from the design light emission unit 33 to the outside of the display device 3 as design illumination. The design illumination unit 34 is disposed in the operation unit 23.
With this configuration, when the 2 nd transmitting unit 22a of the 2 nd sensor 22 is disposed behind the back surface side of the operation unit 23, the infrared light emitted from the 2 nd transmitting unit 22a and the infrared light reflected by the object pass through the designed illumination unit 34 disposed on the operation unit 23. This eliminates the need to specially form the operation unit 23 by an infrared-transmitting member. In addition, with such a configuration, even if the 2 nd transmission unit 22a of the 2 nd sensor 22 is disposed behind the back surface side of the operation unit 23, the light emission of the 2 nd transmission unit 22a is inconspicuous by the irradiation light of the design illumination, and thus the design is not impaired. Therefore, the degree of freedom in the arrangement of the 2 nd transmitting unit 22a can be improved.
In fig. 8, a light guide 35 is added, and the light guide 35 guides the reflected light of the infrared light incident on the operation portion 23 to a predetermined direction. The light guide 35 is disposed to connect between the back surface of the operation portion 23 and the receiving portion 24. The light guide 35 guides at least a part of the reflected light of the infrared light incident from the front surface of the operation portion 23 to the receiving portion 24.
With this configuration, the degree of freedom in the arrangement of the receiving unit 24 can be improved.
Further, the light guide 35 may guide at least a part of the infrared light emitted from the 2 nd transmitting portion 22a to the front surface of the operation portion 23. Further, a part of the operation portion 23 may be a light guide 35.
With this configuration, the degree of freedom in the arrangement of the 2 nd transmission unit 22a can be improved.
Embodiment 2.
In embodiment 1, an embodiment in which there is one 1 st sensor 21 of the input device 2 is described.
In embodiment 2, an embodiment in which there are a plurality of the 1 st sensors 21 of the input device 2a will be described.
Fig. 9 is a block diagram showing a configuration of a display device 3a to which the input control device 1 according to embodiment 2 is applied.
While embodiment 1 shows the input device 2 having only one 1 st sensor 21, embodiment 2 shows the input device 2a having a plurality of 1 st sensors 211 and 212, as shown in fig. 9. While the input device 2 has only one 2 nd sensor 22 as shown in fig. 1 in embodiment 1, the input device 2a has a plurality of 2 nd sensors 221 and 222 as shown in fig. 9 in embodiment 2. While the 1 st receiving unit 11 receives the 1 st detection information transmitted from one 1 st sensor 21 as shown in fig. 1 in embodiment 1, the 1 st receiving unit 11 receives all the 1 st detection information transmitted from each of the plurality of 1 st sensors 211 and 212 as shown in fig. 9 in embodiment 2. While embodiment 1 shows that the 2 nd receiving unit 12 receives the 2 nd detection information transmitted from one 2 nd sensor 22, embodiment 2 shows that the 2 nd receiving unit 12 receives all the 2 nd detection information transmitted from each of the plurality of 2 nd sensors 221 and 222, as shown in fig. 9.
The same reference numerals are given to the other structures that are the same as those in embodiment 1, and redundant description is omitted.
Fig. 10 is a front view showing a configuration example of a display device 3a to which the input control device 1 according to embodiment 2 is applied, as viewed from the front.
As shown in fig. 10, a plurality of operation units 231 and 232, a display unit 32, and an external panel 53 are arranged.
The display device 3a has a plurality of 1 st sensors 211 and 212 and a plurality of 2 nd sensors 221 and 222 disposed therein. All the sensors irradiate infrared light from the inside of the display device 3a toward the front surface, and the reflected light of the irradiated infrared light is detected by the respective sensors. The 2 nd sensor 221 is disposed behind the back surface side of the operation surface of the operation portion 231, and the 2 nd sensor 222 is disposed behind the back surface side of the operation surface of the operation portion 232.
As shown in fig. 10, the 1 st detection range 511 is set as a range in which the 1 st sensor 211 detects an object. The 1 st detection range 511 is a range for detecting whether or not an object is present in the periphery of the display device 3a, and is set in advance in a space in the periphery of the display device 3. Similarly, the 1 st detection range 512 is set as a range in which the 1 st sensor 212 detects an object.
The 2 nd detection range 521 is set as a range in which the 2 nd sensor 221 detects an object. The 2 nd detection range 521 is a range for detecting whether or not an object is in proximity to the operation unit 231, and is set in advance in a space around the operation unit 231. Similarly, the 2 nd detection range 522 is set as a range in which the 2 nd sensor 222 detects an object.
In embodiment 2, a user such as a driver or a passenger may operate the display device 3a by a motion such as swinging a hand around the display device 3a without touching the display device 3a, that is, a so-called gesture operation. For the input of the gesture operation, the 1 st sensors 211, 212 are used. For example, when the user's hand moves from the left side to the right side in the space around the display device 3a for a predetermined time, the 1 st sensor 211 detects that an object is present in the 1 st detection range 511, and the 2 nd sensor 212 detects that an object is present in the 1 st detection range 512. For example, when a predetermined detection mode is established in such a detection order, the input control device 1 determines that the gesture operation is performed.
The operation will be described.
Fig. 11 is a flowchart for explaining the operation of the input control device 1 according to embodiment 2.
In embodiment 2, the 1 st sensors 211 and 212 transmit the 1 st detection information whenever an object is present in the range of the 1 st detection ranges 511 and 512, respectively. Similarly, the 2 nd sensors 221 and 222 transmit the 2 nd detection information whenever an object is present in the range of the 2 nd detection ranges 521 and 522.
The input control apparatus 1 repeatedly executes this flow, and the display apparatus 3a generates and displays output information based on the determination result information from the input control apparatus 1 during the operation of the input control apparatus 1.
In embodiment 2, the following known techniques are used: when the input control device 1 determines whether or not the mode corresponding to the gesture operation is performed, the direction in which the object moves is determined according to the order in which the plurality of sensors detect the object. Therefore, a method of determining whether or not a mode corresponding to a gesture operation is performed will not be described.
When the operation of the input control device 1 is started, the 1 ST receiving unit 11 receives the 1 ST detection information from the 1 ST sensors 211 and 212, respectively (step ST 101).
The processing determination unit 13 determines whether or not the object is present in at least one of the 1 ST detection range 511 and the 1 ST detection range 512, that is, a predetermined region in the space around the display device 3a, based on the plurality of received 1 ST detection information (step ST 102).
In step ST102, when the processing determination unit 13 determines that the object is not present in the predetermined area in the space around the display device 3a (no in step ST102), the input control device 1 ends the operation.
In step ST102, when the processing and determining unit 13 determines that the object is present in the predetermined area in the space around the display device 3a (yes in step ST102), the 2 nd receiving unit 12 receives the 2 nd detection information related to the detection of the object approaching the operating units 231 and 232 from the 2 nd sensors 221 and 222, respectively (step ST 103). Next, the processing determination unit 13 determines whether or not the object is approaching at least one of the operation unit 231 and the operation unit 232 based on the received 2 nd detection information (step ST 104).
In step ST104, when the processing determination unit 13 determines that the object approaches at least one of the operation unit 231 and the operation unit 232 (yes in step ST104), the processing determination unit 13 determines whether or not a predetermined detection pattern is established in the detection order of the object in the 1 ST detection ranges 511 and 512 (step ST 105).
In step ST105, when the processing determination unit 13 determines that the predetermined detection mode is established in the detection order of the object in the 1 ST detection ranges 511 and 512 (yes in step ST105), the processing determination unit 13 generates determination result information instructing the output processing unit 31 to perform output processing corresponding to the predetermined detection mode (step ST 111). Thereafter, the processing determining unit 13 transmits the determination result information to the output processing unit 31 (step ST112), and the input control device 1 ends the operation.
In step ST105, when the processing determination unit 13 determines that the predetermined detection mode is not satisfied in the detection order of the object in the 1 ST detection ranges 511 and 512 (yes in step ST105), the input control device 1 ends the operation.
In step ST104, when the processing determination unit 13 determines that the object does not approach either the operation unit 231 or the operation unit 232 (no in step ST104), the processing determination unit 13 determines whether or not a predetermined detection pattern is established in the detection order of the object in the 1 ST detection ranges 511 and 512 (step ST 121).
In step ST121, when the processing determination unit 13 determines that the predetermined detection mode is established in the detection order of the object in the 1 ST detection ranges 511 and 512 (yes in step ST121), the processing determination unit 13 generates determination result information instructing the output processing unit 31 to perform output processing corresponding to the predetermined detection mode (step ST 111). Thereafter, the processing determining unit 13 transmits the determination result information to the output processing unit 31 (step ST112), and the input control device 1 ends the operation.
In step ST121, when the process determination unit 13 determines that the predetermined detection mode is not satisfied in the detection order of the object in the 1 ST detection ranges 511 and 512 (no in step ST121), the process determination unit 13 generates determination result information instructing the output processing unit 31 to perform the output process corresponding to the detection of the object by the 1 ST sensor 21 (step ST 122). Thereafter, the processing determining unit 13 transmits the determination result information to the output processing unit 31 (step ST123), and the input control device 1 ends the operation.
In embodiment 2, an example in which two 1 st sensors and two 2 nd sensors are applied is shown, but the number of sensors is not limited to this, and at least the number of 1 st sensors may be plural.
In embodiment 2, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the detection order of the object in the 1 st detection ranges 511 and 512, and generates determination result information instructing the output processing unit 31 based on the determination.
Next, a modification of embodiment 2 will be described.
In the modification, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the detection order of the object in the 2 nd detection ranges 521 and 522.
The modification is, for example, as follows.
Before performing the process of step ST105 shown in fig. 11, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the detection order of the object in the 2 nd detection ranges 521 and 522. When the process determination unit 13 determines that the predetermined detection mode is established in the detection order of the object in the 2 nd detection ranges 521 and 522, the input control device 1 ends the operation. When the process determining unit 13 determines that the predetermined detection mode is not satisfied in the detection order of the object in the 2 nd detection ranges 521 and 522, the process of step ST105 is performed.
In addition, in the modification, an example in which two 1 st sensors and two 2 nd sensors are applied respectively is shown, but the number of sensors is not limited thereto, and at least the number of 2 nd sensors may be plural.
With the above configuration, the input control device 1 can perform not only the determination to prevent the execution of the process not intended by the user but also the determination to execute the process intended by the user more accurately.
In embodiments 1 and 2, the 2 nd sensor itself may be configured to receive an operation by a user, and the 2 nd sensor may have a function of each operation unit. For example, a non-contact capacitance sensor may be disposed on the front surface of the operation unit, and both the detection of an approaching object and the detection of an operation input may be performed by the non-contact capacitance sensor.
In addition, although embodiments 1 and 2 show an example in which the operation unit is provided in the display device and the display device executes processing corresponding to an operation, the operation unit is not necessarily configured in the display device. That is, in embodiment 1 and embodiment 2, the operation unit may be provided in an external device different from the display device, and when the operation unit is operated, the external device may execute processing corresponding to the operation. In this case, since the 2 nd sensor detects that an object approaches an operation unit provided in an external device different from the display device, the 2 nd sensor is not necessarily provided in the input device and may be provided outside the input device.
In embodiments 1 and 2, when the 1 st sensor for detecting an operation input by a gesture operation is mounted on a so-called remote controller for operating a display device located at a remote location, for example, the 1 st sensor and the 1 st receiving unit may be connected via a network.
In embodiments 1 and 2, for example, when the user operates the operation unit, the hand of the driver or the like may pass through the 1 st detection range before the 2 nd detection range due to the arrangement of the operation unit and the 1 st sensor. If the above-described situation is assumed due to the arrangement, an appropriate period may be provided between the time when the processing determination unit determines that the object is present in the predetermined area in the space around the display device and the time when the 2 nd detection information is received from the 2 nd sensor.
In addition, embodiments 1 and 2 can be realized as a display system including a display unit of a display device as a separate display output device, and the output processing unit and the display output device are connected via a network.
In embodiments 1 and 2, examples in which the input control device is applied to a display device for vehicle mounting are shown, but the display device is not limited to the display device for vehicle mounting.
In addition, although embodiments 1 and 2 show examples in which the input control device is applied to a display device, the input control device may be applied to an audio device that outputs audio, a mechanical device that mechanically operates according to electric control, or the like.
In the present invention, the embodiments may be freely combined, or any component of the embodiments may be modified, or any component may be omitted in the embodiments within the scope of the invention.
Industrial applicability of the invention
The input control device according to the present invention can be applied to a device that is input by a user, such as a display device.
Description of the reference symbols
1 is input into a control device to be controlled,
2. 2a means for inputting the information to the user,
3. 3a display means for displaying the image of the object,
11 a 1 st receiving portion for receiving a liquid,
12 a 2 nd receiving part for receiving the liquid,
13 a process determining section for processing the image data,
21. 211, 212 the 1 st sensor is provided,
21a 1 st transmitting part for transmitting the data,
21b the 1 st receiving part of the optical fiber,
22. the 2 nd sensors 221, 222 are arranged in the same way,
22a 2 nd transmission part for transmitting the data,
22b the 2 nd receiving part of the optical fiber,
23 an operation part for the operation part of the device,
24 the receiving part of the optical fiber cable is provided with a receiving part,
31 an output of the processing section is provided,
32 a display part for displaying the image data,
33 a light-emitting part for design use,
34 is designed to be a lighting part,
35 a light-guiding body having a light-guiding body,
51. 511, 512, the 1 st detection range,
52. 521, 522 the 2 nd detection range,
53-a number of panels of the type,
201 a processing circuit for processing a plurality of signals,
202 HDD,
203 is input to the interface device or devices,
204 of the output of the interface device,
205 a memory, and a memory controller, for controlling the memory,
206 CPU。

Claims (16)

1. an input control device, comprising:
a 1 st receiving unit that receives 1 st detection information from a 1 st sensor that detects that an object is present in a predetermined area;
a 2 nd receiving unit that receives 2 nd detection information from a 2 nd sensor that detects that an object approaches the operating unit; and
and a process determination unit that determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor, based on the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
2. The input control apparatus of claim 1,
the process determination unit determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor based on information of the detection intensity in the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
3. The input control apparatus of claim 1,
the process determination unit determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor based on the length of the detection time in the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
4. The input control apparatus of claim 1,
the process determination unit determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor based on a detection order of the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
5. The input control apparatus of claim 1,
the 1 st receiving section receives the 1 st detection information from the 1 st sensors, respectively,
the process determination unit determines whether or not to perform the process corresponding to the detection of the object by the 1 st sensor based on the 1 st detection information received by the 1 st reception unit and corresponding to each of the 1 st sensors, the detection order of the object by the 1 st sensors, and the 2 nd detection information received by the 2 nd reception unit.
6. The input control apparatus of claim 1,
the 2 nd receiving portions respectively receive the 2 nd detection information from a plurality of the 2 nd sensors,
the process determination unit determines whether or not to perform the process corresponding to the detection of the object by the 1 st sensor based on the 1 st detection information received by the 1 st receiving unit, the 2 nd detection information received by the 2 nd receiving unit and corresponding to the 2 nd sensors, respectively, and the detection order of the object by the 2 nd sensor.
7. An input device, comprising:
the 1 st sensor, the 1 st sensor detects that an object exists in a predetermined area;
the 2 nd sensor that detects that an object approaches the operation unit; and
the input control device of claim 1.
8. The input device of claim 7,
the 1 st sensor and the 2 nd sensor are non-contact sensors.
9. The input device of claim 8,
the 1 st sensor and the 2 nd sensor are each one of an infrared sensor, an ultrasonic sensor, and a capacitance sensor.
10. The input device of claim 8,
the 1 st sensor and the 2 nd sensor receive signals of both the 1 st sensor and the 2 nd sensor with one receiving portion.
11. The input device of claim 7,
includes the operation portion.
12. The input device of claim 11,
the 2 nd sensor is an infrared ray sensor,
at least a part of the operating portion is formed of an infrared ray transmitting member,
the 2 nd sensor emits infrared rays toward the infrared ray transmitting member.
13. The input device of claim 12,
a light guide is provided between the front surface of the light receiving element of the 2 nd sensor and the back surface of the operation portion,
the light guide guides at least a part of the infrared light emitted from the 2 nd sensor to the light receiving element of the 2 nd sensor.
14. The input device of claim 11,
the 2 nd sensor is a non-contact capacitance sensor disposed on a front surface of the operation portion.
15. An in-vehicle input device is characterized in that,
comprising the input control device of claim 1.
16. An input control method, characterized in that,
the 1 st receiving unit receives 1 st detection information from a 1 st sensor that detects that an object is present in a predetermined area,
the 2 nd receiving unit receives the 2 nd detection information from the 2 nd sensor for detecting that the object approaches the operating unit,
the process determination unit determines whether or not to perform a process corresponding to the detection of the object by the 1 st sensor based on the 1 st detection information received by the 1 st reception unit and the 2 nd detection information received by the 2 nd reception unit.
CN201880087256.7A 2018-01-26 2018-01-26 Input control device, input device, and input control method Pending CN111630474A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002477 WO2019146072A1 (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method

Publications (1)

Publication Number Publication Date
CN111630474A true CN111630474A (en) 2020-09-04

Family

ID=67394577

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201880087256.7A Pending CN111630474A (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method

Country Status (3)

Country Link
CN (1) CN111630474A (en)
DE (1) DE112018006942T5 (en)
WO (1) WO2019146072A1 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102958756A (en) * 2011-04-22 2013-03-06 松下电器产业株式会社 Input device for vehicle and input method for vehicle
CN104115118A (en) * 2012-03-01 2014-10-22 高通股份有限公司 Gesture detection based on information from multiple types of sensors
JP2015225423A (en) * 2014-05-27 2015-12-14 京セラディスプレイ株式会社 Display device
JP2015225424A (en) * 2014-05-27 2015-12-14 三菱電機株式会社 Software device
JP2016179723A (en) * 2015-03-24 2016-10-13 アルパイン株式会社 On-vehicle apparatus
CN106839290A (en) * 2017-01-16 2017-06-13 广东美的制冷设备有限公司 The control method and control device and air-conditioner of gesture identification
WO2017130344A1 (en) * 2016-01-28 2017-08-03 三菱電機株式会社 Interface device, contact detection device, and contact detection method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2013058117A (en) * 2011-09-09 2013-03-28 Alps Electric Co Ltd Input device
JP2015201054A (en) * 2014-04-08 2015-11-12 アルプス電気株式会社 Proximity detection type input device
JP6169298B1 (en) * 2017-02-16 2017-07-26 京セラ株式会社 Electronic device and control method

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102958756A (en) * 2011-04-22 2013-03-06 松下电器产业株式会社 Input device for vehicle and input method for vehicle
US20130090807A1 (en) * 2011-04-22 2013-04-11 Yoshihiro Kojima Vehicular input device and vehicular input method
CN104115118A (en) * 2012-03-01 2014-10-22 高通股份有限公司 Gesture detection based on information from multiple types of sensors
JP2015225423A (en) * 2014-05-27 2015-12-14 京セラディスプレイ株式会社 Display device
JP2015225424A (en) * 2014-05-27 2015-12-14 三菱電機株式会社 Software device
JP2016179723A (en) * 2015-03-24 2016-10-13 アルパイン株式会社 On-vehicle apparatus
WO2017130344A1 (en) * 2016-01-28 2017-08-03 三菱電機株式会社 Interface device, contact detection device, and contact detection method
CN106839290A (en) * 2017-01-16 2017-06-13 广东美的制冷设备有限公司 The control method and control device and air-conditioner of gesture identification

Also Published As

Publication number Publication date
DE112018006942T5 (en) 2020-11-19
WO2019146072A1 (en) 2019-08-01

Similar Documents

Publication Publication Date Title
US9440537B2 (en) Method and device for operating functions displayed on a display unit of a vehicle using gestures which are carried out in a three-dimensional space, and corresponding computer program product
KR102441082B1 (en) Apparatus for remote parking control, system having the same and method thereof
KR101367593B1 (en) Interactive operating device and method for operating the interactive operating device
US9475390B2 (en) Method and device for providing a user interface in a vehicle
CN110419211A (en) Information processing unit, information processing method and message handling program
CN103085734A (en) Proximity switch having wrong touch feedback
US10175695B2 (en) Apparatus and method for controlling parking-out of vehicle
US20190286123A1 (en) Remote vehicle control device and remote vehicle control method
US11181909B2 (en) Remote vehicle control device, remote vehicle control system, and remote vehicle control method
CN114026462A (en) Electronic device and control method thereof
EP3008977A1 (en) Identification device, method, and computer program product
CN108628433A (en) Gesture identification method and its system
CN102662533A (en) Method for eliminating non-normal contact interference in infrared touch system
CN111630474A (en) Input control device, input device, and input control method
JP2013195326A (en) Object detection apparatus, object detection method, object detection program, and information storage medium
US11334243B2 (en) Input control device
US20200380726A1 (en) Dynamic three-dimensional imaging distance safeguard
CN103809819A (en) Optical touch device and touch method thereof
JP7345684B2 (en) Autonomous driving system, server, and dynamic map generation method
JP7258260B2 (en) Occupant determination device and occupant determination method
EP3977168B1 (en) Electronic apparatus and control method thereof
US20140184491A1 (en) System and method for providing user interface using an optical scanning
KR101783419B1 (en) User interface apparatus and method for controlling the same
CN104050442A (en) Gesture recognition system operability verification
CN110568922A (en) Method for recognizing input

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200904