WO2019146072A1 - Input control device, input device, and input control method - Google Patents

Input control device, input device, and input control method Download PDF

Info

Publication number
WO2019146072A1
WO2019146072A1 PCT/JP2018/002477 JP2018002477W WO2019146072A1 WO 2019146072 A1 WO2019146072 A1 WO 2019146072A1 JP 2018002477 W JP2018002477 W JP 2018002477W WO 2019146072 A1 WO2019146072 A1 WO 2019146072A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
detection
unit
detection information
reception unit
Prior art date
Application number
PCT/JP2018/002477
Other languages
French (fr)
Japanese (ja)
Inventor
新仁 芹澤
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to DE112018006942.7T priority Critical patent/DE112018006942T5/en
Priority to CN201880087256.7A priority patent/CN111630474A/en
Priority to PCT/JP2018/002477 priority patent/WO2019146072A1/en
Publication of WO2019146072A1 publication Critical patent/WO2019146072A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures

Definitions

  • the present invention relates to an input control device, an input device, and an input control method.
  • Patent Document 1 When the user performs an operation input to various devices, a technique is being studied to prevent the processing not intended by the user from being executed by the devices.
  • a detection unit that detects the presence of the driver's hand, a processing unit that performs a predetermined process corresponding to the detection, and operation information that indicates that the operation unit of the external device has been operated.
  • a processing control apparatus is disclosed that includes an acquisition unit to be acquired, and a control unit that controls the processing unit so that execution of processing corresponding to hand detection is restricted when operation information is acquired.
  • the process control device disclosed in Patent Document 1 restricts the execution of the process corresponding to the detection of the presence of the hand only when the operation information indicating that the external device has been operated is acquired. Therefore, the processing control device has a problem that it is not possible to limit execution of processing unintended by the user in a period from detection of the presence of a hand to acquisition of operation information on an external device.
  • the present invention is intended to solve the above-mentioned problems, and has as its object to provide an input control device capable of preventing processing unintended by a user from being executed.
  • an object approaches the operation unit and a first reception unit that receives first detection information from a first sensor that detects that an object is present in a predetermined area. Based on the second reception unit that receives the second detection information from the second sensor that detects that the user is present, the first detection information that is received by the first reception unit, and the second detection information that is received by the second reception unit And a processing determination unit that determines whether to perform processing corresponding to the detection of an object by the first sensor.
  • FIG. 1 is a block diagram showing a configuration of a display device to which an input control device according to Embodiment 1 is applied.
  • 2A and 2B are diagrams showing an example of the hardware configuration of the input control device according to the first embodiment.
  • FIG. 3A is a front view showing a configuration example of the display device to which the input control device according to Embodiment 1 is applied viewed from the front
  • FIG. 3B is a right side cross section of the display device shown in FIG. It is a side sectional view showing an example of composition seen from the field. It is the front view which looked at the modification of the display to which the input control device concerning Embodiment 1 was applied from the front.
  • 5 is a flowchart for explaining the operation of the input control device according to the first embodiment.
  • FIG. 7 is a flowchart illustrating an operation of an input control device according to a modification of the first embodiment.
  • FIG. 3B is a side sectional view showing a modified example of the configuration in which the XX ′ section of the display device shown in FIG. 3A is viewed from the right side surface.
  • FIG. 3B is a side sectional view showing a modified example of the configuration in which the XX ′ section of the display device shown in FIG. 3A is viewed from the right side surface.
  • FIG. 7 is a block diagram showing a configuration of a display device to which an input control device according to Embodiment 2 is applied.
  • FIG. 16 is a front view showing a configuration example of a display device to which the input control device according to the second embodiment is applied, viewed from the front. 7 is a flowchart for explaining the operation of the input control device according to the second embodiment.
  • Embodiment 1 Embodiment 1
  • the input control device 1 according to the first embodiment will be described below as an example applied to the display device 3 mounted in a vehicle.
  • FIG. 1 is a block diagram showing a configuration of a display device 3 to which an input control device 1 according to Embodiment 1 is applied.
  • the input control device 1 includes a first reception unit 11, a second reception unit 12, and a process determination unit 13.
  • the first reception unit 11 receives first detection information from a first sensor 21 that detects that an object is present in a predetermined area.
  • the first reception unit 11 transmits the received first detection information to the processing determination unit 13.
  • the first sensor 21 and the processing determination unit 13 will be described later.
  • the second reception unit 12 receives second detection information from a second sensor 22 that detects that an object is approaching the operation unit 23.
  • the second accepting unit 12 transmits the accepted second detection information to the process determining unit 13.
  • the operation unit 23 and the second sensor 22 will be described later.
  • the processing determination unit 13 performs processing corresponding to detection of an object by the first sensor 21 based on the first detection information received by the first reception unit 11 and the second detection information received by the second reception unit 12. To determine whether to The processing determination unit 13 generates determination result information indicating the determination result, and transmits the generated determination result information to the output processing unit 31 described later.
  • the input device 2 includes an input control device 1, a first sensor 21, a second sensor 22, and an operation unit 23.
  • the first sensor 21 is a sensor that detects that an object is present in a predetermined area (hereinafter referred to as “first detection range”) in the space around the display device 3, and is an infrared sensor, an ultrasonic wave, or the like.
  • Noncontact sensors such as sensors and noncontact capacitance sensors.
  • Information of the object detected by the first sensor 21 is transmitted to the first reception unit 11 as first detection information.
  • the first detection information is information indicating the detection intensity of the object detected by the first sensor 21.
  • the detection intensity is the intensity of the received light if the first sensor 21 is an infrared sensor, the intensity of the received sound wave if it is an ultrasonic sensor, or the amount of change in capacitance if it is a non-contact type capacitance sensor is there.
  • the second sensor 22 is a sensor that detects that an object approaches the operation unit 23, and is a noncontact sensor such as an infrared sensor, an ultrasonic sensor, or a noncontact capacitance sensor.
  • Information of the object detected by the second sensor 22 is transmitted to the second accepting unit 12 as second detection information.
  • the second detection information is information indicating the detection intensity of the object detected by the second sensor 22.
  • the detected intensity is the intensity of the received light if the second sensor 22 is an infrared sensor, the intensity of the received sound wave if it is an ultrasonic sensor, or the amount of change in capacitance if it is a non-contact type capacitance sensor is there.
  • the operation unit 23 is a touch sensor, a push button or the like for a user such as a driver or a passenger to input an operation.
  • the form of the operation unit 23 is not limited to a touch sensor, a push button, etc.
  • the user may input an operation in a non-contact manner.
  • the display device 3 described later executes processing corresponding to the operation.
  • the display device 3 includes an input device 2, an output processing unit 31, and a display unit 32.
  • the output processing unit 31 generates output information to be output by the display unit 32 based on the determination result information transmitted from the process determination unit 13, and transmits the generated output information to the display unit 32.
  • the display unit 32 is a display output device such as a display.
  • the display unit 32 displays the output information transmitted from the output processing unit 31.
  • the display device 3 when the display device 3 functions as a navigation device, the display device 3 performs route guidance to a destination using a GPS (Global Positioning System) or the like, and displays a map and an image of the route guidance on the display unit 32.
  • the processing corresponding to the detection of an object by the first sensor 21 refers to, for example, a map displayed on the display unit 32 and an image of route guidance when the first sensor 21 detects an object such as a user's hand. It is a process which superimposes and displays an image like a pop-up menu on top.
  • FIG. 2 is a diagram illustrating an example of a hardware configuration of the input control device 1 according to the first embodiment.
  • each function of first reception unit 11, second reception unit 12, and process determination unit 13 is realized by processing circuit 201. That is, the input control device 1 includes the processing circuit 201 for determining whether or not the process corresponding to the detection of the object by the first sensor 21 is performed based on the first detection information and the second detection information.
  • the processing circuit 201 may be dedicated hardware as shown in FIG. 2A or a CPU (Central Processing Unit) 206 that executes a program stored in the memory 205 as shown in FIG. 2B.
  • CPU Central Processing Unit
  • the processing circuit 201 When the processing circuit 201 is dedicated hardware, the processing circuit 201 includes, for example, a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), and an FPGA (field-programmable). Gate Array) or a combination thereof is applicable.
  • each function of the first reception unit 11, the second reception unit 12, and the processing determination unit 13 is realized by software, firmware, or a combination of software and firmware. That is, the first reception unit 11, the second reception unit 12, and the processing determination unit 13 execute the program stored in the HDD (Hard Disk Drive) 202, the memory 205, etc., or the system LSI (Large-Scale Integration). Etc.). In addition, it can be said that the program stored in the HDD 202 or the memory 205 or the like causes the computer to execute each procedure or each method of the first reception unit 11, the second reception unit 12, and the processing determination unit 13.
  • the HDD Hard Disk Drive
  • the memory 205 etc.
  • the system LSI Large-Scale Integration
  • the memory 205 is, for example, a nonvolatile memory such as a random access memory (RAM), a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM), and an electrically erasable programmable read only memory (EEPROM).
  • RAM random access memory
  • ROM read only memory
  • EPROM erasable programmable read only memory
  • EEPROM electrically erasable programmable read only memory
  • Semiconductor or volatile semiconductor memory magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), etc. correspond.
  • the functions of the first reception unit 11, the second reception unit 12, and the processing determination unit 13 may be partially realized by dedicated hardware and partially realized by software or firmware.
  • the functions of the first reception unit 11 and the second reception unit 12 can be realized by the processing circuit 201 as dedicated hardware, and the processing determination unit 13 can obtain the program stored in the memory 205. It is possible to realize the function by reading and executing.
  • the input control device 1 includes an input interface device 203 and an output interface device 204 for communicating with the first sensor 21, the second sensor 22, the operation unit 23, and the output processing unit 31.
  • the hardware configuration of the input control device 1 has been described as using the HDD 202 as shown in FIG. 2B, but instead of the HDD 202, a solid state drive (SSD) is used. It may be.
  • SSD solid state drive
  • FIG. 3A is a front view showing a configuration example as viewed from the front of the display device 3 to which the input control device 1 according to the first embodiment is applied.
  • FIG. 3B is a side cross-sectional view showing a configuration example of the display device 3 shown in FIG. 3A when the XX ′ cross-section is viewed from the right side.
  • an operation unit 23 in the display device 3, an operation unit 23, a display unit 32, and a panel 53 for exterior are arranged.
  • the 1st sensor 21 and the 2nd sensor 22 are arrange
  • the first sensor 21 When the first sensor 21 is a sensor such as an infrared sensor in which a signal transmitted from the sensor is reflected by an object and the sensor receives a signal reflected by the object, the first sensor 21 transmits a signal. A first transmitter 21a and a first receiver 21b for receiving the reflected signal are provided.
  • the second sensor 22 When the second sensor 22 is a sensor such as an infrared sensor, in which the signal transmitted from the sensor is reflected by the object and the sensor receives the signal reflected by the object, the second sensor 22 transmits a signal A second transmitting unit 22a and a second receiving unit 22b that receives the reflected signal are provided.
  • the case where both the first sensor 21 and the second sensor 22 are infrared sensors will be described as an example.
  • the first transmitter 21a of the first sensor 21 and the second transmitter 22a of the second sensor 22 are both light emitting elements of infrared light It is.
  • the first receiving unit 21 b of the first sensor 21 and the second receiving unit 22 b of the second sensor 22 are both light receiving elements of infrared light It is.
  • the first transmitting unit 21 a and the second transmitting unit 22 a respectively emit infrared light from the inside of the display device 3 toward the front.
  • the first receiving unit 21b and the second receiving unit 22b respectively receive the reflected light of the infrared light emitted by the first transmitting unit 21a and the second transmitting unit 22a. That is, the first sensor 21 and the second sensor 22 respectively emit infrared light from the inside to the front of the display device 3, and the first sensor 21 and the second sensor 22 correspond to the first sensor 21 and the second sensor. 22 receive the reflected light of the infrared light respectively irradiated. As shown in FIGS. 3A and 3B, a second sensor 22 is disposed behind the rear side of the operation surface of the operation unit 23.
  • the operation unit 23 is formed of an infrared transmitting member that transmits infrared light.
  • a part of members positioned in the direction in which the first sensor 21 emits infrared light is formed of an infrared light transmitting member that transmits infrared light.
  • the 1st sensor 21 and the 2nd sensor 22 are respectively constituted by an infrared sensor, if it is not this limitation, if the 1st sensor 21 and the 2nd sensor 22 are each a non-contact type sensor, , An ultrasonic sensor, a non-contact type capacitance sensor, or the like.
  • the first sensor 21 and the second sensor 22 are both infrared sensors and are sensors of the same type, the present invention is not limited to this.
  • the first sensor 21 and the second sensor 22 are each noncontact type
  • the sensors may be different types of sensors.
  • the operation unit 23 and the exterior panel 53 do not necessarily have to be formed of an infrared ray transmitting member, and may be formed of a conductive member, a resin member or the like according to the type of sensor.
  • a first detection range 51 is set as a range in which the first sensor 21 detects an object.
  • the first detection range 51 is a range for detecting whether or not an object is present around the display device 3, and is set in advance in a space around the display device 3.
  • a second detection range 52 is set as a range in which the second sensor 22 detects an object.
  • the second detection range 52 is a range for detecting whether or not an object approaches the operation unit 23, and is set in advance in a space around the operation unit 23.
  • a so-called gesture operation is also possible in which the user operates the display device 3 by an operation such as waving his hand around the display device 3 without touching the display device 3.
  • the first sensor 21 is used to input the gesture operation.
  • the display device 3 when the user starts a gesture operation, the display device 3 superimposes the method of the next gesture operation operation input as a pop-up image on the map and the route guidance image displayed on the display unit 32.
  • the user's hand, arm, etc. may cross the range of both the first detection range 51 and the second detection range 52.
  • the user's hand, an arm, and the like may be present in both the first detection range 51 and the second detection range 52.
  • the first sensor 21 when the user's hand approaches the display device 3 and the user's hand, arm or the like falls within the range of the first detection range 51, the first sensor 21 has an object within the range Is detected and transmitted to the first reception unit 11 as first detection information.
  • the second sensor 22 detects that the object approaches the operation unit 23, and the second detection detects that It transmits to the 2nd reception part 12 as information.
  • FIG. 4 is a front view of a modification of the display device 3 to which the input control device 1 according to the first embodiment is applied, viewed from the front.
  • FIGS. 3A and 3B show the case where there is a range in which the first detection range 51 and the second detection range 52 overlap, but FIG. 4 shows the first detection range 51 and the second detection range 52. And shows the case where there is no overlapping range. Even when there is no overlapping range of the first detection range 51 and the second detection range 52 as shown in FIG. 4, for example, when the user operates the operation unit 23, the user's hand operates the operation unit 23, That is, when approaching the second sensor 22, the user's arm may fall within the range of the first detection range 51. Therefore, the first embodiment does not limit whether or not there is a range in which the first detection range 51 and the second detection range 52 overlap.
  • the first sensor 21 transmits the first detection information as needed regardless of whether an object is present in the range of the first detection range 51.
  • the second sensor 22 transmits the second detection information as needed regardless of whether or not an object is present in the range of the second detection range 52.
  • FIG. 5 is a flowchart for explaining the operation of the input control device 1 according to the first embodiment.
  • the input control device 1 repeatedly executes the processing shown in this flowchart, so that the display device 3 generates output information based on the determination result information from the input control device 1 while the input control device 1 is in operation, and displays Do.
  • the first reception unit 11 receives the first detection information from the first sensor 21 (step ST1). Based on the received first detection information, the processing determination unit 13 determines whether an object is present within the range of the first detection range 51, that is, in a predetermined area in the space around the display device 3. It is determined (step ST2).
  • step ST2 when the processing determination unit 13 determines that an object is not present in a predetermined area in the space around the display device 3 (step ST2 “NO”), the input control device 1 operates. finish.
  • step ST2 when the processing determination unit 13 determines that an object is present in a predetermined area in the space around the display device 3 (step ST2 "YES"), the second reception unit 12 The second detection information is received from the second sensor 22 (step ST3). Next, the processing determination unit 13 determines whether an object approaches the operation unit 23 based on the received second detection information (step ST4).
  • step ST4 when the processing determination unit 13 determines that the object approaches the operation unit 23 (step ST4 “YES”), the input control device 1 ends the operation.
  • step ST4 when the processing determination unit 13 determines that the object is not approaching the operation unit 23 (step ST4 "NO"), the processing determination unit 13 performs processing corresponding to the detection of the object by the first sensor 21.
  • the determination result information for instructing the output processing unit 31 to generate the control information is generated (step ST11). Thereafter, the processing determination unit 13 transmits the determination result information to the output processing unit 31 (step ST12), and the input control device 1 ends the operation.
  • the input control device 1 receives the first detection information from the first sensor 21 that detects the presence of an object in a predetermined area, and the operation unit 23.
  • the second reception unit 12 receiving the second detection information from the second sensor 22 that detects that the object is approaching, the first detection information received by the first reception unit 11, and the second reception unit 12 And a processing determination unit that determines whether to perform processing corresponding to the detection of an object by the first sensor based on the second detection information.
  • the input control device 1 can determine to prevent the processing not intended by the user from being executed.
  • the process determination unit 13 determines whether an object is present in a predetermined area in the space around the display device 3 and whether the object is approaching the operation unit 23. , It is determined whether or not the process corresponding to the detection of the object by the first sensor 21 is performed.
  • a modification of the first embodiment will be described. In the modification, in addition to the processing of the first embodiment described above, processing of comparing the detected intensity of the object in the first sensor 21 with the detected intensity of the object in the second sensor 22 is added.
  • FIG. 6 is a flow chart for explaining the operation of the input control device 1 according to the modification of the first embodiment.
  • the difference between FIG. 5 and FIG. 6 is that when the determination in step ST4 in FIG. 5 is “YES”, the process in step ST5 in FIG. 6 is added.
  • the determination in step ST4 in FIG. 5 is “YES”
  • the process in step ST5 in FIG. 6 is added.
  • FIG. 6 only processing of a portion having a difference from FIG. 5 will be described using FIG.
  • step ST4 when the processing determination unit 13 determines that the object is approaching the operation unit 23 (step ST4 "YES"), the processing determination unit 13 detects the detection strength of the object in the first sensor 21, and It is determined whether the detection intensity of the second sensor 22 is stronger than the detection intensity of the first sensor 21 by comparing the detection intensity of the object in the two sensors 22 (step ST5).
  • step ST5 when the processing determination unit 13 determines that the detection intensity of the second sensor 22 is stronger than the detection intensity of the first sensor 21 (step ST5 “YES”), the process determination unit 13 causes the user to operate It is determined that it is going to operate the unit 23, and the input control device 1 ends the operation.
  • step ST5 when the process determination unit 13 determines that the detection intensity of the second sensor 22 is not stronger than the detection intensity of the first sensor 21 (step ST5 “NO”), the process determination unit 13 determines whether The process after step ST11 is performed.
  • the processing determination unit 13 compares the detection intensity of the object in the first sensor 21 with the detection intensity of the object in the second sensor 22, so that the input control device 1 detects the object by the first sensor 21.
  • the determination as to whether or not to perform the process corresponding to the detection can be made more accurately.
  • the processing determination unit 13 determines the detection time of the object in the first sensor 21 and the detection time of the object in the second sensor 22. A process of comparing the detection time is added.
  • the time at which the second sensor 22 detects an object from the time at which the process determination unit 13 detects an object at the first sensor 21 This is realized by changing to a process of determining whether or not is ahead.
  • the detection times of the objects in the first sensor 21 and the second sensor 22 are compared, but even if the lengths of the detection times of the objects in the first sensor 21 and the second sensor 22 are compared good.
  • the processing determination unit 13 indicates that the detection intensity of the object in the first sensor 21 is a threshold or more.
  • a process is added to compare the period with a period in which the detected intensity of the object in the second sensor 22 indicates an intensity equal to or greater than a threshold.
  • the determination process of step ST5 performed by the process determination unit 13 is performed in the second sensor 22 from the period in which the process determination unit 13 indicates that the detection intensity of the object in the first sensor 21 is equal to or higher than the threshold. This is realized by changing to a process of determining whether or not the period in which the detected intensity of the object at the intensity of the object is greater than or equal to the threshold is longer.
  • the processing determination unit 13 compares the detection intensity of the object in the second sensor 22 with a preset threshold. Processing is added. In this modification, the process determination unit 13 determines whether the detection intensity of the object in the second sensor 22 is larger than a preset threshold value or not in the determination process of step ST5 performed by the process determination unit 13. It is realized by changing to processing.
  • FIG. 7 is a side sectional view showing a modified example of the configuration in which the XX ′ cross section of the display device 3 shown in FIG. 3A is viewed from the right side surface.
  • the configurations of the first sensor 21 and the second sensor 22 are different between the display device 3 shown in FIG. 3B and the display device 3 shown in FIG. 7.
  • the first receiver 21 b of the first sensor 21 and the second receiver 22 b of the second sensor 22 are formed by the same receiver 24.
  • the receiving unit 24 is a light receiving element.
  • both the reflected light of the infrared light emitted from the first transmitting portion 21 a of the first sensor 21 and the reflected light of the infrared light emitted from the second transmitting portion 22 a of the second sensor 22 are received. It is received by the part 24.
  • FIG. 8 is a side cross sectional view showing a modification of the display device 3 shown in FIG. 3A, viewed from the right side, which is different from that of FIG.
  • the following configuration is different between the display device 3 shown in FIG. 7 and the display device 3 shown in FIG.
  • the display light source 3 emits design light from the inside to the front of the display 3, and the light emitted from the light emitter 33 for design is designed to the outside of the display 3.
  • a design illumination unit 34 for illumination as illumination is added.
  • the design illumination unit 34 is disposed in the operation unit 23.
  • positioning of the 2nd transmission part 22a can be raised.
  • the light guide 35 which guides the reflected light of the infrared light which injected into the operation part 23 in a predetermined direction is added.
  • the light guide 35 is disposed so as to connect between the back surface of the operation unit 23 and the reception unit 24.
  • the light guide 35 guides at least a part of the reflected light of the infrared light incident from the front surface of the operation unit 23 to the receiving unit 24.
  • the degree of freedom in the arrangement of the receiving unit 24 can be increased.
  • the light guide 35 may guide at least a part of the infrared light emitted from the second transmission unit 22 a to the front surface of the operation unit 23.
  • a part of the operation unit 23 may be used as the light guide 35. By configuring in this manner, the degree of freedom in the arrangement of the second transmission units 22a can be increased.
  • FIG. 9 is a block diagram showing a configuration of a display device 3a to which the input control device 1 according to the second embodiment is applied.
  • the input device 2 has only one first sensor 21, whereas in the second embodiment, as shown in FIG. A plurality of first sensors 211 and 212 are provided.
  • the input device 2 has only one second sensor 22, whereas in the second embodiment, as shown in FIG. A plurality of second sensors 221 and 222 are provided.
  • the first reception unit 11 receives the first detection information transmitted by one first sensor 21, whereas in the second embodiment, the first reception unit 11 receives the first detection information.
  • the first receiving unit 11 receives all of the first detection information transmitted by each of the plurality of first sensors 211 and 212.
  • the second reception unit 12 receives the second detection information transmitted by one second sensor 22, whereas in the second embodiment, the second reception unit 12 receives the second detection information.
  • the second receiving unit 12 receives all the second detection information transmitted by each of the plurality of second sensors 221 and 222.
  • the other parts of the configuration that are the same as in the first embodiment are given the same reference numerals, and duplicate explanations are omitted.
  • FIG. 10 is a front view showing a configuration example as viewed from the front of the display device 3a to which the input control device 1 according to the second embodiment is applied.
  • a plurality of operation units 231 and 232, a display unit 32, and an exterior panel 53 are disposed.
  • the display device 3 a has a plurality of first sensors 211 and 212 and a plurality of second sensors 221 and 222 disposed therein. All the sensors emit infrared light toward the front from the inside of the display device 3a, and the reflected light of the emitted infrared light is detected by the respective sensors.
  • the second sensor 221 is disposed behind the back of the operation surface of the operation unit 231, and the second sensor 222 is disposed behind the back of the operation surface of the operation unit 232.
  • a first detection range 511 is set as a range in which the first sensor 211 detects an object.
  • the first detection range 511 is a range for detecting whether or not an object is present around the display device 3a, and is set in advance in a space around the display device 3a.
  • a first detection range 512 is set as a range in which the first sensor 212 detects an object.
  • a second detection range 521 is set as a range in which the second sensor 221 detects an object.
  • the second detection range 521 is a range for detecting whether or not an object approaches the operation unit 231, and is preset in a space around the operation unit 231.
  • a second detection range 522 is set as a range in which the second sensor 222 detects an object.
  • a so-called gesture operation is also possible in which the user such as the driver or the passenger does not touch the display device 3a, but operates the display device 3a by an operation such as waving his hand around the display device 3a.
  • the first sensors 211 and 212 are used for the input of the gesture operation. For example, when the user's hand moves from left to right in the space around the display device 3a within a predetermined time, the first sensor 211 detects that an object is present in the first detection range 511, and then, The presence of an object in the first detection range 512 is detected by the first sensor 212. For example, in the order of such detection, when a predetermined detection pattern is established, the input control device 1 determines that a gesture operation has been performed.
  • FIG. 11 is a flowchart for explaining the operation of the input control device 1 according to the second embodiment.
  • the first sensors 211 and 212 transmit the first detection information as needed, regardless of whether or not an object is present in the range of the first detection ranges 511 and 512.
  • the second sensors 221 and 222 transmit the second detection information as needed, regardless of whether or not an object is present in the range of the second detection ranges 521 and 522.
  • the display device 3 a When the input control device 1 repeatedly executes this flowchart, the display device 3 a generates and displays output information based on the determination result information from the input control device 1 while the input control device 1 is in operation.
  • the plurality of sensors determine the direction in which the object moves in the order in which the objects are detected. Using known techniques. Therefore, the description of the method of determining whether the pattern corresponding to the gesture operation has been performed is omitted.
  • the first reception unit 11 receives first detection information from the first sensors 211 and 212 (step ST101). Based on the plurality of pieces of received first detection information, the processing determination unit 13 is predetermined in at least one of the first detection range 511 and the first detection range 512, that is, in the space around the display device 3a. It is determined whether an object is present in the selected area (step ST102).
  • step ST102 when the processing determination unit 13 determines that an object is not present in a predetermined area in the space around the display device 3a (step ST102 “NO”), the input control device 1 operates. finish.
  • step ST102 when the processing determination unit 13 determines that an object is present in a predetermined area in the space around the display device 3a (step ST102 “YES”), the second reception unit 12 Second detection information relating to detection of an object approaching the operation units 231 and 232 is received from the second sensors 221 and 222, respectively (step ST103). Next, based on the received second detection information, the process determination unit 13 determines whether the object approaches at least one of the operation unit 231 or the operation unit 232 (step ST104).
  • step ST104 when the processing determination unit 13 determines that the object is approaching at least one of the operation unit 231 or the operation unit 232 (step ST104 “YES”), the process determination unit 13 performs the first detection range. In the order of detection of objects in 511 and 512, it is determined whether or not a predetermined detection pattern is established (step ST105). In step ST105, when the processing determination unit 13 determines that the predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512 (step ST105 "YES”), the processing determination unit 13 Determination result information that instructs the output processing unit 31 to perform output processing corresponding to a predetermined detection pattern is generated (step ST111).
  • step ST105 when the processing determination unit 13 determines that the predetermined detection pattern is not established in the order of object detection in the first detection range 511, 512 (step ST105 "YES"), the input control device 1 End the operation.
  • step ST104 determines that the object is not approaching any of the operation unit 231 or the operation unit 232 (step ST104 “NO”)
  • the process determination unit 13 determines that the first detection range 511 , 512, and determines whether or not a predetermined detection pattern is established (step ST121).
  • step ST121 when the processing determination unit 13 determines that the predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512 (step ST121 “YES”), the processing determination unit 13 Determination result information that instructs the output processing unit 31 to perform output processing corresponding to a predetermined detection pattern is generated (step ST111).
  • step ST112 the processing determination unit 13 transmits the determination result information to the output processing unit 31 (step ST112), and the input control device 1 ends the operation.
  • step ST121 when the processing determination unit 13 determines that the predetermined detection pattern is not established in the order of detection of objects in the first detection range 511, 512 (step ST121 “NO”), the processing determination unit 13
  • step ST122 the determination result information that instructs the output processing unit 31 to perform the output process corresponding to the detection of the object by the first sensor 21 is generated (step ST122).
  • the process determination unit 13 transmits the determination result information to the output processing unit 31 (step ST123), and the input control device 1 ends the operation.
  • Embodiment 2 shows an example in which two first sensors and two second sensors are applied, the number of sensors is not limited to this, and at least the number of first sensors is It is sufficient if it is plural.
  • the process determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512, and based on the determination. And generation of determination result information to instruct the output processing unit 31.
  • the processing determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the second detection ranges 521 and 522.
  • the modification performs, for example, the following processing. Before performing the process of step ST105 shown in FIG. 11, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the second detection ranges 521 and 522.
  • the input control device 1 ends the operation. If the process determination unit 13 determines that the predetermined detection pattern is not established in the order of detection of the objects in the second detection ranges 521 and 522, the process of step ST105 is performed.
  • the number of sensors is not limited to this, and if the number of the 2nd sensors is more than one at least Good.
  • the input control device 1 can be determined to prevent execution of processing not intended by the user, but also processing intended by the user can be performed more accurately. It can be determined as follows.
  • the second sensor itself may be configured to receive the user's operation, and the second sensor may have the function of the operation unit.
  • a noncontact capacitance sensor may be disposed on the front surface of the operation unit, and detection of an approaching object and detection of an operation input may be performed by the noncontact capacitance sensor.
  • the operation unit is provided in the display device, and the display device executes the processing corresponding to the operation.
  • the operation unit is an essential component of the display device. is not. That is, in Embodiment 1 and Embodiment 2, the operation unit is provided in an external device different from the display device, and when the operation unit is operated, the external device executes processing corresponding to the operation. You may configure it.
  • the second sensor since the second sensor detects that the object approaches the operation unit provided in the external device different from the display device, the second sensor is not an essential component of the input device, and the input device May be outside of the
  • the first and second embodiments for example, when the first sensor for detecting an operation input by a gesture operation is mounted on a so-called remote controller that operates a display device at a remote location,
  • the first sensor and the first reception unit may be connected via a network.
  • the driver's hand or the like precedes the second detection range. May pass through the first detection range.
  • the processing determination unit determines that an object is present in a predetermined area in the space around the display device, and then the second sensor It is also possible to give a suitable period before receiving the 2 detection information.
  • the input control device is applied to a vehicle-mounted display device.
  • the display device is not limited to a vehicle-mounted display device.
  • the input control device is applied to the display device.
  • the input control device is an audio device with an audio output, and a machine mechanically operated by electrical control. The present invention may be applied to an apparatus or the like.
  • the present invention allows free combination of each embodiment, modification of any component of each embodiment, or omission of any component in each embodiment. .
  • the input control device can be applied to a device such as a display device where a user performs an input operation.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

An input control device (1) is provided with: a first reception unit (11) that receives first detection information from a first sensor (21) that detects presence of an object in a predetermined region; a second reception unit (12) that receives second detection information from a second sensor (22) that detects approaching of the object to an operation unit (23); and a processing determining unit (13) that determines, on the basis of the first detection information received by the first reception unit (11), and the second detection information received by the second reception unit (12), whether processing corresponding to the object detection performed by the first sensor (21) is to be performed.

Description

入力制御装置、入力装置、及び、入力制御方法INPUT CONTROL DEVICE, INPUT DEVICE, AND INPUT CONTROL METHOD
 この発明は、入力制御装置、入力装置、及び、入力制御方法に関するものである。 The present invention relates to an input control device, an input device, and an input control method.
 ユーザが各種機器に対して操作入力を行う際に、ユーザの意図しない処理が当該機器によって実行されてしまうことを防止する技術が検討されている。例えば、特許文献1には、運転者の手の存在を検知する検知部と、当該検知に対応した既定の処理を行う処理部と、外部装置の操作部が操作されたことを示す操作情報を取得する取得部と、操作情報が取得されたとき、手の検知に対応した処理の実行が制限されるように処理部を制御する制御部とを備えた処理制御装置が開示されている。 When the user performs an operation input to various devices, a technique is being studied to prevent the processing not intended by the user from being executed by the devices. For example, in Patent Document 1, a detection unit that detects the presence of the driver's hand, a processing unit that performs a predetermined process corresponding to the detection, and operation information that indicates that the operation unit of the external device has been operated. A processing control apparatus is disclosed that includes an acquisition unit to be acquired, and a control unit that controls the processing unit so that execution of processing corresponding to hand detection is restricted when operation information is acquired.
特開2014-106845号公報JP 2014-106845 A
 特許文献1に開示された処理制御装置は、外部装置が操作されたことを示す操作情報を取得した場合にのみ、手の存在の検知に対応した処理の実行を制限するものである。したがって、当該処理制御装置は、手の存在を検知してから外部機器の操作情報を取得するまでの期間において、ユーザの意図しない処理が実行されてしまうことを制限できないという課題があった。 The process control device disclosed in Patent Document 1 restricts the execution of the process corresponding to the detection of the presence of the hand only when the operation information indicating that the external device has been operated is acquired. Therefore, the processing control device has a problem that it is not possible to limit execution of processing unintended by the user in a period from detection of the presence of a hand to acquisition of operation information on an external device.
 この発明は、上述の課題を解決するためのもので、ユーザの意図しない処理が実行されてしまうことを防止できる入力制御装置を提供することを目的としている。 SUMMARY OF THE INVENTION The present invention is intended to solve the above-mentioned problems, and has as its object to provide an input control device capable of preventing processing unintended by a user from being executed.
 この発明に係る入力制御装置は、予め定められた領域に物体が存在していることを検知する第1センサからの第1検知情報を受け付ける第1受付部と、操作部に物体が接近していることを検知する第2センサからの第2検知情報を受け付ける第2受付部と、第1受付部で受け付けた第1検知情報と、第2受付部で受け付けた第2検知情報と、に基づいて、第1センサによる物体の検知に対応した処理を行うか否かを判定する処理判定部とを備えたものである。 In the input control device according to the present invention, an object approaches the operation unit and a first reception unit that receives first detection information from a first sensor that detects that an object is present in a predetermined area. Based on the second reception unit that receives the second detection information from the second sensor that detects that the user is present, the first detection information that is received by the first reception unit, and the second detection information that is received by the second reception unit And a processing determination unit that determines whether to perform processing corresponding to the detection of an object by the first sensor.
 この発明によれば、ユーザの意図しない処理が実行されてしまうことを防止できる。 According to the present invention, it is possible to prevent the processing not intended by the user from being executed.
実施の形態1に係る入力制御装置が適用された表示装置の構成を示すブロック図である。FIG. 1 is a block diagram showing a configuration of a display device to which an input control device according to Embodiment 1 is applied. 図2A、図2Bは実施の形態1に係る入力制御装置のハードウェア構成の一例を示す図である。2A and 2B are diagrams showing an example of the hardware configuration of the input control device according to the first embodiment. 図3Aは、実施の形態1に係る入力制御装置が適用された表示装置を正面から見た構成例を示す正面図、図3Bは、図3Aに示した表示装置のX-X´断面を右側面から見た構成例を示す側方断面図である。FIG. 3A is a front view showing a configuration example of the display device to which the input control device according to Embodiment 1 is applied viewed from the front, and FIG. 3B is a right side cross section of the display device shown in FIG. It is a side sectional view showing an example of composition seen from the field. 実施の形態1に係る入力制御装置が適用された表示装置の変形例を正面から見た正面図である。It is the front view which looked at the modification of the display to which the input control device concerning Embodiment 1 was applied from the front. 実施の形態1に係る入力制御装置の動作を説明するフローチャートである。5 is a flowchart for explaining the operation of the input control device according to the first embodiment. 実施の形態1の変形例に係る入力制御装置の動作を説明するフローチャートである。7 is a flowchart illustrating an operation of an input control device according to a modification of the first embodiment. 図3Aに示す表示装置のX-X´断面を、右側面から見た構成の変形例を示す側方断面図である。FIG. 3B is a side sectional view showing a modified example of the configuration in which the XX ′ section of the display device shown in FIG. 3A is viewed from the right side surface. 図3Aに示す表示装置のX-X´断面を、右側面から見た構成の変形例を示す側方断面図である。FIG. 3B is a side sectional view showing a modified example of the configuration in which the XX ′ section of the display device shown in FIG. 3A is viewed from the right side surface. 実施の形態2に係る入力制御装置が適用された表示装置の構成を示すブロック図である。FIG. 7 is a block diagram showing a configuration of a display device to which an input control device according to Embodiment 2 is applied. 実施の形態2に係る入力制御装置が適用された表示装置を正面から見た構成例を示す正面図である。FIG. 16 is a front view showing a configuration example of a display device to which the input control device according to the second embodiment is applied, viewed from the front. 実施の形態2に係る入力制御装置の動作を説明するフローチャートである。7 is a flowchart for explaining the operation of the input control device according to the second embodiment.
 以下、この発明の実施の形態について、図面を参照しながら詳細に説明する。
実施の形態1.
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
Embodiment 1
 実施の形態1に係る入力制御装置1は、一例として、車両に搭載された表示装置3に適用されるものとして、以下説明する。 The input control device 1 according to the first embodiment will be described below as an example applied to the display device 3 mounted in a vehicle.
 図1は、実施の形態1に係る入力制御装置1が適用された表示装置3の構成を示すブロック図である。
 入力制御装置1は、第1受付部11、第2受付部12、及び処理判定部13を備える。
 第1受付部11は、予め定められた領域に物体が存在していることを検知する第1センサ21からの第1検知情報を受け付ける。第1受付部11は、受け付けた第1検知情報を処理判定部13に送信する。第1センサ21及び処理判定部13については後述する。
 第2受付部12は、操作部23に物体が接近していることを検知する第2センサ22からの第2検知情報を受け付ける。第2受付部12は、受け付けた第2検知情報を処理判定部13に送信する。操作部23及び第2センサ22については後述する。
FIG. 1 is a block diagram showing a configuration of a display device 3 to which an input control device 1 according to Embodiment 1 is applied.
The input control device 1 includes a first reception unit 11, a second reception unit 12, and a process determination unit 13.
The first reception unit 11 receives first detection information from a first sensor 21 that detects that an object is present in a predetermined area. The first reception unit 11 transmits the received first detection information to the processing determination unit 13. The first sensor 21 and the processing determination unit 13 will be described later.
The second reception unit 12 receives second detection information from a second sensor 22 that detects that an object is approaching the operation unit 23. The second accepting unit 12 transmits the accepted second detection information to the process determining unit 13. The operation unit 23 and the second sensor 22 will be described later.
 処理判定部13は、第1受付部11で受け付けた第1検知情報と、第2受付部12で受け付けた第2検知情報と、に基づいて、第1センサ21による物体の検知に対応した処理を行うか否かを判定する。処理判定部13は、判定の結果を示す判定結果情報を生成し、生成した判定結果情報を後述の出力処理部31に送信する。 The processing determination unit 13 performs processing corresponding to detection of an object by the first sensor 21 based on the first detection information received by the first reception unit 11 and the second detection information received by the second reception unit 12. To determine whether to The processing determination unit 13 generates determination result information indicating the determination result, and transmits the generated determination result information to the output processing unit 31 described later.
 入力装置2は、入力制御装置1、第1センサ21、第2センサ22、及び操作部23を備える。 The input device 2 includes an input control device 1, a first sensor 21, a second sensor 22, and an operation unit 23.
 第1センサ21は、表示装置3の周辺の空間において予め定められた領域(以下「第1検知範囲」という。)に物体が存在していることを検知するセンサであり、赤外線センサ、超音波センサ、非接触型静電容量センサ等の非接触型センサである。第1センサ21で検知された物体の情報は、第1検知情報として、第1受付部11に送信される。
 第1検知情報は、第1センサ21で検知された物体の検知強度を示す情報である。検知強度は、第1センサ21が赤外線センサであれば受信した光の強度、超音波センサであれば受信した音波の強度、非接触型静電容量センサであれば静電容量の変化量等である。
The first sensor 21 is a sensor that detects that an object is present in a predetermined area (hereinafter referred to as “first detection range”) in the space around the display device 3, and is an infrared sensor, an ultrasonic wave, or the like. Noncontact sensors such as sensors and noncontact capacitance sensors. Information of the object detected by the first sensor 21 is transmitted to the first reception unit 11 as first detection information.
The first detection information is information indicating the detection intensity of the object detected by the first sensor 21. The detection intensity is the intensity of the received light if the first sensor 21 is an infrared sensor, the intensity of the received sound wave if it is an ultrasonic sensor, or the amount of change in capacitance if it is a non-contact type capacitance sensor is there.
 第2センサ22は、操作部23に物体が接近していることを検知するセンサであり、赤外線センサ、超音波センサ、非接触型静電容量センサ等の非接触型センサである。第2センサ22で検知された物体の情報は、第2検知情報として、第2受付部12に送信される。
 第2検知情報は、第2センサ22で検知された物体の検知強度を示す情報である。検知強度は、第2センサ22が赤外線センサであれば受信した光の強度、超音波センサであれば受信した音波の強度、非接触型静電容量センサであれば静電容量の変化量等である。
The second sensor 22 is a sensor that detects that an object approaches the operation unit 23, and is a noncontact sensor such as an infrared sensor, an ultrasonic sensor, or a noncontact capacitance sensor. Information of the object detected by the second sensor 22 is transmitted to the second accepting unit 12 as second detection information.
The second detection information is information indicating the detection intensity of the object detected by the second sensor 22. The detected intensity is the intensity of the received light if the second sensor 22 is an infrared sensor, the intensity of the received sound wave if it is an ultrasonic sensor, or the amount of change in capacitance if it is a non-contact type capacitance sensor is there.
 操作部23は、運転者、搭乗者等であるユーザが操作を入力するためのタッチセンサ、押しボタン等である。操作部23の形態は、タッチセンサ、押しボタン等に限らず、例えば、非接触型静電容量センサのように、ユーザが非接触で操作の入力を行えるものであってもよい。操作部23が操作されると、後述の表示装置3が当該操作に対応した処理を実行する。 The operation unit 23 is a touch sensor, a push button or the like for a user such as a driver or a passenger to input an operation. The form of the operation unit 23 is not limited to a touch sensor, a push button, etc. For example, like a non-contact type capacitance sensor, the user may input an operation in a non-contact manner. When the operation unit 23 is operated, the display device 3 described later executes processing corresponding to the operation.
 表示装置3は、入力装置2、出力処理部31、及び表示部32を備える。 The display device 3 includes an input device 2, an output processing unit 31, and a display unit 32.
 出力処理部31は、処理判定部13から送信された判定結果情報に基づいて、表示部32で出力する出力情報を生成し、生成した出力情報を表示部32に送信する。
 表示部32は、ディスプレイ等の表示出力装置である。表示部32は、出力処理部31から送信された出力情報を表示する。
The output processing unit 31 generates output information to be output by the display unit 32 based on the determination result information transmitted from the process determination unit 13, and transmits the generated output information to the display unit 32.
The display unit 32 is a display output device such as a display. The display unit 32 displays the output information transmitted from the output processing unit 31.
 例えば、表示装置3がナビゲーション装置として機能する場合、表示装置3は、GPS(Global Positioning System)などを利用して、目的地までの経路案内を行い、地図及び経路案内の画像を表示部32に表示する。ここで、第1センサ21による物体の検知に対応した処理とは、第1センサ21がユーザの手等の物体を検知した際に、例えば、表示部32に表示された地図及び経路案内の画像上に、ポップアップメニューのような画像を重畳表示する処理である。 For example, when the display device 3 functions as a navigation device, the display device 3 performs route guidance to a destination using a GPS (Global Positioning System) or the like, and displays a map and an image of the route guidance on the display unit 32. indicate. Here, the processing corresponding to the detection of an object by the first sensor 21 refers to, for example, a map displayed on the display unit 32 and an image of route guidance when the first sensor 21 detects an object such as a user's hand. It is a process which superimposes and displays an image like a pop-up menu on top.
 図2は、実施の形態1に係る入力制御装置1のハードウェア構成の一例を示す図である。
 実施の形態1において、第1受付部11、第2受付部12、及び処理判定部13の各機能は、処理回路201により実現される。すなわち、入力制御装置1は、第1検知情報と第2検知情報とに基づいて、第1センサ21による物体の検知に対応した処理を行うか否かを判定するための処理回路201を備える。
 処理回路201は、図2Aに示すように専用のハードウェアであっても、図2Bに示すようにメモリ205に格納されるプログラムを実行するCPU(Central Processing Unit)206であってもよい。
FIG. 2 is a diagram illustrating an example of a hardware configuration of the input control device 1 according to the first embodiment.
In the first embodiment, each function of first reception unit 11, second reception unit 12, and process determination unit 13 is realized by processing circuit 201. That is, the input control device 1 includes the processing circuit 201 for determining whether or not the process corresponding to the detection of the object by the first sensor 21 is performed based on the first detection information and the second detection information.
The processing circuit 201 may be dedicated hardware as shown in FIG. 2A or a CPU (Central Processing Unit) 206 that executes a program stored in the memory 205 as shown in FIG. 2B.
 処理回路201が専用のハードウェアである場合、処理回路201は、例えば、単一回路、複合回路、プログラム化したプロセッサ、並列プログラム化したプロセッサ、ASIC(Application Specific Integrated Circuit)、FPGA(Field-Programmable Gate Array)、又はこれらを組み合わせたものが該当する。 When the processing circuit 201 is dedicated hardware, the processing circuit 201 includes, for example, a single circuit, a complex circuit, a programmed processor, a parallel programmed processor, an application specific integrated circuit (ASIC), and an FPGA (field-programmable). Gate Array) or a combination thereof is applicable.
 処理回路201がCPU206の場合、第1受付部11、第2受付部12、及び処理判定部13の各機能は、ソフトウェア、ファームウェア、又は、ソフトウェアとファームウェアとの組み合わせにより実現される。すなわち、第1受付部11、第2受付部12、及び処理判定部13は、HDD(Hard Disk Drive)202、メモリ205等に記憶されたプログラムを実行するCPU206、又はシステムLSI(Large-Scale Integration)等の処理回路により実現される。また、HDD202、又はメモリ205等に記憶されたプログラムは、第1受付部11、第2受付部12、及び処理判定部13の各手順、又は各方法をコンピュータに実行させるものであるとも言える。ここで、メモリ205とは、例えば、RAM(Random Access Memory)、ROM(Read Only Memory)、フラッシュメモリ、EPROM(Erasable Programmable Read Only Memory)、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の、不揮発性もしくは揮発性の半導体メモリ、磁気ディスク、フレキシブルディスク、光ディスク、コンパクトディスク、ミニディスク、又はDVD(Digital Versatile Disc)等が該当する。 When the processing circuit 201 is the CPU 206, each function of the first reception unit 11, the second reception unit 12, and the processing determination unit 13 is realized by software, firmware, or a combination of software and firmware. That is, the first reception unit 11, the second reception unit 12, and the processing determination unit 13 execute the program stored in the HDD (Hard Disk Drive) 202, the memory 205, etc., or the system LSI (Large-Scale Integration). Etc.). In addition, it can be said that the program stored in the HDD 202 or the memory 205 or the like causes the computer to execute each procedure or each method of the first reception unit 11, the second reception unit 12, and the processing determination unit 13. Here, the memory 205 is, for example, a nonvolatile memory such as a random access memory (RAM), a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM), and an electrically erasable programmable read only memory (EEPROM). Semiconductor or volatile semiconductor memory, magnetic disk, flexible disk, optical disk, compact disk, mini disk, DVD (Digital Versatile Disc), etc. correspond.
 なお、第1受付部11、第2受付部12、及び処理判定部13の各機能について、一部を専用のハードウェアで実現し、一部をソフトウェア又はファームウェアで実現するようにしてもよい。例えば、第1受付部11及び第2受付部12については、専用のハードウェアとしての処理回路201でその機能を実現し、処理判定部13については、処理回路がメモリ205に格納されたプログラムを読み出して実行することによってその機能を実現することが可能である。
 また、入力制御装置1は、第1センサ21、第2センサ22、操作部23、及び出力処理部31との通信を行うための、入力インタフェース装置203及び出力インタフェース装置204を有する。
 なお、以上の説明では、入力制御装置1のハードウェア構成について、図2Bに示すように、HDD202を使用するものとして説明したが、HDD202に代えて、SSD(Solid State Drive)を使用するものであってもよい。
The functions of the first reception unit 11, the second reception unit 12, and the processing determination unit 13 may be partially realized by dedicated hardware and partially realized by software or firmware. For example, the functions of the first reception unit 11 and the second reception unit 12 can be realized by the processing circuit 201 as dedicated hardware, and the processing determination unit 13 can obtain the program stored in the memory 205. It is possible to realize the function by reading and executing.
In addition, the input control device 1 includes an input interface device 203 and an output interface device 204 for communicating with the first sensor 21, the second sensor 22, the operation unit 23, and the output processing unit 31.
In the above description, the hardware configuration of the input control device 1 has been described as using the HDD 202 as shown in FIG. 2B, but instead of the HDD 202, a solid state drive (SSD) is used. It may be.
 図3Aは、実施の形態1に係る入力制御装置1が適用された表示装置3を正面から見た構成例を示す正面図である。図3Bは、図3Aに示す表示装置3のX-X´断面を、右側面から見た構成例を示す側方断面図である。
 図3Aに示すように、表示装置3には、操作部23、表示部32、及び外装用のパネル53が配置されている。
 図3Bに示すように、表示装置3の内部には、第1センサ21及び第2センサ22が配置されている。第1センサ21が、赤外線センサのように、センサから発信した信号が物体で反射し、物体で反射した信号をセンサが受信するようなセンサである場合、第1センサ21は、信号を発信する第1発信部21aと、反射した信号を受信する第1受信部21bとを備える。第2センサ22が、赤外線センサのように、センサから発信した信号が物体で反射し、物体で反射した信号をセンサが受信するようなセンサである場合、第2センサ22は、信号を発信する第2発信部22aと、反射した信号を受信する第2受信部22bとを備える。実施の形態1では、第1センサ21及び第2センサ22がいずれも赤外線センサである場合を例にとって説明する。第1センサ21及び第2センサ22がいずれも赤外線センサである場合、第1センサ21の第1発信部21a、及び第2センサ22の第2発信部22aは、いずれも赤外光の発光素子である。第1センサ21及び第2センサ22がいずれも赤外線センサである場合、第1センサ21の第1受信部21b、及び第2センサ22の第2受信部22bは、いずれも赤外光の受光素子である。第1発信部21a及び第2発信部22aは、それぞれ表示装置3の内部から正面に向かって赤外光を照射する。第1受信部21b及び第2受信部22bは、第1発信部21a及び第2発信部22aがそれぞれ照射した赤外光の反射光をそれぞれ受光する。すなわち、第1センサ21及び第2センサ22は、表示装置3の内部から正面に向かって赤外光をそれぞれ照射し、第1センサ21及び第2センサ22は、第1センサ21及び第2センサ22がそれぞれ照射した赤外光の反射光をそれぞれ受光する。
 図3A及び図3Bに示すように、操作部23の操作面の裏側の後方には、第2センサ22が配置されている。操作部23は、赤外光を透過する赤外線透過部材で形成されている。外装用のパネル53は、第1センサ21が赤外光を照射する方向に位置する一部の部材が赤外光を透過する赤外線透過部材で形成されている。
 実施の形態1では、第1センサ21及び第2センサ22がそれぞれ赤外線センサで構成されているが、この限りではなく、第1センサ21及び第2センサ22は、それぞれ非接触型センサであれば、超音波センサ、非接触型静電容量センサ等であってもよい。
 実施の形態1では、第1センサ21及び第2センサ22がいずれも赤外線センサであり、同種のセンサであるが、この限りではなく、第1センサ21及び第2センサ22は、それぞれ非接触型センサであれば、異なる種のセンサであってもよい。その際、操作部23及び外装用のパネル53は、必ずしも赤外線透過部材で形成されている必要はなく、センサの種類に応じて導電部材、樹脂部材等で形成されていても良い。
FIG. 3A is a front view showing a configuration example as viewed from the front of the display device 3 to which the input control device 1 according to the first embodiment is applied. FIG. 3B is a side cross-sectional view showing a configuration example of the display device 3 shown in FIG. 3A when the XX ′ cross-section is viewed from the right side.
As shown in FIG. 3A, in the display device 3, an operation unit 23, a display unit 32, and a panel 53 for exterior are arranged.
As shown to FIG. 3B, the 1st sensor 21 and the 2nd sensor 22 are arrange | positioned inside the display apparatus 3. As shown in FIG. When the first sensor 21 is a sensor such as an infrared sensor in which a signal transmitted from the sensor is reflected by an object and the sensor receives a signal reflected by the object, the first sensor 21 transmits a signal. A first transmitter 21a and a first receiver 21b for receiving the reflected signal are provided. When the second sensor 22 is a sensor such as an infrared sensor, in which the signal transmitted from the sensor is reflected by the object and the sensor receives the signal reflected by the object, the second sensor 22 transmits a signal A second transmitting unit 22a and a second receiving unit 22b that receives the reflected signal are provided. In the first embodiment, the case where both the first sensor 21 and the second sensor 22 are infrared sensors will be described as an example. When the first sensor 21 and the second sensor 22 are both infrared sensors, the first transmitter 21a of the first sensor 21 and the second transmitter 22a of the second sensor 22 are both light emitting elements of infrared light It is. When the first sensor 21 and the second sensor 22 are both infrared sensors, the first receiving unit 21 b of the first sensor 21 and the second receiving unit 22 b of the second sensor 22 are both light receiving elements of infrared light It is. The first transmitting unit 21 a and the second transmitting unit 22 a respectively emit infrared light from the inside of the display device 3 toward the front. The first receiving unit 21b and the second receiving unit 22b respectively receive the reflected light of the infrared light emitted by the first transmitting unit 21a and the second transmitting unit 22a. That is, the first sensor 21 and the second sensor 22 respectively emit infrared light from the inside to the front of the display device 3, and the first sensor 21 and the second sensor 22 correspond to the first sensor 21 and the second sensor. 22 receive the reflected light of the infrared light respectively irradiated.
As shown in FIGS. 3A and 3B, a second sensor 22 is disposed behind the rear side of the operation surface of the operation unit 23. The operation unit 23 is formed of an infrared transmitting member that transmits infrared light. In the panel 53 for exterior, a part of members positioned in the direction in which the first sensor 21 emits infrared light is formed of an infrared light transmitting member that transmits infrared light.
In Embodiment 1, although the 1st sensor 21 and the 2nd sensor 22 are respectively constituted by an infrared sensor, if it is not this limitation, if the 1st sensor 21 and the 2nd sensor 22 are each a non-contact type sensor, , An ultrasonic sensor, a non-contact type capacitance sensor, or the like.
In the first embodiment, although the first sensor 21 and the second sensor 22 are both infrared sensors and are sensors of the same type, the present invention is not limited to this. The first sensor 21 and the second sensor 22 are each noncontact type The sensors may be different types of sensors. At this time, the operation unit 23 and the exterior panel 53 do not necessarily have to be formed of an infrared ray transmitting member, and may be formed of a conductive member, a resin member or the like according to the type of sensor.
 図3A及び図3Bに示すように、第1センサ21が物体を検知する範囲として、第1検知範囲51が設定されている。第1検知範囲51は、表示装置3の周辺に物体が存在しているか否かを検知するための範囲であり、表示装置3の周辺の空間に予め設定されている。また、第2センサ22が物体を検知する範囲として、第2検知範囲52が設定されている。第2検知範囲52は、操作部23に物体が接近しているか否かを検知するための範囲であり、操作部23の周辺の空間に予め設定されている。
 実施の形態1では、ユーザが表示装置3に触れずに、表示装置3の周辺で手を振る等の動作によって表示装置3を操作する、いわゆるジェスチャ操作も可能である。ジェスチャ操作の入力には、第1センサ21が用いられる。例えば、ユーザがジェスチャ操作を開始すると、表示装置3は、表示部32に表示されている地図及び経路案内の画像上に、次のジェスチャ操作の操作入力の方法をポップアップ画像として重畳表示する。ユーザがジェスチャ操作を行う場合、ユーザの手、腕等が、第1検知範囲51及び第2検知範囲52の両方の範囲内を横切ることがある。また、ユーザが操作部23を操作する場合も同様に、ユーザの手、腕等が、第1検知範囲51及び第2検知範囲52の両方の範囲内に存在し得る。
 例えば、表示装置3にユーザの手が接近した場合、ユーザの手、腕等が第1検知範囲51の範囲内に入ると、第1センサ21は、当該範囲内に物体が存在していることを検知して、その旨を第1検知情報として第1受付部11に送信する。同様に、ユーザの手、腕等が第2検知範囲52の範囲内に入ると、第2センサ22は、操作部23に物体が接近していることを検知して、その旨を第2検知情報として第2受付部12に送信する。
As shown in FIGS. 3A and 3B, a first detection range 51 is set as a range in which the first sensor 21 detects an object. The first detection range 51 is a range for detecting whether or not an object is present around the display device 3, and is set in advance in a space around the display device 3. In addition, a second detection range 52 is set as a range in which the second sensor 22 detects an object. The second detection range 52 is a range for detecting whether or not an object approaches the operation unit 23, and is set in advance in a space around the operation unit 23.
In the first embodiment, a so-called gesture operation is also possible in which the user operates the display device 3 by an operation such as waving his hand around the display device 3 without touching the display device 3. The first sensor 21 is used to input the gesture operation. For example, when the user starts a gesture operation, the display device 3 superimposes the method of the next gesture operation operation input as a pop-up image on the map and the route guidance image displayed on the display unit 32. When the user performs a gesture operation, the user's hand, arm, etc. may cross the range of both the first detection range 51 and the second detection range 52. Similarly, when the user operates the operation unit 23, the user's hand, an arm, and the like may be present in both the first detection range 51 and the second detection range 52.
For example, when the user's hand approaches the display device 3 and the user's hand, arm or the like falls within the range of the first detection range 51, the first sensor 21 has an object within the range Is detected and transmitted to the first reception unit 11 as first detection information. Similarly, when the user's hand, arm or the like falls within the range of the second detection range 52, the second sensor 22 detects that the object approaches the operation unit 23, and the second detection detects that It transmits to the 2nd reception part 12 as information.
 図4は、実施の形態1に係る入力制御装置1が適用された表示装置3の変形例を正面から見た正面図である。
 図3A及び図3Bは、第1検知範囲51と第2検知範囲52とが重複している範囲が存在する場合を示しているが、図4は、第1検知範囲51と第2検知範囲52とが重複している範囲が存在しない場合を示している。図4のように第1検知範囲51と第2検知範囲52とが重複している範囲が存在しない場合においても、例えば、ユーザが操作部23を操作する場合、ユーザの手が操作部23、すなわち、第2センサ22に接近した際に、ユーザの腕が第1検知範囲51の範囲内に入ってしまうことがある。したがって、実施の形態1は、第1検知範囲51と第2検知範囲52とが重複している範囲が存在するか否かを制限するものではない。
FIG. 4 is a front view of a modification of the display device 3 to which the input control device 1 according to the first embodiment is applied, viewed from the front.
FIGS. 3A and 3B show the case where there is a range in which the first detection range 51 and the second detection range 52 overlap, but FIG. 4 shows the first detection range 51 and the second detection range 52. And shows the case where there is no overlapping range. Even when there is no overlapping range of the first detection range 51 and the second detection range 52 as shown in FIG. 4, for example, when the user operates the operation unit 23, the user's hand operates the operation unit 23, That is, when approaching the second sensor 22, the user's arm may fall within the range of the first detection range 51. Therefore, the first embodiment does not limit whether or not there is a range in which the first detection range 51 and the second detection range 52 overlap.
 動作について説明する。
 実施の形態1では、第1センサ21は、第1検知範囲51の範囲内に物体が存在しているか否かに関わらず、第1検知情報を随時送信する。また、第2センサ22は、第2検知範囲52の範囲内に物体が存在しているか否かに関わらず、第2検知情報を随時送信する。
 図5は、実施の形態1に係る入力制御装置1の動作を説明するフローチャートである。
 入力制御装置1がこのフローチャートに示した処理を繰り返し実行することで、入力制御装置1の動作中、表示装置3は、入力制御装置1からの判定結果情報に基づいて出力情報を生成し、表示する。
The operation will be described.
In the first embodiment, the first sensor 21 transmits the first detection information as needed regardless of whether an object is present in the range of the first detection range 51. In addition, the second sensor 22 transmits the second detection information as needed regardless of whether or not an object is present in the range of the second detection range 52.
FIG. 5 is a flowchart for explaining the operation of the input control device 1 according to the first embodiment.
The input control device 1 repeatedly executes the processing shown in this flowchart, so that the display device 3 generates output information based on the determination result information from the input control device 1 while the input control device 1 is in operation, and displays Do.
 入力制御装置1の動作が開始されると、第1受付部11は、第1検知情報を第1センサ21から受け付ける(ステップST1)。
 処理判定部13は、受け付けた第1検知情報を基に、第1検知範囲51の範囲内に、すなわち、表示装置3の周辺の空間において予め定められた領域に、物体が存在しているか否かを判定する(ステップST2)。
When the operation of the input control device 1 is started, the first reception unit 11 receives the first detection information from the first sensor 21 (step ST1).
Based on the received first detection information, the processing determination unit 13 determines whether an object is present within the range of the first detection range 51, that is, in a predetermined area in the space around the display device 3. It is determined (step ST2).
 ステップST2において、処理判定部13が、表示装置3の周辺の空間において予め定められた領域に、物体が存在していないと判定した場合(ステップST2“NO”)、入力制御装置1は動作を終了する。 In step ST2, when the processing determination unit 13 determines that an object is not present in a predetermined area in the space around the display device 3 (step ST2 “NO”), the input control device 1 operates. finish.
 ステップST2において、処理判定部13が、表示装置3の周辺の空間において予め定められた領域に、物体が存在していると判定した場合(ステップST2“YES”)、第2受付部12は、第2検知情報を第2センサ22から受け付ける(ステップST3)。次に、処理判定部13は、受け付けた第2検知情報を基に、操作部23に物体が接近しているか否かを判定する(ステップST4)。 In step ST2, when the processing determination unit 13 determines that an object is present in a predetermined area in the space around the display device 3 (step ST2 "YES"), the second reception unit 12 The second detection information is received from the second sensor 22 (step ST3). Next, the processing determination unit 13 determines whether an object approaches the operation unit 23 based on the received second detection information (step ST4).
 ステップST4において、処理判定部13が、操作部23に物体が接近していると判定した場合(ステップST4“YES”)、入力制御装置1は動作を終了する。
 ステップST4において、処理判定部13が、操作部23に物体が接近していないと判定した場合(ステップST4“NO”)、処理判定部13は、第1センサ21による物体の検知に対応した処理をするように出力処理部31に指示する判定結果情報を生成する(ステップST11)。その後、処理判定部13は、当該判定結果情報を出力処理部31に送信して(ステップST12)、入力制御装置1は動作を終了する。
In step ST4, when the processing determination unit 13 determines that the object approaches the operation unit 23 (step ST4 “YES”), the input control device 1 ends the operation.
In step ST4, when the processing determination unit 13 determines that the object is not approaching the operation unit 23 (step ST4 "NO"), the processing determination unit 13 performs processing corresponding to the detection of the object by the first sensor 21. The determination result information for instructing the output processing unit 31 to generate the control information is generated (step ST11). Thereafter, the processing determination unit 13 transmits the determination result information to the output processing unit 31 (step ST12), and the input control device 1 ends the operation.
 以上のように、入力制御装置1は、予め定められた領域に物体が存在していることを検知する第1センサ21からの第1検知情報を受け付ける第1受付部11と、操作部23に物体が接近していることを検知する第2センサ22からの第2検知情報を受け付ける第2受付部12と、第1受付部11で受け付けた第1検知情報と、第2受付部12で受け付けた第2検知情報と、に基づいて、第1センサ21による物体の検知に対応した処理を行うか否かを判定する処理判定部13とを備える。このように構成することで、入力制御装置1は、ユーザの意図しない処理が実行されてしまうことを防止するよう判定することができる。 As described above, the input control device 1 receives the first detection information from the first sensor 21 that detects the presence of an object in a predetermined area, and the operation unit 23. The second reception unit 12 receiving the second detection information from the second sensor 22 that detects that the object is approaching, the first detection information received by the first reception unit 11, and the second reception unit 12 And a processing determination unit that determines whether to perform processing corresponding to the detection of an object by the first sensor based on the second detection information. By configuring in this manner, the input control device 1 can determine to prevent the processing not intended by the user from being executed.
 これまでの実施の形態1は、処理判定部13が、表示装置3の周辺の空間において予め定められた領域に物体が存在しているか否かと、操作部23に物体が接近しているか否かと、に基づいて、第1センサ21による物体の検知に対応した処理を行うか否か、を判定するものである。
 次に、実施の形態1の変形例について説明する。
 変形例では、これまで説明した実施の形態1の処理に加えて、第1センサ21における物体の検知強度と、第2センサ22における物体の検知強度とを比較する処理が追加される。
In the first embodiment, the process determination unit 13 determines whether an object is present in a predetermined area in the space around the display device 3 and whether the object is approaching the operation unit 23. , It is determined whether or not the process corresponding to the detection of the object by the first sensor 21 is performed.
Next, a modification of the first embodiment will be described.
In the modification, in addition to the processing of the first embodiment described above, processing of comparing the detected intensity of the object in the first sensor 21 with the detected intensity of the object in the second sensor 22 is added.
 図6は、実施の形態1の変形例に係る入力制御装置1の動作を説明するフローチャートである。図5と図6との差異は、図5におけるステップST4の判定が“YES”であった場合、図6におけるステップST5の処理が付加される点である。ここでは図5との差異がある部分の処理のみを図6を用いて説明する。 FIG. 6 is a flow chart for explaining the operation of the input control device 1 according to the modification of the first embodiment. The difference between FIG. 5 and FIG. 6 is that when the determination in step ST4 in FIG. 5 is “YES”, the process in step ST5 in FIG. 6 is added. Here, only processing of a portion having a difference from FIG. 5 will be described using FIG.
 ステップST4において、処理判定部13が、操作部23に物体が接近していると判定した場合(ステップST4“YES”)、処理判定部13は、第1センサ21における物体の検知強度と、第2センサ22における物体の検知強度とを比較して、第1センサ21の検知強度より第2センサ22の検知強度の方が強いか否か、を判定する(ステップST5)。
 ステップST5において、処理判定部13が、第1センサ21の検知強度より第2センサ22の検知強度の方が強いと判定した場合(ステップST5“YES”)、処理判定部13は、ユーザが操作部23を操作しようとしていると判断して、入力制御装置1は動作を終了する。
 ステップST5において、処理判定部13が、第1センサ21の検知強度より第2センサ22の検知強度の方が強くないと判定した場合(ステップST5“NO”)、処理判定部13は、上述のステップST11以降の処理を実行する。
In step ST4, when the processing determination unit 13 determines that the object is approaching the operation unit 23 (step ST4 "YES"), the processing determination unit 13 detects the detection strength of the object in the first sensor 21, and It is determined whether the detection intensity of the second sensor 22 is stronger than the detection intensity of the first sensor 21 by comparing the detection intensity of the object in the two sensors 22 (step ST5).
In step ST5, when the processing determination unit 13 determines that the detection intensity of the second sensor 22 is stronger than the detection intensity of the first sensor 21 (step ST5 “YES”), the process determination unit 13 causes the user to operate It is determined that it is going to operate the unit 23, and the input control device 1 ends the operation.
In step ST5, when the process determination unit 13 determines that the detection intensity of the second sensor 22 is not stronger than the detection intensity of the first sensor 21 (step ST5 “NO”), the process determination unit 13 determines whether The process after step ST11 is performed.
 このように、処理判定部13が、第1センサ21における物体の検知強度と、第2センサ22における物体の検知強度とを比較することで、入力制御装置1は、第1センサ21による物体の検知に対応した処理を行うか否かの判定を、より正確に行うことができる。 As described above, the processing determination unit 13 compares the detection intensity of the object in the first sensor 21 with the detection intensity of the object in the second sensor 22, so that the input control device 1 detects the object by the first sensor 21. The determination as to whether or not to perform the process corresponding to the detection can be made more accurately.
 また、実施の形態1の異なる変形例では、これまで説明した実施の形態1の処理に加えて、処理判定部13が、第1センサ21における物体の検知時刻と、第2センサ22における物体の検知時刻とを比較する処理が追加される。
 この変形例は、処理判定部13が行っていたステップST5の判定処理を、処理判定部13が、第1センサ21における物体を検知した時刻より、第2センサ22における物体を検知した時刻の方が先か否か、で判定する処理に変更することで実現される。この変形例では、第1センサ21及び第2センサ22における物体の検知時刻を比較しているが、第1センサ21及び第2センサ22における物体の検知時間の長さを比較するようにしても良い。
Further, in the modification of the first embodiment, in addition to the processing of the first embodiment described above, the processing determination unit 13 determines the detection time of the object in the first sensor 21 and the detection time of the object in the second sensor 22. A process of comparing the detection time is added.
In this modification, in the determination process of step ST5 performed by the process determination unit 13, the time at which the second sensor 22 detects an object from the time at which the process determination unit 13 detects an object at the first sensor 21 This is realized by changing to a process of determining whether or not is ahead. In this modification, the detection times of the objects in the first sensor 21 and the second sensor 22 are compared, but even if the lengths of the detection times of the objects in the first sensor 21 and the second sensor 22 are compared good.
 また、実施の形態1のさらに異なる変形例では、これまで説明した実施の形態1の処理に加えて、処理判定部13が、第1センサ21における物体の検知強度が閾値以上の強度を示した期間と、第2センサ22における物体の検知強度が閾値以上の強度を示した期間とを比較する処理が追加される。
 この変形例は、処理判定部13が行っていたステップST5の判定処理を、処理判定部13が、第1センサ21における物体の検知強度が閾値以上の強度を示した期間より、第2センサ22における物体の検知強度が閾値以上の強度を示した期間の方が長いか否か、で判定する処理に変更することで実現される。
Further, in a further different modification of the first embodiment, in addition to the processing of the first embodiment described above, the processing determination unit 13 indicates that the detection intensity of the object in the first sensor 21 is a threshold or more. A process is added to compare the period with a period in which the detected intensity of the object in the second sensor 22 indicates an intensity equal to or greater than a threshold.
In this modification, the determination process of step ST5 performed by the process determination unit 13 is performed in the second sensor 22 from the period in which the process determination unit 13 indicates that the detection intensity of the object in the first sensor 21 is equal to or higher than the threshold. This is realized by changing to a process of determining whether or not the period in which the detected intensity of the object at the intensity of the object is greater than or equal to the threshold is longer.
 また、実施の形態1のさらに異なる変形例では、これまで説明した実施の形態1の処理に加えて、処理判定部13が、第2センサ22における物体の検知強度を予め設定された閾値と比較する処理が追加される。
 この変形例は、処理判定部13が行っていたステップST5の判定処理を、処理判定部13が、第2センサ22における物体の検知強度が予め設定された閾値より大きいか否か、で判定する処理に変更することで実現される。
Further, in still another modification of the first embodiment, in addition to the processing of the first embodiment described above, the processing determination unit 13 compares the detection intensity of the object in the second sensor 22 with a preset threshold. Processing is added.
In this modification, the process determination unit 13 determines whether the detection intensity of the object in the second sensor 22 is larger than a preset threshold value or not in the determination process of step ST5 performed by the process determination unit 13. It is realized by changing to processing.
 図7は、図3Aに示す表示装置3のX-X´断面を、右側面から見た構成の変形例を示す側方断面図である。
 図3Bに示す表示装置3と図7に示す表示装置3とでは、第1センサ21及び第2センサ22の構成が異なる。図7では、第1センサ21の第1受信部21bと、第2センサ22の第2受信部22bとを、同一の受信部24で形成している。第1センサ21及び第2センサ22がいずれも赤外線センサの場合、受信部24は受光素子である。図7では、第1センサ21の第1発信部21aから照射した赤外光の反射光、及び、第2センサ22の第2発信部22aから照射した赤外光の反射光は、両方とも受信部24で受信される。このように構成することで、部品コストを低減することができる。
FIG. 7 is a side sectional view showing a modified example of the configuration in which the XX ′ cross section of the display device 3 shown in FIG. 3A is viewed from the right side surface.
The configurations of the first sensor 21 and the second sensor 22 are different between the display device 3 shown in FIG. 3B and the display device 3 shown in FIG. 7. In FIG. 7, the first receiver 21 b of the first sensor 21 and the second receiver 22 b of the second sensor 22 are formed by the same receiver 24. When each of the first sensor 21 and the second sensor 22 is an infrared sensor, the receiving unit 24 is a light receiving element. In FIG. 7, both the reflected light of the infrared light emitted from the first transmitting portion 21 a of the first sensor 21 and the reflected light of the infrared light emitted from the second transmitting portion 22 a of the second sensor 22 are received. It is received by the part 24. By configuring in this manner, the cost of parts can be reduced.
 図8は、図3Aに示す表示装置3のX-X´断面を、右側面から見た構成において図7とは異なる変形例を示す側方断面図である。
 図7に示す表示装置3と図8に示す表示装置3とでは、以下の構成が異なる。図8では、表示装置3には、表示装置3の内部から正面に向かって意匠照明を発光する意匠用発光部33と、意匠用発光部33で発光した照射光を表示装置3の外部に意匠照明として照射する意匠照明部34とが追加されている。意匠照明部34は、操作部23に配置されている。
 このように構成することで、第2センサ22の第2発信部22aを操作部23の裏側の後方に配置した場合、第2発信部22aから照射された赤外光、及び物体で反射した赤外光は、操作部23に配置された意匠照明部34を通過する。このため、わざわざ操作部23を赤外線透過部材で形成する必要はない。また、このように構成することで、第2センサ22の第2発信部22aを操作部23の裏側の後方に配置したとしても、意匠照明の照射光により、第2発信部22aの発光が目立たなくなるため、デザイン性を損ねることはない。このため、第2発信部22aの配置の自由度を上げることができる。
 また、図8では、操作部23に入射した赤外光の反射光を、既定の方向に導く導光体35が追加されている。導光体35は、操作部23の背面から受信部24までの間を結ぶように配置される。導光体35は、操作部23の前面から入射した赤外光の反射光のうち少なくとも一部を受信部24に導光する。
 このように構成することで、受信部24の配置の自由度を上げることができる。
 また、導光体35は、第2発信部22aから出射した赤外光のうち少なくとも一部を操作部23の前面へ導光してもよい。また、操作部23の一部を導光体35としてもよい。
 このように構成することで、第2発信部22aの配置の自由度を上げることができる。
FIG. 8 is a side cross sectional view showing a modification of the display device 3 shown in FIG. 3A, viewed from the right side, which is different from that of FIG.
The following configuration is different between the display device 3 shown in FIG. 7 and the display device 3 shown in FIG. In FIG. 8, the display light source 3 emits design light from the inside to the front of the display 3, and the light emitted from the light emitter 33 for design is designed to the outside of the display 3. A design illumination unit 34 for illumination as illumination is added. The design illumination unit 34 is disposed in the operation unit 23.
With this configuration, when the second transmitting unit 22a of the second sensor 22 is disposed behind the back of the operation unit 23, the infrared light emitted from the second transmitting unit 22a and the red reflected by the object Outside light passes through the design illumination unit 34 disposed in the operation unit 23. For this reason, it is not necessary to intentionally form the operation unit 23 with an infrared transmitting member. Moreover, even if it arrange | positions the 2nd transmission part 22a of the 2nd sensor 22 back to the back side of the operation part 23 by comprising in this way, light emission of the 2nd transmission part 22a is conspicuous by the irradiated light of design illumination There is no loss of design because it disappears. For this reason, the freedom degree of arrangement | positioning of the 2nd transmission part 22a can be raised.
Moreover, in FIG. 8, the light guide 35 which guides the reflected light of the infrared light which injected into the operation part 23 in a predetermined direction is added. The light guide 35 is disposed so as to connect between the back surface of the operation unit 23 and the reception unit 24. The light guide 35 guides at least a part of the reflected light of the infrared light incident from the front surface of the operation unit 23 to the receiving unit 24.
By configuring in this manner, the degree of freedom in the arrangement of the receiving unit 24 can be increased.
Further, the light guide 35 may guide at least a part of the infrared light emitted from the second transmission unit 22 a to the front surface of the operation unit 23. Further, a part of the operation unit 23 may be used as the light guide 35.
By configuring in this manner, the degree of freedom in the arrangement of the second transmission units 22a can be increased.
実施の形態2.
 実施の形態1では、入力装置2の第1センサ21が1つの場合の実施の形態について説明するものであった。
 実施の形態2では、入力装置2aの第1センサ21が複数の場合の実施の形態について説明する。
Second Embodiment
In the first embodiment, the embodiment in the case where there is one first sensor 21 of the input device 2 has been described.
In the second embodiment, an embodiment in which a plurality of first sensors 21 of the input device 2a are provided will be described.
 図9は、実施の形態2に係る入力制御装置1を適用した表示装置3aの構成を示すブロック図である。
 実施の形態1では、図1で示すように、入力装置2は、第1センサ21を1つだけ有するのに対して、実施の形態2では、図9で示すように、入力装置2aは、複数の第1センサ211,212を有する。実施の形態1では、図1で示すように、入力装置2は、第2センサ22を1つだけ有するのに対して、実施の形態2では、図9で示すように、入力装置2aは、複数の第2センサ221,222を有する。実施の形態1では、図1で示すように、第1受付部11は、1つの第1センサ21が送信する第1検知情報を受け付けるものであるのに対して、実施の形態2では、図9で示すように、第1受付部11は、複数の第1センサ211,212それぞれが送信する第1検知情報を全て受け付けるものである。実施の形態1では、図1で示すように、第2受付部12は、1つの第2センサ22が送信する第2検知情報を受け付けるものであるのに対して、実施の形態2では、図9で示すように、第2受付部12は、複数の第2センサ221,222それぞれが送信する第2検知情報を全て受け付けるものである。
 その他の、実施の形態1と同様の構成については、同じ符号を付して重複した説明を省略する。
FIG. 9 is a block diagram showing a configuration of a display device 3a to which the input control device 1 according to the second embodiment is applied.
In the first embodiment, as shown in FIG. 1, the input device 2 has only one first sensor 21, whereas in the second embodiment, as shown in FIG. A plurality of first sensors 211 and 212 are provided. In the first embodiment, as shown in FIG. 1, the input device 2 has only one second sensor 22, whereas in the second embodiment, as shown in FIG. A plurality of second sensors 221 and 222 are provided. In the first embodiment, as shown in FIG. 1, the first reception unit 11 receives the first detection information transmitted by one first sensor 21, whereas in the second embodiment, the first reception unit 11 receives the first detection information. As indicated by 9, the first receiving unit 11 receives all of the first detection information transmitted by each of the plurality of first sensors 211 and 212. In the first embodiment, as shown in FIG. 1, the second reception unit 12 receives the second detection information transmitted by one second sensor 22, whereas in the second embodiment, the second reception unit 12 receives the second detection information. As indicated by 9, the second receiving unit 12 receives all the second detection information transmitted by each of the plurality of second sensors 221 and 222.
The other parts of the configuration that are the same as in the first embodiment are given the same reference numerals, and duplicate explanations are omitted.
 図10は、実施の形態2に係る入力制御装置1が適用された表示装置3aを正面から見た構成例を示す正面図である。
 図10に示すように、複数の操作部231,232、表示部32、及び外装用のパネル53が配置されている。
 表示装置3aは、内部に複数の第1センサ211,212、並びに、複数の第2センサ221,222が配置されている。全てのセンサは、それぞれ表示装置3aの内部から正面に向かって赤外光を照射し、照射した赤外光の反射光をそれぞれのセンサで検知する。第2センサ221は、操作部231の操作面の裏側の後方に、第2センサ222は、操作部232の操作面の裏側の後方に、それぞれ配置されている。
FIG. 10 is a front view showing a configuration example as viewed from the front of the display device 3a to which the input control device 1 according to the second embodiment is applied.
As shown in FIG. 10, a plurality of operation units 231 and 232, a display unit 32, and an exterior panel 53 are disposed.
The display device 3 a has a plurality of first sensors 211 and 212 and a plurality of second sensors 221 and 222 disposed therein. All the sensors emit infrared light toward the front from the inside of the display device 3a, and the reflected light of the emitted infrared light is detected by the respective sensors. The second sensor 221 is disposed behind the back of the operation surface of the operation unit 231, and the second sensor 222 is disposed behind the back of the operation surface of the operation unit 232.
 図10に示すように、第1センサ211が物体を検知する範囲として、第1検知範囲511が設定されている。第1検知範囲511は、表示装置3aの周辺に、物体が存在しているか否かを検知するための範囲であり、表示装置3aの周辺の空間に予め設定されている。同様に、第1センサ212が物体を検知する範囲として、第1検知範囲512が設定されている。
 第2センサ221が物体を検知する範囲として、第2検知範囲521が設定されている。第2検知範囲521は、操作部231に物体が接近しているか否かを検知するための範囲であり、操作部231の周辺の空間に予め設定されている。同様に、第2センサ222が物体を検知する範囲として、第2検知範囲522が設定されている。
As shown in FIG. 10, a first detection range 511 is set as a range in which the first sensor 211 detects an object. The first detection range 511 is a range for detecting whether or not an object is present around the display device 3a, and is set in advance in a space around the display device 3a. Similarly, a first detection range 512 is set as a range in which the first sensor 212 detects an object.
A second detection range 521 is set as a range in which the second sensor 221 detects an object. The second detection range 521 is a range for detecting whether or not an object approaches the operation unit 231, and is preset in a space around the operation unit 231. Similarly, a second detection range 522 is set as a range in which the second sensor 222 detects an object.
 実施の形態2では、運転者、搭乗者等であるユーザが表示装置3aに触れずに、表示装置3aの周辺で手を振る等の動作によって表示装置3aを操作する、いわゆるジェスチャ操作も可能である。ジェスチャ操作の入力は、第1センサ211,212が用いられる。例えば、ユーザの手が所定時間内に表示装置3aの周辺の空間において左側から右側に移動すると、第1センサ211によって第1検知範囲511に物体が存在していることが検知され、次に、第1センサ212によって第1検知範囲512に物体が存在していることが検知される。例えば、このような検知の順序において、既定の検知パターンが成立した場合、入力制御装置1は、ジェスチャ操作が行われたと判定する。 In the second embodiment, a so-called gesture operation is also possible in which the user such as the driver or the passenger does not touch the display device 3a, but operates the display device 3a by an operation such as waving his hand around the display device 3a. is there. The first sensors 211 and 212 are used for the input of the gesture operation. For example, when the user's hand moves from left to right in the space around the display device 3a within a predetermined time, the first sensor 211 detects that an object is present in the first detection range 511, and then, The presence of an object in the first detection range 512 is detected by the first sensor 212. For example, in the order of such detection, when a predetermined detection pattern is established, the input control device 1 determines that a gesture operation has been performed.
 動作について説明する。
 図11は、実施の形態2に係る入力制御装置1の動作を説明するフローチャートである。
 実施の形態2では、第1センサ211,212は、第1検知範囲511,512の範囲内に物体が存在しているか否かに関わらず、それぞれ第1検知情報を随時送信する。同様に、第2センサ221,222は、第2検知範囲521,522の範囲内に物体が存在しているか否かに関わらず、それぞれ第2検知情報を随時送信する。
 入力制御装置1がこのフローチャートを繰り返し実行することで、入力制御装置1の動作中、表示装置3aは入力制御装置1からの判定結果情報に基づいて出力情報を生成し、表示する。
 なお、実施の形態2では、入力制御装置1が上述のジェスチャ操作に該当するパターンが行われたか否かを判定する際に、複数のセンサが物体を検知した順序により物体が移動する方向を判定する公知の技術を利用している。したがって、ジェスチャ操作に該当するパターンが行われたか否かの判定方法については説明を省略する。
 入力制御装置1の動作が開始されると、第1受付部11は、第1検知情報を第1センサ211,212からそれぞれ受け付ける(ステップST101)。
 処理判定部13は、受け付けた複数の第1検知情報を基に、第1検知範囲511又は第1検知範囲512の少なくともいずれかの範囲内に、すなわち、表示装置3aの周辺の空間において予め定められた領域に、物体が存在しているか否かを判定する(ステップST102)。
The operation will be described.
FIG. 11 is a flowchart for explaining the operation of the input control device 1 according to the second embodiment.
In the second embodiment, the first sensors 211 and 212 transmit the first detection information as needed, regardless of whether or not an object is present in the range of the first detection ranges 511 and 512. Similarly, the second sensors 221 and 222 transmit the second detection information as needed, regardless of whether or not an object is present in the range of the second detection ranges 521 and 522.
When the input control device 1 repeatedly executes this flowchart, the display device 3 a generates and displays output information based on the determination result information from the input control device 1 while the input control device 1 is in operation.
In the second embodiment, when the input control device 1 determines whether or not a pattern corresponding to the above-described gesture operation has been performed, the plurality of sensors determine the direction in which the object moves in the order in which the objects are detected. Using known techniques. Therefore, the description of the method of determining whether the pattern corresponding to the gesture operation has been performed is omitted.
When the operation of the input control device 1 is started, the first reception unit 11 receives first detection information from the first sensors 211 and 212 (step ST101).
Based on the plurality of pieces of received first detection information, the processing determination unit 13 is predetermined in at least one of the first detection range 511 and the first detection range 512, that is, in the space around the display device 3a. It is determined whether an object is present in the selected area (step ST102).
 ステップST102において、処理判定部13が、表示装置3aの周辺の空間において予め定められた領域に、物体が存在してないと判定した場合(ステップST102“NO”)、入力制御装置1は動作を終了する。 In step ST102, when the processing determination unit 13 determines that an object is not present in a predetermined area in the space around the display device 3a (step ST102 “NO”), the input control device 1 operates. finish.
 ステップST102において、処理判定部13が、表示装置3aの周辺の空間において予め定められた領域に、物体が存在していると判定した場合(ステップST102“YES”)、第2受付部12は、操作部231,232に接近している物体の検知に係る第2検知情報を、第2センサ221,222からそれぞれ受け付ける(ステップST103)。次に、処理判定部13は、受け付けた第2検知情報を基に、操作部231又は操作部232の少なくともいずれかに物体が接近しているか否かを判定する(ステップST104)。 In step ST102, when the processing determination unit 13 determines that an object is present in a predetermined area in the space around the display device 3a (step ST102 “YES”), the second reception unit 12 Second detection information relating to detection of an object approaching the operation units 231 and 232 is received from the second sensors 221 and 222, respectively (step ST103). Next, based on the received second detection information, the process determination unit 13 determines whether the object approaches at least one of the operation unit 231 or the operation unit 232 (step ST104).
 ステップST104において、処理判定部13が、操作部231又は操作部232の少なくともいずれかに物体が接近していると判定した場合(ステップST104“YES”)、処理判定部13は、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立したか否か、を判定する(ステップST105)。
 ステップST105において、処理判定部13が、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立したと判定した場合(ステップST105“YES”)、処理判定部13は、既定の検知パターンに対応した出力処理を行うように出力処理部31に指示する判定結果情報を生成する(ステップST111)。その後、処理判定部13は、当該判定結果情報を出力処理部31に送信して(ステップST112)、入力制御装置1は動作を終了する。
 ステップST105において、処理判定部13が、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立していないと判定した場合(ステップST105“YES”)、入力制御装置1は動作を終了する。
In step ST104, when the processing determination unit 13 determines that the object is approaching at least one of the operation unit 231 or the operation unit 232 (step ST104 “YES”), the process determination unit 13 performs the first detection range. In the order of detection of objects in 511 and 512, it is determined whether or not a predetermined detection pattern is established (step ST105).
In step ST105, when the processing determination unit 13 determines that the predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512 (step ST105 "YES"), the processing determination unit 13 Determination result information that instructs the output processing unit 31 to perform output processing corresponding to a predetermined detection pattern is generated (step ST111). Thereafter, the processing determination unit 13 transmits the determination result information to the output processing unit 31 (step ST112), and the input control device 1 ends the operation.
In step ST105, when the processing determination unit 13 determines that the predetermined detection pattern is not established in the order of object detection in the first detection range 511, 512 (step ST105 "YES"), the input control device 1 End the operation.
 ステップST104において、処理判定部13が、操作部231又は操作部232のいずれにも物体が接近していないと判定した場合(ステップST104“NO”)、処理判定部13は、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立したか否か、を判定する(ステップST121)。
 ステップST121において、処理判定部13が、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立したと判定した場合(ステップST121“YES”)、処理判定部13は、既定の検知パターンに対応した出力処理を行うように出力処理部31に指示する判定結果情報を生成する(ステップST111)。その後、処理判定部13は、当該判定結果情報を出力処理部31に送信して(ステップST112)、入力制御装置1は動作を終了する。
 ステップST121において、処理判定部13が、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立していないと判定した場合(ステップST121“NO”)、処理判定部13は、第1センサ21による物体の検知に対応した出力処理を行うように出力処理部31に指示する判定結果情報を生成する(ステップST122)。その後、処理判定部13は、当該判定結果情報を出力処理部31に送信して(ステップST123)、入力制御装置1は動作を終了する。
 なお、実施の形態2では、第1センサと第2センサをそれぞれ2つずつ適用した例を示したが、センサの数はこれに限定されるものではなく、また、少なくとも第1センサの数が複数であればよい。
If the process determination unit 13 determines in step ST104 that the object is not approaching any of the operation unit 231 or the operation unit 232 (step ST104 “NO”), the process determination unit 13 determines that the first detection range 511 , 512, and determines whether or not a predetermined detection pattern is established (step ST121).
In step ST121, when the processing determination unit 13 determines that the predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512 (step ST121 “YES”), the processing determination unit 13 Determination result information that instructs the output processing unit 31 to perform output processing corresponding to a predetermined detection pattern is generated (step ST111). Thereafter, the processing determination unit 13 transmits the determination result information to the output processing unit 31 (step ST112), and the input control device 1 ends the operation.
In step ST121, when the processing determination unit 13 determines that the predetermined detection pattern is not established in the order of detection of objects in the first detection range 511, 512 (step ST121 “NO”), the processing determination unit 13 In step ST122, the determination result information that instructs the output processing unit 31 to perform the output process corresponding to the detection of the object by the first sensor 21 is generated (step ST122). Thereafter, the process determination unit 13 transmits the determination result information to the output processing unit 31 (step ST123), and the input control device 1 ends the operation.
Although Embodiment 2 shows an example in which two first sensors and two second sensors are applied, the number of sensors is not limited to this, and at least the number of first sensors is It is sufficient if it is plural.
 これまでの実施の形態2は、処理判定部13が、第1検知範囲511,512における物体の検知の順序において、既定の検知パターンが成立したか否か、を判定し、当該判定に基づいて、出力処理部31に指示する判定結果情報を生成するものである。
 次に、実施の形態2の変形例について説明する。
 変形例では、処理判定部13は、第2検知範囲521,522における物体の検知の順序において、既定の検知パターンが成立したか否か、を判定する。
 変形例は、例えば、次のような処理を行う。
 処理判定部13は、図11に示しているステップST105の処理を行う前に、第2検知範囲521,522における物体の検知の順序において、既定の検知パターンが成立したか否かを判定する。処理判定部13が、第2検知範囲521,522における物体の検知の順序において、既定の検知パターンが成立したと判定した場合、入力制御装置1は動作を終了する。処理判定部13が、第2検知範囲521,522における物体の検知の順序において、既定の検知パターンが成立していないと判定した場合、ステップST105の処理を行う。
 なお、変形例では、第1センサと第2センサをそれぞれ2つずつ適用した例を示したが、センサの数はこれに限定されるものではなく、少なくとも第2センサの数が複数であればよい。
In the second embodiment, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the first detection range 511, 512, and based on the determination. And generation of determination result information to instruct the output processing unit 31.
Next, a modification of the second embodiment will be described.
In the modification, the processing determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the second detection ranges 521 and 522.
The modification performs, for example, the following processing.
Before performing the process of step ST105 shown in FIG. 11, the process determination unit 13 determines whether or not a predetermined detection pattern is established in the order of detection of objects in the second detection ranges 521 and 522. When the processing determination unit 13 determines that the predetermined detection pattern is established in the order of detection of the objects in the second detection ranges 521 and 522, the input control device 1 ends the operation. If the process determination unit 13 determines that the predetermined detection pattern is not established in the order of detection of the objects in the second detection ranges 521 and 522, the process of step ST105 is performed.
In addition, although the example which applied the 1st sensor and the 2nd sensor two each was shown in the modification, the number of sensors is not limited to this, and if the number of the 2nd sensors is more than one at least Good.
 以上のように構成することで、入力制御装置1は、ユーザの意図しない処理が実行されてしまうことを防止するよう判定することができるばかりか、ユーザの意図した処理がより正確に実行されるように判定することができる。 By configuring as described above, not only the input control device 1 can be determined to prevent execution of processing not intended by the user, but also processing intended by the user can be performed more accurately. It can be determined as follows.
 実施の形態1及び実施の形態2においては、第2センサ自体がユーザの操作を受け付けるように構成して、第2センサがそれぞれ操作部の機能を有するようにしてもよい。例えば、操作部の前面に非接触型静電容量センサを配置し、非接触型静電容量センサにより接近している物体の検知と、操作入力の検知との両方を行ってもよい。
 また、実施の形態1及び実施の形態2において、操作部は、表示装置に備えられ、操作に対応した処理を表示装置が実行する例を示したが、操作部は、表示装置において必須な構成ではない。すなわち、実施の形態1及び実施の形態2において、操作部は、表示装置とは異なる外部機器に備えられるものとし、操作部が操作されると、操作に対応した処理を外部機器が実行するように構成してもよい。さらに、この場合、第2センサは、表示装置とは異なる外部機器に備えられた操作部に物体が接近していることを検知するものであるため、入力装置において必須な構成ではなく、入力装置の外部にあってもよい。
 また、実施の形態1及び実施の形態2において、ジェスチャ操作による操作入力を検知する第1センサが、例えば、遠隔地にある表示装置を操作する、いわゆるリモートコントローラに搭載されているような場合、第1センサと第1受付部とは、ネットワークを介して接続されてもよい。
 また、実施の形態1及び実施の形態2において、操作部及び第1センサの配置によっては、例えばユーザが操作部を操作しようとする際に、運転者等の手が第2検知範囲よりも前に、第1検知範囲を通過する場合がある。配置によって、このような場合が想定される際には、処理判定部が、表示装置の周辺の空間において予め定められた領域に物体が存在していると判定してから、第2センサから第2検知情報を受け付けるまでの間に、適宜期間を持たせることも可能である。
 また、実施の形態1及び実施の形態2において、表示装置の表示部を別体の表示出力装置として備え、出力処理部と表示出力装置とをネットワークを介して接続した表示システムとして実現してもよい。
 また、実施の形態1及び実施の形態2において、入力制御装置が車載用の表示装置に適された例を示したが、表示装置は、車載用に限定されるものではない。
 また、実施の形態1及び実施の形態2において、入力制御装置が表示装置に適用された例を示したが、入力制御装置は、音声出力を伴う音声装置、電気制御により機械的に動作する機械装置等に適用されてもよい。
In the first embodiment and the second embodiment, the second sensor itself may be configured to receive the user's operation, and the second sensor may have the function of the operation unit. For example, a noncontact capacitance sensor may be disposed on the front surface of the operation unit, and detection of an approaching object and detection of an operation input may be performed by the noncontact capacitance sensor.
In the first and second embodiments, the operation unit is provided in the display device, and the display device executes the processing corresponding to the operation. However, the operation unit is an essential component of the display device. is not. That is, in Embodiment 1 and Embodiment 2, the operation unit is provided in an external device different from the display device, and when the operation unit is operated, the external device executes processing corresponding to the operation. You may configure it. Furthermore, in this case, since the second sensor detects that the object approaches the operation unit provided in the external device different from the display device, the second sensor is not an essential component of the input device, and the input device May be outside of the
In the first and second embodiments, for example, when the first sensor for detecting an operation input by a gesture operation is mounted on a so-called remote controller that operates a display device at a remote location, The first sensor and the first reception unit may be connected via a network.
In the first embodiment and the second embodiment, depending on the arrangement of the operation unit and the first sensor, for example, when the user tries to operate the operation unit, the driver's hand or the like precedes the second detection range. May pass through the first detection range. When such a case is assumed depending on the arrangement, the processing determination unit determines that an object is present in a predetermined area in the space around the display device, and then the second sensor It is also possible to give a suitable period before receiving the 2 detection information.
In Embodiments 1 and 2, even if the display unit of the display device is provided as a separate display output device and realized as a display system in which the output processing unit and the display output device are connected via a network. Good.
In the first and second embodiments, the input control device is applied to a vehicle-mounted display device. However, the display device is not limited to a vehicle-mounted display device.
In the first and second embodiments, the input control device is applied to the display device. However, the input control device is an audio device with an audio output, and a machine mechanically operated by electrical control. The present invention may be applied to an apparatus or the like.
 なお、この発明はその発明の範囲内において、各実施の形態の自由な組み合わせ、あるいは各実施の形態の任意の構成要素の変形、もしくは各実施の形態において任意の構成要素の省略が可能である。 In the scope of the invention, the present invention allows free combination of each embodiment, modification of any component of each embodiment, or omission of any component in each embodiment. .
 この発明に係る入力制御装置は表示装置をはじめとしたユーザが入力操作を行う機器に適用することができる。 The input control device according to the present invention can be applied to a device such as a display device where a user performs an input operation.
 1 入力制御装置、2、2a 入力装置、3、3a 表示装置、11 第1受付部、12 第2受付部、13 処理判定部、21、211、212 第1センサ、21a 第1発信部、21b 第1受信部、22、221、222 第2センサ、22a 第2発信部、22b 第2受信部、23 操作部、24 受信部、31 出力処理部、32 表示部、33 意匠用発光部、34 意匠照明部、35 導光体、51、511、512 第1検知範囲、52、521、522 第2検知範囲、53 パネル、201 処理回路、202 HDD、203 入力インタフェース装置、204 出力インタフェース装置、205 メモリ、206 CPU。 Reference Signs List 1 input control device 2, 2a input device 3, 3a display device 11, 11 first reception unit, 12 second reception unit, 13 process determination unit 21, 211, 212 first sensor, 21a first transmission unit, 21b 1st receiver, 22 221 221 222 2nd sensor 22a 2nd transmitter 22b 2nd receiver 23 23 receiver 24 receiver 31 output processor 32 indicator 33 light emitter for design 34 Design illumination unit, 35 light guide, 51, 511, 512 first detection range, 52, 521, 522 second detection range, 53 panel, 201 processing circuit, 202 HDD, 203 input interface device, 204 output interface device, 205 Memory, 206 CPU.

Claims (16)

  1.  予め定められた領域に物体が存在していることを検知する第1センサからの第1検知情報を受け付ける第1受付部と、
     操作部に物体が接近していることを検知する第2センサからの第2検知情報を受け付ける第2受付部と、
     前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた前記第2検知情報と、に基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定する処理判定部と、
     を備えた入力制御装置。
    A first reception unit that receives first detection information from a first sensor that detects that an object is present in a predetermined area;
    A second reception unit that receives second detection information from a second sensor that detects that an object is approaching the operation unit;
    Whether to perform processing corresponding to detection of an object by the first sensor based on the first detection information received by the first reception unit and the second detection information received by the second reception unit A process determination unit that determines
    Input control device with.
  2.  前記処理判定部は、前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた前記第2検知情報と、における検知の強度の情報に基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする請求項1記載の入力制御装置。
    The processing determination unit is configured to detect the first sensor based on information on strength of detection in the first detection information received by the first reception unit and the second detection information received by the second reception unit. The input control device according to claim 1, wherein it is determined whether or not the process corresponding to the detection of an object according to is performed.
  3.  前記処理判定部は、前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた前記第2検知情報と、における検知時間の長さに基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする請求項1記載の入力制御装置。
    The processing determination unit is configured to determine the first sensor based on a length of detection time of the first detection information received by the first reception unit and the second detection information received by the second reception unit. The input control device according to claim 1, wherein it is determined whether or not the process corresponding to the detection of an object according to is performed.
  4.  前記処理判定部は、前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた前記第2検知情報と、における検知の順序に基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする請求項1記載の入力制御装置。
    The processing determination unit determines an object by the first sensor based on an order of detection in the first detection information received by the first reception unit and the second detection information received by the second reception unit. The input control device according to claim 1, wherein it is determined whether or not the process corresponding to the detection of is performed.
  5.  前記第1受付部は、複数の前記第1センサからの前記第1検知情報をそれぞれ受け付け、
     前記処理判定部は、前記第1受付部で受け付けた複数の前記第1センサそれぞれに対応した複数の前記第1検知情報と、複数の前記第1センサにおける物体の検知の順序と、前記第2受付部で受け付けた前記第2検知情報と、に基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする請求項1記載の入力制御装置。
    The first reception unit receives the first detection information from the plurality of first sensors, respectively.
    The processing determination unit may include a plurality of first detection information corresponding to each of the plurality of first sensors received by the first reception unit, an order of detection of an object in the plurality of first sensors, and the second The input control device according to claim 1, wherein it is determined whether or not the process corresponding to the detection of the object by the first sensor is performed based on the second detection information received by the reception unit.
  6.  前記第2受付部は、複数の前記第2センサからの前記第2検知情報をそれぞれ受け付け、
     前記処理判定部は、前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた複数の前記第2センサそれぞれに対応した複数の前記第2検知情報と、複数の前記第2センサにおける物体の検知の順序と、基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする請求項1記載の入力制御装置。
    The second reception unit receives the second detection information from the plurality of second sensors, respectively.
    The processing determination unit includes the first detection information received by the first reception unit, a plurality of second detection information corresponding to each of the plurality of second sensors received by the second reception unit, and a plurality of the second detection information. The input control device according to claim 1, wherein it is determined whether or not the process corresponding to the detection of the object by the first sensor is performed based on the order of detection of the object in the second sensor and the processing corresponding to the detection of the object by the first sensor.
  7.  予め定められた領域に物体が存在していることを検知する前記第1センサと、
     操作部に物体が接近していることを検知する前記第2センサと、
     請求項1記載の入力制御装置と、を備えたこと
     を特徴とする入力装置。
    The first sensor that detects the presence of an object in a predetermined area;
    The second sensor that detects that an object is approaching the operation unit;
    An input device comprising: the input control device according to claim 1.
  8.  前記第1センサ、及び、前記第2センサは、非接触型センサであること
     を特徴とする請求項7記載の入力装置。
    The input device according to claim 7, wherein the first sensor and the second sensor are non-contact sensors.
  9.  前記第1センサ、及び、前記第2センサは、それぞれ、赤外線センサ、超音波センサ、又は、静電容量センサ、のいずれかであること
     を特徴とする請求項8記載の入力装置。
    The input device according to claim 8, wherein each of the first sensor and the second sensor is any of an infrared sensor, an ultrasonic sensor, or a capacitance sensor.
  10.  前記第1センサ、及び、前記第2センサは、1つの受信部で前記第1センサ、及び、前記第2センサの両方の信号を受信すること
     を特徴とする請求項8記載の入力装置。
    The input device according to claim 8, wherein the first sensor and the second sensor receive signals of both the first sensor and the second sensor at one receiving unit.
  11.  前記操作部を備えたこと
     を特徴とする請求項7記載の入力装置。
    The input device according to claim 7, further comprising the operation unit.
  12.  前記第2センサは赤外線センサであり、
     前記操作部の少なくとも一部が赤外線透過部材で形成され、
     前記第2センサは、前記赤外線透過部材に向かって赤外線を発光すること
     を特徴とする請求項11記載の入力装置。
    The second sensor is an infrared sensor,
    At least a part of the operation unit is formed of an infrared ray transmitting member,
    The input device according to claim 11, wherein the second sensor emits infrared light toward the infrared transmitting member.
  13.  前記第2センサの受光素子の前面と前記操作部の背面との間に導光体を備え、
     前記導光体は、前記第2センサから出射した赤外光のうち少なくとも一部を前記第2センサの受光素子に導光すること
     を特徴とする請求項12記載の入力装置。
    A light guide is provided between the front surface of the light receiving element of the second sensor and the back surface of the operation unit,
    The input device according to claim 12, wherein the light guide guides at least a part of infrared light emitted from the second sensor to a light receiving element of the second sensor.
  14.  前記第2センサは、前記操作部の前面に配置した非接触型静電容量センサであること
     を特徴とする請求項11記載の入力装置。
    The input device according to claim 11, wherein the second sensor is a non-contact type capacitance sensor disposed on a front surface of the operation unit.
  15.  請求項1記載の入力制御装置を備えたこと
     を特徴とする車載用入力装置。
    An on-vehicle input device comprising the input control device according to claim 1.
  16.  第1受付部が、予め定められた領域に物体が存在していることを検知する第1センサからの第1検知情報を受け付け、
     第2受付部が、操作部に物体が接近していることを検知する第2センサからの第2検知情報を受け付け、
     処理判定部が、前記第1受付部で受け付けた前記第1検知情報と、前記第2受付部で受け付けた前記第2検知情報と、に基づいて、前記第1センサによる物体の検知に対応した処理を行うか否かを判定すること
     を特徴とする入力制御方法。
    The first receiving unit receives first detection information from a first sensor that detects that an object is present in a predetermined area;
    The second reception unit receives second detection information from a second sensor that detects that an object is approaching the operation unit,
    The processing determination unit corresponds to the detection of the object by the first sensor based on the first detection information received by the first reception unit and the second detection information received by the second reception unit. An input control method characterized by determining whether or not to perform processing.
PCT/JP2018/002477 2018-01-26 2018-01-26 Input control device, input device, and input control method WO2019146072A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112018006942.7T DE112018006942T5 (en) 2018-01-26 2018-01-26 Input control device, input device and input control method
CN201880087256.7A CN111630474A (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method
PCT/JP2018/002477 WO2019146072A1 (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/002477 WO2019146072A1 (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method

Publications (1)

Publication Number Publication Date
WO2019146072A1 true WO2019146072A1 (en) 2019-08-01

Family

ID=67394577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/002477 WO2019146072A1 (en) 2018-01-26 2018-01-26 Input control device, input device, and input control method

Country Status (3)

Country Link
CN (1) CN111630474A (en)
DE (1) DE112018006942T5 (en)
WO (1) WO2019146072A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012144217A1 (en) * 2011-04-22 2012-10-26 パナソニック株式会社 Input device for vehicle and input method for vehicle
JP2013058117A (en) * 2011-09-09 2013-03-28 Alps Electric Co Ltd Input device
JP2015201054A (en) * 2014-04-08 2015-11-12 アルプス電気株式会社 Proximity detection type input device
JP2015225423A (en) * 2014-05-27 2015-12-14 京セラディスプレイ株式会社 Display device
JP2016179723A (en) * 2015-03-24 2016-10-13 アルパイン株式会社 On-vehicle apparatus
JP6169298B1 (en) * 2017-02-16 2017-07-26 京セラ株式会社 Electronic device and control method

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9389690B2 (en) * 2012-03-01 2016-07-12 Qualcomm Incorporated Gesture detection based on information from multiple types of sensors
JP6242292B2 (en) * 2014-05-27 2017-12-06 三菱電機株式会社 Software device
JPWO2017130344A1 (en) * 2016-01-28 2018-06-21 三菱電機株式会社 Interface device, contact detection device, and contact detection method
CN106839290A (en) * 2017-01-16 2017-06-13 广东美的制冷设备有限公司 The control method and control device and air-conditioner of gesture identification

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2012144217A1 (en) * 2011-04-22 2012-10-26 パナソニック株式会社 Input device for vehicle and input method for vehicle
JP2013058117A (en) * 2011-09-09 2013-03-28 Alps Electric Co Ltd Input device
JP2015201054A (en) * 2014-04-08 2015-11-12 アルプス電気株式会社 Proximity detection type input device
JP2015225423A (en) * 2014-05-27 2015-12-14 京セラディスプレイ株式会社 Display device
JP2016179723A (en) * 2015-03-24 2016-10-13 アルパイン株式会社 On-vehicle apparatus
JP6169298B1 (en) * 2017-02-16 2017-07-26 京セラ株式会社 Electronic device and control method

Also Published As

Publication number Publication date
CN111630474A (en) 2020-09-04
DE112018006942T5 (en) 2020-11-19

Similar Documents

Publication Publication Date Title
JP6077119B2 (en) Improved operating method of ultrasonic sensor, driver assistance device and automobile
US20160114726A1 (en) Reverse parking assistance with rear visual indicator
JP6080955B2 (en) Information display device
US9672433B2 (en) Multi-directional vehicle maneuvering assistance
US20140358368A1 (en) Method and Device for Operating Functions Displayed on a Display Unit of a Vehicle Using Gestures Which are Carried Out in a Three-Dimensional Space, and Corresponding Computer Program Product
US10175695B2 (en) Apparatus and method for controlling parking-out of vehicle
US20160313438A1 (en) Ultrasonic sensor device for a motor vehicle, motor vehicle and corresponding method
JPWO2004083889A1 (en) Obstacle detection device
US10328843B2 (en) Driving assistance system with short-distance ranging
US11181909B2 (en) Remote vehicle control device, remote vehicle control system, and remote vehicle control method
US20190135179A1 (en) Vehicle and control method thereof
JP6672915B2 (en) Object detection device, object detection method, and program
CN113165643A (en) Parking assist apparatus
WO2019146072A1 (en) Input control device, input device, and input control method
US10237843B2 (en) Vehicle-mounted apparatus
US20200380726A1 (en) Dynamic three-dimensional imaging distance safeguard
JP6942255B2 (en) False start control device and false start control method
WO2014076875A1 (en) Object detecting system and object detecting device
US20210354669A1 (en) Optical apparatus, in-vehicle system, and mobile apparatus
KR102653265B1 (en) Ultrasonic sensor system and defect diagnosis method of ultrasonic sensor and method for detecting object of ultrasonic sensor
CN105589549A (en) Optical control device based on infrared induction
JP2019534448A (en) Sensing device, driving support system, powered vehicle, and method for powered vehicle
JP7192647B2 (en) Adhesion detection device and adhesion detection method
JP6557193B2 (en) Determination device, determination method, and determination program
JP2020076763A5 (en)

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18902561

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 18902561

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP