WO2016002270A1 - 非接触操作検出装置 - Google Patents
非接触操作検出装置 Download PDFInfo
- Publication number
- WO2016002270A1 WO2016002270A1 PCT/JP2015/058914 JP2015058914W WO2016002270A1 WO 2016002270 A1 WO2016002270 A1 WO 2016002270A1 JP 2015058914 W JP2015058914 W JP 2015058914W WO 2016002270 A1 WO2016002270 A1 WO 2016002270A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sensor
- contact operation
- screen
- detection
- user
- Prior art date
Links
- 238000001514 detection method Methods 0.000 title claims description 150
- 238000000034 method Methods 0.000 claims description 49
- 230000008569 process Effects 0.000 claims description 39
- 230000008859 change Effects 0.000 claims description 10
- 238000004458 analytical method Methods 0.000 claims description 3
- 238000010586 diagram Methods 0.000 description 26
- 238000012545 processing Methods 0.000 description 20
- 230000006870 function Effects 0.000 description 9
- 238000004891 communication Methods 0.000 description 8
- 125000002066 L-histidyl group Chemical group [H]N1C([H])=NC(C([H])([H])[C@](C(=O)[*])([H])N([H])[H])=C1[H] 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 230000001133 acceleration Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 230000004044 response Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/10—Input arrangements, i.e. from user to vehicle, associated with vehicle functions or specially adapted therefor
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K35/00—Instruments specially adapted for vehicles; Arrangement of instruments in or on vehicles
- B60K35/20—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor
- B60K35/21—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays
- B60K35/211—Output arrangements, i.e. from vehicle to user, associated with vehicle functions or specially adapted therefor using visual output, e.g. blinking lights or matrix displays producing three-dimensional [3D] effects, e.g. stereoscopic images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R16/00—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for
- B60R16/02—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements
- B60R16/037—Electric or fluid circuits specially adapted for vehicles and not otherwise provided for; Arrangement of elements of electric or fluid circuits specially adapted for vehicles and not otherwise provided for electric constitutive elements for occupant comfort, e.g. for automatic adjustment of appliances according to personal settings, e.g. seats, mirrors, steering wheel
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1438—Touch screens
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/143—Touch sensitive instrument input devices
- B60K2360/1446—Touch switches
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/145—Instrument input by combination of touch screen and hardware input devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60K—ARRANGEMENT OR MOUNTING OF PROPULSION UNITS OR OF TRANSMISSIONS IN VEHICLES; ARRANGEMENT OR MOUNTING OF PLURAL DIVERSE PRIME-MOVERS IN VEHICLES; AUXILIARY DRIVES FOR VEHICLES; INSTRUMENTATION OR DASHBOARDS FOR VEHICLES; ARRANGEMENTS IN CONNECTION WITH COOLING, AIR INTAKE, GAS EXHAUST OR FUEL SUPPLY OF PROPULSION UNITS IN VEHICLES
- B60K2360/00—Indexing scheme associated with groups B60K35/00 or B60K37/00 relating to details of instruments or dashboards
- B60K2360/146—Instrument input by gesture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/20—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used
- B60R2300/207—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of display used using multi-purpose displays, e.g. camera image and navigation or video on same display
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60Y—INDEXING SCHEME RELATING TO ASPECTS CROSS-CUTTING VEHICLE TECHNOLOGY
- B60Y2400/00—Special features of vehicle units
- B60Y2400/92—Driver displays
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
Definitions
- the present invention relates to a non-contact operation detection device.
- Patent Document 1 JP 2013-195326 A (Patent Document 1).
- This gazette states that “each has a region for spatially detecting a hand, and detects that the hand is in the region when the distance from the hand is equal to or less than the threshold distance in the corresponding region.
- the threshold distance for the region C is set to the threshold corresponding to the predetermined wipe operation of the hand
- a sensor driver 10 that sets the distance.
- Patent Literature 1 it is possible to prevent the user from forcing an unnatural hand movement, but there is still room for improvement in terms of making the user easier to operate.
- the present invention has been made in view of the above-described circumstances, and an object thereof is to provide a non-contact operation detection device that is highly convenient for the user.
- a non-contact operation detecting device mounted on a vehicle detects a non-contact operation and, in the first mode, whether or not the non-contact operation is an effective operation based on the first rule. And in the second mode, a control unit for determining whether the non-contact operation is an effective operation based on the second rule is provided.
- FIG. 1 is a front view of an in-vehicle device according to the present embodiment.
- FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device.
- FIG. 3A is a diagram showing an arrangement state of four sensor units, and
- FIG. 3B is a diagram showing a detection range of an infrared sensor unit.
- FIG. 4 is a diagram illustrating a predetermined direction operation.
- FIG. 5 is a diagram illustrating an example of a screen displayed by the in-vehicle device and screen transition.
- FIG. 6 is a diagram illustrating an example of a screen displayed by the in-vehicle device.
- FIG. 7 is a diagram illustrating an example of a screen displayed by the in-vehicle device.
- FIG. 8 is a flowchart showing the operation of the in-vehicle device.
- FIG. 9 is a flowchart showing the operation of the in-vehicle device.
- FIG. 10 is a diagram illustrating an example of waveforms of detection values of four sensor units.
- FIG. 11 is a diagram showing a pattern of peak order.
- FIG. 12 is a diagram illustrating a first table stored in the in-vehicle device.
- FIG. 13 is a diagram illustrating a second table and a third table stored in the in-vehicle device.
- FIG. 14 is a diagram used for explaining the recognition range of the downward operation.
- FIG. 15 is a diagram used for explaining the recognition range of the diagonally down right operation.
- FIG. 16 is a diagram used for explaining a difference in recognition range of a predetermined operation between the first mode and the third mode.
- FIG. 1 is a front view of an in-vehicle device 1 (non-contact operation detection device) according to the present embodiment.
- the in-vehicle device 1 includes a housing 10 that houses a control board, an interface board, a power supply unit, and the like.
- a touch panel 11 (display unit) is provided on the front surface of the housing 10.
- An operation panel 13 provided with a plurality of operation switches 12 such as a power switch is provided below the touch panel 11 on the front surface of the housing 10.
- An infrared sensor unit 14 is provided at the center in the left-right direction of the operation panel 13. The infrared sensor unit 14 will be described later.
- the in-vehicle device 1 is provided on the dashboard of the vehicle. As will be described later, the user can perform input to the in-vehicle device 1 by performing a non-contact operation in front of the in-vehicle device 1.
- the non-contact operation is not an operation involving a contact with the in-vehicle device 1 (such as a touch operation on the touch panel 11 or a pressing operation with respect to the operation switch 12), but the user's hand is in a predetermined mode in a predetermined area in front of the in-vehicle device 1.
- FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 1.
- the in-vehicle device 1 includes a control unit 20, a touch panel 11, an operation unit 21, a storage unit 22, a GPS unit 23, a relative orientation detection unit 24, a beacon receiving unit 25, and an FM.
- the multiplex receiving unit 26, the short-range wireless communication unit 27, the mail management unit 28, the sensor control unit 29, and the infrared sensor unit 14 are provided.
- the control unit 20 includes a CPU, a ROM, a RAM, other peripheral circuits, and the like, and controls each unit of the in-vehicle device 1.
- the touch panel 11 includes a display panel 11a and a touch sensor 11b.
- the display panel 11 a is configured by a liquid crystal display panel or the like, and displays various images under the control of the control unit 20.
- the touch sensor 11b is disposed so as to overlap the display panel 11a, detects a user's touch operation, and outputs it to the control unit 20.
- the operation unit 21 includes an operation switch 12, detects an operation on the operation switch 12, and outputs it to the control unit 20.
- the storage unit 22 includes a nonvolatile memory and stores various data.
- the storage unit 22 stores the map data 22a and the application AP.
- the application AP will be described later.
- the map data 22a is for displaying a map on the display panel 11a such as information on the map, information on facilities existing on the map, information on links indicating roads on the map, information on nodes indicating connection portions of links, and the like.
- the data includes information necessary for the route, information necessary for route guidance to be described later, and the like.
- the GPS unit 23 receives GPS radio waves from GPS satellites via a GPS antenna (not shown), and acquires the current position and traveling direction of the vehicle from the GPS signals superimposed on the GPS radio waves by calculation.
- the GPS unit 23 outputs the acquisition result to the control unit 20.
- the relative orientation detection unit 24 includes a gyro sensor 71 and an acceleration sensor 72.
- the gyro sensor 71 is constituted by, for example, a vibration gyro, and detects a relative direction of the vehicle (for example, a turning amount in the yaw axis direction).
- the acceleration sensor 72 detects acceleration acting on the vehicle (for example, the inclination of the vehicle with respect to the traveling direction).
- the relative direction detection unit 24 outputs the detection result to the control unit 2.
- the control unit 20 estimates the current position of the vehicle based on the input from the GPS unit 23 and the relative orientation detection unit 24 and the map data 22a, and the estimated current position is displayed on the map displayed on the display panel 11a. It has a function to display.
- the control unit 20 has a function of displaying a route to the destination while displaying the current position of the vehicle on the map and guiding the route.
- the beacon receiving unit 25 receives radio waves related to road traffic information emitted by radio wave beacons and optical beacons (hereinafter simply referred to as “beacons”) provided on the road on which the vehicle travels, and based on the received radio waves Road traffic information is generated, and the generated road traffic information is output to the control unit 20.
- the road traffic information is VICS (registered trademark) (Vehicle Information and Communication System) information, and includes information on traffic jams, information on road traffic regulations, and the like.
- the FM multiplex receiving unit 26 receives the FM multiplex broadcast wave, extracts road traffic information, and outputs it to the control unit 20. More specifically, the FM multiplex receiving unit 26 includes an FM multiplex dedicated tuner and a multiplex encoder. When receiving the FM multiplex broadcast, the FM multiplex dedicated tuner outputs an FM multiplex signal to the multiplex encoder. The multiplex encoder acquires the road traffic information by decoding the FM multiplex signal input from the FM multiplex dedicated tuner, and outputs it to the control unit 20.
- the short-range wireless communication unit 27 establishes a wireless communication link with the mobile terminal KT owned by the user boarding the vehicle and performs wireless communication with the mobile terminal KT in accordance with a communication standard for short-range wireless communication such as Bluetooth (registered trademark). .
- the mobile terminal KT is a mobile terminal such as a mobile phone or a tablet terminal, and has a function of accessing the Internet via a telephone line or a wireless LAN.
- the portable terminal KT has a function of receiving mail related to SMS, mail such as Web mail from a predetermined server.
- the mobile terminal KT receives the mail and transmits the mail data to the in-vehicle device 1.
- the mobile terminal KT includes the installed chat application, messenger application, and predetermined service provider. It may be configured to receive a notification such as a Web message or a push notification and transmit data corresponding to the received notification to the in-vehicle device 1 by a function of a predetermined application such as an application provided by the device.
- the mobile terminal KT receives the function of responding to an inquiry from the mail management unit 28, which will be described later, and the data of newly received mail in response to a request from the mail management unit 28 by using a function of software installed in advance. It has a function to transmit.
- the mail management unit 28 controls the short-range wireless communication unit 27 to communicate with the mobile terminal KT and inquire whether the mobile terminal KT has received a new mail. When there is a response to the inquiry that there is a newly received mail, the mail management unit 28 requests the mobile terminal KT to transmit new mail data, receives the data, and stores a predetermined memory. Store in the area.
- the sensor control unit 29 controls the infrared sensor unit 14 according to the control of the control unit 20.
- the infrared sensor unit 14 is a sensor unit that detects a non-contact operation by a user using infrared rays, and includes a first sensor unit 141 (first sensor), a second sensor unit 142 (second sensor), A third sensor unit 143 (third sensor) and a fourth sensor unit 144 (fourth sensor) are provided.
- the first sensor unit 141 includes a first light emitting unit H1 and a first light receiving unit J1
- the second sensor unit 142 includes a second light emitting unit H2 and a second light receiving unit J2
- the third sensor unit 143 includes a first sensor.
- the third light emitting unit H3 and the third light receiving unit J3 are provided, and the fourth sensor unit 144 includes the fourth light emitting unit H4 and the fourth light receiving unit J4.
- the detailed configuration of the infrared sensor unit 14 and the types of non-contact operations performed by the user will be described.
- the infrared sensor unit 14 includes four sensor units, and each sensor unit includes a light emitting unit that emits infrared light and a light receiving unit that receives reflected light.
- the configuration of the infrared sensor unit 14 is not limited to this.
- the infrared sensor unit 14 may include one light emitting unit and one light receiving unit, and the light receiving surface of the light receiving unit may be divided into four. That is, the configuration of the infrared sensor unit 14 may be any configuration capable of recognizing a non-contact operation described below.
- the number of sensor units, the mode of light emitting units and light receiving units in the sensor units, The relationship with the light receiving unit is not limited to the configuration according to the present embodiment.
- FIG. 3 (A) is a diagram schematically showing an arrangement state when each sensor portion of the infrared sensor unit 14 is viewed from the front.
- FIG. 3B schematically shows the detection range AA of the user's hand by the infrared sensor unit 14 formed in front of the in-vehicle device 1 while the front view of the in-vehicle device 1 is indicated by a dotted line.
- FIG. 3 A is a diagram schematically showing an arrangement state when each sensor portion of the infrared sensor unit 14 is viewed from the front.
- FIG. 3B schematically shows the detection range AA of the user's hand by the infrared sensor unit 14 formed in front of the in-vehicle device 1 while the front view of the in-vehicle device 1 is indicated by a dotted line.
- the upward direction is the upward direction
- the downward direction is the downward direction
- the left direction is the left direction
- the direction to the right is the right direction.
- the direction toward the front is defined as the front direction
- the direction toward the direction opposite to the front direction is defined as the rear direction.
- a second sensor unit 142 is provided to the right of the first sensor unit 141, and a third sensor unit 143 is provided below the second sensor unit 142.
- the fourth sensor unit 144 is provided on the left side of the third sensor unit 143, and the first sensor unit 141 is provided above the fourth sensor unit 144.
- the first sensor unit 141 includes the first light emitting unit H1 and the first light receiving unit J1.
- the first light emitting unit H1 of the first sensor unit 141 emits infrared rays at a predetermined cycle according to the control of the sensor control unit 29.
- the first light receiving unit J1 receives the reflected infrared light emitted by the first light emitting unit H1, and outputs a signal indicating the received light intensity to the sensor control unit 29.
- the sensor control unit 29 performs A / D conversion on the input signal indicating the received light intensity to generate data indicating the received light intensity as a gradation value having a predetermined gradation (hereinafter referred to as “first received light intensity data”).
- first received light intensity data is output from the sensor control unit 29 to the control unit 20 at a predetermined cycle.
- the first detection range A1 schematically shows the detection range of the user's hand (detection target) by the first sensor unit 141. That is, when the user's hand is located within the first detection range A1, the first light receiving unit J1 can receive the reflected light of the infrared light emitted by the first light emitting unit H1 from the user's hand. In the first detection range A1, the closer the user's hand is to the first sensor unit 141, the first received light intensity data generated based on the reflected light of the user's hand received by the first light receiving unit J1. The value of the received light intensity increases.
- the first detection range A1 represents an indication of the detection range of the user's hand by the first sensor unit 141, and indicates an accurate detection range. It is not a thing. Actually, the first detection range A1 has a region that overlaps the other third detection range A3 to the fourth detection range A4. Moreover, although 1st detection range A1 has shown a mode that the detection range of the user's hand by the 1st sensor part 141 was cut out along the front 10a of the vehicle-mounted apparatus 1 in a predetermined position, 1st detection range A1 Is expanded in the front-rear direction (depth direction) according to the characteristics of the first sensor unit 141. Such characteristics of the first detection range A1 in FIG. 3B also have the second detection range A2 to the fourth detection range A4.
- the second detection range A2 schematically shows the detection range of the user's hand by the second sensor unit 142. That is, when the user's hand is positioned within the second detection range A2, the second light receiving unit J2 receives the infrared light reflected by the user's hand emitted by the second light emitting unit H2.
- the second light emitting unit H2 of the second sensor unit 142 emits infrared rays at a predetermined cycle according to the control of the sensor control unit 29.
- the second light receiving unit J2 receives the reflected infrared light emitted from the second light emitting unit H2, and outputs a signal indicating the received light intensity of the received reflected light to the sensor control unit 29.
- the sensor control unit 29 performs A / D conversion on the input signal indicating the received light intensity to generate data indicating the received light intensity as a gradation value having a predetermined gradation (hereinafter referred to as “second received light intensity data”). To the control unit 20. As a result, the second received light intensity data is output from the sensor control unit 29 to the control unit 20 at a predetermined cycle.
- the third detection range A3 schematically shows the detection range of the user's hand by the third sensor unit 143. That is, when the user's hand is located within the third detection range A3, the third light receiving unit J3 receives the infrared light reflected by the user's hand emitted by the third light emitting unit H3.
- the third light emitting unit H3 of the third sensor unit 143 emits infrared rays at a predetermined cycle according to the control of the sensor control unit 29.
- the third light receiving unit J3 receives the reflected infrared light emitted by the third light emitting unit H3, and outputs a signal indicating the received light intensity of the received reflected light to the sensor control unit 29.
- the sensor control unit 29 performs A / D conversion on the input signal indicating the received light intensity, and generates data indicating the received light intensity as a gradation value having a predetermined gradation (hereinafter referred to as “third received light intensity data”). To the control unit 20. As a result, the third received light intensity data is output from the sensor control unit 29 to the control unit 20 at a predetermined cycle.
- the fourth detection range A4 schematically shows the detection range of the user's hand by the fourth sensor unit 144. That is, when the user's hand is located within the fourth detection range A4, the fourth light receiving unit J4 receives the infrared light reflected by the user's hand emitted by the fourth light emitting unit H4.
- the fourth light emitting unit H4 of the fourth sensor unit 144 emits infrared rays at a predetermined cycle according to the control of the sensor control unit 29.
- the fourth light receiving unit J4 receives the reflected infrared light emitted by the fourth light emitting unit H4, and outputs a signal indicating the received light intensity of the received reflected light to the sensor control unit 29.
- the sensor control unit 29 performs A / D conversion on the input signal indicating the received light intensity, and generates data indicating the received light intensity as a gradation value having a predetermined gradation (hereinafter referred to as “fourth received light intensity data”). To the control unit 20. As a result, the fourth received light intensity data is output from the sensor control unit 29 to the control unit 20 at a predetermined cycle.
- the detection range AA is a detection range composed of a first detection range A1 to a fourth detection range A4.
- the control unit 20 detects a non-contact operation performed by the user based on the input first received light intensity data to fourth received light intensity data by a method described later. And the control part 20 analyzes non-contact operation by the method mentioned later, recognizes non-contact operation performed by the user, and discriminate
- FIG. 4A to 4H are respectively a downward operation, an upward operation, a right operation, a left operation, a right oblique downward operation, a right oblique upward operation, a left oblique upward operation, and
- FIG. 10 is a diagram illustrating a left diagonally downward operation.
- the downward operation is an operation in which the user's hand moves downward so as to pass through the detection range AA.
- FIG. 4A illustrates an example of the movement of the user's hand during the downward operation in relation to the detection range AA in order to explain the downward operation.
- the user's hand enters the detection range AA from the upper side of the detection range AA, and after the user's hand moves downward in the detection range AA, The user's hand leaves the detection range AA from below the detection range AA.
- the upward operation is an operation of moving the user's hand so as to pass through the detection range AA upward as shown in FIG.
- the right direction operation is an operation of moving the user's hand so as to pass the detection range AA in the right direction as shown in FIG.
- the left direction operation is an operation in which the user's hand moves so as to pass the detection range AA in the left direction as shown in FIG.
- the diagonally downward right direction operation is an operation in which the user's hand moves so as to pass through the detection range AA in the diagonally downward right direction as shown in FIG.
- the diagonally upward right operation is an operation in which the user's hand moves so as to pass through the detection range AA in the diagonally upward right direction.
- the diagonally upward left direction operation is an operation in which the user's hand moves so as to pass through the detection range AA in the diagonally upward left direction as shown in FIG.
- the left diagonally downward operation is an operation in which the user's hand moves so as to pass through the detection range AA in the diagonally downward left direction.
- the predetermined direction operation is based on the premise that the user's hand moves linearly within the detection range AA and passes near the center O (see FIG. 3B) of the detection range AA. .
- the user is notified in advance that the hand should be moved in the above-described manner.
- the non-contact operation detected by the method described later is not an operation that moves linearly within the detection range AA and passes near the center O (see FIG. 3B) of the detection range AA,
- the control unit 20 does not specify the type of the predetermined direction operation for the non-contact operation performed by the user, and therefore, the process based on the non-contact operation is not executed.
- the case where a non-contact operation is detected against the user's intention means that the user tries to operate the operation switch 12 of the in-vehicle apparatus 1 or tries to take an object placed on the dashboard.
- the purpose is not to perform the contact operation, it means that a hand enters the detection range AA and the operation is detected by the infrared sensor unit 14.
- the in-vehicle device 1 can display a plurality of types of screens with different display contents on the touch panel 11. In this embodiment, for each type of screen, whether to accept an input by a non-contact operation is set. Further, for a screen set to accept an input by a non-contact operation, the first mode to Of the third modes, which operation mode is to be operated is set.
- the first mode to the third mode are different in which type of predetermined direction operation is effective among the eight types of predetermined direction operations. Specifically, the first mode enables a downward operation, an upward operation, a right operation, and a left operation. In the first mode, the rule that enables the non-contact operation corresponding to the movement of the hand in the four directions of up, down, left, and right corresponds to the “first rule”. In the second mode, all eight types of predetermined direction operations are valid. That is, the second mode enables a non-contact operation in which the user's hand moves in an arbitrary direction. In the second mode, the rule that validates the non-contact operation corresponding to the movement of the hand in an arbitrary direction corresponds to the “second rule”. In the third mode, the right direction operation and the left direction operation are validated. In the third mode, the rule that enables the non-contact operation corresponding to the movement of the hand in the left and right directions corresponds to the “third rule”.
- the range of movement of the user's hand recognized as the predetermined direction operation is different.
- FIG. 5 is a diagram used for explaining the operation of the in-vehicle device 1 when a non-contact operation is performed by the user.
- a screen G1 is a screen for guiding a route to the destination.
- a map is displayed, and the current position of the vehicle and a route to the destination are displayed on the map.
- the first mode is set as the operation mode for the screen G1. That is, effective non-contact operations are a downward operation, an upward operation, a right operation, and a left operation. Therefore, when the screen G1 is displayed, the user performs four non-contact operations including a downward operation, an upward operation, a right operation, and a left operation, and performs processing corresponding to the operation in the control unit 20. Can be executed.
- the control unit 20 When the control unit 20 detects that a downward operation has been performed while the screen G1 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G1 to the screen G2.
- the screen G2 is a screen that displays road traffic information.
- the control unit 20 stores the latest road traffic information received by the beacon receiving unit 25 and the FM multiplex receiving unit 26, and the latest road traffic information is displayed on the screen G2. Display in a predetermined manner.
- the second mode is set as the operation mode for the screen G2. Therefore, while the screen G2 is displayed, the user can perform an operation of moving his / her hand in an arbitrary direction and cause the control unit 20 to execute a process corresponding to the operation.
- the control unit 20 detects that the operation of moving the user's hand in an arbitrary direction is performed while the screen G2 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G2 to the screen G1. To do.
- the user when the user wants to acquire the latest road traffic information while driving toward the destination based on the route provided on the screen G1, the user continues driving.
- the downward operation may be executed.
- the user acquires the road traffic information provided on the screen G2 and wants to display the screen G1 on the touch panel 11 again, the user moves the hand in an arbitrary direction while continuing driving.
- a non-contact operation may be performed. In this way, the user can easily change the screen to be displayed between the screen G1 and the screen G2 by performing a non-contact operation.
- the transition from the screen G2 to the screen G1 is very simple because it is only necessary to move the hand in any direction.
- the control unit 20 detects that an upward operation has been performed while the screen G1 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G1 to the screen G3.
- the screen G3 displays a three-dimensional map of the intersection closest to the current position of the vehicle among the intersections that the vehicle passes through on the route to the destination when the vehicle travels on the route to the destination. This is a screen that displays information about three intersections that pass through the earliest in the right half.
- the information regarding the intersection is the name of the intersection, the expected arrival time, the distance from the current position of the vehicle, and the shape of the intersection.
- the control unit 20 acquires necessary information from the map data 22a, and displays the screen G3 based on the acquired information.
- the second mode is set as the operation mode for the screen G3. Accordingly, while the screen G3 is displayed, the user can perform an operation of moving his / her hand in an arbitrary direction and cause the control unit 20 to execute a process corresponding to the operation.
- the control unit 20 detects that an operation of moving the user's hand in an arbitrary direction is performed while the screen G3 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G3 to the screen G1. To do. With such a configuration, the user can easily change the screen to be displayed between the screen G1 and the screen G3 by performing a non-contact operation.
- the control unit 20 detects that the right direction operation has been performed while the screen G1 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G1 to the screen G4.
- the screen G4 is a screen that displays information related to the mail received by the mobile terminal KT.
- the control unit 20 controls the mail management unit 28 to acquire the mail when there is a mail newly received by the mobile terminal KT, and information about the new mail (subject name) together with information about the mail received in the past. And the transmission source, text, etc.) are displayed on the screen G4.
- the second mode is set as the operation mode for the screen G4. Therefore, while the screen G4 is displayed, the user can perform an operation of moving his / her hand in an arbitrary direction and cause the control unit 20 to execute a process corresponding to the operation.
- the control unit 20 detects that an operation in which the user's hand moves in an arbitrary direction is performed while the screen G4 is displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G4 to the screen G1. To do. Due to such a configuration, the user can easily change the screen to be displayed between the screen G1 and the screen G4 by performing a non-contact operation.
- control unit 20 detects that a leftward operation has been performed while the screen G1 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G1 to the screen G5.
- the screen G5 is a screen that displays content provided by a pre-installed application AP.
- the control unit 20 displays the content on the screen G5 by the function of the application AP.
- the second mode is set as the operation mode for the screen G5. Therefore, while the screen G5 is displayed, the user can perform an operation of moving his / her hand in an arbitrary direction and cause the control unit 20 to execute a process corresponding to the operation.
- the control unit 20 detects that an operation in which the user's hand moves in an arbitrary direction is performed while the screen G5 is being displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G5 to the screen G1. To do. Due to such a configuration, the user can easily change the screen to be displayed between the screen G1 and the screen G5 by performing a non-contact operation.
- FIG. 6 is a diagram illustrating an example in which the above display is performed on the screen G1.
- the map displayed on the screen G1 in FIG. 5 is not displayed.
- the downward operation is effective by an arrow pointing downward, and when the downward operation is performed by the character string displayed near the arrow, the screen G2 is displayed. It is. The same applies to the upward, right, and leftward operations.
- FIG. 7 is a diagram for explaining another example of the operation of the in-vehicle device 1 when the user performs a non-contact operation.
- FIG. 7A shows a screen G6.
- the screen G6 is automatically switched from the screen G1 and displayed when the distance between the current position of the vehicle and the passing intersection is less than a predetermined distance during route guidance on the screen G1. That is, the screen G6 is a screen that is automatically switched and displayed when a predetermined condition is satisfied as a trigger.
- a three-dimensional view of an adjacent intersection is displayed on the substantially left half, and a map in which the current position of the vehicle and the route to the destination are clearly displayed on the substantially right half.
- the second mode is set as the operation mode for the screen G6. Therefore, while the screen G6 is displayed, the user can perform an operation of moving his / her hand in an arbitrary direction and cause the control unit 20 to execute a process corresponding to the operation.
- the control unit 20 detects that an operation of moving the user's hand in an arbitrary direction is performed while the screen G6 is displayed, the control unit 20 changes the screen displayed on the touch panel 11 from the screen G6 to the screen G1. To do.
- the operation mode of the screen that automatically switches when a predetermined condition is satisfied such as the screen G6, or a pop-up screen that is displayed by a pop-up when the predetermined condition is satisfied as a trigger, to the second mode
- a predetermined condition such as the screen G6, or a pop-up screen that is displayed by a pop-up when the predetermined condition is satisfied as a trigger
- FIG. 7B is a diagram showing a screen G7.
- the screen G7 is displayed as a pop-up. Displayed automatically.
- a message asking whether or not to display the received road traffic information is displayed, "The latest traffic information has been received. Do you want to display it?" .
- an arrow indicating the left direction in association with the wording is displayed together with the wording “Yes”, which means displaying road traffic information. By performing the left direction operation, the arrow clearly indicates that “Yes” is selected and road traffic information is displayed.
- an arrow indicating the right direction in association with the wording is displayed together with the wording “No” meaning that the road traffic information is not displayed.
- the arrow clearly indicates that “No” is selected and the display of the road traffic information is cancelled.
- the third mode is set as the operation mode for the screen G7. Therefore, while the screen G7 is displayed, the user can perform a leftward operation or a rightward operation and cause the control unit 20 to execute processing corresponding to the operation.
- the control unit 20 monitors whether a leftward operation or a rightward operation has been performed while the screen G7 is being displayed.
- the left direction operation that is, when the user selects to display the road traffic information
- the control unit 20 changes the screen to be displayed to the screen G2 in FIG. Is displayed.
- the right direction operation that is, when the user selects not to display the road traffic information
- the control unit 20 deletes the screen G7 and displays the screen displayed before the screen G7 is displayed. G1 is displayed.
- a pop-up screen that is displayed by a pop-up when a predetermined condition is satisfied, or a screen that automatically switches when a predetermined condition is satisfied, and the user selects two choices.
- the operation mode of the screen may be set to the first mode.
- FIG. 8 is a flowchart showing the operation of the in-vehicle device 1.
- Newly displayed screens include screens that are displayed after power-on, screens that are newly displayed by transitioning screens, screens that are automatically switched to display new screens, and screens that are newly displayed by pop-up. It is.
- the control unit 20 of the in-vehicle device 1 determines whether or not the newly displayed screen is a recognition applicable screen (step SA1).
- the recognition corresponding screen is a screen set to accept an input by a non-contact operation.
- one of the first mode to the third mode is set as the operation mode on the recognition corresponding screen.
- step SA1 If it is not a recognition applicable screen (step SA1: NO), the control unit 20 ends the process.
- step SA2 When it is a recognition applicable screen (step SA1: YES), the control unit 20 starts recognition of a non-contact operation (step SA2).
- Starting recognition of a non-contact operation means establishing a state in which the operation can be detected and the operation can be analyzed when the non-contact operation is performed.
- step SA2 for example, the control unit 20 controls the sensor control unit 29 to start driving the first sensor unit 141 to the fourth sensor unit 144.
- the control unit 20 selects any one of the first table TB1, the second table TB2, and the third table TB3 stored in the storage unit 22 according to the operation mode set on the displayed screen. Is used to determine whether to recognize the non-contact operation (specify the type of non-contact operation) (step SA3).
- the control unit 20 uses the first table TB1 when the operation mode set on the screen is the first mode, the second table TB2 when the operation mode is the second mode, and the third table TB3 when the operation mode is the third mode. Determine as a table.
- the first table TB1 to the third table TB3 will be described later.
- the control unit 20 determines the table to be used, thereby changing the range of user movement when specifying the type of operation in the predetermined direction in accordance with the operation mode. .
- control unit 20 executes sensor signal recognition processing (step SA4).
- FIG. 9 is a flowchart showing details of the operation of the control unit 20 when the sensor signal recognition process in step SA4 is executed.
- the control unit 20 monitors the first received light intensity data to the fourth received light intensity data inputted at a predetermined cycle, and the user's hand enters the detection range AA. It is determined whether or not there has been (step SB1). The control unit 20 determines whether or not the detection value (reception intensity) of any one of the first sensor unit 141 to the fourth sensor unit 144 is changed based on the first received light intensity data to the fourth received light intensity data. If there is a change, it is determined that the user's hand has entered the detection range AA.
- step SB1 When the user's hand enters the detection range AA (step SB1: YES), the control unit 20 performs a predetermined process based on the first received light intensity data to the fourth received light intensity data input at a predetermined period.
- the control unit 20 executes the process of step SB2 until the user's hand leaves the detection range AA. That is, the control unit 20 determines that the detection values (reception intensities) of all the sensor units of the first sensor unit 141 to the fourth sensor unit 144 are based on the first received light intensity data to the fourth received light intensity data. Whether the user's hand has left the detection range AA is determined, and the process of step SB2 ends.
- FIG. 10 is a graph showing an example of a change with time of each detection value of the first sensor unit 141 to the fourth sensor unit 144 in a table in which elapsed time is taken on the horizontal axis and reception intensity is taken on the vertical axis.
- the control unit 20 stores the detection value (reception intensity) of each of the first sensor unit 141 to the fourth sensor unit 144 together with information indicating the elapsed time. Therefore, it is possible to draw the graph of FIG. 10 based on the stored information.
- the detection value of the first sensor unit 141 reaches a peak (maximum value) at the elapsed time TP1.
- the detected value of the second sensor unit 142 peaks at the elapsed time TP2 (elapsed time TP1 ⁇ elapsed time TP2 ⁇ elapsed time TP3).
- the detected value of the fourth sensor unit 144 peaks at the elapsed time TP3 (elapsed time TP2 ⁇ elapsed time TP3 ⁇ elapsed time TP4).
- the detected value of the third sensor unit 143 peaks at the elapsed time TP4.
- the control unit 20 acquires the elapsed time when the detection value of each sensor unit becomes a peak based on the information stored in the predetermined storage area (step SB3).
- the control unit 20 includes an elapsed time TP1 when the detection value of the first sensor unit 141 reaches a peak, an elapsed time TP2 when the detection value of the second sensor unit 142 reaches a peak, The elapsed time TP3 when the detection value of the fourth sensor unit 144 reaches a peak and the elapsed time TP4 when the detection value of the third sensor unit 143 reaches a peak are acquired.
- control unit 20 determines whether or not to specify the type of non-contact operation based on the information stored in the predetermined storage area in Step SB2 (Step SB4). In the following cases, the control unit 20 determines that the intentional non-contact operation by the user is not performed, and does not specify the type of the non-contact operation.
- step SB4 the control unit 20 does not specify the type of the first sensor unit 141 to the fourth sensor unit 144 when there is at least one unit that cannot acquire the elapsed time when the detection value has reached the peak. Determine.
- the user's hand moves linearly within the detection range AA and does not move near the center O (see FIG. 3B) of the detection range AA. This is because it is assumed that no intentional non-contact operation is performed.
- step SB4 the control unit 20 determines that at least one of the first sensor unit 141 to the fourth sensor unit 144 has a detection value peak (maximum value) that does not exceed a predetermined threshold value. If there is, it is determined that the type is not specified. In such a case, it is assumed that the user is not performing an intentional non-contact operation.
- step SB4 the control unit 20 determines whether the user's hand has moved as follows based on the information stored in the predetermined storage area. That is, the control unit 20 determines whether or not the state where the user's hand is located within a certain distance continues for a predetermined period after the user's hand approaches the infrared sensor unit 14 from a distance. Is determined. When it is determined that the user's hand has made such a movement, the control unit 20 determines that the type is not specified. This is because the movement of the user's hand is not a movement related to the user's intentional non-contact operation but a movement performed when the user operates the operation switch 12 provided in the in-vehicle device 1.
- step SB4 NO
- the control unit 20 ends the sensor signal recognition process.
- step SB4 When specifying the type of non-contact operation (step SB4: YES), the control unit 20 performs the following process (step SB5).
- step SB5 the control unit 20 obtains the order of the sensor units in which the detected value reaches a peak (hereinafter referred to as “peak order”).
- peak order is the order of the first sensor unit 141 ⁇ the second sensor unit 142 ⁇ the fourth sensor unit 144 ⁇ the third sensor unit 143.
- each of the first sensor unit 141, the second sensor unit 142, the third sensor unit 143, and the fourth sensor unit 144 is appropriately expressed as a sensor S1, a sensor S2, a sensor S3, and a sensor S4.
- FIG. 11 is a diagram showing a pattern of peak order that can be taken in the present embodiment.
- the pattern of a peak order is shown in FIG.
- step SB5 the control unit 20 determines the elapsed time when the detection value of the sensor unit with the first peak order reaches the peak (elapsed time TP1 in the example of FIG. 10) and the second peak order.
- a time difference KA from the elapsed time (in the example of FIG. 10, elapsed time TP2) when the detection value of the sensor unit reaches a peak is calculated.
- the control unit 20 detects the elapsed time (in the example of FIG. 10, the elapsed time TP2) when the detection value of the sensor unit with the second peak order reaches the peak and the detection of the sensor unit with the third peak order.
- a time difference KB with respect to an elapsed time when the value reaches a peak in the example of FIG.
- elapsed time TP3 is calculated. Further, the control unit 20 detects the elapsed time when the detection value of the third sensor unit having the peak order reaches the peak (elapsed time TP3 in the example of FIG. 10) and the detection of the fourth sensor unit having the peak order. A time difference KC from the elapsed time when the value reaches the peak (in the example of FIG. 10, elapsed time TP4) is calculated. Further, the control unit 20 detects the elapsed time (in the example of FIG. 10, elapsed time TP1) when the detection value of the sensor unit with the first peak order reaches the peak, and the detection of the sensor unit with the fourth peak order. A time difference KX from the elapsed time when the value reaches the peak (in the example of FIG. 10, elapsed time TP4) is calculated.
- step SB5 the control unit 20 determines the ratio of the time difference KA to the time difference KX (time difference KA / time difference KX; hereinafter referred to as “first ratio ⁇ ”) and the ratio of the time difference KB to the time difference KX ( Time difference KB / time difference KX (hereinafter referred to as “second ratio ⁇ ”) and the ratio of time difference KC to time difference KX (time difference KC / time difference KX; hereinafter referred to as “third ratio ⁇ ”).
- first ratio ⁇ time difference KA / time difference KX
- second ratio ⁇ Time difference KB / time difference KX
- third ratio ⁇ the ratio of time difference KC to time difference KX
- step SB6 the control unit 20 compares the first ratio ⁇ and the third ratio ⁇ , and determines whether or not the difference between these values is below a predetermined threshold value (step SB6).
- a predetermined threshold is determined by reflecting the result of a prior test or simulation from the viewpoint of detecting a user's intentional non-contact operation.
- step SB6 NO
- the control unit 20 does not specify the type of non-contact operation, and does not specify the sensor signal. The recognition process ends.
- step SB6 When the difference between the first ratio ⁇ and the third ratio ⁇ falls below a predetermined threshold (step SB6: YES), the control unit 20 specifies the type of non-contact operation using the table determined in step SA3. (Step SB7).
- the data structure of each table will be described through the description of the first table TB1, and a specific method of the type of non-contact operation will be described.
- FIG. 12 shows the data structure of the first table TB1.
- the first table TB1 is a table used when the operation mode is the first mode.
- one record of the first table TB1 has a pattern field FA1 for storing information indicating a peak order pattern (hereinafter referred to as “peak order pattern”).
- one record in the first table TB1 has a time difference comparison information field FA2.
- the time difference comparison information field F2 information indicating that the value of the time difference KA is less than the value of the time difference KB (represented as “KA ⁇ KB” in FIG. 12), the value of the time difference KA is greater than or equal to the value of the time difference KB.
- Information indicating that it is present (expressed as “KA ⁇ KB” in FIG. 12) or information indicating that the magnitude relationship between the time difference KA and the time difference KB does not matter (expressed as “ANY” in FIG. 12). Is stored.
- one record of the first table TB1 has a predetermined direction operation field FA3.
- the predetermined direction operation field FA3 stores information indicating any of eight types of predetermined direction operations.
- one record in the first table TB1 has a first ratio field FA4.
- the first ratio field FA4 stores information indicating the range of the first ratio ⁇ .
- one record in the first table TB1 has a second ratio field FA5.
- the second ratio field FA5 stores information indicating the range of the second ratio ⁇ .
- One record in the first table TB1 indicates the following. That is, in one record of the first table TB1, information indicating the peak order acquired by the control unit 20 in step B5 is stored in the pattern field FA1, and indicates the magnitude relationship between the acquired time difference KA and the time difference KB.
- the information is stored in the time difference comparison information field FA2, the acquired first ratio ⁇ is within the range indicated by the information stored in the first ratio field FA4, and the acquired second ratio ⁇ is the second ratio field.
- the information stored in FA5 is within the range indicated, it indicates that the non-contact operation performed by the user is a predetermined direction operation indicated by the information stored in the predetermined direction operation field FA3.
- the peak order pattern is the pattern P1
- the time difference KA is smaller than the time difference KB
- the first ratio ⁇ is in the range of “value A1min ⁇ ⁇ ⁇ value A1max”.
- the second ratio ⁇ is in the range of “value B1min ⁇ ⁇ ⁇ value B1max”, this indicates that the non-contact operation performed is a downward operation.
- the peak order pattern is the pattern P1
- the time difference KA is equal to or greater than the time difference KB
- the first ratio ⁇ is in the range of “value A2min ⁇ ⁇ ⁇ value A2max”
- the second ratio ⁇ is in the range of “value B2min ⁇ ⁇ ⁇ value B2max”
- the peak order pattern is the pattern P2
- the first ratio ⁇ is in the range of “value A3min ⁇ ⁇ ⁇ value A3max”
- the second ratio ⁇ is “value B3min ⁇
- it indicates that the non-contact operation performed is a downward operation regardless of the magnitude relationship between the time difference KA and the time difference KB.
- the range of the first ratio ⁇ and the range of the second ratio ⁇ mean the following, and are set as follows.
- FIG. 14 is a diagram used for explaining the first record R1 of the first table TB1 of FIG. 12, (A) shows the waveform of the detection value of each sensor unit, and (B) shows the detection range AA. Show.
- time difference KA time difference KB.
- the waveform of the detection value of each sensor unit is as shown in FIG.
- the peak order pattern is the pattern P1 (sensor S1, sensor S2, sensor S4, sensor S3). Therefore, in the non-contact operation in which the peak order pattern is the pattern P1 and the time difference KA ⁇ time difference KB, the user's hand passes through the center O of the detection range AA as indicated by the arrow YJ1 in FIG.
- the axis Y is inclined obliquely counterclockwise by a predetermined angle with respect to the axis Y extending in the vertical direction and moved obliquely downward to the right.
- the range of inclination regarded as the downward operation is the first ratio ⁇ (the range of the angle ⁇ 1 in FIG. 14B)
- the first ratio ⁇ By defining the range of the third ratio ⁇ ) and the second ratio ⁇ , it can be determined with a certain degree of accuracy.
- the first ratio ⁇ that defines the range of the angle ⁇ 1 (the range of movement of the user's hand regarded as a downward operation). And information indicating the range of the second ratio ⁇ is stored.
- FIG. 15 is a diagram used for explaining the second record R2 of the first table TB1 of FIG. 12, (A) shows the waveform of the detection value of each sensor unit, and (B) shows the detection range AA. Show.
- time difference KA time difference KB.
- the waveform of the detection value of each sensor unit is as shown in FIG.
- the peak order pattern is pattern P1 (sensor S1 ⁇ sensor S2 ⁇ sensor S4 ⁇ sensor S3). Therefore, in the non-contact operation in which the peak order pattern is the pattern P1 and the time difference KA ⁇ the time difference KB, the user's hand is as described above with respect to the axis Y as indicated by the arrow YJ2 in FIG. It is assumed that the time difference KA ⁇ time difference KB is larger than the case of the time difference KA ⁇ time difference KB, and moves toward the lower right.
- the range of inclination that is regarded as a diagonally downward right operation is defined by defining the range of the first ratio ⁇ ( ⁇ third ratio ⁇ ) and the second ratio ⁇ . Can be determined with a certain degree of accuracy.
- a first range that defines the range of the angle ⁇ 2 (the range of movement of the user's hand that is considered to be a diagonally downward operation to the right).
- Information indicating the range of the ratio ⁇ and the range of the second ratio ⁇ is stored.
- FIG. 13A shows the data structure of the second table TB2
- FIG. 13B shows the data structure of the third table TB3.
- each table has a first ratio field FA4 and a second ratio field FA5 for the corresponding records.
- the contents of information stored in are different.
- the range of hand movement that recognizes that one predetermined direction operation has been performed is different for one predetermined direction operation.
- the first table TB1 related to the first mode and the third table TB3 related to the third mode have the following differences in the contents of information stored in the first ratio field FA4 and the second ratio field FA5. is there.
- FIG. 16 is a diagram used for explaining the difference in the range of hand movement recognized as a predetermined direction operation between the first mode and the third mode.
- FIG. 16A shows the range of movement of the user's hand that is recognized as a right direction operation and a left direction operation in the first mode, using an angle ⁇ 3 formed around the center O of the detection range AA.
- FIG. 16B shows the range of the user's hand movement recognized as the right direction operation and the left direction operation in the third mode, using an angle ⁇ 4 formed around the center O of the detection range AA.
- the range of movement of the user's hand that is recognized as a right direction operation and a left direction operation is larger than in the first mode. That is, angle ⁇ 4> angle ⁇ 3.
- information of appropriate contents is stored in the first ratio field FA4 and the second ratio field FA5 of the corresponding records in the first table TB1 and the third table TB3 so that the angle ⁇ 4> the angle ⁇ 3.
- the reason why the range of movement of the user's hand that is recognized as the right direction operation and the left direction operation is larger than that in the first mode is as follows. That is, in the third mode, there are two effective predetermined direction operations: a right direction operation and a left direction operation. Therefore, when the operation mode set on the screen being displayed is the third mode, the intentional non-contact operation performed by the user is a right direction operation and a left direction operation. For this reason, it is effective when the user performs the right direction operation and the left direction operation by setting the recognition range of the right direction operation and the left direction operation wider than the first mode. The probability of being recognized as an operation can be increased, and convenience for the user is improved.
- the predetermined direction operation for moving the hand in the up, down, left, and right directions is recognized as an effective predetermined direction operation even when the user's hand movement has a certain degree of inclination. Is done. For this reason, when the user performs an operation in a predetermined direction, strict movement along the vertical and horizontal directions is not required for the movement of the hand, and the operability and convenience for the user are improved.
- a user who performs a non-contact operation is on the vehicle and cannot easily move his / her posture, but as described above, the movement of the hand is strictly moving along the vertical and horizontal directions. Therefore, user operability and convenience can be effectively improved.
- an operation in which the hand moves in an oblique direction (such as an operation in a diagonally downward right direction) is provided as the predetermined direction operation, and such an operation is effective in the first mode and the third mode.
- an operation in which the hand moves in an oblique direction is provided as the predetermined direction operation, and such an operation is effective in the first mode and the third mode.
- step SB7 in FIG. 9 first, the control unit 20 responds from the records in the table selected in step SA3 in FIG. 8 based on the peak order, time difference KA, and time difference KB acquired in step SB5. Specify 1 record. That is, the control unit 20 stores information indicating the peak order pattern corresponding to the acquired peak order from the selected table record in the pattern field FA1, and indicates the magnitude relationship between the acquired time difference KA and the time difference KB. One record whose information is stored in the time difference comparison information field FA2 is specified.
- the control unit 20 determines that the first ratio ⁇ acquired in step SB5 is within the range indicated by the information stored in the first ratio field FA4 of the specified record, and the acquired second ratio ⁇ is the specified It is determined whether or not the information stored in the second ratio field FA5 of the record is within the range indicated.
- the control unit 20 performs the predetermined direction operation indicated by the information stored in the predetermined direction operation field FA3 of the specified record. Is specified as the type of non-contact operation performed.
- the control unit 20 does not specify the type of the non-contact operation.
- the control unit 20 ends the sensor signal recognition process after the process of step SB7.
- step SA5 determines whether or not the type of non-contact operation is specified by the sensor signal recognition process.
- step SA5 determines whether or not the type of the non-contact operation is specified by the sensor signal recognition process.
- step SA5 NO
- the control unit 20 returns the processing procedure to step SA4 and executes the sensor signal recognition process again.
- the control unit 20 When the type of non-contact operation is specified by the sensor signal recognition process (step SA5: YES), the control unit 20 is effective in the operation mode in which the specified type of non-contact operation is set on the screen being displayed. It is determined whether or not the operation is a non-contact operation (step SA6). As described above, in the first mode, the downward operation, the upward operation, the right operation, and the left operation are non-contact operations that are effective. In the second mode, in addition to the four operations described above, a right diagonal down operation, a right diagonal upward operation, a left diagonal upward operation, and a left diagonal down operation are non-contact operations that are effective. In other words, in the second mode, an operation in which the hand moves in an arbitrary direction is effective. In the third mode, the right direction operation and the left direction operation are effective non-contact operations.
- step SA6 NO
- the control unit 20 returns the processing procedure to step SA4.
- step SA7 When the specified type of non-contact operation is a non-contact operation validated in the corresponding operation mode (step SA6: YES), the control unit 20 executes a corresponding process (step SA7).
- the corresponding process is, for example, the transition of the screen described with reference to FIG. Note that the corresponding process is not limited to screen transition, and may be any process that can be executed by the control unit 20.
- step SA8 determines whether or not the process executed in step SA7 is a process involving display of a new screen. If the process involves displaying a new screen (step SA8: YES), the control unit 20 moves the processing procedure to step SA1.
- step SA7 When the process executed in step SA7 is not a process involving display of a new screen (step SA8: NO), the control unit 20 moves the process procedure to step SA4 and executes the sensor signal recognition process again. .
- the in-vehicle device 1 detects a non-contact operation, and in the first mode, analyzes the detected non-contact operation and sets the first rule. Based on this, it is determined whether or not the non-contact operation is an effective operation. In the second mode, the detected non-contact operation is analyzed, and whether or not the non-contact operation is an effective operation is determined based on the second rule.
- a control unit 20 is provided. According to this configuration, it is possible to vary the effective non-contact operation according to the operation mode.
- the second rule related to the second mode has a wider range in which the non-contact operation is valid than the first rule related to the first mode. According to this configuration, by setting the second mode as the operation mode, it is possible to validate the non-contact operation related to the movement of the user's hand in a wider range compared to the case where the first mode is set. it can.
- the first rule according to the first mode is a rule that enables a non-contact operation corresponding to an operation in which the user's hand (detection target) moves in a predetermined direction.
- the second rule is a rule that enables a non-contact operation corresponding to an operation in which a user's hand moves in an arbitrary direction. More specifically, the first rule is a rule that enables a non-contact operation corresponding to an operation in which the user's hand moves in four directions, up, down, left, and right. According to this configuration, by setting the operation mode to the first mode, a non-contact operation corresponding to an operation in which the user's hand moves in four directions of up, down, left, and right can be validated. By setting to the second mode, a non-contact operation corresponding to an operation in which the user's hand moves in an arbitrary direction can be validated.
- the vehicle-mounted apparatus 1 which concerns on this embodiment outputs the detection value of a large value, so that a user's hand approaches, the 1st sensor part 141 (1st sensor) and the 2nd sensor part 142 (2nd sensor).
- a second sensor unit 142 is provided on the right side of the first sensor unit 141, a third sensor unit 143 is provided below the second sensor unit 142, and a fourth sensor is provided on the left side of the third sensor unit 143.
- a sensor unit 144 is provided, and the first sensor unit 141 is provided above the fourth sensor unit 144.
- the detected non-contact operation corresponds to an operation in which the user's hand moves in a direction corresponding to the up / down / left / right direction based on the detected values of the four sensor units. Detect whether or not.
- the detected non-contact operation is accurately performed in a direction in which the user's hand corresponds to the vertical and horizontal directions. It is possible to detect whether or not the operation corresponds to the operation of moving toward.
- control unit 20 performs an operation in which the detected non-contact operation moves in a direction corresponding to the up / down / left / right direction based on the change in the peak of the detection values of the four sensor units. It is detected whether or not it corresponds to.
- this configuration based on the configuration having four sensor units and the arrangement state of the four sensor units, it is possible to accurately detect non-detection using the change in the detection value peaks of the four sensor units. It can be detected whether or not the contact operation corresponds to an operation in which the user's hand moves in a direction corresponding to the up / down / left / right direction.
- the control unit 20 determines whether the non-contact operation is an effective operation based on the third rule.
- the third rule is a rule that validates a non-contact operation corresponding to an operation in which the user's hand moves in two directions, left and right. According to this configuration, by setting the third mode as the operation mode, it is possible to validate a non-contact operation corresponding to an operation in which the user's hand moves in four directions, up, down, left, and right.
- the vehicle-mounted apparatus 1 is provided with the touch panel 11 (display part).
- An operation mode is set for each type of screen that can be displayed on the touch panel 11. According to this configuration, it is possible to set the operation mode for each screen and vary the effective non-contact operation.
- FIG. 2 is a schematic diagram showing the functional configuration of the in-vehicle device 1 classified according to main processing contents in order to facilitate understanding of the present invention.
- the configuration of each device corresponds to the processing content. Thus, it can be classified into more components. Moreover, it can also classify
- processing unit of the flowchart is divided according to the main processing contents in order to facilitate understanding of the processing of the in-vehicle device 1.
- the present invention is not limited by the way of dividing the processing unit or the name.
- the processing of the in-vehicle device 1 can be divided into more processing units according to the processing content. Moreover, it can also divide
- the non-contact operation is detected using infrared rays.
- the non-contact operation detection method and the analysis method are not limited to the methods exemplified in the above-described embodiment.
- a configuration may be adopted in which a non-contact operation is detected by other means such as using ultrasonic waves instead of infrared rays or using image data taken by an imaging device.
- a non-contact operation in which the user's hand moves in the up / down / left / right direction in the first mode is an effective operation
- a non-contact operation in which the user's hand moves in the left / right direction in the second mode It was an effective operation.
- the movement direction of the user's hand in the effective non-contact operation is not limited to the direction corresponding to the up, down, left, and right directions.
- control unit 141 first sensor unit (first sensor) 142 Second sensor unit (second sensor) 143 Third sensor unit (third sensor) 144 Fourth sensor unit (fourth sensor)
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Mechanical Engineering (AREA)
- Chemical & Material Sciences (AREA)
- Combustion & Propulsion (AREA)
- Transportation (AREA)
- User Interface Of Digital Computer (AREA)
- Navigation (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
本発明は、上述した事情に鑑みてなされたものであり、ユーザにとって利便性の高い非接触操作検出装置を提供することを目的とする。
この構成によれば、動作モードに応じて、有効とする非接触操作を異ならせることが可能である。
この構成によれば、動作モードとして第2モードを設定することにより、第1モードを設定する場合と比較して、より広い範囲のユーザの手の動きに係る非接触操作を有効とすることができる。
この構成によれば、動作モードを第1モードに設定することにより、ユーザの手が上下左右の4方向に向かって移動する操作に対応した非接触操作を有効とすることができ、動作モードを第2モードに設定することにより、ユーザの手が任意の方向に向かって移動する操作に対応した非接触操作を有効とすることができる。
この構成によれば、4つのセンサ部を有する構成、及び、4つのセンサ部の配置状態を踏まえて、的確に、検出された非接触操作が、ユーザの手が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出できる。
この構成によれば、4つのセンサ部を有する構成、及び、4つのセンサ部の配置状態を踏まえて、4つのセンサ部の検出値のピークの変化を利用して、的確に、検出された非接触操作が、ユーザの手が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出できる。
この構成によれば、動作モードとして第3モードを設定することにより、ユーザの手が上下左右の4方向に向かって移動する操作に対応した非接触操作を有効とすることができる。
この構成によれば、画面ごとに、動作モードを設定して、有効とする非接触操作を異ならせることができる。
20 制御部
141 第1センサ部(第1のセンサ)
142 第2センサ部(第2のセンサ)
143 第3センサ部(第3のセンサ)
144 第4センサ部(第4のセンサ)
Claims (13)
- 車両に搭載される非接触操作検出装置において、
非接触操作を検出すると共に、第1モードでは、第1のルールに基づいて非接触操作が有効な操作か否かを判別し、第2モードでは、第2のルールに基づいて非接触操作が有効な操作か否かを判別する制御部を備えることを特徴とする非接触操作検出装置。 - 検出対象の位置によって出力する検出値を変化させる1又は複数のセンサを備え、
前記制御部は、
1又は複数のセンサの検出値の態様に基づいて、非接触操作を検出することを特徴とする請求項1に記載の非接触操作検出装置。 - 前記第2のルールは、前記第1のルールと比較して、非接触操作を有効とする範囲が広く設定されていることを特徴とする請求項1又は2に記載の非接触操作検出装置。
- 前記第1のルールは、検出対象が所定の方向に向かって移動する操作に対応した非接触操作を有効とするルールであり、
前記第2のルールは、検出対象が任意の方向に向かって移動する操作に対応した非接触操作を有効とするルールであることを特徴とする請求項3に記載の非接触操作検出装置。 - 前記第1のルールは、検出対象が所定の4方向に向かって移動する操作に対応した非接触操作を有効とするルールであることを特徴とする請求項4に記載の非接触操作検出装置。
- 前記所定の4方向は、上下左右に対応する方向であることを特徴とする請求項5に記載の非接触操作検出装置。
- 検出対象が近接するほど大きな値の検出値を出力する第1のセンサ、第2のセンサ、第3のセンサ、及び、第4のセンサの4つのセンサを備え、
4つの前記センサを、前記第1のセンサの右方に前記第2のセンサが位置し、前記第2のセンサの下方に前記第3のセンサが位置し、前記第3のセンサの左方に前記第4のセンサが位置し、前記第4のセンサの上方に前記第1のセンサが位置するように、上下左右に並べて配置し、
前記制御部は、
4つの前記センサの検出値の態様に基づいて、検出された非接触操作が、検出対象が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出することを特徴とする請求項6に記載の非接触操作検出装置。 - 前記制御部は、
4つのセンサの検出値のピークの変化に基づいて、検出した非接触操作が、検出対象が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出することを特徴とする請求項7に記載の非接触操作検出装置。 - 前記制御部は、
第3モードでは、第3のルールに基づいて非接触操作が有効な操作か否かを判別し、有効な場合、解析結果に基づく処理を行い、
前記第1のルールは、検出対象が所定の4方向に向かって移動する操作に対応した非接触操作を有効とするルールであり、
前記第2のルールは、検出対象が任意の方向に向かって移動する操作に対応した非接触操作を有効とするルールであり、
前記第3のルールは、検出対象が所定の2方向に向かって移動する操作に対応した非接触操作を有効とするルールであることを特徴とする請求項3に記載の非接触操作検出装置。 - 前記所定の4方向は、上下左右に対応する方向であり、
前記所定の2方向は、左右に対応する方向であることを特徴とする請求項9に記載の非接触操作検出装置。 - 検出対象が近接するほど大きな値の検出値を出力する第1のセンサ、第2のセンサ、第3のセンサ、及び、第4のセンサの4つのセンサを備え、
4つの前記センサを、前記第1のセンサの右方に前記第2のセンサが位置し、前記第2のセンサの下方に前記第3のセンサが位置し、前記第3のセンサの左方に前記第4のセンサが位置し、前記第4のセンサの上方に前記第1のセンサが位置するように、上下左右に並べて配置し、
前記制御部は、
4つのセンサの検出値の態様に基づいて、検出された非接触操作が、検出対象が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否か、又は、検出対象が左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出することを特徴とする請求項10に記載の非接触操作検出装置。 - 前記制御部は、
4つのセンサの検出値のピークの変化に基づいて、検出された非接触操作が、検出対象が上下左右方向に対応する方向に向かって移動する操作に対応するものであるか否か、又は、検出対象が左右方向に対応する方向に向かって移動する操作に対応するものであるか否かを検出することを特徴とする請求項11に記載の非接触操作検出装置。 - 所定の種類の画面を表示する表示部を備え、
画面の種類ごとに、モードが設定されることを特徴とする請求項1に記載の非接触操作検出装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/310,936 US10067570B2 (en) | 2014-06-30 | 2015-03-24 | Non-contact operation detection device |
JP2016531139A JP6401268B2 (ja) | 2014-06-30 | 2015-03-24 | 非接触操作検出装置 |
EP15814009.5A EP3165993B1 (en) | 2014-06-30 | 2015-03-24 | Non-contact operation detection device |
CN201580035676.7A CN106462252B (zh) | 2014-06-30 | 2015-03-24 | 非接触操作检测装置 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2014133874 | 2014-06-30 | ||
JP2014-133874 | 2014-06-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016002270A1 true WO2016002270A1 (ja) | 2016-01-07 |
Family
ID=55018829
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2015/058914 WO2016002270A1 (ja) | 2014-06-30 | 2015-03-24 | 非接触操作検出装置 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10067570B2 (ja) |
EP (1) | EP3165993B1 (ja) |
JP (1) | JP6401268B2 (ja) |
CN (1) | CN106462252B (ja) |
WO (1) | WO2016002270A1 (ja) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019179388A (ja) * | 2018-03-30 | 2019-10-17 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6443418B2 (ja) * | 2016-10-03 | 2018-12-26 | トヨタ自動車株式会社 | 車両運転支援装置 |
WO2019163503A1 (ja) * | 2018-02-22 | 2019-08-29 | 京セラ株式会社 | 電子機器、制御方法およびプログラム |
CN109164914A (zh) * | 2018-08-01 | 2019-01-08 | 江苏捷阳科技股份有限公司 | 一种智能晾衣机手势识别系统及手势控制晾衣机的方法 |
US11367413B2 (en) * | 2020-02-03 | 2022-06-21 | Panasonic Liquid Crystal Display Co., Ltd. | Display device, method for displaying image data and mobile terminal |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007080214A (ja) * | 2005-09-16 | 2007-03-29 | Asahi Kasei Electronics Co Ltd | 位置検出装置、情報入力装置、及びモーションスイッチ |
JP2009276993A (ja) * | 2008-05-14 | 2009-11-26 | Denso Corp | 入力装置 |
JP2010129069A (ja) * | 2008-12-01 | 2010-06-10 | Fujitsu Ten Ltd | ディスプレイ装置 |
JP2013012158A (ja) * | 2011-06-30 | 2013-01-17 | Toshiba Corp | 電子機器および制御方法 |
US20130135188A1 (en) * | 2011-11-30 | 2013-05-30 | Qualcomm Mems Technologies, Inc. | Gesture-responsive user interface for an electronic device |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2000066805A (ja) | 1998-08-18 | 2000-03-03 | Fujitsu Ltd | 赤外線検出式入力装置 |
JP2002083302A (ja) * | 2000-09-07 | 2002-03-22 | Sony Corp | 情報処理装置、動作認識処理方法及びプログラム格納媒体 |
JP2005135439A (ja) | 2004-12-28 | 2005-05-26 | Toshiba Corp | 操作入力装置 |
JP2007072564A (ja) * | 2005-09-05 | 2007-03-22 | Sony Computer Entertainment Inc | マルチメディア再生装置、メニュー操作受付方法およびコンピュータプログラム |
EP2304527A4 (en) * | 2008-06-18 | 2013-03-27 | Oblong Ind Inc | GESTIK BASED CONTROL SYSTEM FOR VEHICLE INTERFACES |
EP2507682A2 (en) * | 2009-12-04 | 2012-10-10 | Next Holdings Limited | Sensor methods and systems for position detection |
JP5617581B2 (ja) | 2010-12-08 | 2014-11-05 | オムロン株式会社 | ジェスチャ認識装置、ジェスチャ認識方法、制御プログラム、および、記録媒体 |
WO2013074897A1 (en) | 2011-11-16 | 2013-05-23 | Flextronics Ap, Llc | Configurable vehicle console |
US20130155237A1 (en) * | 2011-12-16 | 2013-06-20 | Microsoft Corporation | Interacting with a mobile device within a vehicle using gestures |
JP2013195326A (ja) | 2012-03-22 | 2013-09-30 | Pioneer Electronic Corp | 物体検出装置、物体検出方法、物体検出用プログラム及び情報記録媒体 |
US20130293454A1 (en) | 2012-05-04 | 2013-11-07 | Samsung Electronics Co. Ltd. | Terminal and method for controlling the same based on spatial interaction |
JP5916566B2 (ja) | 2012-08-29 | 2016-05-11 | アルパイン株式会社 | 情報システム |
-
2015
- 2015-03-24 WO PCT/JP2015/058914 patent/WO2016002270A1/ja active Application Filing
- 2015-03-24 JP JP2016531139A patent/JP6401268B2/ja active Active
- 2015-03-24 US US15/310,936 patent/US10067570B2/en active Active
- 2015-03-24 CN CN201580035676.7A patent/CN106462252B/zh active Active
- 2015-03-24 EP EP15814009.5A patent/EP3165993B1/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007080214A (ja) * | 2005-09-16 | 2007-03-29 | Asahi Kasei Electronics Co Ltd | 位置検出装置、情報入力装置、及びモーションスイッチ |
JP2009276993A (ja) * | 2008-05-14 | 2009-11-26 | Denso Corp | 入力装置 |
JP2010129069A (ja) * | 2008-12-01 | 2010-06-10 | Fujitsu Ten Ltd | ディスプレイ装置 |
JP2013012158A (ja) * | 2011-06-30 | 2013-01-17 | Toshiba Corp | 電子機器および制御方法 |
US20130135188A1 (en) * | 2011-11-30 | 2013-05-30 | Qualcomm Mems Technologies, Inc. | Gesture-responsive user interface for an electronic device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019179388A (ja) * | 2018-03-30 | 2019-10-17 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
JP7048151B2 (ja) | 2018-03-30 | 2022-04-05 | Necソリューションイノベータ株式会社 | モーション判定装置、モーション判定方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
US20170102777A1 (en) | 2017-04-13 |
EP3165993B1 (en) | 2020-05-06 |
JPWO2016002270A1 (ja) | 2017-04-27 |
US10067570B2 (en) | 2018-09-04 |
EP3165993A1 (en) | 2017-05-10 |
CN106462252B (zh) | 2019-05-21 |
JP6401268B2 (ja) | 2018-10-10 |
CN106462252A (zh) | 2017-02-22 |
EP3165993A4 (en) | 2018-01-10 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6401268B2 (ja) | 非接触操作検出装置 | |
KR101569022B1 (ko) | 정보 제공 장치 및 그 방법 | |
US9842503B2 (en) | Driving support apparatus and driving support method | |
WO2016132876A1 (ja) | 情報処理装置 | |
US20160097651A1 (en) | Image display apparatus and operating method of image display apparatus | |
WO2011054549A1 (en) | Electronic device having a proximity based touch screen | |
KR20110045762A (ko) | 이동 단말기의 내비게이션 방법 및 그 장치 | |
JPWO2007023899A1 (ja) | 運転支援装置、運転支援方法、運転支援プログラムおよび記録媒体 | |
JP2016038621A (ja) | 空間入力システム | |
JP2011169621A (ja) | 地図表示装置 | |
JP2011099815A (ja) | ナビゲーション装置及び車線案内方法 | |
KR100716337B1 (ko) | 네비게이션 지도화면 터치영역 확대 표시 방법 | |
JP6360405B2 (ja) | 情報処理システム、及び、情報処理方法 | |
JP2007127447A (ja) | 経路案内装置、情報センタ、経路案内システム、及び経路案内方法 | |
JP2007127450A (ja) | 車両用経路案内システム,車載機器,および管制センタ | |
JP2007163413A (ja) | 到着時刻情報報知装置、ナビゲーション装置、及びプログラム | |
JP6722483B2 (ja) | サーバ装置、情報システム、車載装置 | |
US20220295017A1 (en) | Rendezvous assistance apparatus, rendezvous assistance system, and rendezvous assistance method | |
JP5243107B2 (ja) | ナビゲーション装置、画面表示制御方法 | |
JP2009163436A (ja) | 情報端末装置、コンピュータプログラム及び表示方法 | |
JP2022153363A (ja) | サーバ装置及び情報処理方法並びにサーバプログラム | |
JP2009098086A (ja) | ナビゲーション装置、スクロール方法 | |
JP2012225751A (ja) | 車載情報端末 | |
JP5785478B2 (ja) | 車載機およびナビゲーション画像割り込み表示方法 | |
KR20110055267A (ko) | 이동 단말기의 내비게이션 방법 및 그 장치 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15814009 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2016531139 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15310936 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
REEP | Request for entry into the european phase |
Ref document number: 2015814009 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2015814009 Country of ref document: EP |