JP6184827B2 - Electronic device, gesture input method, and program - Google Patents

Electronic device, gesture input method, and program Download PDF

Info

Publication number
JP6184827B2
JP6184827B2 JP2013208171A JP2013208171A JP6184827B2 JP 6184827 B2 JP6184827 B2 JP 6184827B2 JP 2013208171 A JP2013208171 A JP 2013208171A JP 2013208171 A JP2013208171 A JP 2013208171A JP 6184827 B2 JP6184827 B2 JP 6184827B2
Authority
JP
Japan
Prior art keywords
gesture
input
guide
means
index
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2013208171A
Other languages
Japanese (ja)
Other versions
JP2015072609A (en
Inventor
中里 光晴
光晴 中里
Original Assignee
アルパイン株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by アルパイン株式会社 filed Critical アルパイン株式会社
Priority to JP2013208171A priority Critical patent/JP6184827B2/en
Publication of JP2015072609A publication Critical patent/JP2015072609A/en
Application granted granted Critical
Publication of JP6184827B2 publication Critical patent/JP6184827B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an electronic apparatus or electronic system having a gesture input function for inputting an instruction by a gesture, and more particularly to a guide presentation technique for presenting a guide for gesture input.

  Currently, touch panels are installed in various electronic devices such as personal computers, smartphones, tablet terminals, and in-vehicle devices, and various instructions are input by touch operations on the touch panels. Touch input is widely used by ordinary people who do not have IT skills because the input operation can be performed intuitively based on the display image displayed on the display. In addition to touch input, such intuitive input methods include gesture input for performing input operations by finger or hand movement, gaze input for gaze direction, voice input for voice, etc. It is attracting attention from the facts.

  Patent Document 1 discloses a technique related to an inexpensive instrument fingering guide device that guides a performer to a position to be operated with a finger and finger usage during the operation. In Cited Reference 2, in a line-of-sight input system that performs input by gazing at a target, even if a plurality of virtual images are displayed, a virtual image display that can easily specify a virtual image that is the target of an instruction Techniques relating to the apparatus are disclosed.

Japanese Patent Laid-Open No. 2003-280641 JP 2005-138755 A

  For example, in an in-vehicle device, it is strongly desired to adopt an input method using a gesture in order to further simplify an input operation in a vehicle. In the gesture input, the user can perform an input operation by moving a finger or hand based on the displayed instruction screen in front of the display, or by signaling with the finger. For example, an input operation for selecting a music to be reproduced can be performed by gesture input on a media reproduction screen for enlarging / reducing or scrolling a road map on a navigation screen for route guidance or reproducing music data or the like. .

  FIG. 7 is a diagram for explaining a gesture input operation in a conventional in-vehicle apparatus. A conventional in-vehicle device 500 includes a display 502 that displays a navigation screen and the like, and an imaging camera 504 that detects a gesture (input operation) by a user. For example, when the destination candidate list 503 is displayed on the display 502, the user moves the hand 508 in the front-rear direction within the detection area 506 of the imaging camera 504. By analyzing the image data obtained by imaging the movement of the hand, an instruction 509 for scrolling up and down the destination candidate list 503 displayed on the display 502 can be input.

  However, since the user cannot visually recognize the detection area 506 where the gesture can be input, the user may move the hand 508 at a position outside the detection area 506 as shown in FIG. The input operation by may not be detected or misrecognized. Also, with gesture input, it is possible to register a large number of gestures in advance, and it may not be clear which gesture is requested on the instruction screen.

  An object of the present invention is to solve such a conventional problem and to provide an electronic device, a gesture input method, and a program capable of visually presenting an area where gesture input can be detected.

  The electronic device according to the present invention is an electronic device capable of detecting an input by a gesture, and includes a display means, a detection means capable of detecting a gesture input, and the detection when a user input screen is displayed on the display means. Visualization means for visualizing at least a range in which a gesture input can be detected within a detection region by the means.

  Preferably, the detection unit includes an imaging unit, and the visualization unit visualizes at least a range in which an gesture input can be detected within an imaging region of the imaging unit. The visualization means visualizes a center point of a range in which gesture input can be detected. The visualization means visualizes a guidance index for guiding a gesture required for the user input screen.

  Preferably, the visualization means visualizes a range detectable by laser light irradiation or a projection image. The visualization means includes a peristaltic mechanism that perturbs the emission direction of the laser light, and a gesture required for the user input screen is indicated by the perturbed guidance index. Preferably, the electronic device further includes an analysis unit that analyzes a gesture based on a detection result detected by the detection unit, and an execution unit that executes a gesture input based on the analyzed result.

  The gesture input method according to the present invention includes a step of visualizing a range in which a gesture input can be detected by an imaging camera when a user input screen is displayed.

  The gesture input program according to the present invention includes a step of visualizing a range in which a gesture input can be detected by an imaging camera when a user input screen is displayed, and is executed by an electronic device.

  According to the present invention, by visually presenting a position where a gesture input can be detected, the user can easily recognize the position where the gesture input should be performed. In addition, it is possible to guide the gesture requested by the displayed user input screen and improve the convenience of the gesture input.

It is a figure which shows the vehicle-mounted apparatus which is an example of the electronic device which concerns on the Example of this invention. It is a figure which shows the schematic which shows the relationship between an imaging camera and a guide presentation part. It is a block diagram which shows the functional structural example of the guide parameter | index visualization program which concerns on the Example of this invention. It is an example of the guide parameter | index which a guide parameter | index preparation means produces. It is an example of the guidance index created by the guide index creating means. It is a flowchart regarding the guide presentation control process of the gesture input in the vehicle-mounted apparatus of the Example of this invention. It is a figure explaining the gesture input by the conventional vehicle-mounted apparatus.

  Embodiments of the present invention will be described in detail with reference to the drawings. An electronic device or an electronic system according to a preferred aspect of the present invention includes a display, a detection unit that detects a gesture such as a movement or shape of a user's finger or hand, and a visualization unit that visualizes a range or region in which the gesture can be detected. And an execution means for analyzing the gesture detected by the detection means and executing an input operation using the gesture when a gesture input is performed on the user input screen displayed on the display.

  The electronic device or electronic system of the present invention can be a notebook PC, a tablet PC, a personal computer, a car navigation device, a game device, a facility guide display device, or any other display device that allows gesture input. It is possible to present a guide index indicating a detectable range or region, or to present a guidance index that guides the direction and form of a gesture requested by the user input screen. Thereby, when performing a gesture input, the user can recognize an input area correctly, and can perform a gesture input requested by the electronic apparatus using a guidance index.

  Next, an in-vehicle device will be described with reference to the drawings as an example of an electronic device according to an embodiment of the present invention. FIG. 1 is a block diagram illustrating a configuration example of an in-vehicle device. The on-vehicle device 10 displays a multimedia playback unit 20 that plays back audio data and video data obtained from CDs, DVDs, Blu-ray discs, hard disk devices, terrestrial digital television broadcasts, and the like, and a road map around the vehicle position. Or a navigation unit 40 that guides a guidance route to a destination, an imaging camera 60 that captures a gesture such as a user's finger or hand movement, and a position or range where a gesture input by the imaging camera 60 can be detected, or a request Guide presentation unit 80 that presents a guide of the direction and form of a gesture to be performed, gesture analysis unit 100 that analyzes a gesture based on image data captured by the imaging camera 60, a media playback screen, a navigation screen, and a user input screen Display 120 for displaying Control unit for controlling each unit by executing a program including a storage unit 140 for storing gram, data, etc., an input unit 160 for receiving an instruction from a user such as a gesture input, a mouse, a keyboard, and a touch panel, and a microcontroller and a microprocessor The unit 180 is configured to be included.

  The imaging camera 60 is a detection means for detecting a gesture input, and is attached to a predetermined position in the vehicle. For example, the imaging camera 60 is installed in the vicinity of a room mirror or room light in a vehicle, and captures a gesture such as a finger or hand movement that is input around the front of the display 120. In addition to this, the imaging camera 60 may be integrally attached to the outer edge of the display 120, or may be attached to a dashboard, a seat, or a dashboard provided with instruments.

  The guide presentation unit 80 visualizes the detection area (or imaging area) of the imaging camera 60 using LED light, laser light, or a projection image, and presents a guide index indicating a position or range where gesture input can be detected. Can do. Furthermore, when the user input screen displayed on the display 120 is displayed, the guide presenting unit 80 can present a guidance index for guiding a gesture requested by the user input screen as one of the guide indices.

  The guide presentation unit 80 is preferably a laser mechanism that emits laser light or a projector that emits a projected image. Similar to the imaging camera 60, the guide presentation unit 80 is located at a predetermined position in the vehicle, for example, in the vicinity of a room mirror or room light in the vehicle. Installed. The guide presentation unit 80 may be a module integrated with the imaging camera 60. The guide presentation unit 80 may visualize the guide index like a still image, or may dynamically change the guide index. In the latter case, the guide presentation unit 80 controls the movement of the guide index by using a peristaltic mechanism that perturbs the emission direction of the laser light, a variable mirror for changing the emission direction, a rotating polygon mirror, or the like. Is possible. The guide presentation unit 80 visualizes the guide index based on the information supplied from the guide index supply unit as will be described later.

  FIG. 2 is a schematic diagram illustrating a relationship between the imaging camera and the guide presenting unit. The imaging camera 60 is installed in a room mirror, for example, and images the position where the gesture input is performed, that is, the vicinity of the front of the display 120. On the other hand, the guide presentation unit 80 is installed in the vicinity of a room light on the ceiling, for example, and presents a guide index that visualizes the gesture input detection area 190 within the imaging range of the imaging camera 60. The light output from the guide presentation unit 80 is visualized by illuminating an arbitrary position or space in the vehicle. In the example of the figure, the guide presentation unit 80 irradiates a projection pattern that visualizes the four corners of the boundary 192 of the detection area 190 as a guide index. In addition to the visualization of such corners, the projection pattern may be irradiated with a projection pattern that visualizes the entire contour or boundary of the detection area 190 as a guide index. Further, the guide presenting unit 80 may be a guide index such that the boundary of the detection area 190 is optically scanned with a laser spot, in addition to always visualizing the projection pattern. Furthermore, the guide presentation unit 80 can present the guidance index 194 in the detection area 190 by controlling the emission direction of the laser light L. The guidance index 194 may be a dynamic index such that the laser spot moves in the guidance direction, for example.

  The gesture analysis unit 100 can analyze the input gesture based on the image data of the gesture imaged by the imaging camera 60. For example, the gesture analysis unit 110 extracts a finger or hand displayed in the image data, detects the shape or movement of the finger or hand, and analyzes the direction of the finger or hand and the form represented by the finger or hand. When analyzing the form represented by a finger or hand, a gesture input may be identified by collating with a pre-registered gesture pattern. Gesture input can include multiple types, for example, front-back direction parallel to the reference, left-right direction orthogonal to the reference, circular movement, open hand, grip, number of fingers The gesture analysis unit 100 can determine the direction and form of the input gesture.

  The storage unit 140 can store a program, and can store a large number of collation patterns necessary for determining the gesture input of the in-vehicle device 10 in addition to the program. In one example, it can be stored in a database of matching patterns required for gesture discrimination. In another example, the control unit 180 can access a distribution site that distributes a gesture matching pattern or the like via an external network, and acquire necessary information from the distribution site. The storage unit 140 can store road map data necessary for navigation operations, music data necessary for media playback operations, and the like.

  In a preferred embodiment, control unit 180 is constituted by a microcontroller including a ROM, a RAM, and the like, and ROM or RAM can store various programs for controlling the operation of each unit of the in-vehicle device. Furthermore, in this embodiment, a guide index visualization program for visualizing a guide index indicating a range in which gesture input can be detected and a guide index for guiding a required gesture is stored using laser light.

  FIG. 3 shows a functional configuration of the guide index visualization program according to this embodiment. The guide index visualization program 200 includes a screen display unit 202 that displays a user input screen, a menu screen, a media playback screen, a navigation screen, and the like on the display 120, and a determination unit 204 that determines whether the user input screen is displayed on the display. When it is determined that the user input screen is displayed on the display 120, a guide index creating means 206 that creates a guide index that is within the imaging range of the imaging camera 60 and indicates a detection area of gesture input, and guide index creation Guide index supply means 208 for supplying the guide index created by the means 206 to the guide presentation section 80, gesture determination means 210 for determining a gesture based on the analysis result of the gesture analysis section 100, and determination results from the gesture determination means 210 Based on , And an input execution means 212 for executing an input by a gesture.

  The screen display unit 202 displays a user input screen that prompts the user to input in accordance with execution of an application such as media playback or navigation. The user input screen may be in any form, for example, an instruction to input, an instruction to select, an instruction to scroll, an instruction to drag, an enlargement or reduction, etc. These instructions can be input from a touch panel or a mouse, and can also be input by gestures. For example, when the menu list is displayed as a user input screen, the user can select a desired item from the menu by a gesture.

  The determination unit 204 determines whether or not the content displayed by the screen display unit 202 is a user input screen, and provides the determination result to the guide index creation unit 206.

  The guide index creating means 206 creates a guide index that is visualized within the imaging range of the imaging camera 60 when a user input screen is displayed on the display 120.

  FIG. 4 is an example of a guide index created by the guide index creating means. As shown in FIG. 4A, the guide index creating unit 206 creates a guide index 192a indicating a boundary line on the outer edge of the detection area 190 capable of detecting a gesture input. When the guide presentation unit 80 is a projector that emits a projection image, the guide index created by the guide index creation unit 206 is projected. Alternatively, when the guide presentation unit 80 is a laser mechanism that scans with laser light, the guide presentation unit 80 scans the laser light according to the guide index created by the guide index creation unit 206. Thereby, a rectangular guide index 192a as shown in FIG. 4A is visualized. Further, the guide index creating means 206 has a guide index 192b indicating the corner of the detection area 190 as shown in FIG. 4B, and a guide index indicating the center point of the detection area 190 as shown in FIG. 4C. As shown in 192c and FIG. 4 (d), a guide index 192d or the like that divides the inside of the detection area 190 into a grid can be created. By presenting such a guide index, the position or range of the detection area 190 is visualized. In the example of FIG. 4, the detection area 190 is rectangular, but the detection area may be circular or other shapes.

  Further, the guide index creation means 206 can create a guidance index 194 (see FIG. 2) that guides the form of gesture requested by the user input screen. The guidance index 194 is one of the guide indices 192 indicating the position or range where gesture input can be detected. In a preferred embodiment, gesture information indicating the form of the gesture in association with the user input screen is stored in the storage unit 140. When it is determined that the user input screen is displayed by the screen display unit 202, the guide index creation unit 206 reads gesture information corresponding to the user input screen from the storage unit 140, and creates a guidance index according to the gesture information. In the example of FIG. 2, the user input screen of the display 120 includes an input 122 for instructing scrolling and a selection button 124 for selecting an item. Gesture information associated with the input 122 for instructing scrolling is stored in the storage unit. 140 stored. Then, the guide index creation means 206 creates a guidance index 194 according to the gesture information, and causes the guide presentation unit 80 to display the guide index 194. Note that the form of the gesture requested on the user input screen may be arbitrarily set by the user.

  FIG. 5 is an example of a guidance index created by the guide index creating means. The guide index creation means 206 can create a guidance index 194a for guiding a gesture in the vertical direction or the horizontal direction, as shown in FIG. 5A, according to the acquired gesture information. The guide index creating means 206 is not limited to this, but a guide index 194b for requesting a circular gesture, a guide index 194c for requesting an L-shaped gesture, as shown in FIG. Various guidance indices can be created based on the acquired gesture information, such as the guidance index 194d requesting the user.

  The guide index supply unit 208 provides the guide index created by the guide index creation unit 206 to the guide presentation unit 80. Thereby, the guide presentation part 80 outputs a guide parameter | index by the scanning by a laser beam, or a projection image. As a result, the guide index can be visualized in the range where the gesture input can be detected within the imaging range captured by the imaging camera 60. When presenting a plurality of guidance indices, the guidance indices are simultaneously presented so as to overlap each other (FIG. 5), but it is also possible to present each guidance index individually individually.

  The gesture determination unit 210 receives the gesture image analyzed by the gesture analysis unit 100 and determines a gesture from the gesture image. The determination result is provided to the input execution means 212.

  The input execution unit 212 executes a corresponding input operation based on the determination result of the gesture determination unit 210. For example, the input execution unit 212 selects a specific item from the menu screen, scrolls the road map, or plays the next music.

  Next, according to the flowchart of FIG. 6, the control operation of the gesture input guide index in the in-vehicle apparatus of the present embodiment will be described. It is assumed that the navigation function is activated in the in-vehicle device 10. When the navigation function is executed, the screen display unit 202 displays a navigation screen on the display 120. During the operation, the screen display unit 202 displays a user input screen that receives input from the user such as a destination search screen or a facility search screen. It is displayed (S101).

  When the determination unit 204 determines that the user input screen is displayed, the imaging camera 60 starts imaging the detection area 190 as shown in FIG. 2 in order to enable gesture input (S102). However, the imaging camera 60 does not necessarily have to be interlocked with the display of the user input screen, and may start imaging when the engine is started or when the in-vehicle device 10 is started.

  When the user input screen is displayed, the guide index creating unit 206 determines whether or not the user input screen displayed on the display 120 has associated gesture information (S103). If it is determined that there is gesture information, the guide index creating means 206 acquires the gesture information and creates a guidance index 194 as shown in FIG. 5 (S104). By determining the presence / absence of gesture information, the guide index creation means 206 can be prioritized to create the guidance index 194.

  When it is determined that there is no gesture information, since the user input screen does not hold the gesture information, the guide index creation unit 206 indicates the range of the detection area 190 in which gesture input can be detected as shown in FIG. A guide index 192 is created (S106).

  When the guide index is created, the guide index supply unit 208 supplies the guide index to the guide presentation unit 80. If it is determined that there is gesture information, a guidance index is supplied to the guide presentation unit 80 together with a guide index indicating the range of the gesture input detection area. When the guide index is received, the guide presentation unit 80 presents the guide index 192 and / or the guidance index 194 indicating the gesture input detection area 190 to the user as shown in FIG. 2 (S107). The guide index 192 and the guidance index 194 are for visualizing the position and range of the detection area 190, respectively, and the user can easily recognize the area where the gesture should be executed visually, and the gesture outside the detection area 190. Input is prevented.

  Further, the guide index supply unit 208 can present only the guidance index 194 in accordance with preset conditions. The guidance index 194 indicates a gesture input detection area and also indicates an input operation requested by the currently displayed user input screen. The user can accurately recognize the corresponding gesture according to the visualized guidance index 194. Can be entered.

  Next, the user performs gesture input in the detection area 190 in response to the visualized guide index. This gesture input is imaged by the imaging camera 60, and after gesture analysis is performed by the gesture analysis unit 100, a gesture is determined by the gesture determination unit 210 (S108). Preferably, gesture determination means 210 determines the type of gesture using a collation pattern prepared in advance.

  The input execution unit 212 executes an input by a gesture based on the determination result by the gesture determination unit 210. For example, the display area of the road map is moved or enlarged / reduced, the destination candidate list is scrolled, and the voice guidance volume is increased. Adjustment is performed (S109).

  In the above embodiment, the guide presentation unit 80 has shown an example in which a guide index is visualized using optical means such as laser light. However, the guide presentation unit 80 uses a screen to visualize the guide index more clearly. May be provided. Further, for example, a fog screen (fog screen) can be used so as not to disturb the gesture input. Laser light is scattered by the fog screen, and the guide index is displayed more clearly. The thickness of the fog screen can be set as appropriate by the user in the same manner as the output of the laser beam or the like, and the sharpness of the guide index can be adjusted.

  As described above, according to the present embodiment, by visualizing the guidance index and the guide index indicating the detection area of the gesture input by the imaging camera, the user can easily recognize the area and the input method in which the gesture is to be input. Convenience by gesture input is improved.

  Although the preferred embodiments of the present invention have been described in detail, the present invention is not limited to specific embodiments, and various modifications and changes can be made within the scope of the gist of the invention described in the claims. Is possible.

10: In-vehicle device 20: Multimedia playback unit 40: Navigation unit 60: Imaging camera 80: Guide presentation unit 100: Gesture analysis unit 120: Display 140: Storage unit 160: Input unit 180: Control unit 200: Guide index visualization program 202 : Screen display means 204: Determination means 206: Guide index creation means 208: Guide index supply means 210: Gesture determination means 212: Input execution means

Claims (6)

  1. An electronic device mounted on a vehicle and capable of detecting an input by a gesture,
    A display having a touch panel function;
    Detection means capable of detecting gesture input by a user in the vehicle by an imaging means ;
    When the user input screen is displayed on the display, a detection area by said detecting means, and the display area and different areas of the display, and visualization means for visualizing the detectable range at least gesture input, Have
    The visualization means is an electronic device that includes laser light or LED light for visualizing the detectable range, and is arranged at a position different from the display .
  2. The electronic device according to claim 1, wherein the visualization unit is disposed on a ceiling of the vehicle.
  3. The electronic device according to claim 1, wherein the visualization unit visualizes a center point of a range in which a gesture input can be detected.
  4. The electronic device according to claim 1, wherein the visualization unit visualizes a guidance indicator that guides a gesture required for the user input screen.
  5. Wherein the visualization means includes a rocking mechanism for swinging the emitting direction of the laser beam, the gesture required to user input screen is indicated by the induction indices is pivotal by said swing mechanism, 4 any one claims 1 An electronic device according to 1.
  6. The electronic device further includes analyzing means for analyzing the gesture based on the detection result detected by said detecting means, and an execution means for executing the gesture input based on the results of the analysis, electron according to claims 1 to 5 apparatus.
JP2013208171A 2013-10-03 2013-10-03 Electronic device, gesture input method, and program Active JP6184827B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2013208171A JP6184827B2 (en) 2013-10-03 2013-10-03 Electronic device, gesture input method, and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2013208171A JP6184827B2 (en) 2013-10-03 2013-10-03 Electronic device, gesture input method, and program

Publications (2)

Publication Number Publication Date
JP2015072609A JP2015072609A (en) 2015-04-16
JP6184827B2 true JP6184827B2 (en) 2017-08-23

Family

ID=53014928

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2013208171A Active JP6184827B2 (en) 2013-10-03 2013-10-03 Electronic device, gesture input method, and program

Country Status (1)

Country Link
JP (1) JP6184827B2 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019003304A1 (en) * 2017-06-27 2019-01-03 マクセル株式会社 Projection image display system
JP2019093529A (en) * 2017-11-27 2019-06-20 アズビル株式会社 Pointer device

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001216069A (en) * 2000-02-01 2001-08-10 Toshiba Corp Operation inputting device and direction detecting method
JP2003280641A (en) * 2002-03-22 2003-10-02 Kawai Musical Instr Mfg Co Ltd Fingering guide device for musical instrument
WO2006038577A1 (en) * 2004-10-05 2006-04-13 Nikon Corporation Electronic device
JP4747810B2 (en) * 2005-11-30 2011-08-17 トヨタ自動車株式会社 In-vehicle remote control device
US20090189858A1 (en) * 2008-01-30 2009-07-30 Jeff Lev Gesture Identification Using A Structured Light Pattern
JP5392900B2 (en) * 2009-03-03 2014-01-22 現代自動車株式会社 In-vehicle device operation device
JP2013257686A (en) * 2012-06-12 2013-12-26 Sony Corp Projection type image display apparatus, image projecting method, and computer program

Also Published As

Publication number Publication date
JP2015072609A (en) 2015-04-16

Similar Documents

Publication Publication Date Title
US10067563B2 (en) Interaction and management of devices using gaze detection
US9927881B2 (en) Hand tracker for device with display
US9489040B2 (en) Interactive input system having a 3D input space
US9619104B2 (en) Interactive input system having a 3D input space
JP5802667B2 (en) gesture input device and gesture input method
KR101892630B1 (en) Touch display apparatus and method for displaying thereof
DE202014104297U1 (en) Automotive and industrial motion sensor device
US9448716B2 (en) Process and system for management of a graphical interface for the display of application software graphical components
EP2332024B1 (en) Using physical objects in conjunction with an interactive surface
US9103691B2 (en) Multimode user interface of a driver assistance system for inputting and presentation of information
JP5269851B2 (en) Image editing apparatus, image editing method and program thereof
JP5230684B2 (en) Electronic device, display method, and program
US20130275924A1 (en) Low-attention gestural user interface
US20130117703A1 (en) System and method for executing an e-book reading application in an electronic device
US9052744B2 (en) Method and apparatus for controlling user interface of electronic device using virtual plane
US8234568B2 (en) Selecting image arrangement regions based on number of users
US8683390B2 (en) Manipulation of objects on multi-touch user interface
JP2013096736A (en) Vehicular display device
AU2011211240B2 (en) Input-output device, and information input-output system
JP2013543201A (en) Surface visible objects off screen
US7692636B2 (en) Systems and methods for handwriting to a screen
JP5580694B2 (en) Information processing apparatus, control method therefor, program, and storage medium
EP2659336B1 (en) User interface, apparatus and method for gesture recognition
US8284168B2 (en) User interface device
KR101304461B1 (en) Method and apparatus of gesture-based user interface

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20160804

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20170412

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20170418

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20170608

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20170725

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20170726

R150 Certificate of patent or registration of utility model

Ref document number: 6184827

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150