JP2010146506A - Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device - Google Patents

Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device Download PDF

Info

Publication number
JP2010146506A
JP2010146506A JP2008326274A JP2008326274A JP2010146506A JP 2010146506 A JP2010146506 A JP 2010146506A JP 2008326274 A JP2008326274 A JP 2008326274A JP 2008326274 A JP2008326274 A JP 2008326274A JP 2010146506 A JP2010146506 A JP 2010146506A
Authority
JP
Japan
Prior art keywords
image
display
finger
light sensor
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008326274A
Other languages
Japanese (ja)
Other versions
JP2010146506A5 (en
Inventor
Kazuhiro Ishida
一宏 石田
Original Assignee
Sharp Corp
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sharp Corp, シャープ株式会社 filed Critical Sharp Corp
Priority to JP2008326274A priority Critical patent/JP2010146506A/en
Publication of JP2010146506A publication Critical patent/JP2010146506A/en
Publication of JP2010146506A5 publication Critical patent/JP2010146506A5/en
Pending legal-status Critical Current

Links

Images

Abstract

Provided is an input device that is convenient and easy to operate for a user.
A detection determination unit 810 that determines whether a first display / light sensor unit 300A and a second display / light sensor unit 300B both detect an image of a finger, and a detection determination unit 810, When it is determined that the first display / light sensor unit 300A and the second display / light sensor unit 300B both detect the image of the finger, the first display / light sensor unit 300A and the second display / light sensor unit 300B And an input detection unit 830 that detects the detected movement of the finger image and generates an input signal associated with the movement of the finger image. As a result, the user can detect various movements of the image of the finger on the first display / light sensor unit 300A and the second display / light sensor unit 300B, thereby realizing a variety of three-dimensional images that could not be realized with the conventional input device. Can be experienced, and such operation provides excellent convenience and operability.
[Selection] Figure 1

Description

  The present invention relates to an input device that inputs from two input surfaces, an input device control method, an input device control program, a computer-readable recording medium, and an information terminal device.

  2. Description of the Related Art In recent years, touch panel type input devices are widely used in mobile devices (portable terminals) represented by mobile phones, PDAs and the like. As an example, there is known an input device that performs input according to the position of the touched finger when the input surface changes from a state where the finger is not in contact to a state where the finger is in contact ( For example, Patent Document 1).

In addition, input devices that allow more convenient input depending on applications as well as accepting input simply by a touch panel type have been proposed. For example, in a portable terminal device, a user touches an icon on the display screen and then moves the finger in the direction in which the icon is to be moved, so that the icon is moved to a desired position on the display screen. Can be moved. Further, the folder can be opened by double-clicking the folder on the display screen with a finger (for example, Patent Document 2).
JP 2008-141519 A (published June 19, 2008) JP 11-39093 A (published on February 12, 1999)

  However, each of the conventional input devices is a method of inputting from one input surface (hereinafter referred to as a single-sided input method), and a method of inputting from two input surfaces arranged on the front and back surfaces (hereinafter referred to as a single-sided input method). In the first place, there was no input device of a double-sided input type.

  Therefore, the user cannot experience the various three-dimensional operations realized by the double-sided input type input device and the excellent convenience and operability of the conventional single-sided input type input device.

  Further, the conventional input device has the following problems. That is, in the one-side input type input device, the possibility that an unintended operation is performed without the user's knowledge cannot be excluded. Specifically, the user's finger may unintentionally touch the icon on the touch panel and drag the icon to somewhere else on the display screen. If the dragged destination is a trash folder, the icon is deleted without the user's knowledge. As described above, the conventional input device also has an inherent problem caused by inputting from one input surface.

  The present invention has been made in view of the above-described problems, and an object of the present invention is to provide an input device that is convenient and easy to operate for a user, an input device control method, an input device control program, and a computer-readable recording. It is to provide a medium and an information terminal device.

  In order to solve the above problems, the input device according to the present invention is a plate-like input device in which surface members for detecting nearby images are arranged on both the front and back surfaces, and the surface members on both the front and back surfaces. Detecting and determining means for determining whether or not both of the finger images have been detected, and when the detection determining means determines that both of the surface members on both the front and back surfaces have detected finger images, And an input detection means for generating an input signal associated with the movement of the finger image detected by each of the planar members.

  The input device control method of the present invention is a plate-like input device control method in which the planar members for detecting a nearby image are arranged on both the front and back surfaces, and the planar members on both the front and back surfaces. And both of the front and back sides when it is determined by the detection and judgment step that both the front and back surface members have detected finger images. And an input detection step of generating an input signal associated with the movement of the finger image by detecting the movement of the finger image detected by each of the planar members.

  Thereby, in the input device and the control method of the input device according to the present invention, the movement of the finger image is detected when both of the planar members that detect the adjacent images respectively disposed on the front and back surfaces detect the finger image. An input signal associated with is generated by the input detection means. The input signal is a signal associated with the movement of the finger image detected by the planar members on both the front and back surfaces.

  Therefore, the surface members on both the front and back sides detect various finger image movements, and thus various and three-dimensional operations that cannot be realized with conventional input devices are performed. Convenience and operability are provided to the user. In addition, since the operation is performed so that the screen is sandwiched between the front and back surfaces of the plate-like input device with a finger, the user can use the device very easily.

  In addition, if only one of the planar members on both the front and back sides detects the finger image, an input signal associated with the movement of the finger image is not generated. Therefore, for example, even when a user's finger unintentionally touches an icon on the touch panel, a situation in which an image of the finger is detected and an input signal is generated and some processing is executed can be avoided. . Therefore, the risk of accidentally touching the input surface and operating it unintentionally is reduced.

  Thus, the input device and the control method for the input device according to the present invention can provide the user with excellent operability and convenience.

  Furthermore, in the input device according to the present invention, in the above-described configuration, the input signal is based on finger-command correspondence information in which the input signal is associated with a command for causing the operation execution device to execute a predetermined operation. It is good also as a structure provided with the operation control means which transmits the instruction | command corresponding to to the said operation execution apparatus.

  According to the above configuration, in the input device according to the present invention, the input detection unit detects the movement of the finger image on both the front and back surface members, and inputs corresponding to the movement of the finger image. Generate a signal. Then, based on the finger-command correspondence information in which the input signal is associated with a command for causing the motion execution device to execute a predetermined motion, the command associated with the input signal is transmitted to the motion execution device, The operation execution device executes a predetermined operation in accordance with the instruction.

  Therefore, when the planar members on both the front and back surfaces detect the movements of various finger images, the operation executing device can execute the operation desired by the user.

  Note that there may be one or a plurality of operation execution devices, the operation execution devices may be integrated with the input device, or may be present at a position physically separated from the input device. .

  Furthermore, in the input device according to the present invention, in the configuration described above, the input detection unit may detect the image of the finger in a predetermined region on the front and back surface members on the front and back surfaces. In addition, the input signal may be generated.

  According to the above configuration, in the input device according to the present invention, the input detection unit generates an input signal when detecting an image of a finger in a predetermined area on the surface member on both the front and back surfaces, and the surfaces on both the front and back surfaces. When the finger touches another area different from the predetermined area on the member, no input signal is generated. In this way, the input detection means can limit the operation area on the front and back surface members.

  Therefore, even if the user touches another area different from the predetermined area on the surface member on both the front and back surfaces, the situation where the input signal is generated and the input surface is erroneously operated is avoided. be able to. In addition, by setting the predetermined area as an area that is easy for the user to operate, the input device according to the present invention can provide the user with better operability and convenience.

  Furthermore, the input device according to the present invention further comprises image display control means for displaying an image in the predetermined area on the surface members on both the front and back surfaces, and the input detection means includes the input associated with the image. It is good also as a structure which produces | generates a signal.

  According to the above configuration, in the input device according to the present invention, the image display control means displays an image in a predetermined area. Further, when a finger image is detected in a predetermined area, the input detection unit generates an input signal associated with the image.

  In other words, the range (operation target) that can be operated by displaying an image by the image display control means is clearly indicated to the user, and the user holds the image with the finger from both the front and back sides, thereby allowing the operation execution apparatus to have a predetermined range. The action can be executed.

  On the other hand, the input detection means does not generate an input signal even when the finger touches another area different from the area where the images on the front and back planar members are displayed. Therefore, it is possible to avoid a situation in which a region different from the predetermined region where the image is displayed is accidentally touched and the operation is performed unintentionally.

  Furthermore, in the input device according to the present invention, in the configuration described above, the image display control unit may display an image based on a result of the predetermined operation executed by the operation execution device on the planar member. Good.

  According to the above configuration, in the input device according to the present invention, the image display control means causes the planar member to display an image based on the result of the predetermined operation executed by the operation execution device.

  Therefore, the image display control means can display the result of the operation executed in association with the movement of the user's finger on the screen and allow the user to confirm the result.

  In addition, when it is desired to give some operation to the image after the operation, a desired movement of the finger image may be detected by the planar members on both the front and back surfaces. That is, the user can continuously perform the next operation by checking the operation result on the screen.

  The input device may be realized by a computer. In this case, an input device control program for causing the input device to be realized by a computer by causing the computer to operate as the respective means, and the program recorded therein. Computer-readable recording media are also within the scope of the present invention. An information terminal device provided with the input device also falls within the scope of the present invention.

  In the input device according to the present invention, as described above, the detection determination means for determining whether or not the surface members on both the front and back surfaces have both detected the image of the finger, and the detection determination means include the surfaces on both the front and back surfaces. When it is determined that the finger-like member has detected the finger image, the movement of the finger image detected by the front and back surface members is detected, and an input signal associated with the finger image movement is detected. And an input detection means for generating.

  In addition, as described above, the control method of the input device according to the present invention includes the detection determination step for determining whether or not the surface members on both the front and back surfaces have both detected a finger image, and the detection determination step. When it is determined that both the front and back planar members have detected a finger image, the movement of the finger image detected by each of the front and rear planar members is detected, and the movement of the finger image is detected. And an input detection step for generating an associated input signal.

  Therefore, it is possible to provide the user with an input device that is input from two input surfaces and has excellent operability and convenience, and an input device control method.

  This embodiment will be described below with reference to FIGS.

  Generally speaking, the data display / sensor device 100 (input device) according to the present embodiment roughly includes a first display / light sensor unit 300A and a second display / light sensor unit 300B (front and back) that detect nearby images. The first display / light sensor unit 300A and the second display / light sensor unit 300B both determine whether or not a finger image has been detected. When the detection determination unit 810 (detection determination unit) and the detection determination unit 810 determine that the first display / light sensor unit 300A and the second display / light sensor unit 300B both detect the image of the finger, An input detection unit that detects the movement of the finger image detected by each of the first display / light sensor unit 300A and the second display / light sensor unit 300B and generates an input signal associated with the movement of the finger image. 830 (input detection means) It is obtain equipment.

  First, an outline of the sensor built-in liquid crystal panel 301 included in the data display / sensor device 100 will be described.

(Outline of LCD panel with built-in sensor)
The sensor built-in liquid crystal panel 301 included in the data display / sensor device 100 is a liquid crystal panel capable of detecting an image of an object in addition to displaying data. Here, the image detection of the object is, for example, detection of a position pointed (touched) by the user with a finger or a pen, or reading (scanning) of an image of a printed material. The device used for display is not limited to a liquid crystal panel, and may be an organic EL (Electro Luminescence) panel or the like.

  The structure of the sensor built-in liquid crystal panel 301 will be described with reference to FIG. FIG. 2 is a diagram schematically showing a cross section of the sensor built-in liquid crystal panel 301. The sensor built-in liquid crystal panel 301 described here is an example, and any structure can be used as long as the display surface and the reading surface are shared.

  As shown in the figure, the sensor built-in liquid crystal panel 301 includes an active matrix substrate 51A disposed on the back surface side and a counter substrate 51B disposed on the front surface side, and has a structure in which a liquid crystal layer 52 is sandwiched between these substrates. is doing. The active matrix substrate 51A is provided with a pixel electrode 56, a data signal line 57, an optical sensor circuit 32 (not shown), an alignment film 58, a polarizing plate 59, and the like. The counter substrate 51B is provided with color filters 53r (red), 53g (green), 53b (blue), a light shielding film 54, a counter electrode 55, an alignment film 58, a polarizing plate 59, and the like. In addition, a backlight 307 is provided on the back surface of the sensor built-in liquid crystal panel 301.

  The photodiode 6 included in the photosensor circuit 32 is provided in the vicinity of the pixel electrode 56 provided with the blue color filter 53b, but is not limited to this configuration. It may be provided in the vicinity of the pixel electrode 56 provided with the red color filter 53r, or may be provided in the vicinity of the pixel electrode 56 provided with the green color filter 53g.

  Next, with reference to FIGS. 3A and 3B, two types of methods for detecting the position where the user touches the sensor built-in liquid crystal panel 301 with a finger or a pen will be described.

  FIG. 3A is a schematic diagram illustrating a state in which a position touched by the user is detected by detecting a reflected image. When the light 63 is emitted from the backlight 307, the optical sensor circuit 32 including the photodiode 6 detects the light 63 reflected by the object 64 such as a finger. Thereby, the reflected image of the target object 64 can be detected. Thus, the sensor built-in liquid crystal panel 301 can detect the touched position by detecting the reflected image.

  FIG. 3B is a schematic diagram illustrating a state in which a position touched by the user is detected by detecting a shadow image. As shown in FIG. 3B, the optical sensor circuit 32 including the photodiode 6 detects external light 61 transmitted through the counter substrate 51B and the like. However, when there is an object 62 such as a pen, the incident of the external light 61 is hindered, so that the amount of light detected by the optical sensor circuit 32 is reduced. Thereby, a shadow image of the object 62 can be detected. Thus, the sensor built-in liquid crystal panel 301 can also detect a touched position by detecting a shadow image.

  As described above, the photodiode 6 may detect reflected light (shadow image) of the light emitted from the backlight 307 or may detect a shadow image caused by external light. Further, the two types of detection methods may be used in combination to detect both a shadow image and a reflected image at the same time.

(Data display / sensor configuration)
Next, the configuration of the main part of the data display / sensor device 100 will be described with reference to FIG. FIG. 4 is a block diagram showing a main configuration of the data display / sensor device 100. As shown in FIG. As illustrated, the data display / sensor device 100 includes one or more display / light sensor units 300, a circuit control unit 600, a data processing unit 700, a main control unit 800, a storage unit 901, a primary storage unit 902, and an operation unit. 903, an external communication unit 907, an audio output unit 908, and an audio input unit 909. Here, the data display / sensor device 100 will be described as including two display / light sensor units 300 (first display / light sensor unit 300A and second display / light sensor unit 300B). When the first display / light sensor unit 300A and the second display / light sensor unit 300B are not distinguished, they are referred to as the display / light sensor unit 300.

  The display / light sensor unit 300 is a so-called light sensor built-in liquid crystal display device and has a df. The display / light sensor unit 300 includes a sensor built-in liquid crystal panel 301, a backlight 307, and a peripheral circuit 309 for driving them.

  The sensor built-in liquid crystal panel 301 includes a plurality of pixel circuits 31 and photosensor circuits 32 arranged in a matrix. The detailed configuration of the sensor built-in liquid crystal panel 301 will be described later.

  The peripheral circuit 309 includes a liquid crystal panel drive circuit 304, an optical sensor drive circuit 305, a signal conversion circuit 306, and a backlight drive circuit 308.

  The liquid crystal panel driving circuit 304 outputs a control signal (G) and a data signal (S) in accordance with the timing control signal (TC1) and the data signal (D) from the display control unit 601 of the circuit control unit 600, and the pixel circuit 31. It is a circuit which drives. Details of the driving method of the pixel circuit 31 will be described later.

  The optical sensor driving circuit 305 is a circuit that drives the optical sensor circuit 32 by applying a voltage to the signal line (R) in accordance with a timing control signal (TC2) from the sensor control unit 602 of the circuit control unit 600. Details of the driving method of the optical sensor circuit 32 will be described later.

  The signal conversion circuit 306 is a circuit that converts the sensor output signal (SS) output from the optical sensor circuit 32 into a digital signal (DS) and transmits the converted signal to the sensor control unit 602.

  The backlight 307 includes a plurality of white LEDs (Light Emitting Diodes) and is disposed on the back surface of the sensor built-in liquid crystal panel 301. When a power supply voltage is applied from the backlight drive circuit 308, the backlight 307 is turned on and irradiates the sensor built-in liquid crystal panel 301 with light. Note that the backlight 307 is not limited to white LEDs, and may include LEDs of other colors. The backlight 307 may include, for example, a cold cathode fluorescent lamp (CCFL) instead of the LED.

  The backlight driving circuit 308 applies a power supply voltage to the backlight 307 when the control signal (BK) from the backlight control unit 603 of the circuit control unit 600 is at a high level, and conversely, the backlight control unit 603. When the control signal from is at a low level, no power supply voltage is applied to the backlight 307.

  Next, the circuit control unit 600 will be described. The circuit control unit 600 has a function as a device driver that controls the peripheral circuit 309 of the display / light sensor unit 300. The circuit control unit 600 includes a display control unit 601, a sensor control unit 602, a backlight control unit 603, and a display data storage unit 604.

  The display control unit 601 receives display data from the display data processing unit 701 of the data processing unit 700, and performs timing control on the liquid crystal panel driving circuit 304 of the display / light sensor unit 300 in accordance with an instruction from the display data processing unit 701. A signal (TC1) and a data signal (D) are transmitted, and the received display data is displayed on the sensor built-in liquid crystal panel 301.

  The display control unit 601 temporarily stores the display data received from the display data processing unit 701 in the display data storage unit 604. Then, a data signal (D) is generated based on the primary stored display data. The display data storage unit 604 is, for example, a video random access memory (VRAM).

  The sensor control unit 602 transmits a timing control signal (TC2) to the optical sensor driving circuit 305 of the display / optical sensor unit 300 in accordance with an instruction from the sensor data processing unit 703 of the data processing unit 700, and the sensor built-in liquid crystal panel 301. Run the scan with.

  In addition, the sensor control unit 602 receives a digital signal (DS) from the signal conversion circuit 306. Then, image data is generated based on the digital signal (DS) corresponding to the sensor output signal (SS) output from all the optical sensor circuits 32 included in the sensor built-in liquid crystal panel 301. That is, the image data read in the entire reading area of the sensor built-in liquid crystal panel 301 is generated. Then, the generated image data is transmitted to the sensor data processing unit 703.

  The backlight control unit 603 transmits a control signal (BK) to the backlight drive circuit 308 of the display / light sensor unit 300 in accordance with instructions from the display data processing unit 701 and the sensor data processing unit 703 to drive the backlight 307. Let

  When the data display / sensor device 100 includes a plurality of display / light sensor units 300, the display control unit 601 determines which display / light sensor unit 300 displays the display data from the data processing unit 700. When an instruction is received, the liquid crystal panel drive circuit 304 of the display / light sensor unit 300 is controlled according to the instruction. When the sensor control unit 602 receives an instruction from the data processing unit 700 as to which display / light sensor unit 300 is to scan the object, The sensor drive circuit 305 is controlled and a digital signal (DS) is received from the signal conversion circuit 306 of the display / light sensor unit 300 according to the instruction.

  Next, the data processing unit 700 will be described. The data processing unit 700 has a function as middleware that gives an instruction to the circuit control unit 600 based on a “command” received from the main control unit 800. Details of the command will be described later.

  The data processing unit 700 includes a display data processing unit 701 and a sensor data processing unit 703. When the data processing unit 700 receives a command from the main control unit 800, at least one of the display data processing unit 701 and the sensor data processing unit 703 depends on the value of each field (described later) included in the received command. One works.

  The display data processing unit 701 receives display data from the main control unit 800, and gives instructions to the display control unit 601 and the backlight control unit 603 according to the command received by the data processing unit 700, and displays the received display data. The image is displayed on the sensor built-in liquid crystal panel 301. The operation of the display data processing unit 701 according to the command will be described later.

  The sensor data processing unit 703 gives an instruction to the sensor control unit 602 and the backlight control unit 603 according to the command received by the data processing unit 700.

  The sensor data processing unit 703 receives image data from the sensor control unit 602 and stores the image data in the image data buffer 704 as it is. Then, in accordance with the command received by the data processing unit 700, the sensor data processing unit 703 performs “whole image data”, “partial image data (partial image coordinate data) based on the image data stored in the image data buffer 704. At least one of “including coordinate data” and “coordinate data” is transmitted to the main control unit 800. The whole image data, partial image data, and coordinate data will be described later. The operation of the sensor data processing unit 703 according to the command will be described later.

  Next, the main control unit 800 executes an application program. The main control unit 800 reads the program stored in the storage unit 901 into a primary storage unit 902 configured by, for example, a RAM (Random Access Memory) and executes the program.

  An application program executed by the main control unit 800 causes the data processing unit 700 to display display data on the sensor built-in liquid crystal panel 301 or to scan an object on the sensor built-in liquid crystal panel 301. Send commands and display data. When “data type” is designated as a command, at least one of whole image data, partial image data, and coordinate data is received from the data processing unit 700 as a response to the command.

  The circuit control unit 600, the data processing unit 700, and the main control unit 800 can be configured by a CPU (Central Processing Unit), a memory, and the like, respectively. The data processing unit 700 may be configured by a circuit such as an ASIC (application specific integrate circuit).

  Next, the storage unit 901 stores programs and data executed by the main control unit 800 as shown in the figure. The program executed by the main control unit 800 may be separated into an application-specific program and a general-purpose program that can be shared by each application.

  Next, the operation unit 903 receives an input operation of the user of the data display / sensor device 100. The operation unit 903 includes input devices such as a switch, a remote controller, a mouse, and a keyboard, for example. Then, the operation unit 903 generates a control signal corresponding to the user's input operation of the data display / sensor device 100, and transmits the generated control signal to the main control unit 800.

  As an example of the switch, a hardware switch such as a power switch 905 that switches power on and off and a user switch 906 to which a predetermined function is assigned in advance is assumed.

  In addition, the data display / sensor device 100 includes an external communication unit 907 for communicating with an external device by wireless / wired communication, an audio output unit 908 such as a speaker for outputting audio, and a microphone for inputting an audio signal. A voice input unit 909 such as the above may be provided as appropriate.

(Command details)
Next, details of commands transmitted from the main control unit 800 to the data processing unit 700 will be described with reference to FIGS. 5 and 6. FIG. 5 is a diagram schematically illustrating an example of a command frame structure. FIG. 6 is a diagram for explaining an example of values that can be specified for each field included in the command and an outline thereof.

  As shown in FIG. 5, the commands are “header”, “data acquisition timing”, “data type”, “scan method”, “scan image gradation”, “scan resolution”, “scan panel”, “display panel”. "And" Reserve "fields. In each field, for example, values shown in FIG. 6 can be designated.

  The “header” field is a field indicating the start of a frame. As long as it is possible to identify the “header” field, the value of the “header” field may be any value.

  Next, the “data acquisition timing” field is a field for designating a timing at which data should be transmitted to the main control unit 800. In the “data acquisition timing” field, for example, values “00” (sense), “01” (event), and “10” (all) can be specified.

  Here, “sense” designates that the latest data is transmitted immediately. Therefore, when the sensor data processing unit 703 receives a command whose value in the “data acquisition timing” field is “sense”, the latest data specified in the “data type” field is immediately updated to the main control unit 800. Send to.

  The “event” designates transmission at a timing when a change occurs in the image data received from the sensor control unit 602. Therefore, when the sensor data processing unit 703 receives a command whose value in the “data acquisition timing” field is “event”, the image that receives the data specified in the “data type” field from the sensor control unit 602. The data is transmitted to the main control unit 800 at a timing when a change larger than a predetermined threshold occurs.

  “All” designates data transmission at a predetermined cycle. Therefore, when the sensor data processing unit 703 receives a command whose value in the “data acquisition timing” field is “all”, the data designated in the “data type” field is transferred to the main control unit 800 at a predetermined cycle. Send to. The predetermined period coincides with the period in which the optical sensor circuit 32 performs scanning.

  Next, the “data type” field is a field for designating the type of data acquired from the sensor data processing unit 703. In the “data type” field, for example, values of “001” (coordinates), “010” (partial image), and “100” (entire image) can be specified. Furthermore, by adding these values, “coordinates” and “partial image” / “whole image” can be specified simultaneously. For example, when “coordinate” and “partial image” are specified at the same time, “011” can be specified.

  When the sensor data processing unit 703 receives a command whose value of the “data type” field is “whole image”, the sensor data processing unit 703 transmits the image data itself stored in the image data buffer 704 to the main control unit 800. The image data itself stored in the image data buffer 704 is referred to as “whole image data”.

  In addition, when the sensor data processing unit 703 receives a command whose value of the “data type” field is “partial image”, the sensor data processing unit 703 selects a portion where a change larger than a predetermined threshold has occurred from the image data received from the sensor control unit 602. A region to be included is extracted, and image data of the extracted region is transmitted to the main control unit 800. Here, the image data is referred to as “partial image data”. When a plurality of partial image data are extracted, the sensor data processing unit 703 transmits the extracted partial image data to the main control unit 800.

  Further, when the sensor data processing unit 703 receives a command whose value of the “data type” field is “partial image”, the sensor data processing unit 703 detects representative coordinates in the partial image data and indicates the position of the representative coordinates in the partial image data. The coordinate data is transmitted to the main control unit 800. The representative coordinates include, for example, the coordinates of the center of the partial image data, the coordinates of the center of gravity of the partial image data, and the like.

  Next, when receiving a command whose value of the “data type” field is “coordinate”, the sensor data processing unit 703 transmits coordinate data indicating the position of the representative coordinate in the entire image data to the main control unit 800. When a plurality of partial image data are extracted, the sensor data processing unit 703 detects representative coordinates in the entire image data of the extracted partial image data, and the coordinate data indicating the representative coordinates is detected. Each is transmitted to the main control unit 800 (multi-point detection).

  Specific examples of the whole image data, the partial image data, and the coordinate data will be described later with reference to schematic diagrams.

  Next, the “scan method” field is a field for designating whether or not the backlight 307 is turned on at the time of executing the scan. In the “scan method” field, for example, values of “00” (reflection), “01” (transmission), and “10” (reflection / transmission) can be designated.

  “Reflection” designates that scanning is performed with the backlight 307 turned on. Therefore, when the sensor data processing unit 703 receives a command whose “scan method” field value is “reflection”, the sensor data processing unit 703 performs sensor control so that the optical sensor driving circuit 305 and the backlight driving circuit 308 operate in synchronization. An instruction is given to the unit 602 and the backlight control unit 603.

  “Transmission” specifies that scanning is performed with the backlight 307 turned off. Therefore, when the sensor data processing unit 703 receives a command whose “scan method” field value is “transparent”, the sensor control unit 602 operates the optical sensor driving circuit 305 and does not operate the backlight driving circuit 308. Instructions to the backlight control unit 603. Note that “reflection / transmission” specifies that scanning is performed using both “reflection” and “transmission”.

  Next, the “scanned image gradation” field is a field for designating gradations of the partial image data and the entire image data. In the “scanned image gradation” field, for example, values of “00” (binary) and “01” (multivalue) can be designated.

  When the sensor data processing unit 703 receives a command whose “scan image gradation” field value is “binary”, the sensor data processing unit 703 transmits the partial image data and the entire image data to the main control unit 800 as monochrome data. .

  When the sensor data processing unit 703 receives a command whose “scanned image gradation” field value is “multivalued”, the sensor data processing unit 703 transmits the partial image data and the entire image data to the main control unit 800 as multitone data. To do.

  Next, the “scan resolution” field is a field for designating the resolution of the partial image data and the entire image data. In the “resolution” field, for example, values of “0” (high) and “1” (low) can be designated.

  Here, “high” designates a high resolution. Therefore, when the sensor data processing unit 703 receives a command whose “scan resolution” field value is “high”, the sensor data processing unit 703 transmits the partial image data and the entire image data to the main control unit 800 with high resolution. For example, it is desirable to designate “high” for image data (image data such as a fingerprint) to be subjected to image processing such as image recognition.

  “Low” designates a low resolution. Therefore, when the sensor data processing unit 703 receives a command whose “scan resolution” field value is “low”, the sensor data processing unit 703 transmits the partial image data and the entire image data to the main control unit 800 at a low resolution. For example, it is desirable to designate “low” for image data (such as touched finger or hand image data) that only needs to be recognized.

  Next, the “scan panel” field is a field for designating which display / light sensor unit 300 is to scan the object. In the “scan panel” field, for example, values “001” (first display / light sensor unit 300A) and “010” (second display / light sensor unit 300B) can be designated. By adding these values, a plurality of display / light sensor units 300 can be specified at the same time. For example, when “first display / light sensor unit 300A” and “second display / light sensor unit 300B” are specified at the same time, “011” can be specified.

  When the sensor data processing unit 703 receives a command whose “scan panel” field value is “first display / photosensor unit 300A”, the sensor data processing unit 703 and the photosensor drive circuit 305 of the first display / photosensor unit 300A and An instruction is given to the sensor control unit 602 and the backlight control unit 603 so as to control the backlight drive circuit 308.

  Next, the “display panel” field is a field for designating which display / light sensor unit 300 displays the display data. In the “display panel” field, for example, values of “001” (first display / light sensor unit 300A) and “010” (second display / light sensor unit 300B) can be designated. By adding these values, a plurality of display / light sensor units 300 can be specified at the same time. For example, when “first display / light sensor unit 300A” and “second display / light sensor unit 300B” are specified at the same time, “011” can be specified.

  Here, for example, when the display data processing unit 701 receives a command whose value of the “display panel” field is “first display / light sensor unit 300A”, the display data processing unit 701 displays the display data on the first display / light sensor unit 300A. Therefore, an instruction is given to the display control unit 601 and the backlight control unit 603 to control the liquid crystal panel driving circuit 304 and the backlight driving circuit 308 of the first display / light sensor unit 300A.

  Next, the “reserved” field is a field that is appropriately specified when it is necessary to further specify information other than information that can be specified in the above-described fields.

  Note that an application executed by the main control unit 800 does not need to use all the above-described fields when transmitting a command, and an invalid value (such as a NULL value) may be set for a field that is not used. .

  When the user wants to acquire coordinate data of a position touched with a finger or pen, a command specifying “coordinate” in the “data type” field is transmitted to the data processing unit 700. Therefore, it is desirable to specify “all” in the “data acquisition timing” field of the command to acquire coordinate data. Further, since it is only necessary to acquire coordinate data of the touched position, the scanning accuracy may not be high. Therefore, “low” may be specified as the value of the “resolution” field of the command.

  Further, when “coordinate” is specified in the “data type” field of the command, for example, when the user touches the sensor built-in liquid crystal panel 301 with a plurality of fingers or pens at the same time, the coordinate data of the touched position is used. Can be acquired (multi-point detection).

  When acquiring image data of an object such as a document, a command specifying “whole image” in the “data type” field is transmitted to the data processing unit 700. Since it is common to perform a scan in a stationary state, it is not necessary to periodically perform the scan. Therefore, in this case, it is desirable to designate “sense” or “event” in the “data acquisition timing” field. When scanning an object such as a document, it is desirable that the scanning accuracy is high so that the user can easily read the characters. Therefore, it is desirable to designate “high” in the “resolution” field.

(Whole image data / Partial image data / Coordinate data)
Next, the whole image data, the partial image data, and the coordinate data will be described with reference to FIG. The image data shown in FIG. 7A is image data obtained as a result of scanning the entire sensor built-in liquid crystal panel 301 when the object is not placed on the sensor built-in liquid crystal panel 301. The image data shown in FIG. 7B is image data obtained as a result of scanning the entire sensor-equipped liquid crystal panel 301 when the user touches the sensor-equipped liquid crystal panel 301 with a finger.

  When the user touches the sensor built-in liquid crystal panel 301 with a finger, the amount of light received by the photosensor circuit 32 in the vicinity of the touch changes, so that the voltage output from the photosensor circuit 32 changes, and as a result, In the image data generated by the sensor control unit 602, the brightness of the pixel value of the portion touched by the user changes.

  In the image data shown in FIG. 7B, the brightness of the pixel value of the portion corresponding to the user's finger is higher than that in the image data shown in FIG. In the image data shown in FIG. 7B, the smallest rectangular area (area PP) that includes all pixel values whose lightness changes more than a predetermined threshold is “partial image data”.

  The image data indicated by the area AP is “whole image data”.

  Also, the coordinate data in the whole image data (area AP) of the representative coordinates Z of the partial image data (area PP) is (Xa, Ya), and the coordinate data in the partial image data (area PP) is (Xp, Yp). It is.

(Configuration of sensor built-in liquid crystal panel)
Next, the configuration of the sensor built-in liquid crystal panel 301 and the configuration of the peripheral circuit 309 of the sensor built-in liquid crystal panel 301 will be described with reference to FIG. FIG. 8 is a block diagram showing the main part of the display / light sensor unit 300, particularly the configuration of the sensor built-in liquid crystal panel 301 and the configuration of the peripheral circuit 309.

  The sensor built-in liquid crystal panel 301 includes a pixel circuit 31 for setting light transmittance (brightness) and an optical sensor circuit 32 that outputs a voltage corresponding to the intensity of light received by the sensor. Note that the pixel circuit 31 is a generic term for the R pixel circuit 31r, the G pixel circuit 31g, and the B pixel circuit 31b corresponding to the red, green, and blue color filters, respectively.

  The pixel circuits 31 are arranged on the sensor built-in liquid crystal panel 301 in the column direction (vertical direction) and 3n in the row direction (horizontal direction). A set of the R pixel circuit 31r, the G pixel circuit 31g, and the B pixel circuit 31b is continuously arranged in the row direction (lateral direction). This set forms one pixel.

  In order to set the light transmittance of the pixel circuit 31, first, the high level voltage (TFT 33 is turned on) to the scanning signal line Gi connected to the gate terminal of the TFT (Thin Film Transistor) 33 included in the pixel circuit 31. Voltage). Thereafter, a predetermined voltage is applied to the data signal line SRj connected to the source terminal of the TFT 33 of the R pixel circuit 31r. Similarly, the light transmittance is also set for the G pixel circuit 31g and the B pixel circuit 31b. Then, by setting these light transmittances, an image is displayed on the sensor built-in liquid crystal panel 301.

  Next, the photosensor circuit 32 is arranged for each pixel. One pixel may be arranged in the vicinity of each of the R pixel circuit 31r, the G pixel circuit 31g, and the B pixel circuit 31b.

  In order for the optical sensor circuit 32 to output a voltage corresponding to the light intensity, first, the sensor readout line RWi connected to one electrode of the capacitor 35 and the anode terminal of the photodiode 36 are connected. A predetermined voltage is applied to the sensor reset line RSi. In this state, when light is incident on the photodiode 36, a current corresponding to the amount of incident light flows through the photodiode 36. Then, according to the current, the voltage at the connection point (hereinafter referred to as connection node V) between the other electrode of the capacitor 35 and the cathode terminal of the photodiode 36 decreases. When the power supply voltage VDD is applied to the voltage application line SDj connected to the drain terminal of the sensor preamplifier 37, the voltage at the connection node V is amplified and output from the source terminal of the sensor preamplifier 37 to the sensing data output line SPj. Based on the output voltage, the amount of light received by the optical sensor circuit 32 can be calculated.

  Next, the liquid crystal panel drive circuit 304, the optical sensor drive circuit 305, and the sensor output amplifier 44, which are peripheral circuits of the sensor built-in liquid crystal panel 301, will be described.

  The liquid crystal panel drive circuit 304 is a circuit for driving the pixel circuit 31, and includes a scanning signal line drive circuit 3041 and a data signal line drive circuit 3042.

  The scanning signal line driving circuit 3041 sequentially selects one scanning signal line from the scanning signal lines G1 to Gm for each line time based on the timing control signal TC1 received from the display control unit 601, and A high level voltage is applied to the selected scanning signal line, and a low level voltage is applied to the other scanning signal lines.

  Based on the display data D (DR, DG, and DB) received from the display controller 601, the data signal line driver circuit 3042 generates a predetermined voltage corresponding to the display data for one row for each line time. The data signal lines SR1 to SRn, SG1 to SGn, and SB1 to SBn are applied (line sequential method). Note that the data signal line driver circuit 3042 may be driven by a dot sequential method.

  The optical sensor driving circuit 305 is a circuit for driving the optical sensor circuit 32. Based on the timing control signal TC2 received from the sensor control unit 602, the optical sensor driving circuit 305 selects a predetermined sensor readout signal line from the sensor readout signal lines RW1 to RWm for each line time. A read voltage is applied, and a voltage other than a predetermined read voltage is applied to the other sensor read signal lines. Similarly, based on the timing control signal TC2, a predetermined reset voltage is applied to the sensor reset signal line selected from the sensor reset signal lines RS1 to RSm for each line time, and the others. A voltage other than a predetermined reset voltage is applied to the sensor reset signal line.

The sensing data output signal lines SP1 to SPn are grouped into p groups (p is an integer of 1 to n), and the sensing data output signal lines belonging to each group are connected via a switch 47 that is sequentially turned on in time division. And connected to the sensor output amplifier 44. The sensor output amplifier 44 amplifies the voltage from the group of sensing data output signal lines connected by the switch 47 and outputs the amplified voltage to the signal conversion circuit 306 as sensor output signals SS (SS1 to SSp).
[Embodiment 1]
The main feature of the present invention is that in the data display / sensor device 100, the first display / light sensor unit 300A and the second display / light sensor unit 300B for detecting a nearby image are arranged on both the front and back surfaces. The first display / light sensor unit 300A and the second display / light sensor unit 300B both detect whether or not a finger image is detected, and the detection determination unit 810 detects the first display / light. Fingers detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B when it is determined that both the sensor unit 300A and the second display / light sensor unit 300B have detected a finger image. And an input detection unit 830 for generating an input signal by detecting the movement of the image.

  Thereby, in the data display / sensor device 100, when both the first display / light sensor unit 300A and the second display / light sensor unit 300B arranged on both the front and back surfaces detect the finger image, the input detection unit 830 generates an input signal.

  Therefore, even if only one of the first display / light sensor unit 300A or the second display / light sensor unit 300B detects the image of the finger, no input signal is generated thereby. Therefore, the data display / sensor device 100 can be realized by a conventional input device by the first display / light sensor unit 300A and the second display / light sensor unit 300B detecting movements of various finger images. Various and three-dimensional operations that were not possible are possible. Further, the data display / sensor device 100 provides the user with excellent convenience and operability by such an operation.

Furthermore, in the data display / sensor device 100, it is possible to avoid a situation in which the user's finger unintentionally touches the icon on the touch panel and drags the icon to somewhere else on the display screen. . Accordingly, an input signal is generated by input from the two input surfaces of the first display / light sensor unit 300A and the second display / light sensor unit 300B, and thus the first display / light sensor unit 300A and the second display. / The risk of operating the optical sensor unit 300B by mistake is reduced.
(Description of data display / sensor device operation)
The data display / sensor device 100 according to the first embodiment will be described with reference to FIGS. 1, 9 to 18, and 28 to 30.

  First, specific examples of various operations performed by the data display / sensor device 100 will be described with reference to FIGS. Next, the configuration of the data display / sensor device 100 will be described with reference to FIG. Then, the flow of processing for realizing various operations will be described with reference to FIG.

  Here, the data display / sensor device 100 may display an image on either the first display / light sensor unit 300A or the second display / light sensor unit 300B, but in the present embodiment, the first display / light sensor unit 300B displays the image. In the following description, it is assumed that an image is displayed on the optical sensor unit 300A. The first display / light sensor unit 300A is located on the front side of the drawings in FIGS.

In addition, although the user can operate the data display / sensor device 100 with either the right hand or the left hand, in the present embodiment, it will be described as being operated with the right hand.
(Operations performed on the display image)
The outline of FIGS. 9 to 18 will be described below.

  FIG. 9A shows that the rectangular image 25a is displayed on the first display / light sensor unit 300A, and the user touches the rectangular image 25a with the thumb 20 from above, and the second display / light sensor unit. If the reverse side of the rectangular image 25a is displayed on 300B, the index finger 21 touches the area where the reverse side of the rectangular image 25a will be displayed.

  In addition, although the rectangular image 25a displayed on the first display / light sensor unit 300A is not actually touched with the index finger 21 from the lower surface, in the following description according to the present embodiment, for convenience of explanation, This state is expressed as “touch with the thumb 20 from the upper surface and the index finger 21 from the lower surface”. The same applies to the description described with reference to FIGS.

  In FIG. 9B, the thumb 20 and the index finger 21 are moved in opposite directions (in the horizontal direction in the drawing, the thumb 20 is moved to the right and the index finger 21 is moved to the left), and a plurality of rectangular images 25a are fanned out. It is a figure which shows a mode that it displayed on 300 A of 1st display / light sensor parts in the state. That is, in FIG. 9A, the plurality of rectangular images 25a are in a state where the strips are bundled and closed, and in FIG. 9B, the plurality of rectangular images 25a are expanded in a fan shape. is there.

  FIG. 10A shows a state in which a circular image 26a is displayed on the first display / light sensor unit 300A, and the circular image 26a is touched with the thumb 20 from the top and the index finger 21 from the bottom. FIG. In FIG. 10B, the thumb 20 and the index finger 21 are moved in opposite directions (in the longitudinal direction of the drawing, the thumb 20 is moved upward and the index finger 21 is moved downward), and the first display / light sensor unit 300A It is the figure which displayed the circular image 26b of a big dimension.

  FIG. 11A shows a substantially square image 27a displayed on the first display / light sensor unit 300A, and the substantially square image 27a is touched with the thumb 20 from the top and the index finger 21 from the bottom. FIG. In FIG. 11B, the thumb 20 and the index finger 21 are moved in the same direction (rightward in the horizontal direction in the drawing) to move the substantially square image 27a to the outside of the screen of the first display / light sensor unit 300A. FIG.

  12A shows a pipe-shaped image 28a with the horizontal direction in the drawing as the pipe axis (pipe length) direction on the first display / light sensor unit 300A, and the pipe-shaped image 28a is displayed on the thumb 20 from the top. FIG. 8 is a diagram showing a state where the index finger 21 is touched from the lower surface. FIG. 12B shows a pipe-shaped image in which the thumb 20 and the index finger 21 are moved in opposite directions (in the vertical direction of the drawing, the thumb 20 is moved upward and the index finger 21 is moved downward), respectively. It is a figure which shows a mode that 28a is rotated.

  FIG. 13A shows rectangular images 29a to 29d arranged in a row in the horizontal direction in the drawing near the upper end of the screen of the first display / light sensor unit 300A, and one rectangular image 29c is displayed on the top surface. It is a figure which shows a mode that it is each touching with the thumb 20 from the bottom, and the index finger 21 from the lower surface. In FIG. 13B, by moving the thumb 20 and the index finger 21 in the same direction (downward in the drawing) and pulling out the rectangular image 29c downward, the rectangular image 29c is stretched in the vertical direction of the drawing. It is a figure which shows a mode that it became the strip | belt-shaped image 29c '.

  FIG. 14A shows a rectangular image 30a on the first display / light sensor unit 300A, and the display screen of the first display / light sensor unit 300A is displayed with the thumb 20 from the top and the index finger 21 from the bottom. It is a figure which shows a mode that each is touched. In FIG. 14 (b), the rectangular image 30a is divided into two rectangular images 30b by moving the thumb 20 and the index finger 21 in the same direction (downward in the drawing) so as to divide the rectangular image 30a. It is a figure which shows a mode.

  FIG. 15A shows a plurality of rectangular images 31a arranged circumferentially on the first display / light sensor unit 300A, and the image 31a is displayed near the center point of the circle with the thumb 20 from above. It is a figure which shows a mode that it is each touching with the index finger 21 from the lower surface. FIG. 15B shows a state in which a plurality of rectangular images 31a are rotated counterclockwise along the circumference of the circle by moving the index finger 21 downward in the drawing without moving the thumb 20. FIG.

  FIG. 16A shows an image 32c formed by combining a circular image 32a and a rectangular image 32b on the first display / light sensor unit 300A, and the vicinity of the center point of the circular image 32a is a top surface. It is a figure which shows a mode that the rectangular image 32b is touched with the index finger 21 from the lower surface with the thumb 20 from the bottom. In FIG. 16 (b), the index finger 21 is moved to the left side of the drawing without moving the thumb 20, and the rectangular image 32b is moved to the left as if it were a pendulum around the center point of the circular image 32a. It is a figure which shows a state made to do.

  FIG. 17A shows a rectangular image 33a over almost the entire screen of the first display / light sensor unit 300A, and the rectangular image 33a is displayed with the thumb 20 from the upper surface and with the index finger 21 from the lower surface. It is a figure which shows a mode that each is touching. FIG. 17B shows that a rectangular image 33b appears from the lower side of the drawing by moving the thumb 20 and the index finger 21 in opposite directions (in the vertical direction of the drawing, the thumb 20 is moved upward and the index finger 21 is moved downward). The image 33a and the image 33b are displayed side by side in the drawing vertical direction. FIG. 17C further shows that an image displayed on the screen by moving the thumb 20 and the index finger 21 in opposite directions (the thumb 20 is upward and the index finger 21 is downward in the vertical direction of the drawing) is an image 33a. It is a figure which shows a mode that it switched to image 33b from.

FIG. 18A shows a state in which a substantially square image 34a is displayed on the first display / light sensor unit 300A. FIG. 18B is a diagram illustrating a state in which the substantially square image 34a is “selected” by touching the approximately square image 34a with the index finger 21 from the top and the middle finger 22 from the bottom. .
(Specific description of operation performed on display image)
Next, the operations described above with reference to FIGS. 9 to 18 will be described more specifically.

  As shown in FIG. 9A, a rectangular image 25a is displayed on the first display / light sensor unit 300A. Then, the user displays the rectangular image 25a with the thumb 20 from the upper surface side (side where the first display / light sensor unit 300A is disposed) and the lower surface side (second display / light sensor unit 300B). Touch with the index finger 21 from the other side.

  As described above, the rectangular image 25a displayed on the first display / light sensor unit 300A is actually touched with the index finger 21 from the bottom even though the thumb 20 is touched from the top. There is no. However, in the following description according to the present embodiment, for convenience of explanation, such a state is expressed as “touch with the thumb 20 from the upper surface and the index finger 21 from the lower surface, respectively”. The same applies to the description described with reference to FIGS.

  In addition, although it has been described that the thumb 20 and the index finger 21 touch the rectangular image 25a from both the front and back surfaces, any finger may be used. That is, the rectangular image 25a may be sandwiched from both the front and back surfaces using any finger, and this is a precondition for executing a predetermined operation on the rectangular image 25a. Therefore, even if the rectangular image 25a is touched only from either the upper surface or the lower surface, the predetermined operation cannot be performed on the rectangular image 25a alone. Further, although it is described that the image (or screen) is touched with a finger, it is not necessary to actually touch the image, and the display / light sensor unit 300 may detect the image of the finger.

  In addition, the asterisk in the figure indicates an approximate location where the first display / light sensor unit 300A and the second display / light sensor unit 300B are detecting the image of the finger. In the following description, for convenience of explanation, it is assumed that the image 25a is touched with the thumb 20 from the upper surface side and the index finger 21 from the lower surface side. The same applies to FIGS. 10 to 18.

  Furthermore, although the image 25a is described as being rectangular, the image 25a may be square and can have any shape. However, in this embodiment, the image 25a is described as being rectangular. Similarly, in FIGS. 10 to 18, the shape of each image is described as a circular shape, a square shape, a pipe shape, or the like, but these shapes are merely examples and are not limited to the described shapes.

  Further, although the speed of movement of the finger of the thumb 20 and the index finger 21 is not particularly mentioned, the movement associated with the movement of the image of the finger is executed according to the speed of movement of the finger.

  Next, as shown in FIG. 9B, the user moves the thumb 20 and the index finger 21 in opposite directions (in the horizontal direction in the drawing, the thumb 20 moves to the right and the index finger 21 moves to the left). Due to the movement of the finger, a plurality of rectangular images 25a are displayed on the first display / light sensor unit 300A in a fan-shaped state. That is, in FIG. 9A, the plurality of rectangular images 25a are in a state where the strips are bundled and closed, and in FIG. 9B, the plurality of rectangular images 25a are expanded in a fan shape. is there.

  The fan-shaped spreading condition described above is determined according to the movement distance of the thumb 20 and the index finger 21. Accordingly, in FIG. 9B, since the thumb 20 and the index finger 21 have moved to the vicinity of the end of the display screen, the fan-like expansion extends to the entire display screen.

  In addition, by the following method, the movement opposite to that described above, that is, a plurality of rectangular images 25a spread in a fan shape can be displayed as a bundle of strips closed.

  First, the user touches a plurality of rectangular images 25a spread in a fan shape with the thumb 20 from the upper surface side and the index finger 21 from the lower surface side. At this time, the positions where the thumb 20 and the index finger 21 touch are different positions as shown in FIG. 9B. That is, the thumb 20 touches the image 25a located near the outermost part (left side of the drawing), and the index finger 21 touches the image 25a located near the other outermost part (right side of the drawing). Next, the thumb 20 and the index finger 21 are moved in opposite directions (in the horizontal direction of the drawing, the thumb 20 is moved to the left and the index finger 21 is moved to the right). That is, it moves in the opposite direction to the finger movement from FIG. 9A to FIG. 9B. As a result, a plurality of rectangular images 25a expanded in a fan shape are displayed as if the strips were bundled and closed. In this way, the data display / sensor device 100 can easily and quickly realize various three-dimensional operations shown in FIGS. 9A and 9B.

  Here, the rectangular image 25a has its longitudinal direction arranged in the vertical direction of the drawing in FIG. 9A, and the fan-shaped expansion is described in the horizontal direction of the drawing in FIG. 9B. However, the rectangular image 25a may be displayed so that its longitudinal direction is arranged in the horizontal direction of the drawing in FIG. 9A and the fan-shaped image 25a is expanded in the vertical direction of the drawing in FIG. 9B. It is. Further, the arrangement direction of the rectangular image 25a is not limited to the vertical and horizontal directions in FIG. 9A, and may be arranged in an arbitrary direction. FIG. 9A is merely an example.

  As examples of use of the operations described in FIGS. 9A and 9B, the following embodiments can be considered. For example, it is assumed that a plurality of rectangular images 25a are assigned as images (objects) indicating addresses such as “news”, “blog”, and “shopping”. Then, by the operations of the thumb 20 and the index finger 21 described above, a plurality of rectangular images 25a displayed as if the strips were bundled and closed are displayed in a fan shape. As a result, the user can easily and quickly find a desired address on the screen from the fan-shaped objects. In addition, since the object indicating the address destination expands in a fan shape, an operation with excellent design can be realized, and the data display / sensor device 100 can provide the user with high satisfaction. Note that a method for selecting a desired object from among several objects that expand in a fan shape is realized by a method described with reference to FIG. 18, for example.

  Next, FIG. 10 will be described. As shown in FIG. 10A, a circular image 26a is displayed on the first display / light sensor unit 300A. Then, the user touches the circular image 26a with the thumb 20 from the upper surface side and the index finger 21 from the lower surface side.

  Next, as shown in FIG. 10B, the user moves the thumb 20 and the index finger 21 in opposite directions (in the vertical direction of the drawing, the thumb 20 is substantially upward and the index finger 21 is generally downward). By the operation of the finger, the circular image 26a is displayed on the first display / light sensor unit 300A as a circular image 26b having a larger size.

  Note that the extent to which the dimensions are enlarged is determined according to the movement distance of the thumb 20 and the index finger 21. Accordingly, in FIG. 10B, the user has moved the thumb 20 and the index finger 21 greatly, so that a part of the image 26b is displayed on the screen.

  Further, the circular image 26a having a smaller size than the circular image 26b can be displayed on the first display / light sensor unit 300A by the following method.

  First, the user touches the circular image 26b with the thumb 20 from the upper surface side and the index finger 21 from the lower surface side. At this time, the positions where the thumb 20 and the index finger 21 touch the image 26b are different positions as shown in FIG. 10B. In the example shown in FIG. 10B, the thumb 20 touches the vicinity of the upper end of the image 26b, and the index finger 21 touches the vicinity of the lower end of the image 26b.

  Next, the thumb 20 and the index finger 21 are moved in opposite directions (in the longitudinal direction of the drawing, the thumb 20 is substantially downward and the index finger 21 is substantially downward). By the operation of this finger, the circular image 26b is displayed on the first display / light sensor unit 300A as a circular image 26a having a smaller size. That is, by performing a movement opposite to the movement of the finger from FIG. 10A to FIG. 10B, the circular image 26b becomes a circular image 26a having a smaller size, and the first display / light sensor. Displayed on the part 300A. In this way, the user can easily and quickly execute enlargement / reduction of an image.

  When the thumb 20 and the index finger 21 are moved in directions opposite to each other, the direction is not limited to the above-described vertical direction in the drawing and may be any direction. The example given here is only an example.

  As examples of use of the operations described in FIGS. 10A and 10B, the following embodiments can be considered. For example, when the circular image 26a is a photographic image, the user may want to view the image in an enlarged state. In such a case, the above-described operation of the thumb 20 and the index finger 21 allows the photograph image to be enlarged and displayed, and the data display / sensor device 100 responds to the user's request to view the enlarged image. I can respond.

  Next, FIG. 11 will be described. In FIG. 11A, a square image 27a is displayed on the first display / light sensor unit 300A, and the square image 27a is touched with the thumb 20 from the top and the index finger 21 from the bottom. Then, as shown in FIG. 11 (b), the user moves the thumb 20 and the index finger 21 in the same direction (rightward in the horizontal direction of the drawing) to move the square image 27a to the outside of the screen. . That is, the user can move the square image 27a according to the distance that the finger moves by the movement of the finger. If the operation of deleting the square image 27a when the square image 27a is moved to the outside of the display screen is assigned in advance, the square image 27a file is easily deleted. In this way, the user can more easily and quickly move and delete images.

  11A and 11B, it has been described that both the thumb 20 and the index finger 21 are moved to the right side in the horizontal direction of the drawing. However, the direction in which the thumb 20 and the index finger 21 are moved is not limited to this, and may be any direction. Therefore, the user can move the image to an arbitrary place.

  As examples of use of the operations described in FIGS. 11A and 11B, the following embodiments can be considered. For example, when the user wants to move an arbitrary file (or folder) displayed on the screen, the user holds the file from both the front and back sides and moves the thumb and index finger to the desired destination to move to that position. You can move files. In addition, when deleting a file displayed on the screen, the file may be sandwiched from both the front and back sides and moved to the outside of the screen. In this way, the data display / sensor device 100 can respond to a user's request to delete / move a file.

  Next, FIG. 12 will be described. As shown in FIG. 12A, a pipe-shaped image 28a with the horizontal direction in the drawing as the pipe axis (pipe length) direction is displayed on the first display / light sensor unit 300A. Then, the user touches the pipe-shaped image 28a with the thumb 20 from the upper surface and the index finger 21 from the lower surface. Then, as shown in FIG. 10B, the first display / light sensor is obtained by moving the thumb 20 and the index finger 21 in opposite directions (in the vertical direction of the drawing, the thumb 20 is upward and the index finger 21 is downward). The part 300A displays a state in which the pipe-shaped image 28a rotates in a direction perpendicular to the pipe axis (from the bottom to the top of the drawing).

  By moving the thumb 20 and the index finger 21 in opposite directions (in the longitudinal direction of the drawing, the thumb 20 is downward and the index finger 21 is upward), the pipe-shaped image 28a is opposite to the above. , That is, from the top to the bottom of the drawing. As described above, the rotation direction of the pipe-shaped image 28a is arbitrarily selected depending on the moving direction of the thumb 20 and the index finger 21, and the data display / sensor device 100 is shown in FIGS. 12 (a) and 12 (b). Various and three-dimensional operations can be realized easily and quickly.

  As examples of use of the operations described in FIGS. 12A and 12B, the following embodiments can be considered. In FIG. 12A, the pipe-shaped image 28a is formed by a plurality of rectangular images 28b extending in the horizontal direction of the drawing. Therefore, each rectangular image 28b is an image (object) indicating an address such as “news”, “blog”, “shopping”, and the like, and the pipe-shaped image 28a is rotated and positioned in front of the user. The object to be selected (object displayed in front of the screen) is set as “selected object”. Then, the user rotates the pipe-shaped image 28a by the operation of the thumb 20 and the index finger 21 described above, and further moves the object to be selected to the front (selection position P) with respect to the screen, thereby obtaining a desired object. Is easily and quickly selected. In addition, when the object indicating the address is rotated, the data display / sensor device 100 can realize an operation with excellent design and can provide the user with high satisfaction.

  Next, FIG. 13 will be described. In FIG. 13A, rectangular images 29a to 29d are displayed in a line in the horizontal direction of the drawing near the upper end of the screen of the first display / light sensor unit 300A. The rectangular image 29c is pinched by the thumb 20 from the top and the index finger 21 from the bottom. In FIG. 13B, the thumb 20 and the index finger 21 move in the same direction (downward in the drawing), whereby the rectangular image 29c is stretched in the vertical direction of the drawing to form a strip-like image 29c ′. Is displayed.

  Note that the length of the striped image 29 c ′ in the vertical direction of the drawing is determined according to the movement distance of the thumb 20 and the index finger 21. That is, the length in the drawing vertical direction of the striped image 29 c ′ varies depending on the movement distance of the thumb 20 and the index finger 21.

  In the state where the band-shaped image 29c ′ is displayed, the user holds the position of the band-shaped image 29c ′ on the lower side of the drawing with the thumb 20 from the upper surface and the index finger 21 from the lower surface, and the thumb 20 and the index finger. 21 is moved in the same direction (upward in the drawing). By this operation, the strip-shaped image 29c ′ returns to the rectangular image 29c that is the image before being stretched. As a result, a rectangular image 29c is displayed near the upper end of the screen of the first display / light sensor unit 300A. In this manner, when the thumb 20 and the index finger 21 move up and down in the vertical direction of the drawing, the rectangular image 29c and the strip-shaped image 29c 'are displayed on the first display / light sensor unit 300A.

  In FIG. 13A, images 29a to 29d are displayed near the upper end of the screen of the first display / light sensor unit 300A. However, the present invention is not limited to this position, and the lower end of the screen of the first display / light sensor unit 300A. The images 29a to 29d may be displayed near the right end and the left end.

  As examples of use of the operations described in FIGS. 13A and 13B, the following embodiments can be considered. Some users have a desire to keep the number of objects displayed on the screen as small as possible. In such a case, if an object is brought close to the upper end, lower end, right end, or near the left end of the screen, and the band-like image 29c ′ can be pulled out by the above-described operation of the thumb 20 and the index finger 21 only when necessary. The space in the screen can be used effectively, and the user's desire to keep the number of objects displayed on the screen as small as possible can be met.

  Next, FIG. 14 will be described. In FIG. 14A, a rectangular image 30a is displayed on the first display / light sensor unit 300A, and the user touches the display / light sensor unit 300 with the thumb 20 from the top and the index finger 21 from the bottom. is doing. Then, as shown in FIG. 14B, by moving the thumb 20 and the index finger 21 in the same direction (downward in the drawing) so as to divide the rectangular image 30a, the rectangular image 30a has two left and right sides. The image is divided into images 30b. Note that, by setting in advance in a finger-command correspondence information table, which will be described later, the rectangular image 30a is divided vertically or obliquely according to the movement direction of the thumb 20 and the index finger 21.

  As examples of use of the operations described in FIGS. 14A and 14B, the following embodiments can be considered. For example, if the operation that the file of the image 30a is deleted when the rectangular image 30a is divided into two is assigned in advance, the file of the rectangular image 30a can be easily deleted by the above-described movement of the finger. can do.

  Next, FIG. 15 will be described. In FIG. 15A, a plurality of images 31a are circumferentially arranged and displayed on the first display / light sensor unit 300A, and the user can image the vicinity of the center point of the circle with the thumb 20 from above. 31 a is touched with the index finger 21 from the bottom. Then, as shown in FIG. 15 (b), by moving the index finger 21 downward without moving the thumb 20, the plurality of images 31a are rotated counterclockwise along the circumference of the circle. is doing. Although not shown, the plurality of images 31a rotate clockwise along the circumference of the circle by moving the index finger 21 upward in the drawing.

  In this manner, the plurality of images 31a are displayed in a rotated state on the first display / light sensor unit 300A. Moreover, the user can stop the rotation of the finger by stopping the movement of the finger. Therefore, when rotation is stopped, an image (object) positioned at a predetermined location on the screen is set as a “selected object”, so that an object is selected by a novel method that has not been conventionally used. . In addition, by setting the plurality of images to rotate according to the movement speed of the index finger, a desired object is easily and quickly displayed on the first display / light sensor unit 300A and further selected. Since the data display / sensor device 100 can realize an operation with excellent design, it is possible to provide the user with high satisfaction.

  Next, FIG. 16 will be described. In FIG. 16 (a), an image 32c formed by combining a circular image 32a and a rectangular image 32b is displayed on the first display / light sensor unit 300A, and the user has an upper surface near the center of the image 32a. The thumb 20 and the image 32b are touched from the lower surface with the index finger 21, respectively.

  Next, as shown in FIG. 16 (b), the index finger 21 moves to the left side of the drawing without moving the thumb 20, so that the image 32b looks like a pendulum with the vicinity of the center point of the image 32a as a fulcrum. Move to the left.

  The movement of the image 32b described above is determined according to the movement distance of the index finger 21. Further, by moving the index finger 21 in the reverse direction (rightward in the drawing), the image 32b moves around the center point of the image 32a as a fulcrum according to the distance of the index finger 21 moved.

  In this way, the data display / sensor device 100 can easily and quickly realize the various three-dimensional operations shown in FIGS. 16 (a) and 16 (b).

  In FIG. 16, the image 32b is displayed on the upper side of the image 32a, but the position is arbitrarily selected. In FIG. 16, only one image 32a, 32b is displayed, but a plurality of images may be displayed.

  As examples of use of the operations described in FIGS. 16A and 16B, the following embodiments can be considered. For example, the user may have a desire to adjust the volume output from the data display / sensor device 100 by a simple operation on the screen. In such a case, for example, if an operation in which the index finger 21 moves in the left-right direction in the drawing, the image 32b moves to the right and left accordingly, and the volume is linearly adjusted according to the amount of movement is assigned in advance, for example, By moving the image 32b to the left side, the volume output from the data display / sensor device 100 can be lowered. Alternatively, the image 32b can be moved to the right, thereby increasing the volume output from the data display / sensor device 100.

  Next, FIG. 17 will be described. In FIG. 17A, a rectangular image 33a is displayed over almost the entire screen of the first display / light sensor unit 300A, and the user displays the image 33a with the thumb 20 from the top and the index finger 21 from the bottom. Touching.

  Then, as shown in FIG. 17 (b), the thumb 20 and the index finger 21 are moved in opposite directions (in the vertical direction of the drawing, the thumb 20 is upward and the index finger 21 is downward). Image 33b appears, and a rectangular image 33a and a rectangular image 33b are displayed side by side in the vertical direction of the drawing. Then, as shown in FIG. 17 (c), the thumb 20 and the index finger 21 are further moved in opposite directions (in the vertical direction of the drawing, the thumb 20 is upward and the index finger 21 is downward). The image to be switched from the image 33a to the image 33b.

  That is, the image displayed on the screen is switched from the image 33a to the image 33b by the movement of the thumb 20 and the index finger 21 described above. In addition, by stopping the movement of the finger before the switching is completely performed, the rectangular image 33a and the image 33b are displayed side by side in the vertical direction of the drawing. Further, switching from the rectangular image 33a to the image 33b is performed at a desired speed in accordance with the speed of finger movement. Further, the images are switched from the image 33b to the rectangular image 33a by moving the thumb 20 and the index finger 21 in the opposite directions.

  As examples of use of the operations described in FIGS. 17A and 17B, the following embodiments can be considered. For example, let us consider a case where the user wants to sequentially view a plurality of photographs in an electronic album. In such a case, when the user wants to move from the photograph (image 33a) displayed on the screen to the next photograph (image 33b), the user operates the image from the image 33a by the operation of the thumb 20 and the index finger 21 described above. Switching to 33b can be performed easily and quickly. In this way, the data display / sensor device 100 can reliably meet the user's request to switch the image and view the next image.

  In FIG. 17, the image switching is performed in the vertical direction of the drawing, but can be performed in the horizontal direction of the drawing.

  Next, FIG. 18 will be described. In FIG. 18A, a substantially square image 34 is displayed on the first display / light sensor unit 300A. Then, as shown in FIG. 18B, the thumb 20 and the index finger 21 touch the substantially square image 34 from both the front and back sides almost simultaneously, and the substantially square image 34 is selected.

  That is, the user can select the substantially square image 34 by touching the substantially square image 34 with the thumb 20 and the index finger 21 from both the front and back sides almost simultaneously. Accordingly, the substantially square image 34 is not selected only by the movement of the thumb 20 or the index finger 21, and the second display / light sensor unit 300 </ b> A detects the image of the thumb 20 for a while after the time has elapsed. Even if the optical sensor unit 300B detects the image of the index finger 21, the substantially square image 34 is not selected.

  Therefore, the user cannot select a desired object unless the thumb 20 and the index finger 21 touch the substantially square image 34 from both the front and back sides at the same time, and the risk of erroneous input is reduced accordingly.

  The second display / light sensor unit 300B detects the image of the index finger 21 after the first display / light sensor unit 300A detects the image of the thumb 20 (or the second display / light sensor unit 300B detects the index finger 21). The first display / light sensor unit 300A detects the image of the thumb 20) until the first display / light sensor unit 300A detects the image of the thumb 20 may be set as appropriate. However, if the elapsed time is set to “0”, the object cannot be easily selected, and the user often feels inconvenience. In addition, if the elapsed time is too long, there is a possibility of erroneous input. Therefore, by adopting a configuration in which the elapsed time can be set as appropriate, the user can feel the inconvenience and reduce the risk of erroneous input.

  On the other hand, the substantially square image 34 can be selected by touching the substantially square image 34 from both the front and back surfaces with the thumb 20 and the index finger 21 without setting the elapsed time. This eliminates the need for the user to touch the substantially square image 34 from both the front and back sides while paying attention to the elapsed time.

  The elapsed time may be set by the user inputting from the outside and the input value stored in the storage unit 901. And the input detection part 830 reads the input value from the memory | storage part 901, and it is determined whether elapsed time passed by comparing with the measured value which self measured. In addition, when the measured value of elapsed time exceeds an input value, the input detection unit 830 may be configured not to generate an input signal. Details of the input detection unit 830 and the storage unit 901 will be described later.

  Further, by registering in advance in a finger-command correspondence information table, which will be described later, when the thumb 20 and the index finger 21 double-click the substantially square image 34 from both front and back surfaces, the image 34 is selected. It is also possible. In this case, the elapsed time may be a time from the first click to the second click.

  The specific examples of various operations executed in response to the movement of the finger image have been described above with reference to FIGS. In the above description, the movement direction of the thumb 20 and the index finger 21 and the action to be executed are described in association with each other. However, the action to be executed is not limited to the above-described actions. In other words, if the movement of the finger image and the action to be executed are associated with each other and registered in the finger-command correspondence information table described later, different actions can be executed even with the same finger movement. It becomes.

  For example, the operation of moving the square image 27a described with reference to FIG. 11 is executed by moving the thumb 20 and the index finger 21 in the same direction. However, by associating the movement of the thumb 20 and the index finger 21 in the same direction with the movement of enlarging (reducing) the image, different movements can be executed even with the same finger movement. . Alternatively, by moving the thumb 20 and the index finger 21 in the same direction, the operation of enlarging the image is assigned in the case of a circular image, and the operation of moving is assigned in the case of a rectangular image. Even with finger movement, different actions can be executed depending on the shape of the image.

  Here, in order to recognize the movement of a specific finger image from among the numerous changes in the movement of the finger image that can be counted, and to execute a predetermined operation associated with the movement of the finger image The following methods are conceivable.

  The first method is based on the relationship between the movement of the finger image detected by the first display / light sensor unit 300A and the movement of the finger image detected by the second display / light sensor unit 300B. It is a method to recognize. In this method, for example, when the finger images on both the front and back surfaces move in the same direction, the object moves, when the object moves in the opposite direction, the object expands, and when it moves in the approaching direction, the object contracts in advance. By doing so, the movement of a specific finger image is recognized from among many changes in the movement of the finger image.

  The second method considers the positional relationship with the data display / sensor device 100 in addition to the first method described above, and is specified from the angle of movement of the finger image with respect to the data display / sensor device 100. This is a method for recognizing the movement of a finger. In this method, the reference coordinates (X / Y axes) of the data display / sensor device 100 are determined, and if the images on both sides are moved in the opposite direction of the X axis, the object is enlarged and moved in the opposite direction of the Y axis. Then, by associating movements such as rotation of the object in advance, the movement of a specific finger image is recognized from among many changes in the movement of the finger image.

  The third method is a method for recognizing the movement of a specific finger in accordance with the operating state (scene) of the application, mode, or the like. In some applications, moving the finger images on the front and back sides in the opposite direction will cause the object to fan out, and in other applications the finger images on the front and back sides can be moved in the opposite direction. For example, in the case of another application, the movement of the finger image that exists in large numbers can be obtained by associating in advance the movement that the object is enlarged if the finger images on the front and back surfaces are moved in the same direction. It recognizes the movement of a specific finger image from among the changes.

  And if you decide which method to use in advance, the movement of a specific finger image is recognized from among the many changes in finger image movements that can be counted, and the image of that finger is recognized. A predetermined action associated with the movement is performed.

  In addition to the above three methods, they may be combined, or by a method other than the above, the movement of a specific finger image from among the many changes in the number of finger images that can be counted. May be recognized and a predetermined action associated with the movement of the image of the finger may be executed.

  In this manner, a predetermined operation associated with the movement of the specific finger image is executed. The predetermined operation is the selection, enlargement, reduction, rotation, fanning, switching to another image, movement, deletion, image (object) described with reference to FIGS. For example, enlargement or division. The operation result is displayed on the first display / light sensor unit 300A.

  The predetermined operation to be executed is not limited to the above-described operations, and various operations are possible. Moreover, the operation | movement obtained by combining suitably each operation | movement demonstrated with reference to FIGS. 9-18 is also executable.

  Here, an example of another operation will be described with reference to FIGS. 11 (a) and 11 (b), touch the substantially square image 27a with the thumb 20 from the top and the index finger 21 from the bottom, respectively, and then the thumb 20 and the index finger 21 in the same direction (in the horizontal direction of the drawing, (To the right). By the movement of the finger, the substantially square image 27a moves to the outside of the screen of the first display / light sensor unit 300A.

  At this time, if the thumb 20 and / or forefinger 21 moves away from the substantially square image 27a before the substantially square image 27a finishes moving to the outside of the screen of the first display / light sensor unit 300A, the image is the first position. Assume that the data display / sensor device is configured to return to (state). In this case, the user can return the substantially square image 27a to the initial position (state) by separating the thumb 20 and / or forefinger 21 from the substantially square image 27a by using the device. That is, the user can cancel a predetermined operation (image movement) by releasing the finger.

  That is, if the data display / sensor device is configured as described above, another operation for returning the image to the initial state is not required, and the burden on the user is reduced. The operation of canceling the predetermined operation can be applied to the other embodiments described with reference to FIG.

  In this way, the movement of the finger and the predetermined action can be associated with various patterns. And the predetermined | prescribed operation | movement performed is not restricted to each operation | movement mentioned above, A various operation | movement is possible.

(Explanation of main parts of data display / sensor device)
Next, a more detailed configuration of the data display / sensor device 100 will be described with reference to FIG. Here, in order to make the description easy to understand, the description of the operations of the data processing unit 700 and the circuit control unit 600 located between the main control unit 800 and the display / light sensor unit 300 is omitted. However, to be precise, when performing image display and image detection, the input interface control unit 805 of the main control unit 800 transmits a command to the data processing unit 700, and the data processing unit 700 performs circuit control based on the command. The circuit control unit 600 transmits a signal to the display / light sensor unit 300. Further, the main control unit 800 acquires the entire image data, the partial image data, and the coordinate data from the data processing unit 700 as a response to the command transmitted to the data processing unit 700.

  FIG. 1 is a block diagram showing a main configuration of the data display / sensor device 100. As shown in the figure, the main control unit 800 includes an input interface control unit 805 and an application execution unit 850 (operation execution device). The input interface control unit 805 includes a detection determination unit 810, a superimposition determination unit 820, an input detection unit 830, an operation control unit 840 (operation control unit), and an image display control unit 860 (image display control unit). .

  The detection determination unit 810 determines whether the first display / light sensor unit 300A and the second display / light sensor unit 300B have both detected a finger image. More specifically, when the first display / light sensor unit 300A and the second display / light sensor unit 300B both detect a finger image, coordinate data, whole image data, and partial image data are received from the data control unit 700. It is transmitted to the detection determination unit 810. Accordingly, the detection determination unit 810 receives the coordinate data, the entire image data, and the partial image data from both the first display / light sensor unit 300A and the second display / light sensor unit 300B. It can be determined that both the unit 300A and the second display / light sensor unit 300B have detected a finger image. The detection determination unit 810 receives at least one of the coordinate data, the entire image data, and the partial image data, so that the first display / light sensor unit 300A and the second display / light sensor unit 300B are both connected. It can also be determined that a finger image has been detected.

  Here, in FIG. 1, the coordinate data, the entire image data, and the partial image data transmitted from the data processing unit 700 to the detection determination unit 810 (and the superimposition determination unit 820 and the input detection unit 830) are indicated by solid lines and broken lines. ing. This is because the detection determination unit 810 acquires coordinate data, whole image data, and partial image data from both the first display / light sensor unit 300A and the second display / light sensor unit 300B. Coordinate data, whole image data, and partial image data transmitted from the optical sensor unit 300A to the detection determination unit 810 (and the superimposition determination unit 820 and the input detection unit 830) are detected from the second display / light sensor unit 300B by solid lines. Coordinate data, whole image data, and partial image data transmitted to the determination unit 810 (and the superimposition determination unit 820 and the input detection unit 830) are indicated by broken lines, respectively.

  Next, the detection determination unit 810 outputs a determination result indicating that “the first display / light sensor unit 300 </ b> A and the second display / light sensor unit 300 </ b> B both detected the finger image” to the input detection unit 830. Accordingly, when only the first display / light sensor unit 300A or only the second display / light sensor unit 300B detects an image of a finger, or the first display / light sensor unit 300A and the second display / light sensor unit 300B When neither finger image is detected, the detection determination unit 810 indicates that “the first display / light sensor unit 300A and the second display / light sensor unit 300B have not detected the finger image”. Is output to the input detection unit 830. Alternatively, the detection determination unit 810 outputs nothing to the input detection unit 830.

  The superimposition determination unit 820 includes a display image region that is a region of an image displayed on the first display / light sensor unit 300A, and a finger detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B. It is determined whether or not an image region that is an image region overlaps. In the display / light sensor unit 300, an image is displayed on the first display / light sensor unit 300A and no image is displayed on the second display / light sensor unit 300B.

  More specifically, the superimposition determination unit 820 acquires coordinate data, whole image data, and partial image data from the data control unit 700, and specifies an image region from these coordinate data, whole image data, and partial image data. In addition, the superimposition determination unit 820 reads image information related to the image displayed on the first display / light sensor unit 300 </ b> A from the storage unit 901. In the image information, since the display image area related to the image is specified by coordinates, the superimposition determination unit 820 can specify the display image area by reading the image information from the storage unit 901.

  As an example, image information will be described with reference to FIGS. 28 and 29. FIG. 28 is a diagram for explaining the image information about the circular images 26a and 26b according to FIG. FIG. 29 is a diagram for explaining image information about the rectangular image 27a according to FIG.

  As shown in FIG. 28A, the circular image 26a is a circle having a diameter φT1 with the coordinates (A, B) as the center point. Further, a circular image 26b obtained by the method described with reference to FIG. 10 and enlarged to a larger size in FIG. 28B is indicated by a diameter φT2 with the coordinates (A, B) as a center point. It is a circle. Here, the display image areas related to the circular images 26a and 26b are specified by the coordinates (A, B) and the diameters φT1 and φT2, but are specified by the coordinates of the respective circumferences of the circular images 26a and 26b. You may be the aspect to do. This is because the display image area relating to the circular images 26a and 26b is specified in any manner.

  As shown in FIG. 29A, the rectangular image 27a is a square image whose one side is represented by a length C with the position (A, B1) as the center of the image. Also, the image 27a moved to the right side of the drawing in FIG. 29B obtained by the method described with reference to FIG. 11 has one side represented by a length C with (A, B2) as the center of the image. This is a square image. Here, the square image 27a is specified by the coordinates (A, B1), (A, B2) and the length C, but is specified by coordinates indicating four sides of the square image 27a. It doesn't matter. This is because the display image area related to the square image 27a is specified in any manner.

  As described above, the coordinates (A, B) and information of φT1 and φT2 in FIG. 28 and the information of coordinates (A, B1), (A, B2), and C in FIG. By treating the information as information, the superimposition determination unit 820 can specify the display image region by reading out the image information from the storage unit 901.

  By the above method, the superimposition determination unit 820 can specify the image region and the display image region, and therefore determines whether or not there is a superimposition region that overlaps between the image region and the display image region. Can do. Then, the superimposition determination unit 820 outputs the determination result to the input detection unit 830. In addition, when it determines with the said superimposition area | region not existing, the superimposition determination part 820 does not output the determination result to the input detection part 830, and it can also be set as the determination result that a superimposition area | region does not exist by it. .

  In the above description, the superimposition determination unit 820 is described as acquiring coordinate data, whole image data, and partial image data from the data control unit 700. However, the superimposition determination unit 820 may acquire coordinate data, whole image data, and partial image data from the data control unit 700 via the detection determination unit 810.

  In determining whether or not there is a superimposition region, the superimposition determination unit 820 includes a display image region that is a region of an image displayed on the first display / light sensor unit 300A and the first display / light sensor unit 300A. Or you may determine whether the image area | region which is the area | region of the image of the finger | toe which any one detected among the 2nd display / light sensor part 300B overlaps. Thus, “there is a superimposed region”, compared to the case where it is determined whether or not the image region of both the first display / light sensor unit 300A and the second display / light sensor unit 300B overlap the display image region. Can be relaxed.

  Here, the reason why it is preferable to relax the condition for superimposition determination will be described using a specific example. In FIG. 9A, the user touches the area on the second display / light sensor unit 300 </ b> B corresponding to the rectangular image 25 a displayed on the first display / light sensor unit 300 </ b> A with the index finger 21. That is, if the rectangular image 25a is displayed on the second display / light sensor unit 300B, the index finger 21 touches the area where the rectangular image 25a will be displayed.

  At this time, since the second display / light sensor unit 300B does not display the rectangular image 25a on the screen, the user performs the second display corresponding to the rectangular image 25a displayed on the first display / light sensor unit 300A. A region on the display / light sensor unit 300B is sensibly recognized, and then the region is touched. That is, the touch of the area with the index finger 21 is performed sensibly and is not always performed accurately.

  Accordingly, an image region, which is a region of a finger image detected by at least one of the first display / light sensor unit 300A or the second display / light sensor unit 300B, is superimposed on the display image region. If it is determined that “exists”, the data display / sensor device 100 can provide the user with better operability and convenience.

  The input detection unit 830 determines that the detection determination unit 810 has detected the finger image by both the first display / light sensor unit 300A and the second display / light sensor unit 300B, and the superimposition determination unit 820 determines the superimposition region. Is detected, the first display / light sensor unit 300A and the second display / light sensor unit 300B detect the movement of the finger image, and associate the detected movement with the finger image movement. An input signal is generated, and the input signal is output to the operation control unit 840.

  At this time, the input detection unit 830 further generates an input signal in association with an operating state (scene) such as an application and a mode, or an image attribute. Thereby, the application execution part 850 mentioned later can perform predetermined | prescribed operation | movement with respect to the image displayed on the 1st display / light sensor part 300A. Note that the input detection unit 830 can acquire image information related to the image or the like directly from the storage unit 901 or via the superimposition determination unit 820. Thereby, the input detection unit 830 can generate an input signal associated with the image displayed on the first display / light sensor unit 300A.

  Here, the input signal generated by the input detection unit 830 will be specifically described by taking the operation of FIG. 9 as an example.

  The first display / light sensor unit 300A and the second display / light sensor unit 300B detect the image of the thumb 20 and the image of the index finger 21 moving in opposite directions. At this time, the input detection unit 830 acquires the entire image data, the partial image data, and the coordinate data before and after the movement of the image of the thumb 20 and the index finger 21 from the data control unit 700. Then, the input detection unit 830 calculates the moving amount and moving direction of each finger image from the data.

  More specifically, the coordinate data in the entire image data (region AP) of the representative coordinates Z of the partial image data (region PP) is represented by (Xa, Ya), and the coordinate data in the partial image data (region PP). Is represented as (Xp, Yp). Accordingly, the input detection unit 830 acquires the coordinate data before and after the movement of the image of the thumb 20 and the image of the index finger 21 and calculates the difference between the coordinate data, thereby moving the movement amount of each finger image, And the moving direction can be calculated.

  In this way, the input detection unit 830 detects the movements of the finger images detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B, and moves the coordinate movement amount and coordinates of each finger. The moving direction (single vector) is calculated. Then, the input detection unit 830 calculates a relative vector indicating how much each finger has moved in which direction and how much each finger has moved from the single vector of each finger. Thereby, the input detection unit 830 calculates a single vector and a relative vector, and generates an input signal associated with the single vector and the relative vector.

  Further, the input detection unit 830 associates each operation state (scene) such as an application and a mode based on input signals associated with the following various items, that is, image information read from the storage unit 901. The generated input signal, an input signal associated with “elapsed time” described later with reference to FIG. 19, or an input signal associated with the speed of finger movement are generated. The input signal does not have to be generated in association with all the items listed here, and may be generated in association with some of these items.

  As described above, the input detection unit 830 generates an input signal associated with the various items and outputs the input signal to the operation control unit 840.

  The operation control unit 840 receives the input signal generated by the input detection unit 830, and based on the finger-command correspondence information associated with the input signal and a command for causing the application execution unit 850 to perform a predetermined operation. The operation execution command is transmitted to the application execution unit 850.

  More specifically, the operation control unit 840 receives the input signal generated by the input detection unit 830. Then, the operation control unit 840 reads the finger-command correspondence information associated with the input signal and a command for causing the application execution unit 850 to execute a predetermined operation from the storage unit 901, and uses the finger-command correspondence information as the finger-command correspondence information. Based on this, an operation execution command corresponding to the input signal is transmitted to the application execution unit 850.

  Here, the finger-command correspondence information will be described more specifically. The finger-command correspondence information is information associated with an input signal generated by the input detection unit 830 and a command for causing the application execution unit 850 to execute a predetermined operation. For example, when the image of the finger detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B moves in the same direction, the selected image is moved in the same direction, or the first display / light sensor When the finger images detected by the unit 300A and the second display / light sensor unit 300B move in opposite directions, the selected image is enlarged (reduced). The command is associated with an instruction that causes the application execution unit 850 to execute a predetermined operation (see FIG. 30). Further, in addition to this, the input detection unit 830 receives an input signal associated with various items such as a single vector, a relative vector, an operating state (scene) such as an application mode, an elapsed time, or a finger movement speed. Is generated. Therefore, the finger-command correspondence information is formed by associating the various items with predetermined actions to be executed.

  FIG. 30 is a table illustrating an example of finger-command correspondence information that associates the shape of an image, the movement of a finger, and the obtained action, which will be described with reference to FIGS. 9 to 19. Accordingly, each item shown here is merely an example, and the present invention is not limited to this.

  The application execution unit 850 receives the operation execution command from the operation control unit 840 and executes the operation indicated by the command. Then, the application execution unit 850 transmits operation result information indicating that the operation has been performed to the image display control unit 860.

  For example, when the image of the finger detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B moves in the same direction (assuming that the finger image moves from point A to point B), When the movement associated with the movement of the finger image is an action of “moving the display image from point A to point B”, the application execution unit 850 changes the display position of the display image from point A to point B. Process to move to. Then, the application execution unit 850 transmits operation result information indicating that “the process of moving the display image from point A to point B has been performed” to the image display control unit 860.

  In this way, the application execution unit 850 receives the operation execution command from the operation control unit 840, executes the operation indicated by the command, and further displays the operation result information indicating that the operation has been performed as an image display. It transmits to the control unit 860.

  In FIG. 1, the application execution unit 850 is included in the main control unit 800. However, the application execution unit 850 may be provided outside the main control unit 800, and a predetermined operation may be executed via the external communication unit 907. Further, the application execution unit 850 may have one or a plurality of applications.

  Upon receiving the operation result information from the application execution unit 850, the image display control unit 860 transmits display data to the data processing unit 700. The display data prompts the data processing unit 700 to display the operation result executed by the application execution unit 850 on the first display / light sensor unit 300A.

  Therefore, when the operation of “moving the display image from point A to point B” is executed by the application execution unit 850, the image display control unit 860 transmits display data indicating that to the data processing unit 700. As a result, the operation result is displayed on the first display / light sensor unit 300A. That is, a state in which the display image moves from the point A to the point B is displayed on the first display / light sensor unit 300A.

  In FIG. 1, the storage unit 901 is provided outside the main control unit 800, but may be configured inside the main control unit 800.

  In addition, the detection determination unit 810, the superimposition determination unit 820, and the input detection unit 830 include the entire image data and partial image data of the finger image for each of the first display / light sensor unit 300A and the second display / light sensor unit 300B. And the data processing unit 700 is requested to transmit coordinate data. This is because the application program executed by the main control unit 800 displays the entire image data, partial image data, and coordinate data of the finger image for each of the first display / light sensor unit 300A and the second display / light sensor unit 300B. This is done by controlling the data processing unit 700 to transmit.

(Flowchart explanation)
Next, the processing flow of the data display / sensor device 100 and the details of the command transmitted from the input interface control unit 805 to the data processing unit 700 will be described with reference to FIGS. 9 to 18 and FIG. .

  FIG. 20 is a flowchart showing the flow of processing of the data display / sensor device 100.

  First, when the data display / sensor device 100 is powered off, the data display / sensor device 100 is turned on when the user presses the power switch 905 (S11).

  In step S <b> 13, when the user presses the power switch 905, an image is displayed on the display / light sensor unit 300. Here, it is assumed that an image is displayed on the first display / light sensor unit 300A. In this state, the data display / sensor device 100 waits until the first display / light sensor unit 300A and the second display / light sensor unit 300B detect the image of the finger.

  Here, when displaying an image on the first display / light sensor unit 300A, a command specifying the upper screen (“001”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output. When displaying an image on the second display / light sensor unit 300B, a command specifying the lower screen (“010”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output. Here, in order to display an image on the first display / light sensor unit 300A, a command specifying both screens (“001”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output.

  Further, at this stage, it is assumed that the display / light sensor unit 300 is in a state in which a nearby image can be detected. To perform the detection, a command in which “event” is specified in the “data acquisition timing” field and both “coordinate” and “whole image” are specified (“101” is specified) in the “data type” field is specified. The data is output from the input interface control unit 805 to the data processing unit 700. Note that a command specifying only “coordinates” or “entire image” may be output from the input interface control unit 805 to the data processing unit 700.

  Further, in order to perform the detection in the first display / light sensor unit 300A and the second display / light sensor unit 300B, a command in which both the display / light sensor units 300 are specified (“011”) in the “scan panel” field. Is output from the input interface control unit 805 to the data processing unit 700.

  Thus, when the sensor built-in liquid crystal panel 301 detects a nearby image, the detection determination unit 810, the superimposition determination unit 820, and the input detection unit 830 receive coordinate data, whole image data, and partial image data from the data processing unit 700. Can be obtained.

  That is, when the user's finger touches the first display / light sensor unit 300A and the second display / light sensor unit 300B, the light sensor circuit 32 detects at least one of a shadow image and a reflection image of the touched finger. As a result, the image data acquired by the data processing unit 700 from the sensor control unit 602 changes.

  In this manner, the detection determination unit 810 acquires the coordinate data, the entire image data, and the partial image data that are output from the data processing unit 700 when a change occurs in the image data, thereby performing the first display. It is possible to recognize that both the optical sensor unit 300A and the second display / optical sensor unit 300B have detected the finger image.

  In step S15, the detection determination unit 810 determines whether the first display / light sensor unit 300A and the second display / light sensor unit 300B have both detected a finger image. That is, the coordinate data, the entire image data, and the partial image data for each display / light sensor unit 300 are input from the data processing unit 700 to the detection determination unit 810, so that the first display / light sensor unit 300A and the second display / light sensor unit 300A. It is determined that both of the optical sensor units 300B detect the finger image. In FIG. 20, for the sake of convenience, the first display / light sensor unit 300A is referred to as “A”, and the second display / light sensor unit 300B is referred to as “B”.

  When both the first display / light sensor unit 300A and the second display / light sensor unit 300B detect the finger image (Yes in step S15), the process proceeds to step S17. In step S17, the superimposition determination unit 820 includes a display image region that is a region of an image displayed on the first display / light sensor unit 300A, and the first display / light sensor unit 300A and the second display / light sensor unit 300B. It is determined whether or not the detected image area that is the image area of the finger is superimposed. That is, by acquiring the coordinate data, the entire image data, and the partial image data for each display / light sensor unit 300 from the data processing unit 700, the superimposition determining unit 820 can obtain the coordinate data, the entire image data, and the partial image data, A display image area which is an area of an image displayed on the first display / light sensor unit 300A is compared, and it is determined whether or not an overlapping area (superimposition area) exists. Note that the superimposition determination unit 820 reads image information related to the image displayed on the first display / light sensor unit 300 </ b> A from the storage unit 901.

  If it is determined that the overlapping area exists (Yes in step S17), the process proceeds to step S19. In step S19, the input detection unit 830 detects the movement of the finger image detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B, and associates it with the movement of the finger image. The input signal is generated and the input signal is output to the operation control unit 840.

  Subsequently, in step S21, the operation control unit 840 receives the input signal from the input detection unit 830, and specifies an instruction associated with the input signal and a command for causing the application execution unit 850 to execute a predetermined operation. Read instruction correspondence information from the storage unit 901.

  The operation control unit 840 reads the finger-command correspondence information from the storage unit 901, and transmits the operation execution command to the application execution unit 850 based on the finger-command correspondence information.

  Subsequently, in step S23, the application execution unit 850 receives the operation execution command from the operation control unit 840, and executes the operation indicated by the command. Then, the application execution unit 850 transmits operation result information indicating that the operation has been performed to the image display control unit 860.

  In step S <b> 25, the image display control unit 860 receives operation result information from the application execution unit 850 and transmits display data to the data processing unit 700. The display data prompts the data processing unit 700 to display the operation result executed by the application execution unit 850 on the first display / light sensor unit 300A. As a result, the first display / light sensor unit 300A displays the operation result executed by the application execution unit 850 based on the operation execution command received from the operation control unit 840.

  At this time, when displaying an image on the first display / light sensor unit 300A, a command specifying the upper screen (“001”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output. When displaying an image on the second display / light sensor unit 300B, a command specifying the lower screen (“010”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output. Here, in order to display an image on the first display / light sensor unit 300A, a command specifying both screens (“001”) as the value of the “display panel” field is sent from the input interface control unit 805 to the data processing unit 700. Is output.

  On the other hand, when it progresses to "No" by step S15 and step S17, it returns to step S15 again. In addition, although demonstrated in the order which performs the process of step S17 after step S15, you may carry out simultaneously or in reverse order.

  In the example described with reference to FIG. 23 described later, a desired operation can be executed even when an image is not displayed on the first display / light sensor unit 300A. Therefore, in this case, the process proceeds to the step (step S19) in which the input detection unit 830 generates an input signal without determining whether or not the overlapping region exists (step S17).

  Here, the processing flow of each of the steps (steps S11 to S25) will be described with reference to FIG.

  In FIG. 9A, a rectangular image 25a is displayed on the first display / light sensor unit 300A. This is because the command specifying the upper screen (“001”) as the value of the “display panel” field is output from the input interface control unit 805 to the data processing unit 700, and as a result, the image 25a is displayed on the first display / light sensor unit 300A. Is displayed.

  At this stage, a command that has already specified “event” in the “data acquisition timing” field and both “coordinate” and “whole image” (specified “101”) in the “data type” field. It is output from the input interface control unit 805 to the data processing unit 700, and the display / light sensor unit 300 is in a state where a nearby image can be detected. Note that a command specifying only “coordinates” or “entire image” may be output from the input interface control unit 805 to the data processing unit 700.

  In addition, in order to perform the detection in the first display / light sensor unit 300A and the second display / light sensor unit 300B, a command in which both display / light sensor units 300 are specified (“011”) in the “scan panel” field. Is output from the input interface control unit 805 to the data processing unit 700.

  In such a state, the user uses the thumb 20 from the top surface (side on which the first display / light sensor unit 300A is disposed) and the index finger from the bottom surface (side on which the second display / light sensor unit 300B is disposed). At 21, the rectangular image 25 a is touched. Since both the first display / light sensor unit 300A and the second display / light sensor unit 300B can detect the image of the finger, the detection determination unit 810 includes the first display / light sensor unit 300A and the first display / light sensor unit 300A. The second display / light sensor unit 300 </ b> B determines that both of them detect a finger image, and outputs the determination result to the input detection unit 830.

  Further, as shown in FIG. 9A, a display image area showing an area of an image 25a displayed on the first display / light sensor unit 300A, and the first display / light sensor unit 300A and the second display / light sensor. The image area which is the image area of the thumb 20 and the index finger 21 detected by the unit 300B is superimposed. Therefore, the superimposition determination unit 820 determines that there is a superimposition region where the display image region and the image region overlap, and outputs the determination result to the input detection unit 830.

  The superimposition determination unit 820 acquires coordinate data, whole image data, and partial image data of the finger image from the data control unit 700, and specifies the image region from these data. In addition, the superimposition determination unit 820 identifies the display image area of the image 25a by reading out image information related to the image displayed on the first display / light sensor unit 300A from the storage unit 901. In this way, the superimposition determination unit 820 can specify the image region and the display image region, and therefore determines whether or not there is a superimposition region that overlaps between the image region and the display image region. it can.

  In the input detection unit 830, the detection determination unit 810 determines that both the first display / light sensor unit 300A and the second display / light sensor unit 300B detect the image of the finger, and the superimposition determination unit 820 determines that the superimposition region is present. By determining that it exists, the movement of the finger image detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B is detected.

  Here, in FIG. 9B, the thumb 20 and the index finger 21 are moved in opposite directions (in the horizontal direction of the drawing, the thumb 20 is moved in the right direction and the index finger 21 is moved in the left direction). The movement of the finger image is detected, and an input signal associated with the movement of the finger image is generated. The input detection unit 830 acquires image information related to the rectangular image 25a directly from the storage unit 901 or via the superimposition determination unit 820, and the input signal is generated in association with the rectangular image 25a. ing. Then, the input detection unit 830 outputs the input signal to the operation control unit 840.

  The operation control unit 840 reads from the storage unit 901 finger-command correspondence information in which an input signal and a command for causing the application execution unit 850 to execute a predetermined operation are associated with each other. Next, the operation control unit 840 reads the operation execution command. To 850. Here, the image displayed on the first display / light sensor unit 300A is a rectangular image 25a, and the movements of the finger images detected on both the front and back surfaces of the data display / sensor device 100 move in opposite directions. In this case, the finger-command correspondence information is associated with an operation of changing the rectangular image 25a to the fan-shaped image 25a.

  Therefore, the application execution unit 850 performs a process of expanding the rectangular image 25a into the fan-shaped image 25a, and then, operation result information indicating that “the process of expanding the rectangular image 25a into the fan-shaped image 25a has been performed”. Is transmitted to the image display control unit 860.

  Upon receiving the operation result information from the application execution unit 850, the image display control unit 860 transmits display data to the data processing unit 700. The display data is data instructing to expand the rectangular image 25a into the fan-shaped image 25a for display. Then, based on the display data transmitted from the image display control unit 860, the data processing unit 700 displays an image obtained by expanding the rectangular image 25a displayed on the first display / light sensor unit 300A into a fan-shaped image 25a. Let

  The fan-shaped image 25a is displayed as a result of a command specifying the upper screen (“001”) as the value of the “display panel” field being output from the input interface control unit 805 to the data processing unit 700. A fan-shaped image 25a is displayed on the optical sensor unit 300A.

  In this manner, the operation described with reference to FIGS. 9A and 9B is executed, and the execution result is displayed on the first display / light sensor unit 300A.

  Various operations described with reference to FIGS. 10 to 18 are executed by the same processing flow, and the execution results are displayed on the first display / light sensor unit 300A.

  Here, the right hand is described, but for the left hand, the finger-command correspondence information in which the input signal and the command for causing the application execution unit 850 to execute a predetermined operation are stored in the storage unit 901. By doing so, it is possible to execute a desired operation on the display image displayed on the first display / light sensor unit 300A.

  Further, here, the display image is displayed on the first display / light sensor unit 300A, but the image may be displayed on the second display / light sensor unit 300B.

  Furthermore, although it has been described that the thumb 20 and the index finger 21 are used as detection targets by the first display / light sensor unit 300A and the second display / light sensor unit 300B, the types of fingers to be detected are those fingers. The middle finger, the ring finger, and the little finger may be used. This is because a predetermined operation is performed on the image displayed on the first display / light sensor unit 300A based on the detected movement of the finger image instead of the type of the finger.

  The effects obtained by the processing of the data display / sensor device 100 will be described as follows.

  Since the data display / sensor device 100 has the above-described configuration, both the first display / light sensor unit 300A and the second display / light sensor unit 300B that detect the adjacent images respectively disposed on the front and back surfaces are finger-shaped. When an image is detected, an input signal associated with the movement of the finger image is generated by the input detection unit 830. That is, if only one of the first display / light sensor unit 300A and the second display / light sensor unit 300B detects a finger image, it corresponds to the movement of the finger image. The generated input signal is never generated.

  As described above, in the data display / sensor device 100, the input signal is generated by the input from the two input surfaces of the display / light sensor unit 300 on both the front and back surfaces. The input signal is a signal associated with the movement of the finger image detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B. Therefore, the user can detect various finger image movements in the first display / light sensor unit 300A and the second display / light sensor unit 300B, thereby realizing a variety of three-dimensional images that could not be realized by the conventional input device. Operations can be performed, and such operations provide excellent convenience and operability. In addition, since the operation is performed so that the screen is sandwiched between the front and back surfaces of the plate-like data display / sensor device 100, the user can use the device very easily.

  And since an input signal is generated by input from two input surfaces, even if a user's finger unintentionally touches an icon on the touch panel, the image of the finger is detected and an input signal is generated. It is possible to avoid a situation in which some processing is executed. As a result, the risk of erroneous operation on the input surface is reduced.

  Thus, the data display / sensor device 100 can provide the user with excellent operability and convenience.

  Further, in the data display / sensor device 100, in the above configuration, the input signal is converted into the input signal based on the finger-command correspondence information in which the input signal and the command for causing the application execution unit 850 to execute a predetermined operation are associated with each other. The operation control unit 840 transmits a corresponding command to the application execution unit 850.

  Accordingly, the display / light sensor unit 300 on both the front and back surfaces detects the movement of the finger image, and the input signal associated with the movement of the finger image is generated by the input detection unit 830. Then, the command associated with the input signal is transmitted to the application execution unit 850 based on the finger-command correspondence information in which the input signal is associated with the command for causing the application execution unit 850 to execute a predetermined operation. Then, the application execution unit 850 executes a command corresponding to the input signal.

  Therefore, the user of the data display / sensor device 100 can cause the application execution unit 850 to execute a desired operation by causing the display / light sensor unit 300 on the front and back sides to detect various finger image movements.

  Further, in the data display / sensor device 100, in the above-described configuration, the input detection unit 830 is configured such that the display / light sensor unit 300 on both the front and back surfaces is a finger image in a predetermined area on the display / light sensor unit 300 on both surfaces. When the signal is detected, the input signal can be generated. The display / light sensor unit 300 on both the front and back sides includes an image display control unit 860 that displays an image in a predetermined area, and the input detection unit 830 generates the input signal associated with the image. can do.

  According to the above configuration, the data display / sensor device 100 generates an input signal when a finger image is detected in a predetermined area on the front / back display / light sensor unit 300.

  Therefore, the input detection unit 830 does not generate an input signal when the finger touches another area different from the predetermined area on the display / light sensor unit 300 on both the front and back surfaces.

  Therefore, even when the user touches his / her finger on another area different from the predetermined area on the display / light sensor unit 300 on both the front and back sides, the situation where an input signal is generated can be avoided. it can.

  Further, in the data display / sensor device 100, the image display control unit 860 displays an image in a predetermined area. Further, when a finger image is detected in a predetermined region, the input detection unit 830 generates an input signal associated with the image.

  Therefore, the user can cause the application execution unit 850 to execute a predetermined operation by holding the image from both sides with fingers. On the contrary, since the input detection unit 830 does not generate an input signal in a region different from the predetermined region where the image is displayed, the user touches a region different from the predetermined region where the image is displayed. Even so, it is possible to avoid a situation where the input surface is erroneously operated.

  Therefore, the user can realize a desired operation only for an image displayed in a predetermined area, and as a result, the data display / sensor device 100 provides the user with excellent operability and convenience. can do.

  Further, in the data display / sensor device 100, the image display control unit 860 is configured to cause the first display / light sensor unit 300A to display an image based on the result of a predetermined operation executed by the application execution unit 850. it can.

  That is, the image display control unit 860 can cause the first display / light sensor unit 300A to display an image based on a result of a predetermined operation executed by the application execution unit 850.

  Therefore, the image display control unit 860 can display the result of the operation executed in association with the movement of the user's finger on the screen and allow the user to confirm the result.

  In addition, the user can confirm the result of the operation performed based on the movement of his / her finger on the screen. Then, when it is desired to give some operation to the image after the operation, it is only necessary to cause the display / light sensor unit 300 on both the front and back surfaces to detect the desired movement of the finger image. That is, the user can continuously perform the next operation by checking the operation result on the screen.

  Thus, the data display / sensor device 100 can provide the user with more excellent operability and convenience.

(Modification 1)
Next, a data display / sensor device 110, which is a modification of the data display / sensor device 100, will be described with reference to FIG. 19, FIG. 21, and FIG. In addition, the description similar to the content demonstrated with reference to FIG.1, FIG.9 ~ FIG.18, FIG.20 is abbreviate | omitted.

  FIG. 19A is a diagram illustrating a state in which a rectangular image 34a is displayed on the first display / light sensor unit 300A. FIG. 19B shows a state where the index finger 21 and the middle finger 22 are respectively moved in the same direction (rightward in the horizontal direction in the drawing) to gradually switch from the image 34a to the image 34b. ) Is a diagram illustrating a state in which the index finger 21 and the middle finger 22 are further moved in the same direction (rightward in the horizontal direction in the drawing), and the image displayed on the screen is switched from the image 34a to the image 34b.

  In FIG. 19A, a rectangular image 34a is displayed on the first display / light sensor unit 300A, and the user touches the image 34a with the index finger 21 from the upper surface and the middle finger 22 from the lower surface. At this time, the index finger 21 and the middle finger 22 hold the data display / sensor device 110 from both the front and back surfaces by the belly and side surfaces of the finger in addition to the fingertip. Therefore, the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 is larger than when the display / light sensor unit 300 is held only by the fingertip.

  Next, as shown in FIG. 19B, the user moves the index finger 21 and the middle finger 22 in the same direction (rightward in the horizontal direction of the drawing) with respect to the data display / sensor device 110 to the left side of the drawing. The image 34b is made to appear, and the image 34a and the image 34b are displayed side by side. That is, switching from the image 34 a to the image 34 b is performed as the index finger 21 and the middle finger 22 move. Note that switching from the image 34a to the image 34b is performed with a trigger that the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 is larger than a predetermined contact area. Therefore, when the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 is smaller than the predetermined contact area, switching from the image 34a to the image 34b is not executed. This point is different from the switching from the image 33a to the image 33b described with reference to FIG. The predetermined contact area does not need to be a specific numerical value and can be arbitrarily set, and the value is stored in the storage unit 901.

  Subsequently, as shown in FIG. 19C, the index finger 21 and the middle finger 22 are further moved in the same direction (rightward in the horizontal direction in the drawing) to complete the switching from the image 34a to the image 34b.

  Thus, in the data display / sensor device 110 and its control method, the index finger 21 and the middle finger are triggered by the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 being larger than the set area. The image 34a is switched to the image 34b in accordance with the movement 22. Therefore, simply switching the image 34a from the front and back sides with the fingertip does not execute switching from the image 34a to the image 34b, and the user is consciously required to perform an operation using the belly or side of the finger. That is, the data display / sensor device 110 dares to ask the user for such an operation to reduce the risk of an erroneous operation.

  Note that by stopping the movement of the finger before the switching from the image 34a to the image 34b is completely performed, the state where the images 34a and 34b are displayed side by side can be maintained. In addition, switching from the image 34a to the image 34b can be executed at a desired speed according to the speed of finger movement. Furthermore, the image can be switched from the image 34b to the image 34a by moving the index finger 21 and the middle finger 22 in the opposite directions. The moving direction of the boundary between the image 34a and the image 34b may be the horizontal direction in the drawing, the vertical direction in the drawing, or another direction, and can be changed as appropriate. Various operations described here are realized by setting finger-command correspondence information in advance and referring to the finger-command correspondence information.

  As examples of use of the operations described in FIGS. 19A and 19B, the following embodiments can be considered. For example, let us consider a case where the user wants to sequentially view a plurality of photographs in an electronic album. In such a case, when the user wants to move from the photograph (image 34a) displayed on the screen to the next photograph (image 34b), the user moves the image from the image 34a by the operation of the index finger 21 and the middle finger 22 described above. Switching to 34b can be performed easily and quickly. In this way, the data display / sensor device 110 can reliably respond to a user's request to switch images and view the next image.

  As described above, the switching from the image 34a to the image 34b is performed with a trigger that the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 is larger than a predetermined area. Accordingly, the data display / sensor device 110 that realizes the screen switching illustrated in FIG. 19B has a configuration capable of calculating the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300. It is necessary to prepare. Therefore, the configuration of the data display / sensor device 110 will be described with reference to FIG. The same reference numerals are given to the same components as those described above with reference to FIG. Therefore, detailed description of these components is omitted.

  FIG. 21 is a block diagram illustrating a main configuration of the data display / sensor device 110. As illustrated, the input interface control unit 805 further includes an area calculation determination unit 870.

  The area calculation determination unit 870 acquires partial image data from the data processing unit 700 and calculates the area of the acquired finger image (hereinafter referred to as a contact area). Then, a predetermined area set in advance (hereinafter referred to as a set area) is read from the storage unit 901, and the contact area and the set area are compared. Thereafter, the area calculation determination unit 870 outputs the determination result to the input detection unit 830. The area calculation method may be a general method performed by conventional image processing software.

  The input detection unit 830 determines that the detection determination unit 810 has detected a finger image by both the first display / light sensor unit 300A and the second display / light sensor unit 300B, and the overlap determination unit 820 has an overlap region. When it is determined, and when the area calculation determination unit 870 determines that the contact area is larger than the set area, the image of the finger detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B , The input signal associated with the movement of the finger is generated, and the input signal is output to the motion control unit 840.

  Therefore, when the area calculation determination unit 870 determines that the contact area is smaller than the set area, the input detection unit 830 does not generate an input signal. As a result, no matter how the user moves his / her finger, switching from the image 34a to the image 34b cannot be executed.

  As described above, the data display / sensor device 110 performs the switching from the image 34a to the image 34b with the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 being larger than the set area as a trigger. ing.

  Next, a processing flow of the data display / sensor device 110 will be described with reference to FIGS. 19 and 22. FIG. 22 is a flowchart showing a process flow of the data display / sensor device 110. Note that the same step numbers are assigned to the same steps as the processing flow of the data display / sensor device 100 described with reference to FIG. Therefore, detailed description of these steps is omitted.

  If the superimposition determination unit 820 determines that a superimposition region exists (Yes in step S17), the process proceeds to step S18. In step S18, the area calculation determination unit 870 acquires partial image data from the data processing unit 700, and calculates the area (contact area) of the acquired finger image. Then, the area calculation determination unit 870 reads a predetermined area (set area) set in advance from the storage unit 901 and determines whether or not the contact area is larger than the set area.

  The comparison between the contact area and the set area by the superimposition determination unit 820 is made by comparing the contact area of the finger image detected by the first display / light sensor unit 300A and the set area, or the second display / light sensor unit 300B. Either the contact area of the image of the finger detected by can be compared with the set area. Alternatively, the configuration may be such that the larger (or smaller) contact area of the finger images detected by the two display / light sensor units 300 is compared with the set area.

  On the other hand, when it progresses to "No" at step S18, it returns to step S15 again. In addition, although demonstrated as performing a process in order of step S15, step S17, and step S18, it is not limited to the order of the process demonstrated here, You may change suitably.

  Thus, in the data display / sensor device 110 and its control method, the index finger 21 and the middle finger are triggered by the contact area between the index finger 21 and the middle finger 22 and the display / light sensor unit 300 being larger than the set area. The image 34a is switched to the image 34b in accordance with the movement 22. Therefore, simply switching the image 34a from the front and back sides with the fingertip does not execute switching from the image 34a to the image 34b, and the user is consciously required to perform an operation using the belly or side of the finger. That is, the data display / sensor device 110 dares to ask the user for such an operation to reduce the risk of an erroneous operation.

(Modification 2)
Next, another modification will be described with reference to FIGS.

  FIGS. 23A to 23C are diagrams showing how the background color of the screen changes stepwise in a state where no image is displayed on the display / light sensor unit 300. FIG. FIG. 24 is a block diagram showing a main configuration of the data display / sensor device 120 according to an embodiment of the present invention.

  Hereinafter, the data display / sensor device 120 will be described with reference to FIGS. 23 and 24. The side described in the drawings will be described as the first display / light sensor unit 300A. In addition, the same components as those described above with reference to FIG. Therefore, detailed description of these components is omitted.

  As shown in FIG. 23, the first display / light sensor unit 300A has a signal generation region 170 near the lower end of the screen. Then, the superimposition determination unit 820 reads the coordinates indicating the signal generation region 170 from the storage unit 901 (see FIG. 24). Note that the signal generation region 170 may be displayed on the first display / light sensor unit 300A or may not be displayed.

  Here, the superimposition determination unit 820 superimposes the signal generation region 170 and the image region that is the region of the finger image detected by the first display / light sensor unit 300A and / or the second display / light sensor unit 300B. It is determined whether or not.

  More specifically, the superimposition determination unit 820 acquires coordinate data, whole image data, and partial image data of a finger image detected by the two display / light sensor units 300 from the data control unit 700, Identify the image area from the data. Furthermore, the superimposition determination unit 820 reads the signal generation area 170 indicated by the coordinates from the storage unit 901. Then, the superimposition determination unit 820 determines whether or not there is an overlapping region between the image region and the signal generation region 170, and outputs the determination result to the input detection unit 830.

  The input detection unit 830 determines that the detection determination unit 810 has detected both finger images of the first display / light sensor unit 300A and the second display / light sensor unit 300B, and the superimposition determination unit 820 determines the image. When it is determined that there is a superimposed region between the region and the signal generation region 170, the movement of the finger image detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B is detected. Then, an input signal associated with the movement of the finger image is generated, and the input signal is output to the motion control unit 840.

  Accordingly, when the finger image areas detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B are not superimposed on the signal generation area 170, the input detection unit 830 does not generate an input signal. That is, even if the user touches an area different from the signal generation area 170 with a finger, a desired operation cannot be realized.

  In FIG. 23, the signal generation region 170 is provided on the lower side of the drawing. However, the position, size, and range where the signal generation region 170 is provided may be arbitrarily set. FIG. 23 is only an example.

  The following embodiment can be considered as an example of using the operation described in FIG.

  The user generates a signal with the thumb 20 from the upper surface (the side on which the first display / light sensor unit 300A is disposed) and with the index finger 21 from the lower surface (the side on which the second display / light sensor unit 300B is disposed). The area 170 is touched. Accordingly, the detection determination unit 810 determines that both the first display / light sensor unit 300A and the second display / light sensor unit 300B have detected the finger image, and the superimposition determination unit 820 determines the image region and the signal. It is determined that there is an overlapping region that overlaps with the generation region 170.

  Then, the thumb 20 and the index finger 21 are moved in opposite directions (in the longitudinal direction of the drawing, the thumb 20 is upward and the index finger 21 is downward). Thereby, the input detection unit 830 outputs an input signal associated with the movement of the image of the finger to the operation control unit 840. Here, the movement of the finger that moves the thumb 20 and the index finger 21 in opposite directions (more precisely, the movement of the finger image due to the movement of the finger) and the background of the screen of the first display / light sensor unit 300A. Assume that an operation in which the color changes step by step is associated. Then, the operation control unit 840 reads out finger-command correspondence information in which the input signal and a command for causing the application 950 to execute a predetermined operation are read from the storage unit 901, and the operation execution command corresponding to the input signal Is transmitted to the application execution unit 850. Here, the operation execution command is a command for causing the application execution unit 850 to execute an operation in which the background color of the screen of the first display / light sensor unit 300A changes stepwise. Therefore, finally, the background color of the screen of the first display / light sensor unit 300A changes stepwise in accordance with the processing flow described with reference to FIG.

  As described above, in the data display / sensor device 120, even when the image is not displayed on the first display / light sensor unit 300A, the user causes the data display / sensor device 120 to execute a desired operation. be able to.

  The data display / sensor device 120 is configured to generate an input signal when a finger image is detected in the signal generation area 170 on the front / back display / light sensor unit 300. Therefore, even if the user touches another area different from the signal generation area 170 on the display / light sensor unit 300 on both the front and back surfaces, it is possible to avoid a situation in which an input signal is generated.

  Therefore, the data display / sensor device 120 can provide the user with excellent operability and convenience by setting the signal generation area 170 as an area that is easy for the user to operate. When the user holds the data display / sensor device 120, the signal generation area 170 is provided in the vicinity where the thumb and the index finger are positioned, thereby further facilitating the operation of the user touching the signal generation area 170. Convenience and operability for the user are further improved.

  Further, in the data display / sensor device 120, the user can appropriately set the position, size, and range of the signal generation area 170. However, in order to limit the use of other users, the signal generation area 170 can also be set to a specific area. Thereby, the situation where another user uses the data display / sensor device 120 without permission can be avoided. As described above, the data display / sensor device 120 can provide a user with a device that is excellent in convenience and operability.

  Here, as an example derived from this modification, the following example is also conceivable. An example of its use will be described below.

  The first display / light sensor unit 300A has a signal generation region 170 near the lower end of the screen. Here, it is assumed that the rectangular image 25a described with reference to FIG. 9 is displayed in a region different from the signal generation region 170 of the first display / light sensor unit 300A. Note that the display area of the rectangular image 25a is recorded in the storage unit 901 by coordinates.

  Then, the input detection unit 830 determines that the detection determination unit 810 has detected both finger images of the first display / light sensor unit 300A and the second display / light sensor unit 300B, and the superimposition determination unit 820 The movement of the finger image detected by the first display / light sensor unit 300A and the second display / light sensor unit 300B when it is determined that there is a superimposed region between the image region and the signal generation region 170. An input signal associated with the movement of the finger image is detected.

  At this time, the input detection unit 830 reads the display area of the rectangular image 25a from the storage unit 901, and inputs an input signal associated with the display area of the rectangular image 25a so as to execute a predetermined operation. Output to the operation control unit 840.

  In this way, the user causes the first display / light sensor unit 300 </ b> A to detect the movement of the finger image in the signal generation area 170, thereby generating a rectangular image 25 a displayed in an area different from the signal generation area 170. A predetermined operation can be executed. That is, the user can execute a predetermined operation on the rectangular image 25a without touching the area where the rectangular image 25a is displayed with a finger.

Therefore, the user moves his / her finger in the signal generation area 170 even when the rectangular image 25a is displayed at a position far from the finger and it is difficult to sandwich the rectangular image 25a from both the front and back sides. Thus, a predetermined operation can be performed on the rectangular image 25a. Thereby, the data display / sensor device 120 can provide a user with a device that is excellent in convenience and operability.
[Embodiment 2]
Next, an example of the data display / sensor device 130 according to the present embodiment will be described with reference to FIGS. The content described here is that the user realizes a desired operation by combining various operations described with reference to FIGS. 9 to 24. Therefore, the detailed description of the details of the operation described with reference to FIGS. 9 to 24, the details of the processing for realizing the operation, and the effects obtained by the data display / sensor device 130 according to the present embodiment will be omitted. .

  25 and 26 are diagrams showing an embodiment of the data display / sensor device 130. FIG. FIG. 25A shows a state where the “menu screen (TEL / TV / WEB)” is displayed on the display / light sensor unit 300. FIG. 25B is a diagram showing a state in which an image indicating “TV” is sandwiched from the menu screen and pulled out to the right side of the drawing. FIG. 25C is a diagram showing a state in which “Private release” is selected from the TV menu. FIG. 25 (d) is a diagram showing a state when the commercial broadcast “CH01” is being viewed.

  Next, FIG. 26A is a diagram showing a state in which the channel for private broadcasting is switched from “CH01” to “CH02”. FIG. 26B is a diagram illustrating a state when “CH02” is being viewed. FIG. 26 (c) is a diagram illustrating a state where “private broadcasting” is switched to “CS CH02”. FIG. 26D is a diagram showing a state when “CS CH02” is being viewed.

  FIG. 27 is a conceptual diagram of CS, commercial broadcasting, BS broadcasting, and channel selection in each broadcast radio wave in the TV menu.

  First, an embodiment of the data display / sensor device 130 will be specifically described with reference to FIGS. 25 and 26.

  FIG. 25A shows the “menu screen (TEL / TV / WEB)” on the first display / light sensor unit 300A. In this figure, the menu screen is described using TEL / TV / WEB, but the item to be selected is not limited to the item described here, and may be a menu such as a photo or a moving image.

  This state is a state in which the user can select any one of TEL, TV, and WEB from the menu screen. The first display / light sensor unit described with reference to FIG. This is the same state as when the images 29a to 29d are displayed near the upper end of the 300A screen.

  FIG. 25B is a diagram illustrating a state in which an image portion indicating “TV” is sandwiched and pulled out from the menu screen. That is, in FIG. 25B, from the “menu screen (TEL / TV / WEB)” displayed near the left end of the screen of the first display / light sensor unit 300A, the user uses the thumb 20 from the top and the index finger from the bottom. The desired menu is pinched at 21 and the thumb 20 and the index finger 21 are moved in the same direction (rightward in the horizontal direction in the drawing). Thus, for example, when “TV” is selected, as shown in FIG. 25B, an image showing TV is stretched in the right direction.

  As shown in the figure, the TV further includes an item “CS / commercial broadcasting / BS” (type of broadcast wave). Therefore, the user can select a desired broadcast radio wave from this item. Here, the description will be made with the item “CS / private / BS”, but other items may be included.

  FIG. 25C is a diagram showing a state in which “Private release” is selected from the TV menu. This state is a state in which the user selects “CS / Private Release / BS” from “CS / PB”, and the thumb 20 and the index finger 21 described with reference to FIG. This corresponds to the operation of selecting the image 34. That is, at the stage of FIG. 26 (c), “CS / Private Release / BS” is displayed on the first display / light sensor unit 300A, and the user selects the desired menu with the thumb 20 from the top and the index finger 21 from the bottom. Touch. Thus, for example, when “private release” is selected, the private release is selected as shown in FIG. 25C, and the state shifts to the state shown in FIG.

  FIG. 25 (d) is a diagram showing a state in which the private broadcast “CH01” is being viewed. As a result of selecting “public broadcast”, the private broadcast “CH01” can be viewed. .

  Next, FIG. 26A is a diagram showing a state in which the channel for private broadcasting is switched from “CH01” to “CH02”. When the user wants to switch the channel from “CH01” to “CH02”, as shown in the figure, the thumb 20 and the index finger 21 are respectively in opposite directions (in the vertical direction of the drawing, the thumb 20 is upward and the index finger 21 is (Downward) to switch from “CH01” to “CH02”. This corresponds to the operation described with reference to FIG. 17 for switching the image 33a to the image 33b by moving the thumb 20 and the index finger 21 in opposite directions.

  FIG. 26B is a diagram illustrating a state in which “CH02” is being viewed. As a result of switching from “CH01” to “CH02”, it is possible to view “CH02” which is a commercial broadcast. It is like that.

  FIG. 26 (c) is a diagram showing a state where “private channel CH02” is switched to “CS CH02”. When the user wants to switch the broadcast radio wave from “private broadcasting” to “CS”, the user moves the thumb 20 and the index finger 21 in the same direction (rightward in the horizontal direction in the drawing) as shown in the figure, The company is switching from “private broadcast” to “CS CH02”. This corresponds to the operation of moving the image by moving the thumb 20 and the index finger 21 in the same direction as described with reference to FIG. Alternatively, this operation is similar to the operation of enlarging the image described with reference to FIG.

  FIG. 26 (d) is a diagram showing a state when “CS CH02” is being viewed. As a result of switching from “CH01” to “CH02”, “CH02” can be viewed. It has become.

  Here, the flow of processing described with reference to FIGS. 25 and 26 will be schematically described with reference to FIG. FIG. 27 is a conceptual diagram of CS, commercial broadcasting, and BS broadcasting in the TV menu, and channel selection for each of these broadcasting radio waves. In the illustrated state, “TV” has already been selected from the “menu screen (TEL / TV / WEB)”.

  Here, in order to understand the flow of processing described with reference to FIGS. 25 and 26, as shown in FIG. 27, the item “CS / private broadcasting / BS” (type of broadcast radio wave) is shown in the horizontal direction of the drawing. Consider a state where “channels” are arranged in a ring shape for each item.

  The user is required to move the finger in the horizontal direction when selecting a broadcast wave to be viewed. For example, when the user wants to switch from “commercial service” to “CS”, the image displayed on the first display / light sensor unit 300A is pinched with the thumb 20 from the top and the index finger 21 from the bottom, and the thumb 20 and the index finger 21 is moved in the same direction (rightward in the horizontal direction of the drawing). Thereby, the broadcast of “CS” is displayed on the first display / light sensor unit 300A. In addition, when it is desired to switch from “private broadcasting” to “BS”, the finger moves in the opposite direction from when switching from “private broadcasting” to “CS”. Thereby, the broadcast of “BS” is displayed on the first display / light sensor unit 300A. That is, when the item “CS / Private Broadcasting / BS” arranged in the horizontal direction shown in FIG. 27 is switched, it is possible to select a desired broadcast wave by moving the finger in the horizontal direction. .

  The user is required to move the finger in the vertical direction when switching the channel to be viewed. For example, when the user wants to switch from “CH01” to “CH02”, the user holds the image displayed on the first display / light sensor unit 300A with the thumb 20 from the top and the index finger 21 from the bottom, and the thumb 20 and the index finger 21 are moved in opposite directions (in the horizontal direction in the drawing, the thumb 20 is moved upward and the index finger 21 is moved downward). Thereby, the broadcast of “CH02” is displayed on the first display / light sensor unit 300A. Also, when switching from “CH02” to “CH01”, the finger moves in the opposite direction from when switching from “CH01” to “CH02”. Thereby, the broadcast of “CH01” is displayed on the first display / light sensor unit 300A. That is, when switching the “channels” arranged in the vertical direction shown in FIG. 27, the finger can be switched to a desired channel by moving the finger in the vertical direction.

  Thus, by referring to the finger-command correspondence information set in advance for each function (TEL / TV / WEB) or for each display screen in a state where a desired function is selected, the movement of the finger (To be more precise, the movement of the finger image detected by the display / light sensor unit 300) and the operation associated therewith are executed. Thereby, the user can display a desired screen on the first display / light sensor unit 300A. As described with reference to FIGS. 25 and 26, the user can realize a desired operation by appropriately combining the various operations described with reference to FIGS.

  The invention is not limited to the above-described embodiments, and various modifications can be made within the scope shown in the claims, and the embodiments can be obtained by appropriately combining technical means disclosed in different embodiments. The form is also included in the technical scope of the present invention.

  An information terminal device such as a mobile phone provided with the input device according to the present invention is also included in the technical scope of the present invention.

(Program, computer-readable recording medium)
Finally, each block of the data display / sensor devices 100, 110, 120, and 130 according to the present embodiment, in particular, the detection determination unit 810, the superimposition determination unit 820, the input detection unit 830, and the operation control unit 840 of the input interface control unit 805. The application execution unit 850, the image display control unit 860, and the area calculation determination unit 870 of the data display / sensor device 110 may be configured by hardware logic, or realized by software using a CPU as follows. May be.

  That is, the data display / sensor devices 100, 110, 120, and 130 according to the present embodiment include a CPU (Central Processing Unit) that executes instructions of a control program that realizes each function, and a ROM (Read Only Memory) that stores the program. ), A RAM (Random Access Memory) for expanding the program, and a storage device (recording medium) such as a memory for storing the program and various data. An object of the present invention is to provide a computer with program codes (execution format program, intermediate code program, source program) of a control program for the data display / sensor devices 100, 110, 120, and 130 which are software for realizing the functions described above. This can also be achieved by supplying a readable recording medium to the data display / sensor device 100 and reading and executing the program code recorded on the recording medium by the computer (or CPU or MPU). .

  Examples of the recording medium include tapes such as magnetic tapes and cassette tapes, magnetic disks such as floppy (registered trademark) disks / hard disks, and disks including optical disks such as CD-ROM / MO / MD / DVD / CD-R. Card system such as IC card, IC card (including memory card) / optical card, or semiconductor memory system such as mask ROM / EPROM / EEPROM / flash ROM.

  The data display / sensor devices 100, 110, 120, and 130 according to the present embodiment may be configured to be connectable to a communication network, and the program code may be supplied via the communication network. The communication network is not particularly limited. For example, the Internet, intranet, extranet, LAN, ISDN, VAN, CATV communication network, virtual private network, telephone line network, mobile communication network, satellite communication. A net or the like is available. Also, the transmission medium constituting the communication network is not particularly limited. For example, even in the case of wired such as IEEE 1394, USB, power line carrier, cable TV line, telephone line, ADSL line, etc., infrared rays such as IrDA and remote control, Bluetooth ( (Registered trademark), 802.11 wireless, HDR, mobile phone network, satellite line, terrestrial digital network, and the like can also be used. The present invention can also be realized in the form of a computer data signal embedded in a carrier wave in which the program code is embodied by electronic transmission.

  The present invention is not limited to the above-described embodiments, and various modifications can be made within the scope shown in the claims. That is, embodiments obtained by combining technical means appropriately modified within the scope of the claims are also included in the technical scope of the present invention.

  The present invention can be applied to mobile devices (portable terminals) such as mobile phones and PDAs that input from two input surfaces. In particular, it is suitable for a portable terminal equipped with an image detection device.

It is a block diagram showing the principal part structure of the data display / sensor apparatus which concerns on one Embodiment of this invention. It is a figure which shows typically the cross section of the liquid crystal panel with a built-in sensor with which the data display / sensor apparatus which concerns on one Embodiment of this invention is provided. (A) is a schematic diagram which shows a mode that the position which the user touched is detected by detecting a reflected image with the sensor built-in liquid crystal panel with which the data display / sensor apparatus which concerns on one Embodiment of this invention is provided. (B) is a schematic diagram which shows a mode that the position which the user touched is detected by detecting a shadow image with the sensor built-in liquid crystal panel with which the data display / sensor apparatus which concerns on one Embodiment of this invention is provided. It is a block diagram which shows the principal part structure of the data display / sensor apparatus which concerns on one Embodiment of this invention. It is a figure which shows typically an example of the frame structure of the command used with the data display / sensor apparatus which concerns on one Embodiment of this invention. It is a figure explaining an example of the value which can be specified to each field contained in the command shown in FIG. 5, and its outline. (A) is an image obtained as a result of scanning the entire liquid crystal panel with built-in sensor when the object is not placed on the liquid crystal panel with built-in sensor in the data display / sensor device according to one embodiment of the present invention. It is data. (B) is image data obtained as a result of scanning when the user touches the sensor-equipped liquid crystal panel with a finger in the data display / sensor device according to the embodiment of the present invention. It is a block diagram which shows the structure of the liquid crystal panel with a sensor with which the data display / sensor apparatus which concerns on one Embodiment of this invention is provided, and the structure of its peripheral circuit. (A) is a figure which shows a mode that a rectangular image is displayed on a display / light sensor part, and the image is touched with the thumb from the upper surface, and the index finger from the lower surface, respectively. FIG. 5 is a diagram showing a state in which a rectangular image is expanded in a fan shape by moving the thumb and index finger in opposite directions. (A) is a figure which shows a mode that a circular image is displayed on a display / light sensor part, and the image is touched with the thumb from the upper surface, and the index finger from the lower surface, respectively, (b) FIG. 6 is a diagram in which an image enlarged in a circular shape with a larger size is displayed by moving the thumb and index finger in opposite directions. (A) is a figure which shows a mode that the substantially square image is displayed on a display / light sensor part, and the image is touched with the thumb from the upper surface, and the index finger from the lower surface, respectively. These are figures which show the state which is moving the image to the outer side of a screen, moving a thumb and an index finger to the same direction, respectively. (A) A pipe-shaped image with the horizontal direction of the drawing as the pipe axis (tube length) direction is displayed / displayed on the optical sensor unit, and the image is touched with the thumb from the top and the index finger from the bottom. (B) is a figure which shows a mode that the image is rotated in the direction orthogonal to a pipe axis by moving a thumb and an index finger to the opposite direction, respectively. (A) displays several images arranged in a row in the horizontal direction of the drawing near the top edge of the display / light sensor unit, and one of the images is displayed with the thumb from the top and the index finger from the bottom. (B) is a diagram showing a band-like image by moving the thumb and index finger in the same direction (downward in the drawing) and pulling out the image downward. . (A) is a figure which shows a mode that a rectangular image is displayed on a display / light sensor part, and the display screen of a display / light sensor part is each touched with the thumb from the upper surface, and the index finger from the lower surface. FIG. 6B is a diagram illustrating a state in which the thumb and the index finger are moved in the same direction, and the image is divided into two images. (A) is a figure showing a state where a plurality of images are arranged and displayed in a circle on the display / light sensor unit and are touched with the thumb from the upper surface and the index finger from the lower surface near the center of the circle. FIG. 8B is a diagram illustrating a state in which a plurality of images are rotated along the circumference by moving the index finger downward in the drawing without moving the thumb. (A) displays an image formed by combining a circular image and a rectangular image on the display / light sensor unit, and the vicinity of the center of the circular image is displayed with the thumb from the upper surface, and the rectangular image is displayed from the lower surface. (B) is like a pendulum with the index finger moved to the left side of the drawing without moving the thumb as if it were a fulcrum around the center point of the circular image. FIG. 6 is a diagram illustrating a state in which a rectangular image is moved to the left side. (A) is a figure which shows a mode that the display / light sensor part displays an image, and the image is touched with the thumb from the upper surface and the index finger from the lower surface, respectively. The thumb and index finger are moved in opposite directions, and another image appears from the lower side of the drawing, and the two images are displayed side by side. (C) further moves the thumb and index finger in the opposite directions. It is a figure which shows a mode that it was made to change and the image displayed on a screen was switched. (A) is a figure which shows a mode that the image was displayed on the display / light sensor part, (b) touches the said image with the thumb and forefinger from the front and back both sides almost simultaneously, and the said image is "selection". FIG. (A) is a figure which shows a mode that the display / light sensor part displays an image, and the image is touched with the index finger from the upper surface and the middle finger from the lower surface, respectively. FIG. 4C shows a state where the index finger and the middle finger are moved in the same direction (rightward in the horizontal direction in the drawing) to gradually switch to another image, and (c) further shows the index finger and the middle finger in the same direction. It is a figure which shows a mode that it moved and the image displayed on a screen was switched. It is a flowchart showing the flow of a process of the data display / sensor apparatus which concerns on one Embodiment of this invention. It is a block diagram showing the principal part structure of the data display / sensor apparatus which concerns on other embodiment of this invention. It is a flowchart showing the flow of a process of the data display / sensor apparatus which concerns on other embodiment of this invention. (A)-(c) is a figure which shows a mode that the color of a background changes in steps in the state in which the image is not displayed on a display / light sensor part. It is a block diagram showing the principal part structure of the data display / sensor apparatus which concerns on further another embodiment of this invention. It is a figure which shows the Example of the data display / sensor apparatus which concerns on this Embodiment, (a) shows a mode that the "menu screen (TEL / TV / WEB)" was displayed on the display / light sensor part. (B) is a figure which shows a mode that the image which shows "TV" was pinched and pulled out from the menu screen. (C) is a figure which shows a mode that "private broadcasting" was selected from the TV menu. (D) is a figure which shows a mode when private broadcasting "CH01" is watched. It is a figure which shows the Example of the data display / sensor apparatus which concerns on this Embodiment, (a) is a figure which shows a mode that the channel of private broadcasting is switched from "CH01" to "CH02". (B) is a figure which shows a mode when "CH02" is watched. (C) is a figure which shows a mode that it switches from "private channel CH02" to "CS CH02". (D) is a figure which shows a mode when "CS CH02" is watched. It is a conceptual diagram of channel selection in the TV menu, CS, commercial broadcasting, and BS broadcasting, and their respective broadcast radio waves. It is a figure for demonstrating the image information about the circular images 26a and 26b which concern on FIG. It is a figure for demonstrating the image information about the rectangular image 27a which concerns on FIG. It is a figure for showing an example of finger-command corresponding information.

Explanation of symbols

100, 110, 120, 130 Data display / sensor device (input device)
170 Signal generation area (predetermined area)
800 Main control unit 805 Input interface control unit 810 Detection determination unit (detection determination unit)
820 Superimposition determination unit (superimposition determination means)
830 Input detection unit (input detection means)
840 Operation control unit (operation control means)
850 application execution unit (operation execution device)
860 Image display control unit (image display control means)
870 Area calculation determination unit 901 storage unit

Claims (9)

  1. The planar member that detects the image in the vicinity is a plate-like input device disposed on both the front and back surfaces,
    Detection determination means for determining whether or not both of the surface members on both the front and back surfaces have detected a finger image;
    When the detection determination unit determines that both the front and back surface members have detected finger images, the detection unit detects movements of the finger images detected by the front and back surface members, respectively. And an input detection means for generating an input signal associated with the movement of the image.
  2.   Operation control for transmitting an instruction corresponding to the input signal to the operation execution device based on finger-command association information in which the input signal is associated with an instruction for causing the operation execution device to execute a predetermined operation The input device according to claim 1, further comprising means.
  3.   The input detection means generates the input signal when the planar members on both front and back surfaces detect a finger image in a predetermined region on the front and back surface members. The input device according to 1 or 2.
  4. In the planar members on both the front and back surfaces, image display control means for displaying an image in the predetermined area,
    The input device according to claim 3, wherein the input detection unit generates the input signal associated with the image.
  5.   The input device according to claim 4, wherein the image display control unit displays an image based on a result of the predetermined operation executed by the operation execution device on the planar member.
  6. A planar member for detecting a nearby image is a method for controlling a plate-like input device arranged on both the front and back surfaces,
    A detection determination step for determining whether or not the planar members on both the front and back surfaces have both detected a finger image;
    When it is determined in the detection determination step that both the front and back surface members have detected finger images, the movements of the finger images respectively detected by the front and back surface members are detected, and the finger images are detected. And an input detection step of generating an input signal associated with the movement of the image.
  7. The planar member for detecting a nearby image is a control program for a plate-shaped input device disposed on both the front and back surfaces,
    A detection determination step for determining whether or not the planar members on both the front and back surfaces have both detected a finger image;
    When it is determined in the detection determination step that both the front and back surface members have detected finger images, the movements of the finger images respectively detected by the front and back surface members are detected, and the finger images are detected. An input device control program for causing a computer to execute an input detection step of generating an input signal associated with the motion of the image.
  8.   A computer-readable recording medium on which the control program for the input device according to claim 7 is recorded.
  9.   An information terminal device comprising the input device according to claim 1.
JP2008326274A 2008-12-22 2008-12-22 Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device Pending JP2010146506A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008326274A JP2010146506A (en) 2008-12-22 2008-12-22 Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008326274A JP2010146506A (en) 2008-12-22 2008-12-22 Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device

Publications (2)

Publication Number Publication Date
JP2010146506A true JP2010146506A (en) 2010-07-01
JP2010146506A5 JP2010146506A5 (en) 2012-02-16

Family

ID=42566839

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008326274A Pending JP2010146506A (en) 2008-12-22 2008-12-22 Input device, method for controlling input device, program for controlling input device, computer-readable recording medium, and information terminal device

Country Status (1)

Country Link
JP (1) JP2010146506A (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011076233A (en) * 2009-09-29 2011-04-14 Fujifilm Corp Image displaying device, image displaying method, and program
JP2012037979A (en) * 2010-08-04 2012-02-23 Sony Corp Information processor, information processing method, and computer program
WO2012026395A1 (en) * 2010-08-24 2012-03-01 京セラ株式会社 Mobile terminal and unlocking method
JP2012084137A (en) * 2010-09-15 2012-04-26 Kyocera Corp Portable electronic device, screen control method and screen control program
JP2012095069A (en) * 2010-10-27 2012-05-17 Kyocera Corp Portable terminal, lock state control program and lock state control method
JP2012141869A (en) * 2011-01-05 2012-07-26 Sony Corp Information processing apparatus, information processing method, and computer program
JP2012174252A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic device, contact operation control program, and contact operation control method
JP2012208758A (en) * 2011-03-30 2012-10-25 Brother Ind Ltd Input apparatus and image display apparatus
JP2013008340A (en) * 2011-06-27 2013-01-10 Kyocera Corp Portable terminal device, program, and display control method
WO2013065214A1 (en) * 2011-10-31 2013-05-10 株式会社ソニー・コンピュータエンタテインメント Input control device, input control method, and input control program
JP2013089202A (en) * 2011-10-21 2013-05-13 Sony Computer Entertainment Inc Input control unit, input control method and input control program
JP2013117885A (en) * 2011-12-02 2013-06-13 Nintendo Co Ltd Information processing program, information processing equipment, information processing system and information processing method
JP2013140416A (en) * 2011-12-28 2013-07-18 Alpine Electronics Inc Scroll controller and scroll control method
JP2014029673A (en) * 2012-07-04 2014-02-13 Canon Inc Display device and method for controlling the same
WO2014054424A1 (en) * 2012-10-01 2014-04-10 Canon Kabushiki Kaisha Operation reception device and method for receiving operation
JP2014063319A (en) * 2012-09-20 2014-04-10 Sharp Corp Information processing apparatus, control method, control program, and recording medium
JP2014531688A (en) * 2011-09-30 2014-11-27 マイクロソフト コーポレーション Omni-directional gesture input
JP2015133124A (en) * 2015-02-05 2015-07-23 京セラ株式会社 Portable terminal and locked-state releasing method
JP2016054005A (en) * 2010-09-15 2016-04-14 京セラ株式会社 Portable electronic device, screen control method, and screen control program
JPWO2014003012A1 (en) * 2012-06-29 2016-06-02 日本電気株式会社 Terminal device, display control method, and program
US9983700B2 (en) 2011-07-14 2018-05-29 Nec Corporation Input device, image display method, and program for reliable designation of icons

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000163031A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2000293280A (en) * 1999-04-07 2000-10-20 Sharp Corp Information input device
JP2003330611A (en) * 2002-05-16 2003-11-21 Sony Corp Input method and input device
JP2003345511A (en) * 2002-05-24 2003-12-05 Canon Inc Image recorder/reproducer with touch panel
JP2006244446A (en) * 2005-02-03 2006-09-14 Toshiba Matsushita Display Technology Co Ltd Display device
JP2007141029A (en) * 2005-11-21 2007-06-07 Matsushita Electric Ind Co Ltd Portable information device
JP2009223426A (en) * 2008-03-13 2009-10-01 Sharp Corp Information display device and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2000163031A (en) * 1998-11-25 2000-06-16 Seiko Epson Corp Portable information equipment and information storage medium
JP2000293280A (en) * 1999-04-07 2000-10-20 Sharp Corp Information input device
JP2003330611A (en) * 2002-05-16 2003-11-21 Sony Corp Input method and input device
JP2003345511A (en) * 2002-05-24 2003-12-05 Canon Inc Image recorder/reproducer with touch panel
JP2006244446A (en) * 2005-02-03 2006-09-14 Toshiba Matsushita Display Technology Co Ltd Display device
JP2007141029A (en) * 2005-11-21 2007-06-07 Matsushita Electric Ind Co Ltd Portable information device
JP2009223426A (en) * 2008-03-13 2009-10-01 Sharp Corp Information display device and method

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8830184B2 (en) 2009-09-29 2014-09-09 Fujifilm Corporation Image displaying device, image displaying method, and program for displaying images
JP2011076233A (en) * 2009-09-29 2011-04-14 Fujifilm Corp Image displaying device, image displaying method, and program
CN102375597A (en) * 2010-08-04 2012-03-14 索尼公司 Information processing apparatus, information processing method, and computer program
JP2012037979A (en) * 2010-08-04 2012-02-23 Sony Corp Information processor, information processing method, and computer program
WO2012026395A1 (en) * 2010-08-24 2012-03-01 京セラ株式会社 Mobile terminal and unlocking method
US9398463B2 (en) 2010-08-24 2016-07-19 Kyocera Corporation Portable terminal and lock state canceling method
US9154954B2 (en) 2010-08-24 2015-10-06 Kyocera Corporation Portable terminal and lock state canceling method
JP2012084137A (en) * 2010-09-15 2012-04-26 Kyocera Corp Portable electronic device, screen control method and screen control program
JP2016054005A (en) * 2010-09-15 2016-04-14 京セラ株式会社 Portable electronic device, screen control method, and screen control program
JP2012095069A (en) * 2010-10-27 2012-05-17 Kyocera Corp Portable terminal, lock state control program and lock state control method
JP2012141869A (en) * 2011-01-05 2012-07-26 Sony Corp Information processing apparatus, information processing method, and computer program
JP2012174252A (en) * 2011-02-24 2012-09-10 Kyocera Corp Electronic device, contact operation control program, and contact operation control method
JP2012208758A (en) * 2011-03-30 2012-10-25 Brother Ind Ltd Input apparatus and image display apparatus
US8866737B2 (en) 2011-03-30 2014-10-21 Brother Kogyo Kabushiki Kaisha Input device and image display apparatus
JP2013008340A (en) * 2011-06-27 2013-01-10 Kyocera Corp Portable terminal device, program, and display control method
US9983700B2 (en) 2011-07-14 2018-05-29 Nec Corporation Input device, image display method, and program for reliable designation of icons
JP2014531688A (en) * 2011-09-30 2014-11-27 マイクロソフト コーポレーション Omni-directional gesture input
JP2013089202A (en) * 2011-10-21 2013-05-13 Sony Computer Entertainment Inc Input control unit, input control method and input control program
US9433857B2 (en) 2011-10-31 2016-09-06 Sony Corporation Input control device, input control method, and input control program
CN103890703A (en) * 2011-10-31 2014-06-25 索尼电脑娱乐公司 Input control device, input control method, and input control program
EP2752745A4 (en) * 2011-10-31 2015-06-03 Sony Computer Entertainment Inc Input control device, input control method, and input control program
WO2013065214A1 (en) * 2011-10-31 2013-05-10 株式会社ソニー・コンピュータエンタテインメント Input control device, input control method, and input control program
JP2013097563A (en) * 2011-10-31 2013-05-20 Sony Computer Entertainment Inc Input control device, input control method, and input control program
JP2013117885A (en) * 2011-12-02 2013-06-13 Nintendo Co Ltd Information processing program, information processing equipment, information processing system and information processing method
JP2013140416A (en) * 2011-12-28 2013-07-18 Alpine Electronics Inc Scroll controller and scroll control method
US10394366B2 (en) 2012-06-29 2019-08-27 Nec Corporation Terminal device, display control method, and program
JPWO2014003012A1 (en) * 2012-06-29 2016-06-02 日本電気株式会社 Terminal device, display control method, and program
JP2014029673A (en) * 2012-07-04 2014-02-13 Canon Inc Display device and method for controlling the same
JP2014063319A (en) * 2012-09-20 2014-04-10 Sharp Corp Information processing apparatus, control method, control program, and recording medium
US9594445B2 (en) 2012-10-01 2017-03-14 Canon Kabushiki Kaisha Operation reception device and method for receiving operation on page image, storage medium, and image forming apparatus for use with operation reception device
WO2014054424A1 (en) * 2012-10-01 2014-04-10 Canon Kabushiki Kaisha Operation reception device and method for receiving operation
JP2015133124A (en) * 2015-02-05 2015-07-23 京セラ株式会社 Portable terminal and locked-state releasing method

Similar Documents

Publication Publication Date Title
KR101559093B1 (en) user termianl device and methods for controlling the same
US20190121530A1 (en) Device, Method, and Graphical User Interface for Switching Between Camera Interfaces
US9952681B2 (en) Method and device for switching tasks using fingerprint information
US10254878B2 (en) Operating a touch screen control system according to a plurality of rule sets
US20190138101A1 (en) Device, Method, and Graphical User Interface for Providing Tactile Feedback for Operations Performed in a User Interface
CN104035671B (en) Page operations method and its electronic device
US10564806B1 (en) Gesture actions for interface elements
US9696899B2 (en) Multi display apparatus and multi display method
US20200125222A1 (en) Application execution method by display device and display device thereof
TWI579732B (en) Multi display apparatus and control method thereof
KR101452038B1 (en) Mobile device and display controlling method thereof
US8988342B2 (en) Display apparatus, remote controlling apparatus and control method thereof
US9250768B2 (en) Tablet having user interface
CN104487929B (en) For contacting the equipment for carrying out display additional information, method and graphic user interface in response to user
US10162494B2 (en) Operating method for multiple windows and electronic device supporting the same
US9015584B2 (en) Mobile device and method for controlling the same
US10447744B2 (en) Simultaneous input system for web browsers and other applications
US9335887B2 (en) Multi display device and method of providing tool therefor
US10013098B2 (en) Operating method of portable terminal based on touch and movement inputs and portable terminal supporting the same
RU2629429C2 (en) Method and screen image editing device of the mobile device with the touch sensitive display
US20140351761A1 (en) Method and apparatus for displaying picture on portable device
US8830184B2 (en) Image displaying device, image displaying method, and program for displaying images
CN103729055B (en) Multi-display equipment, input pen, more display control methods and multidisplay system
US10007401B2 (en) Information processing apparatus, method, and non-transitory computer-readable medium
US9465437B2 (en) Method and apparatus for controlling screen by tracking head of user through camera module, and computer-readable recording medium therefor

Legal Events

Date Code Title Description
A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20111222

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20111222

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20121120

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20121121

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20130118

A02 Decision of refusal

Free format text: JAPANESE INTERMEDIATE CODE: A02

Effective date: 20131022