JP2009187290A - Controller with touch panel and program - Google Patents

Controller with touch panel and program Download PDF

Info

Publication number
JP2009187290A
JP2009187290A JP2008026624A JP2008026624A JP2009187290A JP 2009187290 A JP2009187290 A JP 2009187290A JP 2008026624 A JP2008026624 A JP 2008026624A JP 2008026624 A JP2008026624 A JP 2008026624A JP 2009187290 A JP2009187290 A JP 2009187290A
Authority
JP
Japan
Prior art keywords
control
surface
display
touch panel
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2008026624A
Other languages
Japanese (ja)
Inventor
Yasushi Kamiya
Koichi Kashiwazaki
紘一 柏崎
泰史 神谷
Original Assignee
Yamaha Corp
ヤマハ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Corp, ヤマハ株式会社 filed Critical Yamaha Corp
Priority to JP2008026624A priority Critical patent/JP2009187290A/en
Publication of JP2009187290A publication Critical patent/JP2009187290A/en
Application status is Pending legal-status Critical

Links

Images

Abstract

To provide a control device with a touch panel and a program which have touch panels on both sides of a display panel and can perform various controls in association with operations on the respective touch panels.
In a portable device according to an embodiment of the present invention, a front touch panel and a back touch panel provided on both sides of a display panel are operated by a user, whereby the back touch panel of the front touch panel is operated. Various controls such as control performed by any one of these operations, control performed by sharing the roles of both operations, and control performed jointly by both operations can be performed. Further, since the display panel 105 is a transmissive display, the user can easily operate the back touch panel 108 even when viewing the display panel 105 from the front side.
[Selection] Figure 6

Description

  The present invention relates to a technique for an operation method using a touch panel.

Since various portable terminals can perform intuitive operations, touch panels are used as the operation means. Using this intuitive operation, for example, in a PDA (Personal Digital Assistant) such as an electronic notebook, a touch panel is used for character input. Further, not limited to portable terminals, for example, as disclosed in Patent Document 1, in a touch screen using a touch panel as a transmission type organic EL (Electroluminescence) display, a user sandwiches a display that transmits light. The object on the opposite side can be viewed with and the touch panel can be operated while checking the display and the object together. Further, as disclosed in Patent Document 2, a touch screen using a touch panel on both display surfaces of a display capable of double-sided display can be operated by two users in a limited installation area.
JP 2000-331557 A JP 2007-304361 A

  In the touch screen disclosed in Patent Document 2, touch panels provided on both sides of the touch screen are operated by different users. That is, only two touch screens each provided with a touch panel on a normal display are connected to each other on the back side, and the plurality of touch panels are operated independently, and independent processing is performed according to the operation. It was something.

  The present invention has been made in view of the above circumstances, and provides a control device with a touch panel and a program that have touch panels on both sides of a display panel and can perform various controls in association with operations on each touch panel. The purpose is to do.

  In order to solve the above-described problems, the present invention provides a generation unit that generates video data, a control unit that performs control according to supplied control information, and a display screen that performs display according to the video data. A display panel having a first surface and a second surface that is a back surface of the first surface, the second surface side being visible from the first surface side, and the first surface side The first touch panel that outputs first operation information in response to an operation on the first surface and the second surface side provided in response to an operation on the first surface, A second touch panel that outputs second operation information, and a control content in the control means is determined according to the relationship between the display content displayed on the first surface and the first operation information. 1 determining means, display contents displayed on the first surface and the second operation. Second control means for determining the control content in the control means according to the relationship with the information, and control information corresponding to each control content determined by the first determination means and the second determination means. The control content that can be determined by the first determination unit and the control content that can be determined by the second determination unit are set to have different control content. Provided is a control device with a touch panel.

  Moreover, in another preferable aspect, when the relationship between the display content displayed on the first surface, the first operation information, and the second operation information is in a predetermined relationship, according to the relationship And a composite determination means for determining the control content in the control means, the control content that can be determined by the composite determination means is the control content that can be determined by the first determination means and the second determination means. Are set as different control contents, and when the composite determination means determines the control contents, the supply means replaces the control contents determined by the first determination means and the second determination means, Control information corresponding to the control content determined by the composite determination means is supplied to the control means.

  In addition, the present invention provides a generation unit that generates video data, a control unit that performs control according to supplied control information, a first surface that is a display surface that performs display according to the video data, and the first surface. A display panel having a second surface that is a back surface of the first surface, the second surface side being visible from the first surface side, and the first surface side, A first touch panel that outputs first operation information in response to an operation on one surface and a second touch surface provided on the second surface side, and second operation information in response to an operation on the second surface The composite which determines the control content in the said control means according to the relationship between the 2nd touch panel to output, the display content currently displayed on the said 1st surface, the said 1st operation information, and the said 2nd operation information Corresponding to the control content determined by the determination means and the composite determination means. Control information to provide a touch panel with control apparatus characterized by comprising a supply means for supplying to the control means.

  In another preferred embodiment, one of the control contents determined by the first determining means and the second determining means is a control content that specifies a parameter to be controlled, and the other is a parameter value. The control means performs control to change a value for a specified parameter in accordance with the supplied control information.

  In another preferable aspect, the first touch panel detects an operation on the first surface by a predetermined detection method, outputs first operation information corresponding to the detection result, and outputs the second operation information. The touch panel detects an operation on the second surface by a detection method different from the detection method in the first touch panel, and outputs second operation information corresponding to the detection result.

  In another preferable aspect, the display panel can visually recognize the second surface side from the first surface side by transmitting light on the second surface side to the first surface side. It is characterized by.

  In another preferable aspect, provided on the second surface side, photographing a direction substantially in the normal direction of the second surface and opposite to the first surface, and outputting photographing data. The display panel further includes an imaging unit, and the display panel performs display on the first surface according to the video data and the imaging data output from the imaging unit, so that the first surface side 2 side can be visually recognized.

  In another preferable aspect, a mirror image of the display related to the video data is displayed on the second surface of the display panel.

  In another preferable aspect, the control in the control means includes control for changing a display mode of the first surface.

  In another preferred aspect, the control by the control means includes generation of audio data indicating the content of pronunciation.

  Moreover, this invention comprises the above-mentioned control apparatus with a touch panel, and the housing | casing which has an opening part, The said control apparatus with a touch panel is accommodated in the said housing | casing, The said 1st surface and the said 2nd The surface provides a portable device that is exposed from the opening.

  In addition, the present invention includes a first surface that is a display surface that performs display according to video data, and a second surface that is a surface on the back side of the first surface, from the first surface side. A display panel capable of visually recognizing the second surface side, a first touch panel provided on the first surface side and outputting first operation information in accordance with an operation on the first surface; A generation function for generating video data is supplied to a computer provided on the second surface side and provided with a second touch panel that outputs second operation information in response to an operation on the second surface. A control function that performs control according to control information, and a first control function that determines the control content in the control function according to the relationship between the display content displayed on the first surface and the first operation information. The determination function, the display content displayed on the first surface, and the second operation. A control function corresponding to each control content determined by the second determination function and the first determination function and the second determination function according to the relationship with the information; The control function that can be determined by the first determination function and the control content that can be determined by the second determination function are different from each other. Provided is a program characterized in that it is set to have.

  In addition, the present invention includes a first surface that is a display surface that performs display according to video data, and a second surface that is a surface on the back side of the first surface, from the first surface side. A display panel capable of visually recognizing the second surface side, a first touch panel provided on the first surface side and outputting first operation information in accordance with an operation on the first surface; A generation function for generating video data is supplied to a computer provided on the second surface side and provided with a second touch panel that outputs second operation information in response to an operation on the second surface. Control function for performing control according to control information, and control content in the control function according to the relationship between the display content displayed on the first surface, the first operation information, and the second operation information And a composite decision function for determining The control information corresponding to the control content was to provide a program for implementing the supply function of supplying a control information in the control function.

  ADVANTAGE OF THE INVENTION According to this invention, it can provide a control apparatus with a touch panel and a program which have a touch panel on both surfaces of a display panel, and can perform various controls in connection with operation to each touch panel.

  Hereinafter, an embodiment of the present invention will be described.

<Embodiment>
A portable device according to an embodiment of the present invention is a portable game machine having a transmissive display, and touch panels are provided on both sides of the display, and the control device with a touch panel controls the portable device according to the control of the touch panel. have. Thereby, the user of this portable apparatus can control a portable apparatus according to operation of a double-sided touch panel. Hereinafter, the hardware configuration of the portable device according to the embodiment of the present invention will be described.

  FIG. 1 is a diagram illustrating a hardware configuration of a mobile device 10 according to an embodiment of the present invention. FIG. 2 is a diagram illustrating an appearance of the mobile device 10. The mobile device 10 includes a CPU (Central Processing Unit) 101, a storage unit 102, a RAM (Random Access Memory) 103, an operation unit 104, a display panel 105, a speaker 106, a front touch panel 107, a back touch panel 108, and an interface 109. They are connected to each other via a bus 100. In the portable device 10, its components are integrally housed in the housing 200, and a portion corresponding to the display area 300 is exposed from the opening of the housing 200.

  The CPU 101 reads out the program stored in the storage unit 102 to the RAM 103 and executes it. Thereby, the CPU 101 controls each part of the mobile device 10 via the bus 100 and realizes each function as described later. The RAM 103 functions as a work area when the CPU 101 processes each data.

  The storage unit 102 is a storage unit such as a hard disk drive (HDD), a read only memory (ROM), and a nonvolatile memory, and stores the above-described program and various types of information.

  The operation unit 104 is an operation unit configured by operation keys or the like different from the front surface touch panel 107 and the back surface touch panel 108. When operated by the user, the operation unit 104 outputs a signal representing the operation content to the CPU 101.

  The display panel 105 is display means for performing display according to supplied video data. The display panel 105 is a transmissive display in which light on one surface side is transmitted to the other surface side so that the other surface side can be viewed from one surface side. For example, in an organic EL display, both an anode that supplies current to an organic EL light-emitting material and an electrode layer that corresponds to a cathode are made of a material that transmits visible light. As such a material, for example, in addition to a transparent conductor such as ITO (Indium Tin Oxide), a metal obtained by thinning gold, aluminum or the like from several nm to several tens of nm through which visible light is transmitted is used.

  In addition, the display panel 105 performs display corresponding to the supplied video data in the display area 300 as shown in FIG. The three circular displays in the display region 300 shown in FIG. 2 show virtual particles 500-1, 500-2, 500-3, and show an example of the display performed according to the video data. . In the present embodiment, as shown in FIG. 3A, the upper left corner of the display area 300 is coordinates (0, 0) and the lower right corner is coordinates (1024, 756). The coordinates to be set are set.

  As described above, since the display panel 105 is a transmissive display, the display can be seen from both sides. When viewed from one side of the display panel 105 (hereinafter referred to as a display side), the display panel 105 can be seen in the display area 300. As shown in FIG. 3A, display is performed according to the video data. On the other hand, when viewed from the other surface side, the display area 300 is viewed as a coordinate relationship as shown in FIG. 3B, and therefore, a display having a mirror image relationship with the display on the display surface is made. Will look like. In this way, the display panel 105 displays on both sides, but hereinafter, the display surface that is the original display corresponding to the content of the video data is used as the surface of the display panel 105, and display in a mirror image relationship is made. The surface is defined as the back surface of the display panel 105.

  Returning to FIG. 1, the description will be continued. The speaker 106 is sound emitting means for emitting sound according to the supplied audio data, and has one or a plurality of speaker units.

  The front touch panel 107 and the back touch panel 108 are touch panels provided on the front side and the back side of the display panel 105, respectively. Therefore, when viewing the display area 300 exposed from the opening of the housing 200 as a cross section, the front touch panel 107, the display panel 105, and the back touch panel 108 are stacked in this order from the front side as shown in FIG. It is in a state. Therefore, the part corresponding to the display area 300 is a touch screen provided with touch panels on both sides.

  Here, in this embodiment, the front surface touch panel 107 and the back surface touch panel 108 are resistance film type touch panels, but other detection methods such as a capacitance method, an optical method, an ultrasonic method, etc. Also good. In addition, as described above, a plurality of points may be detected by detecting with a matrix such as a digital type instead of an analog type touch panel that detects one touched point. Further, the front touch panel 107 and the back touch panel 108 may be the same detection method, or may be touch panels with different detection methods.

  The front touch panel 107 corresponds to the coordinates set in the display area 300, with the upper left corner as coordinates (0, 0) and the lower right corner as coordinates (1024, 756), as shown in FIG. Coordinates are set. When the user touches a specific point on the surface touch panel 107, surface operation information indicating coordinates corresponding to the touched point is output to the CPU 101.

  On the other hand, the back touch panel 108 corresponds to the coordinates set in the display area 300, as shown in FIG. 3B, the upper right is coordinates (0, 0) and the lower left is coordinates (1024, 756). The coordinates to be set are set. As described above, since the display from the back side has a mirror image relationship with the display on the front side, the coordinate relationship of the touch panel is also set to be a mirror image relationship. When the user touches a specific point on the back touch panel 108, back side operation information indicating coordinates corresponding to the touched point is output to the CPU 101. Note that the coordinates set on the back touch panel 108 may be the same as the coordinates set on the front touch panel 107 instead of the mirror image relationship. In this case, the CPU 101 may perform a process of converting the coordinates indicated by the back surface operation information so as to correspond to the display on the display surface.

  Here, as shown in FIG. 5A, when the user's left hand 1000A touches the surface touch panel 107 in order to designate a specific point of the display area 300 from the surface side, the CPU 101 uses the surface operation information. A point designated by the user in the display area 300 is recognized as a surface designated point 1071. When the user's right hand 1000B touches the back touch panel 108 to specify a specific point of the display area 300 from the back side, the CPU 101 is designated by the user in the display area 300 by the back side operation information. Are recognized as back side designated points 1081. In the description of each figure in the following description, the front surface designated point 1071 is indicated by “X” and the back surface designated point 1081 is indicated by “Δ”, but it may be displayed in the display area 300 or may not be displayed.

  Returning to FIG. 1, the description will be continued. The interface 109 is connected to an external device by wire or wireless, and transmits / receives various information. As an external device, for example, a storage medium such as a USB (Universal Serial Bus) memory, a flash memory card, an optical disk drive that reads a storage medium such as an optical disk, and other portable terminals by communication via a communication network such as the Internet , Server, computer, etc. The above is the description of the hardware configuration of the mobile device 10.

  Next, functions realized by the CPU 101 of the portable device 10 according to the embodiment of the present invention by reading the program stored in the storage unit 102 into the RAM 103 and executing the program will be described with reference to FIG.

  The control unit 11 is a control unit that performs each control in the portable device 10 and performs control according to control information supplied from a control information supply unit 17 described later, in addition to preset control. Each control in the portable device 10 includes various controls. In the present embodiment, control of display contents displayed on the display area 300 of the display panel 105 and control of sound generation contents emitted from the speaker 106 are performed. The video instruction information indicating the display content is output, and the pronunciation instruction information indicating the pronunciation content is output.

  The preset control in this embodiment will be described. As shown in FIG. 2, the control unit 11 performs control to display virtual particles 500-1, 500-2, and 500-3 (hereinafter referred to as virtual particles 500 unless otherwise distinguished) in the display area 300. And the control part 11 performs control which moves the virtual particle 500 according to the predetermined algorithm set beforehand. For example, the inside of the display area 300 is considered as a weightless state, each virtual particle 500 is moved at a predetermined speed, the periphery of the display area 300 is regarded as a wall surface, rebounds on the wall surface, or collides with other virtual particles 500 Or bounce. In addition, each virtual particle 500 is given a pseudo mass, and when rebounding, the momentum is stored.

  And the control part 11 can specify the area | region which each virtual particle 500 occupies at least in the display area 300, such as the position, shape, size, etc. of each controlled virtual particle 500 as indicating the display contents. Is output to the video data generation unit 12, the front surface control determination unit 14, the back surface control determination unit 15 and the double side control determination unit 16 as video instruction information. Since the video instruction information is information indicating the display contents in the display area 300, when performing menu display related to various controls in the display area 300, information for instructing the display may be included.

  In addition, when the virtual particle 500 collides with another virtual particle 500, the control unit 11 performs control for causing the speaker 106 to generate a sound according to a predetermined algorithm. Information indicating the content of the pronunciation, such as waveform data, pitch, volume, and sound length, is output to the audio data generation unit 13 as the sound generation instruction information.

  Next, control according to the control information will be described. The control unit 11 controls display content and pronunciation content according to control information supplied from the control information supply unit 17. This includes control of changes in the above-described preset control contents (display contents, pronunciation contents), for example, changes in the mass, size, and shape of the virtual particles 500, changes in the gravitational field acting on the virtual particles 500, and the like. . In addition, the preset control content is temporarily changed, for example, the virtual particle 500 is temporarily separated from the movement control according to a predetermined algorithm, and the initial velocity is given in another direction. You can also. Furthermore, completely different controls such as a volume control for changing the output level of the speaker 106 and a control for changing the display brightness of the display panel 105 may be used.

  The video data generation unit 12 generates video data for display according to the video instruction information output from the control unit 11 and supplies the video data to the display panel 105. For example, from the video instruction information, an area occupied by the virtual particles 500 in the display area 300 is recognized, and video data is generated so that the virtual particles 500 are displayed in the display area 300 as contents indicated by the video instruction information. .

  The audio data generation unit 13 generates audio data indicating the content of the sound generation specified by the sound generation instruction information output from the control unit 11 and supplies the audio data to the speaker 106.

  The surface control determination unit 14 determines the control content in the control unit 11 based on the video instruction information output from the control unit 11 and the surface operation information output from the surface touch panel 107, and the surface indicating the determined control content Output control information. Specifically, the surface control determination unit 14 recognizes the display content displayed in the display area 300 based on the video instruction information. Then, the surface control determination unit 14 performs control in the control unit 11 in accordance with the relationship between the display content of the recognized display area 300 and the position of the surface designated point 1071 indicated by the surface operation information or the locus due to the time change of the position. Determine the content.

  Here, the relationship between the display contents and the surface operation information and the correspondence between the control contents to be determined are preset in the surface control determination unit 14. For example, in the case where the surface designation point 1071 is included in the region where the virtual particle 500 exists, this correspondence relationship is a correspondence relationship of control contents for selecting the virtual particle 500.

  In addition, the surface control determination unit 14 controls, for example, a predetermined parameter when the locus of the surface designated point 1071 is a predetermined locus regardless of the display content and the position of the surface designated point 1071. The content may be set. As described above, the control content preset in the surface control determination unit 14 may include a control content that is not directly related to the relationship between the display content and the surface operation information. Therefore, it is only necessary that the control content to be determined is included from the relationship between the display content and the surface operation information.

  The back surface control determination unit 15 determines the control content in the control unit 11 based on the video instruction information output from the control unit 11 and the back surface operation information output from the back surface touch panel 108, and shows the determined control content. Output control information. Specifically, the back surface control determination unit 15 recognizes the display content displayed in the display area 300 based on the video instruction information. Then, the back surface control determination unit 15 performs control in the control unit 11 according to the relationship between the display content of the recognized display area 300 and the position of the back surface designated point 1081 indicated by the back surface operation information or the locus due to the time change of the position. Determine the content.

  Here, the relationship between the display content and the back surface operation information and the control content to be determined are set in the back surface control determination unit 15 in advance. For example, when the back surface designated point 1081 is included in a region where the virtual particle 500 does not exist, the correspondence relationship further recognizes the locus of the back surface designated point 1081 and displays the display panel 105 according to the locus. This corresponds to the control content for changing the display brightness of the display.

  In addition, the back surface control determination unit 15 may include, for example, control content for changing the value of a parameter specified separately according to the locus of the back surface designated point 1081 regardless of the display content and the position of the back surface designated point 1081. Good. Thus, as described above, the control content preset in the back surface control determination unit 15 may include a control content that does not directly relate to the relationship between the display content and the back surface operation information. It is only necessary that the control content to be determined includes the control content to be determined from the relationship between the display content and the back surface operation information.

  The control content that can be determined by the front surface control determination unit 14 and the control content that can be determined by the back surface control determination unit 15 are set to have different control content. That is, the control content that can be determined by each of the front surface control determination unit 14 and the back surface control determination unit 15 includes the same control content, but also includes control content that can be determined by only one of them. Therefore, some of the controls in the control unit 11 can be performed only by operating the front surface touch panel 107 and can be performed only by operating the back surface touch panel 108.

  The double-sided control determination unit 16 determines the control contents in the control unit 11 based on the video instruction information output from the control unit 11, the front surface operation information output from the front surface touch panel 107, and the back surface operation information output from the back surface touch panel 108. The duplex control information indicating the determined control content is output. Specifically, the double-sided control determination unit 16 recognizes the display content displayed in the display area 300 based on the video instruction information. Then, the double-sided control determining unit 16 compares the display contents of the recognized display area 300 and the position of the front surface designated point 1071 indicated by the front surface operation information and the position of the back surface designated point 1081 indicated by the back surface operation information or the locus due to the time change of the position. The control content in the control unit 11 is determined according to the relationship. That is, this control content indicates control in which operations on the front surface touch panel 107 and the back surface touch panel 108 are performed jointly.

  On the other hand, the double-sided control determination unit 16 has a predetermined relationship between the display contents and the front surface operation information and the back surface operation information, or a predetermined relationship between the front surface operation information and the back surface operation information. The control content is not determined and the duplex control information is not output. This predetermined relationship is preset in the double-sided control determining unit 16 when there is no control content to be determined, or when the distance between the front surface designated point 1071 and the back surface designated point 1081 is equal to or longer than a certain length. You can do it. On the contrary, the double-sided control determining unit 16 determines that the relationship between the display content and the front surface operation information and the back surface operation information is a predetermined relationship, or the relationship between the front surface operation information and the back surface operation information. When there is no predetermined relationship, the control content may not be determined and the duplex control information may not be output.

  Here, the correspondence between the display content, the front surface operation information, and the back surface operation information and the control content to be determined are set in advance in the double-sided control determination unit 16. For example, in the case where the surface designation point 1071 and the back surface designation point 1081 are included in the region where the virtual particle 500 exists, the correspondence relationship specifies the virtual particle 500 and further specifies the surface designation point 1071. And the control contents for recognizing the locus of the back surface designated point 1081 and moving the position of the virtual particle 500 in accordance with the locus.

  Further, the double-sided control determining unit 16 includes, for example, the control unit 11 according to the locus of the front surface designated point 1071 and the back surface designated point 1081 regardless of the display contents, the positions of the front surface designated point 1071 and the back surface designated point 1081. It may be a control content for changing the content of a predetermined algorithm used. Thus, as described above, the control content preset in the double-sided control determination unit 16 may include a display content that does not directly relate to the relationship between the front surface operation information and the back surface operation information. The control content to be determined may be included in the preset control content from the relationship between the display content, the front surface operation information, and the back surface operation information.

  The control content that can be determined by the double-sided control determination unit 16 is set to be different from the control content that can be determined by the front surface control determination unit 14 and the control content that can be determined by the back surface control determination unit 15. Has been. Therefore, some of the controls in the control unit 11 can be performed only by a joint operation of the front touch panel 107 and the back touch panel 108.

  The control information supply unit 17 is determined by the front surface control information indicating the control content determined by the front surface control determination unit 14, the back surface control information indicating the control content determined by the back surface control determination unit 15, and the double-side control determination unit 16. Based on the double-sided control information indicating the control content, control information is generated and supplied to the control unit 11. Specifically, this control information is generated as follows.

  First, the control information supply unit 17 generates control information corresponding to each control content indicated by the front surface control information and the back surface control information. In the case where the control content indicated by the front surface control information and the control content indicated by the back surface control information each independently indicate one control content, control information corresponding to each is generated. On the other hand, when the control content indicated by the front surface control information and the control content indicated by the back surface control information indicate one control content by sharing roles, control information is generated according to both control content. . For example, when one is a control content for specifying a parameter to be controlled and the other is a control content for changing a parameter value, control information indicating the control content for changing the specified parameter value is generated.

  At this time, when the control content is determined by the double-side control determination unit 16 and the double-side control information is output, the control information supply unit 17 replaces each control content indicated by the front surface control information and the back surface control information with the double-sided control information. Control information corresponding to the control content indicated by the control information is generated. That is, the control information supply unit 17 does not supply control information corresponding to each control content indicated by the front surface control information and the back surface control information, and supplies control information corresponding to the control content indicated by the double-sided control information to the control unit 11. It will be. The above is the description of the functions realized by the CPU 101 executing the program.

  Next, the operation of the mobile device 10 according to the embodiment of the present invention will be described. First, when the mobile device 10 is powered on, the CPU 101 reads out a control program from the storage unit 102 to the RAM 103 and executes it.

  Thereby, virtual particles 500 as shown in FIG. 2 are displayed in the display area 300, and the virtual particles 500 are slowly moving in the display area 300. When the virtual particles 500 collide with other virtual particles 500, a collision sound is emitted from the speaker 106. In this situation, the user can perform various controls by operating the front touch panel 107 and the back touch panel 108. Hereinafter, among various controls, a plurality of examples will be described in order for the control performed on the virtual particles 500 displayed in the display region 300.

  First, as a first example, the user contacts the surface touch panel 107 of a portion corresponding to the region of the virtual particle 500-1. Thereby, as shown in FIG. 7A, the surface designation point 1071 is designated so as to exist in the region of the virtual particle 500-1. When the surface control determining unit 14 recognizes that the surface designated point 1071 exists in the region of the virtual particle 500-1, the surface control determining unit 14 specifies the virtual particle 500-1 and performs control on the virtual particle 500-1. The surface control information indicating the control content for displaying the control menu 400 is output.

  Then, the control unit 11 performs control to display the control menu 400, and as illustrated in FIG. 7B, the outline of the virtual particle 500-1 that is the control target is thickened to change the display mode. The control menu 400 is displayed. In the control menu 400, “size” for changing the size of the virtual particle 500-1, “mass” for changing the mass of the virtual particle 500-1, and the shape of the virtual particle 500-1 are changed. The “shape” is displayed as controllable content.

  Here, the user operates the back surface touch panel 108 while maintaining the position of the surface designated point 1071 within the region of the virtual particles 500-1. In this operation, as shown in FIG. 8A, when the back surface designated point 1081 is moved from the starting point 1080 in the direction of the arrow, the back surface control determining unit 15 recognizes this locus and sets a predetermined parameter to the back surface. The rear surface control information indicating the control content to be changed so as to correspond to the moving speed, moving amount, and moving direction of the designated point 1081 is output.

  The control information supply unit 17 obtains the back surface control information output from the back surface control determination unit 15 as described above without changing the control content indicated by the surface control information output from the surface control determination unit 14 as described above. Then, the control information which changes the movement parameter of the specified virtual particle 500-1 with the control content which back surface control information shows is output. Here, the movement parameter indicates a movement speed and a movement direction in the present embodiment.

  The control information output in this way changes the moving speed of the moving parameter of the virtual particle 500-1 to the speed corresponding to the moving speed of the back surface designated point 1081 to the control unit 11, and changes the moving direction to the back surface. Control to change the moving direction of the designated point 1081, that is, the direction of the arrow in the figure is performed. Thereby, as shown in FIG.8 (b), the moving speed and moving direction of the virtual particle 500-1 are changed, and it moves.

  Next, as a second example, in the state shown in FIG. 7B, the user moves the surface designation point 1071 and, as shown in FIG. 9A, the item “size” in the control menu 400 is displayed. When it is moved to the surface, the surface control determination unit 14 outputs surface control information indicating the control content specifying the size parameter of the virtual particle 500-1. On the other hand, if the user operates the back surface touch panel 108 to move the back surface designated point 1081 from the starting point 1080 in the direction of the arrow as shown in FIG. 9B, the back surface control determining unit 15 The back surface control information indicating the control contents for recognizing the locus and changing the predetermined parameter so as to correspond to the moving speed, moving amount, and moving direction of the back surface designated point 1081 is output. Here, the size parameter indicates the radius of the virtual particle 500 in the present embodiment.

  When the control information supply unit 17 acquires the surface control information output from the surface control determination unit 14 and the back surface control information output from the back surface control determination unit 15 as described above, the control information supply unit 17 of the specified virtual particle 500-1 Control information for changing the size parameter according to the control content indicated by the back surface control information is output.

  The control information output in this way indicates to the control unit 11 the size of the radius of the size parameter of the virtual particle 500-1 in the vertical direction component of the display area 300 among the movement directions of the back surface designated point 1081. Increase or decrease according to the amount of movement. For example, the radius is increased when moved upward, and the radius is decreased when moved downward. Accordingly, as shown in FIG. 9B, when the user moves the back surface designated point 1081, the radius of the virtual particle 500-1 changes so as to increase. If such processing is to be performed only on the front touch panel 107, it is necessary to display an instruction for changing the parameter value, and the display content of the display area 300 becomes complicated and the operation is complicated. It will be a thing. On the other hand, by performing control in such a manner that both sides are shared, it is possible to easily control parameters without largely changing the display contents.

  The back surface control information may be output when the operation of the back surface touch panel 108 by the user is completed, or may be output at any time according to the operation. Further, in each of the above examples, the double-sided control determining unit 16 does not have control content determined from the relationship among the virtual particles 500, the front surface designated point 1071, and the back surface designated point 1081, and the double-sided control information is not output.

  A third example will be described. In this example, as shown in FIG. 10A, virtual particles 501-1, 501-2, and 501-3 having different shapes are displayed in the display area. The user contacts the front surface touch panel 107 and the rear surface touch panel 108 corresponding to the region of the virtual particles 501-1. As a result, as shown in FIG. 10A, the front surface designated point 1071 and the back surface designated point 1081 are designated to exist within the region of the virtual particle 501-1. Here, even if the virtual particle 501-1 is very small and smaller than the size of the user's finger, two points can be specified in the region of the virtual particle 501-1 by specifying from the front and back. That is, when there is no back-side touch panel 108 and only the front-side touch panel 107 is configured, even if the front-side touch panel 107 can detect a plurality of points, there is a limit to designation of two points at a short distance. If specified from the front and back, it is possible to easily specify two points at a short distance.

  When the double-side control determining unit 16 recognizes that the surface designated point 1071 and the back surface designated point 1081 exist in the region of the same virtual particle 501-1, the surface designated point 501-1 is specified and the surface designated point 501-1 is identified. The double-sided control information of the control content for associating 1071 and the back surface designated point 1081 with the position in the region of the virtual particle 501 is output. Note that the front surface control information and the back surface control information corresponding to the operation are also output from the front surface control determination unit 14 and the back surface control determination unit 15, respectively.

  Since the double-sided control information is output from the double-sided control determination unit 16 as described above, the control information supply unit 17 outputs the front surface control information output from the front surface control determination unit 14 and the back surface output from the back surface control determination unit 15. The control information is ignored and the control information indicating the control content indicated by the duplex control information is output.

  Then, as shown in FIG. 10B, when the user moves the front surface designated point 1071 and the back surface designated point 1081, the double-sided control determination unit 16 moves the identified virtual particle 501-1, As for the movement, double-sided control information indicating the contents of control performed while maintaining the correspondence between the front surface designated point 1071 and the back surface designated point 1081 associated with the position in the region of the virtual particle 501-1 is output. Accordingly, the virtual particles 501-1 move as shown in FIG. 10B in accordance with the movement of the front surface designated point 1071 and the back surface designated point 1081. By doing in this way, it is possible to intuitively perform the control of holding and moving the virtual particle 501-1.

  Next, as a fourth example, a case where the user moves only the back surface designated point 1081 in the state shown in FIG. 10A will be described. In this case, for example, as shown in FIG. 11 (a), control may be performed so as to deform the shape of the virtual particle 501-1. As shown in FIG. 11 (b), Control may be performed such that the virtual particles 501-1 are enlarged and deformed while maintaining the similar shape.

  As described above, in the mobile device 10 according to the embodiment of the present invention, the front touch panel 107 and the back touch panel 108 provided on both surfaces of the display panel 105 are operated by the user, so that the back touch panel 108 of the front touch panel 107 is operated. Various controls such as control performed by any one of these operations, control performed by sharing the roles of both operations, and control performed jointly by both operations can be performed.

  Further, since the display panel 105 is a transmissive display, the user can easily operate the back touch panel 108 even when viewing the display panel 105 from the front side. Furthermore, by operating on the front and back, it is possible to specify between two points at a shorter distance than the operation on one side. Furthermore, if it is a small apparatus which can be carried, the front and back can be operated with one hand.

  As mentioned above, although embodiment of this invention was described, this invention can be implemented in various aspects as follows.

<Modification 1>
In the embodiment described above, the mobile device 10 displays the virtual particles 500 and controls the virtual particles 500 by operating the front surface touch panel 107 and the back surface touch panel 108, but can be used for various controls. Hereinafter, a plurality of examples will be described.

  As a first example, it can be applied to the operation of a web browser. For example, a click on a link or the like can be performed by operating the front surface touch panel 107, a character can be input, and the screen can be scrolled by operating the back surface touch panel 108. Thus, the present invention can be applied to the operation of software operating on a general computer.

  As a second example, it can be applied to musical instruments. For example, the portable device 10 displays a piano keyboard in the display area 300. And if a user operates the surface touch panel 107 with a right hand and contacts the part of a certain key, the portable apparatus 10 will emit the sound of the piano corresponding to the key. In addition, when the user operates the back touch panel 108 with the left hand and touches a certain key portion, the portable device 10 emits a base sound corresponding to the key. Even in such a narrow display area 300, it is possible to perform with both hands.

  Further, the mobile device 10 displays harp strings in the display area 300. When the user operates the front touch panel 107 and the back touch panel 108 to play the harp string with the left and right hands, the portable device 10 emits a sound corresponding to the string. Thereby, the portable apparatus 10 can be used like a harp. In addition, by setting different pitches for the front and back sides, different sounds can be generated when the same string is played using the front touch panel 107 and when the same string is played using the back touch panel 108.

  In addition, the mobile device 10 displays the guitar fret portion and the strings in the display area 300. When the user operates to play the string using the front surface touch panel 107 and operates to press the string to be played using the back surface touch panel 108, the portable device 10 moves to a position where the string is pressed. Plays the played string with the corresponding pitch. In this way, the portable device 10 can be used like a guitar. In addition, it is possible to confirm how to press the strings on the fret portion from the viewpoint from the fret side, which is not normally possible. In this way, it can be applied to various musical instruments.

  As a third example, the mobile device 10 can be applied to AR (Augmented Reality). In this case, as shown by a broken line in FIG. 1, there is provided an imaging unit 110 that captures an image in a direction substantially normal to the back surface and opposite to the display surface and outputs imaging data indicating the captured content. For example, various applications can be made. For example, it is possible to provide analysis means for analyzing the image of the photographic data, and display according to the analysis result. Specifically, when the photographing unit 110 photographs a store, the store information is collected from the store logo via the Internet. In the display area 300, the photographed shop can be seen through the display panel 105. However, the analysis means also analyzes the coordinates when the photograph data is associated with the display area 300, and the store part includes The collected information may be displayed. Then, various controls may be performed by operating the front touch panel 107 and the back touch panel 108.

<Modification 2>
In the above-described embodiment, the display panel 105 is a transmissive display. However, a liquid crystal display, a plasma display, a general organic EL display, etc. A thin display that is not a transmissive type that does not transmit light may be used. In this case, as shown by a broken line in FIG. 1, an imaging unit 110 that captures an image of a direction that is substantially normal to the back surface and opposite to the display surface, and outputs imaging data indicating the captured content. What is necessary is just to provide.

  As shown in FIG. 12, the photographing unit 110 outputs the photographing data to the video data generating unit 12. Then, the video data generation unit 12 synthesizes the video related to the video data generated by the video instruction information output from the control unit 11 and the video related to the shooting data so as to be superimposed and outputs the video data. In this way, the display area 300 of the display panel 105 is a display in which the back side photographed by the photographing unit 110 is superimposed on the display according to the video instruction information, and the back side is visually recognized from the display surface side indirectly. It will be possible. In this way, even if the back side cannot be directly viewed from the display surface side, the hand operating the back touch panel 108 can be indirectly viewed, so that the operation is as easy as when directly viewed. It can be performed.

  In addition, when it is possible to visually recognize in this way, the display cannot be seen from the back side, but a non-transmission type thin display may be provided on the back side as well. In this case, as shown in FIG. 13, the display panel 105 is composed of a front display panel 105 </ b> A and a back display panel 105 </ b> B, and the back surfaces that are not display surfaces are in contact with each other. Good. Then, the front touch panel 107 may be provided on the display surface side of the front display panel 105A, and the back touch panel 108 may be provided on the display surface side of the rear display panel 105B. Then, the back display panel 105B may display a mirror image of the display on the front display panel 105A.

  As shown in FIG. 14, a reverse video data generation unit 18 that generates reverse video data indicating the display content of the mirror image from the display content indicated by the video instruction information output from the control unit 11 may be provided. The video data generated by the video data generator 12 is supplied to the front display panel 105A, and the reverse video data generated by the reverse video data generator 18 is supplied to the back display panel 105B. You can do so. In FIG. 14, the description of the photographing unit 110 is omitted, but the photographing data may be input to the video data generating unit 12 as in FIG. Further, when an imaging unit for imaging the front side is also provided, the same processing as the video data generation unit 12 is performed so that the imaging data output by the imaging unit is input to the inverted video data generation unit 18. What is necessary is just to be made.

<Modification 3>
In the embodiment described above, the front touch panel 107 and the back touch panel 108 recognize the position touched by the user as coordinates, but one touch panel may be recognized by another detection method. Good. For example, an impact sensor that detects an impact, a temperature sensor, an optical sensor, or the like may be used as long as some information can be obtained by a user's operation.

<Modification 4>
In the embodiment described above, the portable device 10 is assumed to be a portable game machine, but may be a mobile phone, a handbell PC, or the like, and these display areas correspond to the display area of the portable device 10. What is necessary is just to be comprised. In addition, the portable device 10 is configured to be easily carried by being housed integrally in the housing, but may be separately housed in a plurality of housings. For example, the display panel 105, the front surface touch panel 107, and the back surface touch panel 108 are input / output devices housed in an integrated housing, and other configurations are housed in another housing as a host device. It may be configured to connect by wire or wireless.

  Further, the display region 300 may be a very large device, for example, about several tens of centimeters to several meters, and may be a device that cannot be carried. In the case of such a large apparatus, it is difficult for one person to operate the front surface touch panel 107 and the rear surface touch panel 108, but it is only necessary for another person to operate the front surface side and the back surface side. In this way, by performing various operations jointly in a state where the people on the front side and the back side can see each other, it is possible to enhance the entertainment when this device is applied to a game or the like.

<Modification 5>
The programs executed by the CPU 101 in the above-described embodiment are a magnetic recording medium (magnetic tape, magnetic disk (HDD, FD), etc.), an optical recording medium (optical disk (CD, DVD), etc.), a magneto-optical recording medium, and a semiconductor memory. Or the like stored in a computer-readable recording medium. It is also possible to download via a network such as the Internet.

It is the figure which showed the structure of the hardware of the portable apparatus which concerns on embodiment. It is the figure which showed the external appearance of the portable apparatus which concerns on embodiment. It is explanatory drawing about the display area which concerns on embodiment. It is the figure which showed the structure of the display panel, front surface touch panel, and back surface touch panel in the display area which concerns on embodiment. It is explanatory drawing about operation of the front surface touch panel and back surface touch panel which concern on embodiment. It is the figure which showed the structure of the software of the portable apparatus which concerns on embodiment. It is the figure which showed an example of the display of the display area which concerns on embodiment. It is the figure which showed an example of the display of the display area which concerns on embodiment. It is the figure which showed an example of the display of the display area which concerns on embodiment. It is the figure which showed an example of the display of the display area which concerns on embodiment. It is the figure which showed an example of the display of the display area which concerns on embodiment. It is the figure which showed the structure of the software of the portable apparatus which concerns on the modification 2. FIG. It is the figure which showed the structure of the display panel in the display area which concerns on the modification 2, a surface touch panel, and a back surface touch panel. FIG. 10 is a diagram illustrating a software configuration of a mobile device according to a second modification.

Explanation of symbols

DESCRIPTION OF SYMBOLS 10 ... Portable apparatus, 11 ... Control part, 12 ... Image | video data generation part, 13 ... Audio data generation part, 14 ... Front surface control determination part, 15 ... Back surface control determination part, 16 ... Double-sided control determination part, 17 ... Control information supply , 18 ... Inverted video data generation unit, 100 ... Bus, 101 ... CPU, 102 ... Storage unit, 103 ... RAM, 104 ... Operation unit, 105 ... Display panel, 105A ... Front panel display panel, 105B ... Back panel display panel 106 ... Speaker 107 ... Front touch panel 108 ... Back touch panel 109 ... Interface 200 ... Shooting unit 300 ... Display area 400 ... Control menu 500,501 ... Virtual particles 1071 ... Surface designation point 1081 ... Back Designated point

Claims (6)

  1. Generating means for generating video data;
    Control means for performing control according to the supplied control information;
    A first surface that is a display surface that performs display according to the video data; and a second surface that is a back surface of the first surface, and the second surface from the first surface side. A display panel that can visually recognize the side,
    A first touch panel provided on the first surface side and outputting first operation information in response to an operation on the first surface;
    A second touch panel that is provided on the second surface side and outputs second operation information in response to an operation on the second surface;
    First determining means for determining the control content in the control means according to the relationship between the display content displayed on the first surface and the first operation information;
    Second determining means for determining the control content in the control means according to the relationship between the display content displayed on the first surface and the second operation information;
    Supply means for supplying control information corresponding to each control content determined by the first determination means and the second determination means to the control means,
    The control content that can be determined by the first determining means and the control content that can be determined by the second determining means are set to have different control contents.
  2. When the relationship between the display content displayed on the first surface, the first operation information, and the second operation information is a predetermined relationship, the control content in the control means is determined according to the relationship. And further comprising a composite determination means for determining,
    The control content that can be determined by the composite determination means is set as control content different from the control content that can be determined by the first determination means and the second determination means,
    When the composite determination means determines the control contents, the supply means controls the control determined by the composite determination means instead of the control contents determined by the first determination means and the second determination means. The control apparatus with a touch panel according to claim 1, wherein control information corresponding to contents is supplied to the control means.
  3. Generating means for generating video data;
    Control means for performing control according to the supplied control information;
    A first surface that is a display surface that performs display according to the video data; and a second surface that is a back surface of the first surface, and the second surface from the first surface side. A display panel that can visually recognize the side,
    A first touch panel provided on the first surface side and outputting first operation information in response to an operation on the first surface;
    A second touch panel that is provided on the second surface side and outputs second operation information in response to an operation on the second surface;
    Composite determination means for determining the control content in the control means in accordance with the relationship between the display content displayed on the first surface, the first operation information, and the second operation information;
    A control device with a touch panel, comprising: supply means for supplying control information corresponding to the control content determined by the composite determination means to the control means.
  4. Of the control contents determined by the first determination means and the second determination means, either one is control contents for specifying a parameter to be controlled, and the other is control contents for changing the value of the parameter,
    The control device with a touch panel according to claim 1, wherein the control unit performs control to change a value for a specified parameter in accordance with the supplied control information.
  5. A first surface that is a display surface that performs display according to video data; and a second surface that is a surface behind the first surface, the second surface side from the first surface side A display panel that can visually recognize
    A first touch panel provided on the first surface side and outputting first operation information in response to an operation on the first surface;
    A second touch panel that is provided on the second surface side and outputs second operation information in response to an operation on the second surface;
    A generation function for generating video data;
    A control function for performing control in accordance with the supplied control information;
    A first determination function for determining a control content in the control function according to a relationship between the display content displayed on the first surface and the first operation information;
    A second determination function for determining the control content in the control function according to the relationship between the display content displayed on the first surface and the second operation information;
    A supply function for supplying control information corresponding to each control content determined in the first determination function and the second determination function as control information in the control function; and
    The control content that can be determined by the first determination function and the control content that can be determined by the second determination function are set to have different control contents.
  6. A first surface that is a display surface that performs display according to video data; and a second surface that is a surface behind the first surface, the second surface side from the first surface side A display panel that can visually recognize
    A first touch panel provided on the first surface side and outputting first operation information in response to an operation on the first surface;
    A second touch panel that is provided on the second surface side and outputs second operation information in response to an operation on the second surface;
    A generation function for generating video data;
    A control function for performing control in accordance with the supplied control information;
    A composite determination function for determining the control content in the control function according to the display content displayed on the first surface, the relationship between the first operation information and the second operation information;
    A supply function for supplying control information corresponding to the control content determined in the composite determination function as control information in the control function.
JP2008026624A 2008-02-06 2008-02-06 Controller with touch panel and program Pending JP2009187290A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2008026624A JP2009187290A (en) 2008-02-06 2008-02-06 Controller with touch panel and program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2008026624A JP2009187290A (en) 2008-02-06 2008-02-06 Controller with touch panel and program

Publications (1)

Publication Number Publication Date
JP2009187290A true JP2009187290A (en) 2009-08-20

Family

ID=41070461

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2008026624A Pending JP2009187290A (en) 2008-02-06 2008-02-06 Controller with touch panel and program

Country Status (1)

Country Link
JP (1) JP2009187290A (en)

Cited By (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011070609A (en) * 2009-09-28 2011-04-07 Fujitsu Ltd Information terminal device with touch panel, method and program for controlling display
JP2011076233A (en) * 2009-09-29 2011-04-14 Fujifilm Corp Image displaying device, image displaying method, and program
JP2011141680A (en) * 2010-01-06 2011-07-21 Kyocera Corp Input device, input method and input program
WO2012043079A1 (en) * 2010-09-27 2012-04-05 株式会社ソニー・コンピュータエンタテインメント Information processing device
JP2012084137A (en) * 2010-09-15 2012-04-26 Kyocera Corp Portable electronic device, screen control method and screen control program
JP2012221007A (en) * 2011-04-04 2012-11-12 Sharp Corp Transmissive display device, display system and display method
WO2012161324A1 (en) * 2011-05-26 2012-11-29 株式会社コナミデジタルエンタテインメント Information display device, information display method, non-transitory information recording medium and program
JP2012249880A (en) * 2011-06-03 2012-12-20 Sony Computer Entertainment Inc Mobile terminal, control method, and program
JP2013000306A (en) * 2011-06-15 2013-01-07 Square Enix Co Ltd Video game processing apparatus and video game processing program
JP2013507681A (en) * 2009-10-07 2013-03-04 サムスン エレクトロニクス カンパニー リミテッド UI providing method using a plurality of touch sensors and portable terminal using the same
WO2013031158A1 (en) * 2011-08-31 2013-03-07 ソニー株式会社 Operation device, and information processing method and information processing device therefor
WO2013047294A1 (en) * 2011-09-27 2013-04-04 Necカシオモバイルコミュニケーションズ株式会社 Portable electronic apparatus, input operation reception method, and input operation reception program
WO2013065214A1 (en) * 2011-10-31 2013-05-10 株式会社ソニー・コンピュータエンタテインメント Input control device, input control method, and input control program
JP2013089201A (en) * 2011-10-21 2013-05-13 Sony Computer Entertainment Inc Input control unit, input control method and input control program
JP2013114429A (en) * 2011-11-28 2013-06-10 Sega Corp Game device and game program
JP2013120564A (en) * 2011-12-08 2013-06-17 Nintendo Co Ltd Information processing system, information processing device, information processing method and information processing program
WO2013118522A1 (en) * 2012-02-08 2013-08-15 Necカシオモバイルコミュニケーションズ株式会社 Portable terminal and method for operating same
JP2013162310A (en) * 2012-02-03 2013-08-19 Nikon Corp Electronic apparatus
WO2013157280A1 (en) * 2012-04-18 2013-10-24 Necカシオモバイルコミュニケーションズ株式会社 Position input device, position input method, position input program, and information processing device
JP2014501001A (en) * 2010-11-01 2014-01-16 株式会社ソニー・コンピュータエンタテインメント Control of virtual objects using device touch interface functions
WO2014020765A1 (en) 2012-08-03 2014-02-06 Necカシオモバイルコミュニケーションズ株式会社 Touch panel device, process determination method, program, and touch panel system
JP2014029594A (en) * 2012-07-31 2014-02-13 Canon Inc Information terminal and control method of the same, and program
WO2014025101A1 (en) * 2012-08-10 2014-02-13 주식회사 네오위즈인터넷 Game method, game device, and recording medium
JP2014521174A (en) * 2011-07-20 2014-08-25 ゼットティーイー コーポレイション Method and apparatus for generating dynamic wallpaper
US8827784B2 (en) 2011-06-03 2014-09-09 Sony Corporation Game device, game control program, and method for controlling golf game
JP2014186374A (en) * 2013-03-21 2014-10-02 Casio Comput Co Ltd Information process device, information process method and program
CN104205193A (en) * 2012-03-28 2014-12-10 微软公司 Augmented reality light guide display
JP2015512549A (en) * 2012-04-07 2015-04-27 サムスン エレクトロニクス カンパニー リミテッド Object control method in device having transparent display, device and recording medium
US9069457B2 (en) 2012-01-03 2015-06-30 Sony Corporation Portable terminal
JP5805674B2 (en) * 2011-01-25 2015-11-04 株式会社ソニー・コンピュータエンタテインメント Input device, input method, and computer program
US9319497B2 (en) 2013-01-02 2016-04-19 Sony Corporation Portable terminal
JPWO2014002615A1 (en) * 2012-06-27 2016-05-30 日本電気株式会社 Mobile terminal device, operation method of mobile terminal device, and operation program for mobile terminal device
JPWO2014003012A1 (en) * 2012-06-29 2016-06-02 日本電気株式会社 Terminal device, display control method, and program
JPWO2014054367A1 (en) * 2012-10-01 2016-08-25 日本電気株式会社 Information processing apparatus, information processing method, and program
US9807381B2 (en) 2012-03-14 2017-10-31 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US9909852B2 (en) 2012-02-29 2018-03-06 Denso Corporation Operation position detection apparatus and vehicular apparatus
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US10296127B2 (en) 2012-04-07 2019-05-21 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US10478717B2 (en) 2012-04-05 2019-11-19 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements

Cited By (68)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011070609A (en) * 2009-09-28 2011-04-07 Fujitsu Ltd Information terminal device with touch panel, method and program for controlling display
JP2011076233A (en) * 2009-09-29 2011-04-14 Fujifilm Corp Image displaying device, image displaying method, and program
US8830184B2 (en) 2009-09-29 2014-09-09 Fujifilm Corporation Image displaying device, image displaying method, and program for displaying images
JP2013507681A (en) * 2009-10-07 2013-03-04 サムスン エレクトロニクス カンパニー リミテッド UI providing method using a plurality of touch sensors and portable terminal using the same
JP2011141680A (en) * 2010-01-06 2011-07-21 Kyocera Corp Input device, input method and input program
JP2012084137A (en) * 2010-09-15 2012-04-26 Kyocera Corp Portable electronic device, screen control method and screen control program
JP2012073662A (en) * 2010-09-27 2012-04-12 Sony Computer Entertainment Inc Information processor, control method for the same, and program
US9128550B2 (en) 2010-09-27 2015-09-08 Sony Corporation Information processing device
CN103124951A (en) * 2010-09-27 2013-05-29 索尼电脑娱乐公司 Information processing device
WO2012043079A1 (en) * 2010-09-27 2012-04-05 株式会社ソニー・コンピュータエンタテインメント Information processing device
CN103124951B (en) * 2010-09-27 2016-01-20 索尼电脑娱乐公司 Signal conditioning package
JP2014501001A (en) * 2010-11-01 2014-01-16 株式会社ソニー・コンピュータエンタテインメント Control of virtual objects using device touch interface functions
JP5805674B2 (en) * 2011-01-25 2015-11-04 株式会社ソニー・コンピュータエンタテインメント Input device, input method, and computer program
US9411425B2 (en) 2011-01-25 2016-08-09 Sony Corporation Input device, input method, and computer program for inputting characters, numbers, or symbols by using an on-screen keyboard
JP2012221007A (en) * 2011-04-04 2012-11-12 Sharp Corp Transmissive display device, display system and display method
WO2012161324A1 (en) * 2011-05-26 2012-11-29 株式会社コナミデジタルエンタテインメント Information display device, information display method, non-transitory information recording medium and program
JP2012247921A (en) * 2011-05-26 2012-12-13 Konami Digital Entertainment Co Ltd Information display device, information display method, and program
JP2012249880A (en) * 2011-06-03 2012-12-20 Sony Computer Entertainment Inc Mobile terminal, control method, and program
US8827784B2 (en) 2011-06-03 2014-09-09 Sony Corporation Game device, game control program, and method for controlling golf game
US9802117B2 (en) 2011-06-03 2017-10-31 Sony Interactive Entertainment Inc. Game device, game control program, and method for controlling golf game
JP2013000306A (en) * 2011-06-15 2013-01-07 Square Enix Co Ltd Video game processing apparatus and video game processing program
JP2014521174A (en) * 2011-07-20 2014-08-25 ゼットティーイー コーポレイション Method and apparatus for generating dynamic wallpaper
US9195364B2 (en) 2011-07-20 2015-11-24 Zte Corporation Method and apparatus for generating dynamic wallpaper
US8830406B2 (en) 2011-08-31 2014-09-09 Sony Corporation Operation apparatus, information processing method therefor, and information processing apparatus
WO2013031158A1 (en) * 2011-08-31 2013-03-07 ソニー株式会社 Operation device, and information processing method and information processing device therefor
JP2013050907A (en) * 2011-08-31 2013-03-14 Sony Corp Operation device, and information processing method and information processing device therefor
CN103765365A (en) * 2011-08-31 2014-04-30 索尼公司 Operation device, and information processing method and information processing device therefor
CN103765365B (en) * 2011-08-31 2017-10-13 索尼公司 Operation device, its information processing method and information processor
RU2621183C2 (en) * 2011-08-31 2017-05-31 Сони Корпорейшн Handling device, data processing method therein and data processing device
WO2013047294A1 (en) * 2011-09-27 2013-04-04 Necカシオモバイルコミュニケーションズ株式会社 Portable electronic apparatus, input operation reception method, and input operation reception program
JP2013089201A (en) * 2011-10-21 2013-05-13 Sony Computer Entertainment Inc Input control unit, input control method and input control program
US9433857B2 (en) 2011-10-31 2016-09-06 Sony Corporation Input control device, input control method, and input control program
WO2013065214A1 (en) * 2011-10-31 2013-05-10 株式会社ソニー・コンピュータエンタテインメント Input control device, input control method, and input control program
JP2013114429A (en) * 2011-11-28 2013-06-10 Sega Corp Game device and game program
JP2013120564A (en) * 2011-12-08 2013-06-17 Nintendo Co Ltd Information processing system, information processing device, information processing method and information processing program
US9069457B2 (en) 2012-01-03 2015-06-30 Sony Corporation Portable terminal
JP2013162310A (en) * 2012-02-03 2013-08-19 Nikon Corp Electronic apparatus
US9411449B2 (en) 2012-02-08 2016-08-09 Nec Corporation Mobile terminal and operation method therefor
WO2013118522A1 (en) * 2012-02-08 2013-08-15 Necカシオモバイルコミュニケーションズ株式会社 Portable terminal and method for operating same
US20150011263A1 (en) * 2012-02-08 2015-01-08 Shinichi Itamoto Mobile terminal and operation method therefor
JPWO2013118522A1 (en) * 2012-02-08 2015-05-11 Necカシオモバイルコミュニケーションズ株式会社 Mobile terminal and operation method thereof
US9909852B2 (en) 2012-02-29 2018-03-06 Denso Corporation Operation position detection apparatus and vehicular apparatus
US9807381B2 (en) 2012-03-14 2017-10-31 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
JP2015523583A (en) * 2012-03-28 2015-08-13 マイクロソフト コーポレーション Augmented reality light guide display
KR102049132B1 (en) * 2012-03-28 2019-11-26 마이크로소프트 테크놀로지 라이센싱, 엘엘씨 Augmented reality light guide display
US10388073B2 (en) 2012-03-28 2019-08-20 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
CN104205193B (en) * 2012-03-28 2018-01-26 微软技术许可有限责任公司 Augmented reality light guide is shown
CN104205193A (en) * 2012-03-28 2014-12-10 微软公司 Augmented reality light guide display
KR20140142337A (en) * 2012-03-28 2014-12-11 마이크로소프트 코포레이션 Augmented reality light guide display
US10478717B2 (en) 2012-04-05 2019-11-19 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10296127B2 (en) 2012-04-07 2019-05-21 Samsung Electronics Co., Ltd. Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
JP2015512549A (en) * 2012-04-07 2015-04-27 サムスン エレクトロニクス カンパニー リミテッド Object control method in device having transparent display, device and recording medium
WO2013157280A1 (en) * 2012-04-18 2013-10-24 Necカシオモバイルコミュニケーションズ株式会社 Position input device, position input method, position input program, and information processing device
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
JPWO2014002615A1 (en) * 2012-06-27 2016-05-30 日本電気株式会社 Mobile terminal device, operation method of mobile terminal device, and operation program for mobile terminal device
US10394366B2 (en) 2012-06-29 2019-08-27 Nec Corporation Terminal device, display control method, and program
JPWO2014003012A1 (en) * 2012-06-29 2016-06-02 日本電気株式会社 Terminal device, display control method, and program
JP2014029594A (en) * 2012-07-31 2014-02-13 Canon Inc Information terminal and control method of the same, and program
WO2014020765A1 (en) 2012-08-03 2014-02-06 Necカシオモバイルコミュニケーションズ株式会社 Touch panel device, process determination method, program, and touch panel system
US9817567B2 (en) 2012-08-03 2017-11-14 Nec Corporation Touch panel device, process determination method, program, and touch panel system
KR101476221B1 (en) * 2012-08-10 2014-12-26 주식회사 네오위즈인터넷 Game method, game apparatus, and recording medium
WO2014025101A1 (en) * 2012-08-10 2014-02-13 주식회사 네오위즈인터넷 Game method, game device, and recording medium
US9733667B2 (en) 2012-10-01 2017-08-15 Nec Corporation Information processing device, information processing method and recording medium
JPWO2014054367A1 (en) * 2012-10-01 2016-08-25 日本電気株式会社 Information processing apparatus, information processing method, and program
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9319497B2 (en) 2013-01-02 2016-04-19 Sony Corporation Portable terminal
JP2014186374A (en) * 2013-03-21 2014-10-02 Casio Comput Co Ltd Information process device, information process method and program

Similar Documents

Publication Publication Date Title
US10209806B1 (en) Tri-state gesture-equipped touch screen system, method, and computer program product
US10185440B2 (en) Electronic device operating according to pressure state of touch input and method thereof
US10048917B2 (en) Remote control of a presentation
JP6382261B2 (en) Advanced camera-based input
JP6463795B2 (en) System and method for using textures in a graphical user interface device
US9575594B2 (en) Control of virtual object using device touch interface functionality
EP2960750B1 (en) Portable terminal and display method thereof
US10101873B2 (en) Portable terminal having user interface function, display method, and computer program
US10401964B2 (en) Mobile terminal and method for controlling haptic feedback
US9355472B2 (en) Device, method, and graphical user interface for adjusting the appearance of a control
US20190155420A1 (en) Information processing apparatus, information processing method, and program
US20170052599A1 (en) Touch Free Interface For Augmented Reality Systems
JP5960199B2 (en) Portable device with two-finger touch triggers selection and conversion of active elements
US20190235636A1 (en) Systems and Methods of Creating a Realistic Displacement of a Virtual Object in Virtual Reality/Augmented Reality Environments
CN103729160B (en) Multi-display equipment and multi display method
US9406281B2 (en) Multi-display device and method of controlling thereof
US9367233B2 (en) Display apparatus and method thereof
US8558790B2 (en) Portable device and control method thereof
US9348504B2 (en) Multi-display apparatus and method of controlling the same
JP6074170B2 (en) Short range motion tracking system and method
CN103049254B (en) DLL for semantic zoom
KR101984683B1 (en) Multi display device and method for controlling thereof
US20190354580A1 (en) Multi-word autocorrection
US10296127B2 (en) Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US9740321B2 (en) Method for operating application program and mobile electronic device using the same