CN106774875B - Information processing apparatus and method - Google Patents
Information processing apparatus and method Download PDFInfo
- Publication number
- CN106774875B CN106774875B CN201611140417.3A CN201611140417A CN106774875B CN 106774875 B CN106774875 B CN 106774875B CN 201611140417 A CN201611140417 A CN 201611140417A CN 106774875 B CN106774875 B CN 106774875B
- Authority
- CN
- China
- Prior art keywords
- user
- input
- information processing
- page
- pages
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Landscapes
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
An information processing apparatus and method are disclosed. The information processing apparatus includes: a display unit, located on a first surface of the information processing apparatus, for displaying information to a user; the sensing unit is positioned on a second surface of the information processing device and used for detecting input of a user on the second surface, wherein the first surface is opposite to the second surface; and a processing unit, located in the information processing device, for executing a predetermined operation corresponding to a predetermined input when the sensing unit detects the predetermined input by the user; wherein the display unit and a portion of the information processing apparatus placed overlapping the display unit each satisfy a predetermined light transmittance.
Description
Technical Field
The invention relates to an information processing apparatus and method. And more particularly, to an information processing apparatus and method capable of inputting a specific gesture at the back of the information processing apparatus including a transparent screen to perform an operation corresponding thereto.
Background
The existing information processing apparatus (such as a music pad) can perform an operation corresponding thereto only by performing a touch input on the front screen. However, in some cases, input on only the front screen may cause user inconvenience. For example, when a user browses a web page with a music pad in his hand and wants to switch between multiple overlapping web pages, a mere front-screen input may cause some links to be clicked erroneously. And in some cases, for example, when the user wants to put a selected page to the bottom among a plurality of overlapping pages, only the input on the front side screen may cause the operation to be unsmooth.
Disclosure of Invention
In view of the above, it is desirable to provide a new and improved information processing apparatus and method capable of improving user convenience.
According to an aspect of an embodiment of the present invention, there is provided an information processing apparatus including:
a display unit, located on a first surface of the information processing apparatus, for displaying information to a user;
the sensing unit is positioned on a second surface of the information processing device and used for detecting input of a user on the second surface, wherein the first surface is opposite to the second surface; and
the processing unit is positioned in the information processing equipment and used for executing a preset operation corresponding to a preset input when the preset input of a user is detected by the sensing unit;
wherein the display unit and a portion of the information processing apparatus placed overlapping the display unit each satisfy a predetermined light transmittance.
Preferably, in an information processing apparatus according to an embodiment of the present invention, a portion of the information processing apparatus placed overlapping with the display unit includes:
the display unit comprises a part corresponding to the display unit in the sensing unit and a part corresponding to the display unit between the display unit and the sensing unit.
Preferably, in the information processing apparatus according to an embodiment of the present invention, in a case where the display unit displays a plurality of overlapping pages in a direction perpendicular to the display unit, when the sensing unit detects a first input by a user, the processing unit performs a transition from a first view mode to a second view mode in which the display unit is capable of displaying a positional relationship between the plurality of pages.
Preferably, in the information processing apparatus according to an embodiment of the present invention, the first view mode is a plane overlap display mode, the second view mode is a plane side-by-side display mode, and in the plane overlap display mode, at least two of the plurality of pages have an overlapping portion, and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion.
Preferably, in an information processing apparatus according to an embodiment of the present invention, the first view mode is a plane view mode, the second view mode is a stereoscopic view mode, and in the plane view mode, a page is displayed as a rectangle in which display content is contained, and in the stereoscopic view mode, a page is displayed in a perspective view in which a user can see a page thickness and can see a face containing the display content.
Preferably, in the information processing apparatus according to an embodiment of the present invention, after the transition to the second view mode, when the sensing unit detects a second input by the user, the processing unit selects any one of the plurality of pages according to a selection by the user.
Preferably, in the information processing apparatus according to an embodiment of the present invention, in a case where the display unit displays a plurality of overlapped pages, when the sensing unit detects the second input by the user, the processing unit performs an operation of selecting a specific one of the plurality of overlapped displayed pages according to a selection of the user.
Preferably, in the information processing apparatus according to the embodiment of the present invention, the sensing unit detects a position corresponding to the second input of the user, and if there is only one page corresponding to the position, the processing unit performs an operation of selecting the one page corresponding to the position, and if there are a plurality of pages corresponding to the position, the processing unit performs an operation of selecting a top page or a bottom page of the plurality of pages corresponding to the position.
Preferably, in the information processing apparatus according to an embodiment of the present invention, when the sensing unit detects a third input by the user, the processing unit transposes the selected page to a specific position.
Preferably, in an information processing apparatus according to an embodiment of the present invention, the third input is a user gesture close to or away from the second surface, the sensing unit detects a distance between a hand of the user and the second surface, and the processing unit transposes the selected page to a specific position according to a change in the distance.
According to another aspect of embodiments of the present invention, there is provided an information processing method applied to an information processing apparatus including a display unit located on a first surface, and a predetermined light transmittance is satisfied by the display unit and a portion of the information processing apparatus placed to overlap with the display unit, the method including the steps of:
displaying a plurality of overlapping pages;
detecting a user input on a first surface and/or a second surface, wherein the first surface and the second surface are opposite; and
when a predetermined input by a user is detected, a predetermined operation corresponding to the predetermined input is performed.
Preferably, in an information processing method according to an embodiment of the present invention, when a predetermined input by a user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes:
when a first input by a user is detected, a transition is performed from a first view mode to a second view mode in which the display unit is capable of displaying a positional relationship between a plurality of pages.
Preferably, in the information processing method according to an embodiment of the present invention, the first view mode is a plane overlap display mode, the second view mode is a plane side-by-side display mode, and in the plane overlap display mode, at least two of the plurality of pages have an overlapping portion, and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion.
Preferably, in the information processing method according to an embodiment of the present invention, the first view mode is a plane view mode, the second view mode is a stereoscopic view mode, and in the plane view mode, the page is displayed as a rectangle containing the display content, and in the stereoscopic view mode, the page is displayed in a perspective view in which the user can see the thickness of the page and can see the face containing the display content.
Preferably, in the information processing method according to an embodiment of the present invention, after the transition to the second view mode, the method further includes the steps of:
when the second input of the user is detected, any one of the plurality of pages is selected according to the selection of the user.
Preferably, in an information processing method according to an embodiment of the present invention, when a predetermined input by a user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes:
when the second input of the user is detected, the operation of selecting a specific one of the plurality of overlapped displayed pages is performed according to the selection of the user.
Preferably, in the information processing method according to an embodiment of the present invention, the step of performing an operation of selecting a specific one of the plurality of overlapped displayed pages when the second input by the user is detected includes:
detecting a position corresponding to a second input of the user;
if only one page corresponding to the position exists at the position, the operation of selecting the page corresponding to the position is executed;
and if a plurality of pages corresponding to the position exist at the position, selecting a top page or a bottom page in the plurality of pages corresponding to the position is executed.
Preferably, in the information processing method according to the embodiment of the present invention, after the page is selected, the method further includes the steps of:
when the third input of the user is detected, the selected page is transposed to a specific position.
Preferably, in the information processing method according to an embodiment of the present invention, the third input is a user gesture close to or away from the second surface, and when the third input by the user is detected, the step of swapping the selected page to a specific position includes:
the distance between the user's hand and the second surface is detected, and the selected page is transposed to a specific position according to the change of the distance.
According to the information processing device and the method, the specific gesture can be input at the back of the information processing device, so that more operations conforming to the habit of the user can be completed through the back input or further through the combination of the back input and the front input, and the convenience of the user is improved.
Drawings
Fig. 1 is a block diagram showing a functional configuration of an information processing apparatus according to an embodiment of the present invention;
fig. 2 and 3 are diagrams showing the appearance of an information processing apparatus according to an embodiment of the present invention;
fig. 4 shows a first example of a side view and a corresponding front view for explaining an arrangement positional relationship of respective parts in an information processing apparatus according to an embodiment of the present invention;
fig. 5 shows a second example of a side view and a corresponding front view for explaining an arrangement positional relationship of respective parts in the information processing apparatus according to the embodiment of the present invention;
fig. 6 shows a third example of a side view and a corresponding front view for explaining an arrangement positional relationship of respective parts in the information processing apparatus according to the embodiment of the present invention;
fig. 7 to 19 show display examples of a display unit in an information processing apparatus according to an embodiment of the present invention;
fig. 20 is a side view showing a first example for explaining the position of the second sensing unit and the image capturing range in the information processing apparatus according to the embodiment of the present invention;
fig. 21 is a side view showing a second example for explaining the position of the second sensing unit and the image capturing range in the information processing apparatus according to the embodiment of the present invention; and
fig. 22 is a flowchart showing a procedure of an information processing method according to an embodiment of the present invention.
Detailed Description
Various preferred embodiments of the present invention will be described below with reference to the accompanying drawings. The following description with reference to the accompanying drawings is provided to assist in understanding the exemplary embodiments of the invention as defined by the claims and their equivalents. It includes various specific details to assist understanding, but they are to be construed as merely illustrative. Accordingly, those skilled in the art will recognize that various changes and modifications can be made to the embodiments described herein without departing from the scope and spirit of the present invention. Also, in order to make the description clearer and simpler, a detailed description of functions and configurations well known in the art will be omitted.
First, an information processing apparatus according to an embodiment of the present invention is described with reference to fig. 1. Fig. 1 is a block diagram showing a functional configuration of an information processing apparatus according to an embodiment of the present invention. As shown in fig. 1, the information processing apparatus 100 includes: a display unit 101, a first sensing unit 102 and a processing unit 103.
The display unit 101 is located on the front surface (defined as a first surface) of the information processing apparatus 100 for displaying information to a user. The first sensing unit 102 is located at the back side (defined as a second surface) of the information processing apparatus 100, and detects an input of a user at the back side of the information processing apparatus. As can be seen, the display unit 101 and the first sensing unit 102 are located on two opposite sides in the information processing apparatus 100.
The processing unit 103 is located in the information processing apparatus 100, and is configured to perform a predetermined operation corresponding to a predetermined input by the user when the predetermined input is detected by the first sensing unit 102.
Here, it is to be noted that, in order to realize the input (e.g., a specific gesture) by the user at the back of the information processing apparatus 100, the user must be able to see the hand at the back thereof at the front of the information processing apparatus 100. For example, fig. 2 and 3 are schematic external views of the information processing apparatus 100. In fig. 2 and 3, an object located on the back side thereof can be seen on the front side of the information processing apparatus 100. Therefore, the display unit 101 and a portion of the information processing apparatus 100 placed overlapping with the display unit 101 are designed to each satisfy a predetermined light transmittance. For example, the display unit 101 and a portion of the information processing apparatus 100 placed to overlap with the display unit 101 may be completely transparent or translucent as long as an object located therebehind can be seen therethrough.
In the information processing apparatus 100, the first sensing unit 102 may be disposed to completely/partially overlap with the display unit 101, while neither the processing unit 103 nor other components (not mentioned in this specification since they are not closely related to the present invention) are disposed to overlap with the foregoing two.
Fig. 4 shows a first example of a side view of the information processing apparatus 100 in this case and a corresponding front view, in which the shaded portion represents an opaque portion. As shown in fig. 4, the display unit 101 and the first sensing unit 102 are disposed to overlap without including other components therebetween. The display unit 101 is located at an upper portion of the information processing apparatus 100, and the processing unit 103 and other components are located at a lower portion of the information processing apparatus 100. It can be seen that the information processing apparatus shown in fig. 4 corresponds to the appearance shown in fig. 2 mentioned earlier.
Fig. 5 shows a side view of the information processing apparatus 100 in this case and a second example of a corresponding front view, in which the shaded portion represents an opaque portion. As shown in fig. 5, the display unit 101 and the first sensing unit 102 are disposed to overlap without including other components therebetween. The display unit 101 is located in the middle of the information processing apparatus 100, and the processing unit 103 and other components are located in the upper and lower portions of the information processing apparatus 100, respectively. It can be seen that the information processing apparatus shown in fig. 5 corresponds to the appearance shown in fig. 3 mentioned earlier. Of course, this is merely an example, and any other configuration is possible.
In this case, the portion of the information processing apparatus 100 placed to overlap with the display unit 101 includes a portion of the first sensing unit corresponding to the display unit.
Further alternatively, the first sensing unit 102 may be disposed to completely/partially overlap with the display unit 101 with a part or all of the processing unit 103 and other components therebetween. Fig. 6 shows a side view of the information processing apparatus 100 in this case and a corresponding front view, in which the shaded portion indicates an opaque portion. As shown in fig. 6, the display unit 101 and the first sensing unit 102 are disposed to overlap with a portion of the processing unit 103 and other components included therebetween, the display unit 101, the first sensing unit 102, and a portion of the processing unit 103 and other components are located at an upper portion of the information processing apparatus 100, and another portion of the processing unit 103 and other components are located at a lower portion of the information processing apparatus. In this case, the portion of the information processing apparatus placed to overlap with the display unit includes, in addition to the portion of the first sensing unit corresponding to the display unit mentioned in the former case: and the part between the display unit and the first sensing unit, which corresponds to the display unit, is arranged.
Display examples of the display unit in the information processing apparatus according to the embodiment of the present invention in the case of performing view conversion are described below with reference to fig. 7 to 11.
Fig. 7 shows a case where the display unit 101 displays a plurality of overlapped pages. As shown in fig. 7, a plurality of overlapped pages on which contents A, B, C, D, E are displayed, respectively, are displayed in a direction perpendicular to the display unit 101. The state shown in fig. 7 is taken as an initial state.
In an initial state, when the first sensing unit 102 detects a first input of a user, for example, a one-click operation or a touch operation of the user for a predetermined time, the processing unit 103 performs a transition from a first view mode to a second view mode in which the display unit is capable of displaying a positional relationship between the plurality of pages.
Fig. 8 shows a first display example of the display unit after view conversion. In a first display example, the first view mode is a plane overlay display mode, and the second view mode is a plane side-by-side display mode. In the plane overlay display mode, at least two of the plurality of pages have an overlapping portion (as shown in fig. 7), and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion (as shown in fig. 8), and the pages are sequentially arranged in a plane form in order from left to right and from top to bottom.
In the state shown in fig. 8, the user can select between different pages by moving the finger in the up, down, left, and right directions, as shown in fig. 9. Fig. 9 shows an example where the user moves a finger in different directions. The page where the user's finger is currently located is displayed in a bold frame. For example, a page on which the current finger is located is shown in fig. 9 as a page displaying content D (hereinafter, this is referred to as page D, and similarly, pages displaying content A, B, C, E are referred to as pages A, B, C, E, respectively).
Fig. 10 shows a second display example of the display unit after view conversion. In a second display example, the first view mode is a plan view mode, and the second view mode is a stereoscopic view mode. And, in the plan view mode, the page is displayed as a rectangle (as shown in fig. 7) in which the display contents are contained, and in the stereoscopic view mode, the page is displayed in a perspective view in which the user can see the thickness of the page and can see the face containing the display contents (as shown in fig. 10).
In the state shown in fig. 10, the user can select between different pages by moving the finger in the left-right direction, as shown in fig. 11. Fig. 11 shows an example in which the user moves a finger in the left-right direction. The page where the user's finger is currently located is displayed in a bold frame. For example, fig. 11 shows that the page on which the current finger is located is page C.
After the display unit 101 is converted into the second view mode as described above, when the first sensing unit 102 detects the second input of the user, the processing unit 103 selects any one of the plurality of pages according to the selection of the user. That is, the second input by the user is a selected operation of one of the plurality of pages. The second input may be the same as the first input or may be different from the first input. For example, the second input may be a one-click operation or an operation of touching and pressing for a predetermined time by the user.
In the above, the case where the processing unit 103 performs the view conversion when the first sensing unit 102 detects the first input of the user in the initial state is described. In this case, since the view conversion is performed, the user is enabled to select any one among a plurality of pages which do not overlap with each other. Of course, as another embodiment, the view conversion may not be performed.
Next, a display example of the display unit in the information processing apparatus according to the embodiment of the present invention without view conversion is described with reference to fig. 12 and 13.
In the case where a plurality of overlapped pages are displayed on the display unit 101, when the first sensing unit 102 detects the second input of the user, the processing unit 103 performs an operation of selecting a specific one of the plurality of overlapped displayed pages according to the selection of the user.
The processing unit 103 selects a specific one of the pages as follows: the first sensing unit 102 first detects a position corresponding to the second input of the user, if there is only one page corresponding to the position, the processing unit 103 performs an operation of selecting the one page corresponding to the position, and if there are a plurality of pages corresponding to the position, the processing unit 103 performs an operation of selecting a top page or a bottom page of the plurality of pages corresponding to the position.
For example, fig. 12 shows an example of selecting one page in a case where the display unit displays a plurality of overlapping pages. In fig. 12, since the page corresponding to the finger position of the user includes page a and page C, the processing unit 103 selects the top one of the two pages. In the case of fig. 12, the top of the two pages is page a. Of course, which of the two pages is selected is not limited thereto. The processing unit 103 may also select the bottom of the two pages, i.e. page C.
As another example, fig. 13 shows still another example of selecting one page in a case where the display unit displays a plurality of overlapping pages. In fig. 13, since there is only one page corresponding to the finger position of the user, that is: page a, and thus the processing unit 103 selects the page.
In the above, when a plurality of overlapped pages are displayed on the display unit, the description has been made separately for the case of performing view conversion and the case of not performing view conversion. In both cases, the processing unit 103 may eventually select a page.
After the processing unit 103 selects one page, when the first sensing unit 102 detects a third input of the user, the processing unit 103 transposes the selected page to a specific position. Several embodiments for swapping a selected page to a specific position are described below with reference to fig. 14 to 19.
As an embodiment, fig. 14 shows a display example of swapping a page after the selected page shown in fig. 9. In this case, the third input may be a sliding operation moving from the page D to the page E on the first sensing unit 102.
Of course, it should be noted that the third input is not limited to the sliding operation described above. Alternatively, the third input may be an operation of the user's finger approaching or departing from the first sensing unit 102. For example, when the user's fingers are close, the selected page is swapped forward (i.e., D → C → B → A), and when the user's fingers are far away, the selected page is swapped backward (i.e., D → E). Further, the pages may be exchanged step by step according to a change in the distance between the finger and the first sensing unit 102. For example, as the distance decreases, the selected page D is gradually transposed to page C, page B, and page a. Alternatively, as the distance increases, the selected page D is swapped to page E. Of course, this is merely an example, and vice versa. Moreover, the skilled person can also exchange pages in any other possible way.
Fig. 15 shows a display example after the swap page operation shown in fig. 14. As shown in fig. 15, page D has been transposed with page E.
As another embodiment, fig. 16 shows a display example of swapping a page after the selected page shown in fig. 11. In this case, the third input may be a sliding operation in which the user's finger moves from the page C to the page a in the left-right direction on the first sensing unit 102.
Similarly, the third input here is also not limited to the aforementioned slide operation. Alternatively, the third input may be an operation of the user's finger approaching or departing from the first sensing unit 102. For example, when the user's fingers are close, the selected page is transposed to the left (i.e., C → B → A), and when the user's fingers are far away, the selected page is transposed to the right (i.e., C → D → E). Further, similarly to the above, the pages may be exchanged step by step according to a change in the distance between the finger and the first sensing unit 102. For example, as the distance decreases, the selected page C is gradually transposed to page B and page a. Alternatively, the selected page C is gradually transposed to the pages D and E as the distance increases. Of course, this is merely an example, and vice versa. Moreover, the skilled person can also exchange pages in any other possible way.
Fig. 17 shows a display example after the swap page operation shown in fig. 16. As shown in fig. 17, page C has been transposed with page a.
As still another embodiment, fig. 18 illustrates an operation of swapping a page after the selected page illustrated in fig. 12. In this case, the third input may be an operation of the user's finger away from the first sensing unit 102. In fig. 18, an arrow to the right indicates an operation of moving away the user's finger, and an arrow to the upper indicates a downward movement of the page a. For example, when the first sensing unit 102 detects the third input of the user, the processing unit 103 may set the selected page to the bottom by default, or may move the page down step by step according to a change in the distance of the finger from the first sensing unit 102. Of course, as mentioned above, this exchange is merely an example, and those skilled in the art may exchange pages in any other possible way.
Fig. 19 shows a display example after the swap page operation shown in fig. 18. As shown in fig. 19, page a is set to the bottom.
Further, it should be noted that there are three ways of detecting the distance between the user's finger and the back surface (i.e., the second surface) of the information processing apparatus 100 as follows.
As a first implementation, the first sensing unit 102 may adopt an enhanced capacitive screen, which can detect not only the touch of the user's finger, but also the approach of the user's finger and the distance therebetween. Since the enhanced capacitive screen is the prior art, the details thereof are not described in detail.
As a second embodiment, the information processing apparatus 100 may further include a second sensing unit 104 (i.e., one camera) for detecting a change in distance of the hand from the second surface. Fig. 20 shows a side view of the information processing apparatus 100 in this case. As shown in fig. 20, the second sensing unit 104 is located in the information processing apparatus 100 with its photographing lens parallel to the second surface. Since the first sensing unit 102 satisfies the predetermined light transmittance, the second sensing unit 104 can take a picture of the approach/separation of the finger to/from the first sensing unit 102 at a front view angle. The second sensing unit 104 determines a distance change according to a size change of a finger in the photographed picture. Specifically, when the finger becomes small, the distance becomes large, and when the finger becomes large, the distance becomes small. Then, the processing unit 103 exchanges the page according to the distance change detected by the second sensing unit 104.
As a third embodiment, the information processing apparatus 100 also includes a second sensing unit 104 (i.e., one camera) for detecting a change in the distance of the hand from the second surface. Fig. 21 shows a side view of the information processing apparatus 100 in this case. Unlike the aforementioned second embodiment, the photographing lens of the second sensing unit 104 is perpendicular to the second surface, and can photograph a picture of the approach/separation of the finger to/from the first sensing unit 102 at a side view angle, so that it can be intuitively determined whether the distance between the finger and the second surface is increased or decreased according to the picture. Then, the processing unit 103 exchanges the page according to the distance change detected by the second sensing unit 104.
In the above, the embodiment has been described in which, after the processing unit selects one page, when the sensing unit detects the third input of the user, the processing unit transposes the selected page to a specific position.
In addition, it should be noted that, as described above, when a plurality of overlapped pages are displayed on the display unit, after the selected page is converted by the view and the page is exchanged, when the sensing unit detects the fourth input, the processing unit converts the current view into the initial view for the user to view.
Further, it should be noted that, in the above description, the case where the first sensing unit and/or the second sensing unit is included only on the back surface (second surface) of the information processing apparatus 100 is described. However, it is understood that the third sensing unit may be included on the front surface (first surface) of the information processing apparatus 100. The operations of view conversion, page selection, page exchange and the like are completed through the interaction of the sensing units on the front side and the back side. For example, when the first sensing unit located at the back side detects the first input, the processing unit performs view conversion. After the view conversion, when the third sensing unit positioned on the front side detects the second input, the processing unit selects the page. Then, when the first sensing unit and the second sensing unit on the back side detect the third input, the processing unit performs position exchange on the selected page. Of course, the above approach is merely an example, and any other approach is possible.
Hereinafter, an information processing method according to an embodiment of the present invention will be described with reference to fig. 22. Fig. 22 is a flowchart showing a procedure of an information processing method according to an embodiment of the present invention. The information processing method is applied to the information processing apparatus 100 described hereinabove. As shown in fig. 22, the method includes the steps of:
first, in step S2201, a plurality of overlapped pages are displayed.
Then, in step S2202, it is determined whether or not an input by a user on a first surface and/or a second surface, which are opposite to each other, is detected.
If the determination result in step S2202 indicates that the predetermined input is detected, the processing proceeds to step S2203. In step S2203, a predetermined operation corresponding to the predetermined input is performed.
When a predetermined input by a user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes: when a first input by a user is detected, a transition is performed from a first view mode to a second view mode in which the display unit is capable of displaying a positional relationship between a plurality of pages.
For example, the first view mode is a plane overlap display mode, the second view mode is a plane side-by-side display mode, and in the plane overlap display mode, at least two of the plurality of pages have an overlapping portion, and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion.
Further alternatively, the first view mode is a plane view mode, the second view mode is a stereoscopic view mode, and in the plane view mode, a page is displayed as a rectangle in which display contents are contained, and in the stereoscopic view mode, a page is displayed in a perspective view in which a user can see a page thickness and a face containing the display contents.
After the transition to the second view mode, the method further comprises the steps of: when the second input of the user is detected, any one of the plurality of pages is selected according to the selection of the user.
The case where the view conversion is performed is described above. Alternatively, no view conversion may be performed. In this case, when a predetermined input by the user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes: when the second input of the user is detected, the operation of selecting a specific page in the plurality of overlapped displayed pages is executed.
Wherein, when the second input of the user is detected, the step of performing the operation of selecting a specific one of the plurality of overlapped displayed pages includes: detecting a position corresponding to a second input of the user; if only one page corresponding to the position exists at the position, the operation of selecting the page corresponding to the position is executed; and if a plurality of pages corresponding to the position exist at the position, selecting a top page or a bottom page in the plurality of pages corresponding to the position is executed.
After selecting a page, the method further comprises the steps of: when the third input of the user is detected, the selected page is transposed to a specific position.
For example, the third input is a user gesture close to or away from the second surface, and when the third input by the user is detected, the step of swapping the selected page to a specific position includes: the distance between the user's hand and the second surface is detected, and the selected page is transposed to a specific position according to the change of the distance.
Since the steps of the information processing method according to the embodiment of the present invention completely correspond to the respective parts of the information processing apparatus described previously, details regarding the steps are not described in detail again in order to avoid redundancy.
Heretofore, an information processing apparatus and method according to an embodiment of the present invention have been described in detail. In the information processing apparatus and method according to the embodiment of the present invention, unlike the information processing apparatus and method in the related art that can only input on the front side, a specific gesture can be input on the back side of the information processing apparatus, so that more operations according with the user's habits are completed through the back input or further through the combination of the back input and the front input, thereby improving the user convenience.
It should be noted that, in the present specification, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
Finally, it should be noted that the series of processes described above includes not only processes performed in time series in the order described herein, but also processes performed in parallel or individually, rather than in time series.
Through the above description of the embodiments, those skilled in the art will clearly understand that the present invention may be implemented by software plus a necessary hardware platform, and may also be implemented by software entirely. With this understanding in mind, all or part of the technical solutions of the present invention that contribute to the background can be embodied in the form of a software product, which can be stored in a storage medium, such as a ROM/RAM, a magnetic disk, an optical disk, etc., and includes instructions for causing a computer device (which can be a personal computer, a server, or a network device, etc.) to execute the methods according to the embodiments or some parts of the embodiments of the present invention.
The present invention has been described in detail, and the principle and embodiments of the present invention are explained herein by using specific examples, which are only used to help understand the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.
Claims (11)
1. An information processing apparatus comprising:
a display unit, located on a first surface of the information processing apparatus, for displaying information to a user;
the sensing unit is positioned on a second surface of the information processing device and used for detecting input of a user on the second surface, wherein the first surface is opposite to the second surface; and
the processing unit is positioned in the information processing equipment and used for executing a preset operation corresponding to a preset input when the preset input of a user is detected by the sensing unit;
wherein the display unit and a portion of the information processing apparatus placed overlapping the display unit each satisfy a predetermined light transmittance,
wherein in a case where the display unit displays a plurality of overlapped pages, when the sensing unit detects a second input of the user, the processing unit performs an operation of selecting a specific one of the plurality of overlapped displayed pages according to a selection of the user,
wherein the sensing unit detects a position corresponding to a second input of the user, if there is only one page corresponding thereto at the position, the processing unit performs an operation of selecting the one page corresponding thereto, and if there are a plurality of pages corresponding thereto at the position, the processing unit performs a selection of a set-bottom page among the plurality of pages corresponding thereto, wherein when the sensing unit detects a third input of the user, the processing unit transposes the selected page to a specific position,
wherein in a case where the display unit displays a plurality of overlapped pages in a direction perpendicular to the display unit, when the sensing unit detects a first input of a user, the processing unit performs a transition from a first view mode in which the display unit can display a positional relationship between the plurality of pages, to a second view mode in which the display unit can display a positional relationship between the plurality of pages, wherein the first input is one-click operation or touch of the user and is operated for a predetermined time.
2. The information processing apparatus according to claim 1, wherein a portion of the information processing apparatus placed overlapping with the display unit includes:
the display unit comprises a part corresponding to the display unit in the sensing unit and a part corresponding to the display unit between the display unit and the sensing unit.
3. The information processing apparatus according to claim 1, wherein the first view mode is a plane overlap display mode, the second view mode is a plane side-by-side display mode, and in the plane overlap display mode, at least two of the plurality of pages have an overlapping portion, and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion.
4. The information processing apparatus according to claim 1, wherein the first view mode is a plane view mode, the second view mode is a stereoscopic view mode, and in the plane view mode, a page is displayed as a rectangle in which display content is contained, and in the stereoscopic view mode, a page is displayed in a perspective view in which a user can see a page thickness and can see a face containing display content.
5. The information processing apparatus according to claim 1, wherein after the transition to the second view mode, when the sensing unit detects a second input by the user, the processing unit selects any one of the plurality of pages according to a selection by the user.
6. The information processing apparatus according to claim 1, wherein when the third input by the user is detected, the sensing unit detects a distance of a hand of the user from the second surface, and the processing unit sets the selected page to a position corresponding to the third input according to a change in the distance.
7. An information processing method applied to an information processing apparatus including a display unit located on a first surface, and satisfying a predetermined light transmittance in both the display unit and a portion of the information processing apparatus placed to overlap the display unit, the method comprising the steps of:
displaying a plurality of overlapping pages;
detecting a user input on a first surface and/or a second surface, wherein the first surface and the second surface are opposite; and
when a predetermined input of a user is detected, executing a predetermined operation corresponding to the predetermined input;
wherein when a predetermined input by a user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes:
when the second input of the user is detected, the operation of selecting a specific page in the plurality of overlapped and displayed pages is executed according to the selection of the user;
wherein, when the second input of the user is detected, the step of performing an operation of selecting a specific one of the plurality of overlapped displayed pages according to the selection of the user includes: detecting a position corresponding to a second input of the user, if there is only one page corresponding thereto at the position, the processing unit performing an operation of selecting the one page corresponding thereto, and if there are a plurality of pages corresponding thereto at the position, the processing unit performing a selection of a set-bottom page among the plurality of pages corresponding thereto, wherein when a third input of the user is detected, the selected page is swapped to a specific position,
wherein when a predetermined input by a user is detected, the step of performing a predetermined operation corresponding to the predetermined input includes:
when a first input of a user is detected, a transition from a first view mode to a second view mode in which the display unit is capable of displaying a positional relationship between a plurality of pages is performed, wherein the first input is a one-click operation or a touch operation of the user for a predetermined time.
8. The information processing method according to claim 7, wherein the first view mode is a plane overlap display mode, the second view mode is a plane side-by-side display mode, and in the plane overlap display mode, at least two of the plurality of pages have an overlapping portion, and in the plane side-by-side display mode, any two of the plurality of pages have no overlapping portion.
9. The information processing method according to claim 7, wherein the first view mode is a plane view mode, the second view mode is a stereoscopic view mode, and in the plane view mode, a page is displayed as a rectangle in which display content is contained, and in the stereoscopic view mode, a page is displayed in a perspective view in which a user can see a page thickness and a face containing display content.
10. The information processing method according to claim 7, wherein after the transition to the second view mode, further comprising the steps of:
when the second input of the user is detected, any one of the plurality of pages is selected according to the selection of the user.
11. The information processing method of claim 7, wherein the method further comprises: when the third input of the user is detected, setting the selected page to a position corresponding to the third input;
when a third input by the user is detected, the step of setting the selected page to a position corresponding to the third input includes:
detecting a distance between a hand of the user and the second surface, and setting the selected page to a position corresponding to the third input according to a change in the distance.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611140417.3A CN106774875B (en) | 2012-02-20 | 2012-02-20 | Information processing apparatus and method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611140417.3A CN106774875B (en) | 2012-02-20 | 2012-02-20 | Information processing apparatus and method |
CN201210040061.1A CN103257704B (en) | 2012-02-20 | 2012-02-20 | Messaging device and method |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210040061.1A Division CN103257704B (en) | 2012-02-20 | 2012-02-20 | Messaging device and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106774875A CN106774875A (en) | 2017-05-31 |
CN106774875B true CN106774875B (en) | 2021-04-13 |
Family
ID=48961661
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611140417.3A Active CN106774875B (en) | 2012-02-20 | 2012-02-20 | Information processing apparatus and method |
CN201210040061.1A Active CN103257704B (en) | 2012-02-20 | 2012-02-20 | Messaging device and method |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201210040061.1A Active CN103257704B (en) | 2012-02-20 | 2012-02-20 | Messaging device and method |
Country Status (1)
Country | Link |
---|---|
CN (2) | CN106774875B (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110389736A (en) * | 2019-06-05 | 2019-10-29 | 华为技术有限公司 | A kind of throwing screen display methods and electronic equipment |
Family Cites Families (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5543588A (en) * | 1992-06-08 | 1996-08-06 | Synaptics, Incorporated | Touch pad driven handheld computing device |
GB2344905A (en) * | 1998-12-17 | 2000-06-21 | Canon Kk | Hand held electronic device with back entry touch pad |
GB0317715D0 (en) * | 2003-07-29 | 2003-09-03 | Koninkl Philips Electronics Nv | Display and input device |
KR101526970B1 (en) * | 2008-05-29 | 2015-06-16 | 엘지전자 주식회사 | Terminal and method for controlling the same |
KR20110081040A (en) * | 2010-01-06 | 2011-07-13 | 삼성전자주식회사 | Method and apparatus for operating content in a portable terminal having transparent display panel |
JP5095780B2 (en) * | 2010-06-25 | 2012-12-12 | シャープ株式会社 | Image forming apparatus |
KR101712909B1 (en) * | 2010-07-16 | 2017-03-22 | 엘지전자 주식회사 | An electronic device including a touch screen display, a interface method using the same and computer-readable storage medium storing the same |
-
2012
- 2012-02-20 CN CN201611140417.3A patent/CN106774875B/en active Active
- 2012-02-20 CN CN201210040061.1A patent/CN103257704B/en active Active
Non-Patent Citations (1)
Title |
---|
A Dynamic Web Page Adaptation for Mobile Device Based on Web2.0;Yunpeng Xiao; Yang Tao; Wenji Li;《2008 Advanced Software Engineering and Its Applications》;20081222;全文 * |
Also Published As
Publication number | Publication date |
---|---|
CN103257704A (en) | 2013-08-21 |
CN106774875A (en) | 2017-05-31 |
CN103257704B (en) | 2016-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6132644B2 (en) | Information processing apparatus, display control method, computer program, and storage medium | |
JP5968905B2 (en) | 3D display terminal device and operation method thereof | |
EP2592541A2 (en) | System and method for executing an e-book reading application in an electronic device | |
JP5907692B2 (en) | Portable terminal device, program, and display control method | |
US20140165013A1 (en) | Electronic device and page zooming method thereof | |
JP5975794B2 (en) | Display control apparatus, display control method, program, and storage medium | |
CN110121693B (en) | Content collision in a multi-layer display system | |
JP5265433B2 (en) | Display device and program | |
KR102033801B1 (en) | User interface for editing a value in place | |
US20140362016A1 (en) | Electronic book display device that performs page turning in response to user operation pressing screen, page turning method, and program | |
CN102163121A (en) | Information processing device, information processing method, and program | |
TW201005599A (en) | Touch-type mobile computing device and control method of the same | |
JP6299608B2 (en) | Document browsing device, electronic document page turning method and program | |
KR102205283B1 (en) | Electro device executing at least one application and method for controlling thereof | |
JP2013016018A (en) | Display control apparatus, control method, and program | |
CN103201716A (en) | Touch-sensitive electronic device | |
KR20150074145A (en) | Method, apparatus and terminal device for controlling movement of application interface | |
CN104951052A (en) | Information processing method and electronic equipment | |
JP6229055B2 (en) | Method and apparatus for displaying objects | |
JP6299245B2 (en) | Display device, control program, scroll display method, and recording medium | |
JP5442128B2 (en) | Object layout editing method and apparatus | |
US20140176454A1 (en) | Touch control method and handheld device utilizing the same | |
CN106774875B (en) | Information processing apparatus and method | |
JP2015046062A (en) | Electronic apparatus | |
US20190332237A1 (en) | Method Of Navigating Panels Of Displayed Content |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |