CN102081493A - Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof - Google Patents

Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof Download PDF

Info

Publication number
CN102081493A
CN102081493A CN2009102467864A CN200910246786A CN102081493A CN 102081493 A CN102081493 A CN 102081493A CN 2009102467864 A CN2009102467864 A CN 2009102467864A CN 200910246786 A CN200910246786 A CN 200910246786A CN 102081493 A CN102081493 A CN 102081493A
Authority
CN
China
Prior art keywords
angle
dimensional
screen
particular artifact
electronic apparatus
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN2009102467864A
Other languages
Chinese (zh)
Inventor
邱文宗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Acer Inc
Original Assignee
Acer Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Acer Inc filed Critical Acer Inc
Priority to CN2009102467864A priority Critical patent/CN102081493A/en
Publication of CN102081493A publication Critical patent/CN102081493A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention relates to a mobile electronic device and a control method of a 3D (three-dimensional) operation interface thereof. The method comprises the following steps: enabling the screen of the mobile electronic device to display a first local area of the 3D operation interface by means of a first visual angle; then if the mobile electronic device detects a select instruction of a specific article in the first local area continuously when generating 3D mobile amount in a 3D space, enabling the screen to rotate according to the 3D mobile amount and displaying a second local area in the 3D operation interface by means of a second visual angle; and simultaneously, changing the display position of a specific article in the 3D operation interface according to the 3D mobile amount, so as to enable the specific article to be displayed in the second local area.

Description

The control method at electronic apparatus and three-dimensional manipulating interface thereof
Technical field
The present invention relates to a kind of method of operating of electronic installation, and be particularly related to a kind of three-dimensional (three-dimensional, 3D) method of operation interface and electronic apparatus controlled.
Background technology
(Virtual Reality VR) is meant the technology such as computer graphics and image be synthetic of utilizing to so-called virtual reality, the virtual world that is gone out by computer simulation actual environment and then construction.Generally speaking, the user can (three-dimensional, 3D) equipment such as sensor glove comes the object in the virtual reality is operated by head mounted display and three-dimensional.Wherein, the picture of virtual reality just is shown in head mounted display, the 3D sensor glove then is the action that is used for detecting user's hand, and then corresponding changes the shown virtual reality picture of head mounted display, and allows the user can touch object in the virtual reality.Yet no matter be head mounted display or 3D sensor glove, all need suitable complicated technology and expensive cost of manufacture.Therefore common people and the facility that is not easy to enjoy virtual reality in daily life and brought.
And along with the progress of science and technology, increasing electronic installation begins to provide the user a kind of operating experience that is similar to virtual reality with the 3D operation interface.Wherein, the 3D desktop programs of personal computer is that the background of desktop and common diagrams such as application program shortcut, file and the data folder form with solid is presented in the screen.Yet present 3D desktop programs only is that the mode of the element on the desktop with solid presented, and is not real 3D virtual reality design.And what be not difficult to infer is, even if personal computer can be supported real 3D virtual reality, at personal computer is with mouse or keyboard under the prerequisite as input media, the user will (two-dimensional, 2D) input media be operated the 3D virtual reality and is also faced many difficulties easily with this class two-dimensional.In other words, carry out real 3D to the 3D virtual reality controls and still must be dependent on the high virtual reality equipment of costliness and complexity.
Summary of the invention
The invention provides a kind of three-dimensional (three-dimensional, the 3D) control method of operation interface, according to electronic apparatus at the 3D amount of movement that 3d space produced, the display position of the particular artifact in the mobile 3 D operation interface.
The invention provides a kind of electronic apparatus, allow the user more intuitively electronic apparatus be operated with just like the impression as at real world object being operated.
The present invention proposes a kind of control method of 3D operation interface, is used to have the electronic apparatus of screen.The method at first makes screen show first regional area of the 3D operation interface of electronic apparatus with first visual angle.Wherein, first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle.Then, if electronic apparatus continues to detect the instruction of choosing of particular artifact in first regional area when 3d space produces the 3D amount of movement, then transfer to show second regional area of 3D operation interface with second visual angle according to 3D amount of movement control screen, according to the display position of 3D amount of movement change particular artifact in the 3D operation interface, make particular artifact be shown in second regional area simultaneously.
From another viewpoint, the present invention proposes a kind of electronic apparatus, and comprise screen, choose detection module, 3D amount of movement detection module, and processing module.Wherein, choose the choose instruction of detection module in order to particular artifact in the 3D operation interface that detects electronic apparatus.3D amount of movement detection module is in order to detect the 3D amount of movement that electronic apparatus produces at 3d space.Processing module is connected to screen respectively, chooses detection module, with 3D amount of movement detection module.Processing module control screen is with first regional area of first visual angle demonstration 3D operation interface, and first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle.If 3D amount of movement detection module detects the 3D amount of movement of electronic apparatus, and choose the instruction of choosing that detection module continues to detect particular artifact in first regional area, processing module will transfer to show with second visual angle second regional area of 3D operation interface according to 3D amount of movement control screen so, according to the display position of 3D amount of movement change particular artifact in the 3D operation interface, make particular artifact be shown in second regional area simultaneously.
Based on above-mentioned, the present invention simulates the mode that the user operates object in true environment, according to the user electronic apparatus is operated the 3D amount of movement that is produced at 3d space, change the display position of particular artifact in the visual angle of 3D operation interface and the 3D operation interface accordingly, thereby the user can be experienced on electronic apparatus with virtual reality operate the facility impression that is brought, significantly reduce the complexity of operation electronic apparatus.
For above-mentioned feature and advantage of the present invention can be become apparent, embodiment cited below particularly, and conjunction with figs. is described in detail below.
Description of drawings
Fig. 1 is the calcspar of the electronic apparatus that illustrates according to one embodiment of the invention.
Fig. 2 is the process flow diagram of the control method of the 3D operation interface that illustrates according to one embodiment of the invention.
Fig. 3 is the process flow diagram of the control method of the 3D operation interface that illustrates according to another embodiment of the present invention.
Fig. 4 is the process flow diagram of the control method of the 3D operation interface that illustrates according to another embodiment of the present invention.
[main element symbol description]
100: electronic apparatus
110: screen
120: choose detection module
130:3D amount of movement detection module
140: processing module
210~270: each step of the control method of the described 3D operation interface of one embodiment of the invention
310~360: each step of the control method of the described 3D operation interface of another embodiment of the present invention
410~440: each step of the control method of the described 3D operation interface of another embodiment of the present invention
Embodiment
Fig. 1 is the calcspar of the electronic apparatus that illustrates according to one embodiment of the invention.See also Fig. 1, electronic apparatus 100 comprises screen 110, chooses detection module 120, three-dimensional (three-dimensional, 3D) the amount of movement detection module 130, and processing module 140.In the present embodiment, electronic apparatus 100 for example is that (Personal Digital Assistant PDA), the PDA mobile phone, or Smartphone or the like, does not limit its scope at this for mobile phone, personal digital assistant.
Wherein, screen 110 can be resistance-type or capacitive Touch Screen, in order to various operations or the use picture that shows electronic apparatus 100.In the present embodiment, electronic apparatus 100 has the 3D operation interface that comprises a plurality of default objects, and this 3D operation interface for example is the 3D desktop.The default object of in the 3D operation interface each all has three-dimensional outward appearance, and represents application program, the file of electronic apparatus 100 respectively, or data folder or the like.And screen 110 can be used to show this 3D operation interface, and then allows the user use electronic apparatus 100 by the 3D operation interface.
Choose detection module 120 in order to detect in the 3D operation interface corresponding to the instruction of choosing of particular artifact, this particular artifact for example is any one default object.Furthermore, when screen 110 is Touch Screen, choose detection module 120 just can detect Touch Screen when particular artifact is touched with finger or input tool such as pointer by the user institute's correspondence generation choose instruction.
Go about when the user holds electronic apparatus 100, or rock, when rotation or whipping electronic apparatus 100, will make electronic apparatus 100 produce corresponding 3D amount of movement (comprising 3D displacement variable and 3D angle variable quantity or the like).And 3D amount of movement detection module 130 is to be used for detecting that electronic apparatus 100 is subjected to user's operation and the 3D amount of movement that produces at 3d space.In the present embodiment, 3D amount of movement detection module 130 comprises acceleration sensor (acceleration sensor) and electronic compass.Acceleration sensor can be acceleration of gravity sensor (g-sensor), in order to the sense acceleration variable quantity to extrapolate the 3D displacement variable of electronic apparatus 100.Electronic compass then can accurately be obtained the 3D angle variable quantity of electronic apparatus 100.
Processing module 140 respectively with screen 110, choose detection module 120, and 3D amount of movement detection module 130 links to each other, in order to detect the 3D amount of movement of electronic apparatus 100 at 3D amount of movement detection module 130 and choose detection module 120 continue to detect a certain particular artifact in first regional area choose instruction the time, come the display view angle of conversion 3D operation interface according to the 3D amount of movement, according to the 3D amount of movement particular artifact is moved to another position of 3D operation interface simultaneously.
Running by each member in the electronic apparatus 100, when the user holds electronic apparatus 100 and goes about, screen 110 will be subjected to processing module 140 control and along with electronic apparatus 100 3D amount of movement at that time then show the regional area of 3D operation interface with different visual angles.In addition, the user can click any object in the 3D operation interface by the mode of pushing screen 110, processing module 140 will continue to click object and hold under the situation that electronic apparatus 100 walks about in the space the user, with the object that the clicked migration elsewhere, a place by the 3D operation interface.
Below will further specify the detailed operation flow process of electronic apparatus 100 with another embodiment.Fig. 2 is the process flow diagram of the control method of the 3D operation interface that illustrates according to one embodiment of the invention, please consults Fig. 1 and Fig. 2 simultaneously.In the present embodiment, the 3D operation interface of electronic apparatus 100 is the 3D desktop, and each the default object on the 3D operation interface is represented application program, the file of electronic apparatus 100 respectively, or data folder.In electronic apparatus 100, record initial level position angle on the surface level of origin position (for example being positioned at the center of 3D operation interface), 3D operation interface of predefined 3D operation interface (between 0 degree between 360 degree), and the initial vertical orientations angle on the vertical plane of 3D operation interface (between 0 degree between 90 degree).In another embodiment, above-mentioned origin position, initial level position angle and initial vertical orientations angle also can be set up on their own according to the use habit of itself by the user.
Shown in step 201, processing module 140 control screens 110 are with first regional area of first visual angle demonstration 3D operation interface, and first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle.In detail, processing module 140 judges at first whether current reference position meets origin position.If current reference position accords with origin position, expression electronic apparatus 100 may just be started by the user.At this moment, processing module 140 definition is center and adding and subtracting between first special angle (for example 25 degree) between the initial level position angle on the surface level with the origin position, is the pairing visual range in first visual angle in the scope of adding and subtracting between second special angle (for example 30 degree) between initial vertical orientations angle on the vertical plane simultaneously.Next, processing module 140 obtains the object position of each default object in the 3D operation interface, and the vectorial angle that constituted with origin position respectively of the object position of calculating each default object.All default objects that at last pairing vectorial angle fallen within visual range are shown in screen 110.
Yet if current reference position does not meet origin position, the expression user once held electronic apparatus 100 and went about after electronic apparatus 100 starts, or once electronic apparatus 100 was rocked or operation such as whipping.Under these circumstances, 140 definition of processing module are the center with current reference position, and are the pairing visual range in first visual angle in the scope of adding and subtracting between second special angle between current vertical orientations angle on the vertical plane simultaneously adding and subtracting between first special angle between the present level position angle on the surface level.After obtaining the object position of each default object, the vectorial angle that the object position that processing module 140 is calculated each default object is constituted with current reference position respectively, and all default objects that pairing vectorial angle falls within above-mentioned visual range are shown in screen 110.
Must it should be noted that, at first visual angle pairing current reference position not simultaneously, the shown picture of screen 110 is also inequality.In addition, processing module 140 can utilize a blanking algorithm (for example Zbuffer algorithm) to handle all default objects that need be presented in the visual range, to present the stereoeffect that close shot hides distant view.
In the present embodiment, suppose that screen 110 is Touch Screen, and in order to judge that electronic apparatus 100 is when 3d space produces the 3D amount of movement, whether detect the instruction of choosing of a certain particular artifact in first regional area constantly, shown in step 205, detect electronic apparatus 100 acts on screen 110 when 3d space produces the 3D amount of movement touch operation (for example the user is with finger or pointer touch screen 110) by screen 110.Wherein, touch operation is that (two-dimensional is 2D) on the coordinate for first two dimension that occurs in screen 110.And the 3D amount of movement that electronic apparatus 100 produces at 3d space is to be detected by 3D amount of movement detection module 130.
Then in step 210, judge whether this touch operation can be used as the instruction of choosing of particular artifact in first regional area.That is, judge that whether the user clicks object in first regional area of 3D operation interface with finger or pointer.Furthermore, processing module 140 can be the correspondence position in the 3D operation interface with the 2D coordinate conversion at touch operation place.In obtaining the 3D operation interface behind the object position of all default objects, just can compare out correspondence position and whether conform to any object position.If with all neither conforming to of object position, then shown in step 215, processing module 140 is only controlled second regional area that screen 110 transfers to show with second visual angle 3D operation interface according to the 3D amount of movement.Wherein, repeat after the detailed step of processing module 140 control screens 110 demonstrations second regional area holds.Next, the flow process of this control method will be got back to step 205, wait another touch operation to be detected.
If the correspondence position of touch operation accords with the object position of a certain default object, represent that then the user presses this object with finger (or pointer).Therefore shown in step 220, it was first reference time that processing module 140 clicks object time keeping at that time with the user, and the default object that the object position accords with correspondence position is considered as the selected particular artifact of user.Just can be and choose detection module 120 with the instruction of choosing of touch action as particular artifact.
Next in step 225, judge whether choose instruction continues to exist.Continue to exist if choose instruction, detection module 120 is chosen in expression can be when 3D amount of movement detection module 130 constantly detects the 3D amount of movement of electronic apparatus 100, continues to detect to choose instruction.That is the user holds electronic apparatus 100 and goes about when continuing to click particular artifact.Therefore processing module 140 can be controlled screen 110 accordingly with different visual angle demonstration 3D operation interfaces, and the display position of particular artifact in the 3D operation interface also can change to some extent.
In step 230, processing module 140 is calculated the display position of particular artifact in the 3D operation interface according to the 3D amount of movement.And shown in step 235, processing module 140 is controlled second regional area that screen 110 transfers to show with second visual angle 3D operation interface according to the 3D amount of movement, makes particular artifact be shown in second regional area simultaneously.
In the present embodiment, processing module 140 will corresponding to the new reference position at second visual angle (for example be calculated according to pairing current reference position, first visual angle and present 3D displacement variable, new reference position is the summation of current reference position and 3D displacement variable), and with the surface level component in the 3D angle variable quantity as new height position angle corresponding to second visual angle, and with the vertical plane component in the 3D angle variable quantity as new vertical orientations angle corresponding to second visual angle.Then, processing module 140 is the center with new reference position, to add and subtract between first special angle between the new height position angle on the surface level, be the pairing visual range in second visual angle at the scope definition of adding and subtracting between second special angle between new vertical orientations angle on the vertical plane simultaneously.The object position of each default object and calculating after the vectorial angle that each object position constituted with new reference position respectively in obtaining the 3D operation interface, all default objects that processing module 140 falls within pairing vectorial angle the visual range at second visual angle are shown in screen 110.At this moment, all default objects that processing module 140 also can be utilized the blanking algorithm to handle can be presented in the visual range are to present the effect that close shot hides distant view.
Continuing according to object in response to the user and holding electronic apparatus 100 and move and when changing the display view angle of 3D operation interface, processing module 140 also can change the display position of particular artifact accordingly at 3d space.In the present embodiment, processing module 140 is as the present display position of particular artifact with new reference position, then obtain the 3D moulding data of particular artifact, and on display position, show particular artifact, and then make particular artifact be shown in second regional area according to 3D moulding data.
Please get back to the step 225 of Fig. 2, if judge that in step 225 choosing instruction no longer continues to exist (causing choosing also disappearance thereupon of instruction because of touch action disappears), then represent the user may decontrol finger (or pointer) and no longer continue to click particular artifact, therefore shown in step 240, it is preceding corresponding to the 2nd 2D coordinate on the screen 110 that processing module 140 obtains the touch action disappearance, and the time keeping that touch action is disappeared at that time was second reference time.
Next in step 245, processing module 140 judges that whether difference between first reference time and second reference time is less than very first time preset value (for example 0.5 second).If the difference of first reference time and second reference time is less than very first time preset value, the expression user carries out any to hit action on selected particular artifact, and therefore shown in step 250, processing module 140 is carried out the pairing function of particular artifact.For instance, if particular artifact corresponding to the application program of electronic apparatus 100, processing module 140 will be carried out above-mentioned application program.If the file in the corresponding electronic apparatus 100 of particular artifact, processing module 140 is presented to the user with file opening and by screen 110 with file content.If particular artifact is corresponding to data folder, processing module 140 is the turn-on data folder so, and then allows the user can inspect file in the data folder.In other words, the user is when holding electronic apparatus 100 and go about, and processing module 140 will be controlled screen 110 and show the 3D operation interface according to the 3D amount of movement with different visual angles.And when comprising the particular artifact that the user wishes to carry out or open in the shown regional area of screen 110, the user just can carry out the pairing function of particular artifact as long as click apace and decontrol particular artifact again in very first time preset value.After carrying out function, this flow process will be got back to step 215, control second regional area that screen 110 transfers to show with second visual angle 3D operation interface by processing module 140 according to the 3D amount of movement.Then return step 205, wait another touch operation to be detected.
If the difference of judging first reference time and second reference time in step 245 more than or equal to very first time preset value, then shown in step 255, judges whether the present display position of particular artifact meets the ad-hoc location of 3D operation interface.In the present embodiment, the ad-hoc location of 3D operation interface for example is expression one a virtual dustbin or a data folder.Therefore if the present display position of particular artifact conforms to ad-hoc location, processing module 140 will be deleted the selected particular artifact of user or particular artifact will be moved to data folder in the 3D operation interface.Then this flow process can be got back to step 215 equally, controls second regional area that screen 110 transfers to show with second visual angle 3D operation interface by processing module 140 according to the 3D amount of movement.Then return step 205, wait for the generation of another touch operation.
If judge that in step 255 present display position and the ad-hoc location of particular artifact is not inconsistent, then shown in step 265, whether judge distance between a 2D coordinate and the 2nd 2D coordinate less than distance preset value (for example 10 points), and whether the difference between first reference time and second reference time is greater than the second time preset value (for example 1 second).If represent that then the user pins particular artifact and hold electronic apparatus 100 to go about, and then decontrols particular artifact again.Under these circumstances, processing module 140 can make particular artifact fixedly be shown in present display position.Furthermore, shown in step 230, processing module 140 can be calculated the display position (for example be with new reference position display position as particular artifact) of particular artifact in the 3D operation interface according to the 3D amount of movement.Then shown in step 235, processing module 140 is controlled second regional area that screen 110 transfers to show with second visual angle 3D operation interface according to the 3D amount of movement, makes particular artifact be shown in second regional area simultaneously.
Return step 265, distance between judgement the one 2D coordinate and the 2nd 2D coordinate is more than or equal to the distance preset value, and the difference between first reference time and second reference time was less than or equal to for second time during preset value, then subsequent steps 270, processing module 140 will be calculated the para-curve distance that is moved to the 2nd 2D coordinate by a 2D coordinate, and according to the target location in the para-curve distance calculation 3D operation interface, and then with the display position of target location as particular artifact.Then in step 235, control second regional area that screens 110 transfer to show with second visual angle 3D operation interface, make particular artifact be shown in second regional area simultaneously according to electronic apparatus 100 present 3D amount of movements.
In the present embodiment, behind the display action of completing steps 235, this flow process will be got back to step 205 once more, and the user is to the touch action of screen 110 when waiting electronic apparatus 100 to be detected to produce the 3D amount of movement.In other words, electronic apparatus 100 just flow process of execution graph 2 repeatedly after startup.When the user continues certain particular artifact on the taps screen 110 and hold electronic apparatus 100 to go about, except meeting made the regional area of the screen 110 corresponding 3D of the demonstration operation interface with different visual angles, the particular artifact that the user continued to click also can change the display position in the 3D operation interface thereupon.In case the user decontrols particular artifact, then decide the function that will carry out the object correspondence according to the time length that continues to click particular artifact, or the position at the 3D operation interface that judges whether to delete object or change particular artifact according at that time display position.Thus, the user will no longer need expensive and complicated virtual reality equipment such as head mounted display or 3D sensor glove, also can pass through the operating effect in the real border of electronic apparatus 100 experiencing virtuals simply.
In the control method of 3D operation interface shown in Figure 2, the user is not limited at fixing operating position.In other words, the user can hold the display view angle and the depth of field that electronic apparatus 100 goes about and then changes the 3D operation interface.Yet, in following embodiment, provide a kind of moment to change the mode of the depth of field, allow the user when controlling the 3D operation interface, feel more convenient according to this.
Fig. 3 is the process flow diagram of the control method of the 3D operation interface that illustrates according to another embodiment of the present invention.Please consult Fig. 1 and Fig. 3 simultaneously, at first shown in step 310, processing module 140 control screens 110 show first regional area of 3D operation interface with first visual angle.Because control screen 110 is same or similar with the step and the previous embodiment of first regional area of first visual angle demonstration 3D operation interface, so do not repeat them here.
Then in step 320, judge and choose the instruction of choosing whether detection module 120 detects a certain particular artifact in first regional area.Do not detect any instruction of choosing if choose detection module 120, the expression user does not click any object as yet, therefore shown in step 330, when 3D amount of movement detection module 130 detected 3D displacement variable surpass a preset value in a special time, processing module 140 changes the corresponding at present depth of field in first visual angle according to the 3D displacement variable, and then shows the subregion in first regional area.That is to say, as long as the user imposes a specific operation (for example rocking fast or whipping electronic apparatus 100) to electronic apparatus 100 and make electronic apparatus 100 produce bigger acceleration change amount in moment, the depth of field of the shown picture of screen 110 just can change thereupon so.
If yet the judgment result displays of step 320 is chosen the instruction of choosing that detection module 120 detects a certain particular artifact, shown in step 340, judge whether choose instruction continues to exist.In the present embodiment, disappear if choose instruction, then shown in step 350, processing module 140 makes particular artifact fixedly be presented at present display position.Yet in other embodiments of the invention, when elected instruction fetch disappears, processing module 140 also can determine whether carrying out the pairing function of particular artifact according to the time interval length that the user clicked and decontroled particular artifact, and the display position of particular artifact determines whether deleting particular artifact when decontroling particular artifact also or according to the user.If choosing instruction continues to exist, then shown in step 360, when the 3D displacement variable surpasses preset value in special time, change the pairing depth of field in first visual angle by processing module 140 according to the 3D displacement variable, the subregion that shows first regional area according to this at screen 110, according to the display position of 3D amount of movement change particular artifact, make particular artifact move and be presented in the subregion simultaneously.That is to say, as long as choosing detection module 120 continues to detect and chooses instruction, and 3D amount of movement detection module 130 detected 3D displacement variable surpass preset value in special time, just represent that user's edge point selects the particular artifact limit that electronic apparatus 100 is imposed specific operation (for example rocking fast or whipping electronic apparatus 100), the depth of field of the shown picture of screen 110 and the display position of the object that the user clicked also will change moment so.
In following embodiment, first button (not illustrating) of electronic apparatus 100 is defined as corresponding one default 3D amount of movement in advance, and this default 3D amount of movement comprises 3D displacement variable, preset level position angle and default vertical orientations angle.When the user pushes this button, be equivalent to just that electronic apparatus 100 is subjected to user's operation and the situation that produces above-mentioned default 3D amount of movement.In view of the above, the display position of object also will change thereupon on the visual angle of 3D operation interface and the 3D operation interface.
Fig. 4 is the process flow diagram of the control method of the 3D operation interface that illustrates according to another embodiment of the present invention.Please consult Fig. 1 and Fig. 4 simultaneously, at first shown in step 410, processing module 140 control screens 110 show first regional area of 3D operation interface with first visual angle.Because control screen 110 shows that the detailed step and the previous embodiment of first regional area are same or similar, so do not repeat them here.
Then in step 420, judge and choose the instruction of choosing whether detection module 120 detects a certain particular artifact in first regional area.Do not detect any instruction of choosing if choose detection module 120, the expression user does not choose any object.Therefore shown in step 430, processing module 140 transfers to show with the 3rd visual angle the 3rd regional area of 3D operation interface according to the pairing default 3D amount of movement control screen 110 of first button when first button is pressed.
If choose the instruction of choosing that detection module 120 detects particular artifact, then shown in step 440, if first button continues to detect and chooses instruction and the time be pressed choosing detection module 120, processing module 140 will transfer to show with the 3rd visual angle the 3rd regional area of 3D operation interface according to default 3D amount of movement control screen 110, according to the display position of default 3D amount of movement change particular artifact, make particular artifact be shown in the 3rd regional area simultaneously.In the present embodiment, disappear in case choose instruction, processing module 140 just can make particular artifact fixedly be shown in present display position.In another embodiment, when elected instruction fetch disappears, 140 of processing modules can determine whether carrying out the pairing function of particular artifact according to the time interval length that the user clicked and decontroled particular artifact, and the display position of particular artifact determines whether deleting particular artifact when decontroling particular artifact also or according to the user.
In the present embodiment, as long as the user presses first button corresponding to default 3D amount of movement, just can change the display position of display view angle and object apace.Owing to illustrated in previous embodiment how processing module 140 controls screen 110 shows the 3D operation interface with different visual angles different regional areas according to the 3D amount of movement of electronic apparatus 100, and the display position that how changes particular artifact according to the 3D amount of movement, make particular artifact by the migration elsewhere, a place of 3D operation interface, so just repeat no more at this.
In one embodiment of this invention, electronic apparatus 100 has one second button (below be referred to as to reduce button), and this reduction button is defined as default original position (for example origin position), default reduced level position angle and the default reduction vertical orientations angle gone back of one in the corresponding 3D operation interface.As long as the user pushes this reduction button, processing module 140 just can be controlled screen 110 and show with 3D operation interface default and go back the picture that original position is the center.Furthermore, processing module 140 is gone back original position, default reduced level position angle and default reduction vertical orientations angle control screen 110 then is shown the default regional area (that is, go back the picture that original position is the center with default) of 3D operation interface with the 4th visual angle according to default.In view of the above, even if the user temporarily loses the position because of frequently controlling the 3D operation interface, also can get back to the default original position of going back of 3D operation interface apace by pushing the reduction button.Owing to show that the step and the previous embodiment of default regional area are same or similar, so do not repeat them here.
The control method of described electronic apparatus of the foregoing description and 3D operation interface thereof is the display view angle and the depth of field that changes the 3D operation interface according to the 3D amount of movement that electronic apparatus produces at 3d space accordingly, allow simultaneously the user can click each object in the 3D operation interface, mobile display position, or start operation such as function.The mode that the foregoing description provides allows the user experience as being in the 3D operation interface and various items being carried out the impression of practical operation.Even if the user of uncomfortable operating electronic devices is when obtaining above-mentioned electronic apparatus, does not need to spend extra time study yet and how to utilize input media to control the 3D operation interface, thereby guarantee that electronic apparatus is more convenient and intuitive in the use.
Though the present invention with embodiment openly as above; right its is not in order to qualification the present invention, those skilled in the art, without departing from the spirit and scope of the present invention; when doing a little change and retouching, so protection scope of the present invention is as the criterion when looking the appended claims person of defining.

Claims (20)

1. the control method at a three-dimensional manipulating interface is used to have an electronic apparatus of a screen, and this method comprises:
Make this screen show one first regional area at a three-dimensional manipulating interface of this electronic apparatus with one first visual angle, wherein this first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle; And
If continuing to detect one of a particular artifact in this first regional area when a three dimensions produces a three-dimensional amount of movement, this electronic apparatus chooses instruction, then control one second regional area that this screen transfers to show with one second visual angle this three-dimensional manipulating interface according to this three-dimensional amount of movement, simultaneously change the display position of this particular artifact in this three-dimensional manipulating interface, make this particular artifact be shown in this second regional area according to this three-dimensional amount of movement.
2. the control method at three-dimensional manipulating as claimed in claim 1 interface, wherein this three-dimensional manipulating interface comprises a plurality of default objects, and is making this screen also comprise show the step of this first regional area with this first visual angle before:
Define an origin position at this three-dimensional manipulating interface;
Define the initial level position angle on the surface level at this three-dimensional manipulating interface;
Define the initial vertical orientations angle on the vertical plane at this three-dimensional manipulating interface; And
Make this screen show that with this first visual angle the step of this first regional area comprises:
Judge whether this current reference position meets this origin position;
When this current reference position meets this origin position, definition is center and adding and subtracting between one first special angle between this initial level position angle on this surface level with this origin position, is the pairing visual range in this first visual angle in the scope of adding and subtracting between one second special angle between this initial vertical orientations angle on this vertical plane simultaneously;
Obtain an object position of each these default object;
The vectorial angle that this object position of calculating each these default object is constituted with this origin position respectively; And
Show pairing all default objects that should the vector angle fall within this visual range.
3. the control method at three-dimensional manipulating as claimed in claim 2 interface, wherein judging this current reference position also comprises after whether meeting the step of this origin position:
When this current reference position does not meet this origin position, definition is center and adding and subtracting between this first special angle between this current horizontal azimuth on this surface level with this current reference position, is pairing this visual range in this first visual angle in the scope of adding and subtracting between this second special angle between this current vertical orientations angle on this vertical plane simultaneously;
Obtain this object position of each these default object;
This vector angle that this object position of calculating each these default object is constituted with this current reference position respectively; And
Show pairing all default objects that should the vector angle fall within this visual range.
4. the control method at three-dimensional manipulating as claimed in claim 2 interface, wherein this screen is a Touch Screen, and before this that detects this particular artifact chosen the step of instruction, also comprises:
Detection effect is in a touch action of this screen, and wherein this touch action is to one first two-dimensional coordinate on should screen;
Change the corresponding position of this first two-dimensional coordinate for this three-dimensional manipulating interface;
Judge whether this correspondence position accords with one of them this object position of these default objects; And
If, then with this object position accord with this correspondence position should default object as this particular artifact, and choose instruction as this of this particular artifact with this touch action.
5. the control method at three-dimensional manipulating as claimed in claim 2 interface wherein should comprise a three-D displacement variable quantity and a three-dimensional perspective variable quantity by the three-dimensional amount of movement, and this three-dimensional perspective variable quantity comprises a surface level component and a vertical plane component; And
Controlling this screen according to this three-dimensional amount of movement transfers to show that with this second visual angle the step of this second regional area also comprises:
Calculate a new reference position that should second visual angle according to this current reference position and this three-D displacement variable quantity;
With this surface level component of this three-dimensional perspective variable quantity as to a new height position angle that should second visual angle;
With this vertical plane component of this three-dimensional perspective variable quantity as to a new vertical orientations angle that should second visual angle;
Definition is center and adding and subtracting between this first special angle between this new height position angle on this surface level with this new reference position, is pairing this visual range in this second visual angle in the scope of adding and subtracting between this second special angle between this new vertical orientations angle on this vertical plane simultaneously;
Obtain this object position of each these default object;
This vector angle that this object position of calculating each these default object is constituted with this new reference position respectively; And
Show pairing all default objects that should the vector angle fall within this visual range.
6. the control method at three-dimensional manipulating as claimed in claim 4 interface wherein after judging whether this correspondence position accords with one of them the step of this object position of these default objects, also comprises:
If then writing down at that time, the time was one first reference time; And
This step of choosing instruction of this particular artifact also comprises afterwards in detecting this first regional area:
Obtaining this touch action disappears preceding corresponding to one second two-dimensional coordinate on this screen;
Writing down at that time, the time was one second reference time; And
If the difference between this first reference time and this second reference time less than a very first time preset value, is then carried out the pairing function of this particular artifact.
7. the control method at three-dimensional manipulating as claimed in claim 6 interface also comprises:
If this difference between this first reference time and this second reference time, judges then whether this present display position of this particular artifact meets an ad-hoc location at this three-dimensional manipulating interface more than or equal to this very first time preset value;
If then in this three-dimensional manipulating interface, delete this particular artifact; And
If not, then the distance between this first two-dimensional coordinate and this second two-dimensional coordinate less than one apart from preset value and this difference between this first reference time and this second reference time greater than one second time during preset value, make this particular artifact fixedly be shown in this present display position.
8. the control method at three-dimensional manipulating as claimed in claim 7 interface also comprises:
This distance between this first two-dimensional coordinate and this second two-dimensional coordinate is less than or equal to this during preset value more than or equal to this apart from preset value and this difference between this first reference time and this second reference time second time, calculates a para-curve distance that is moved to this second two-dimensional coordinate by this first two-dimensional coordinate;
Target location according to this this three-dimensional manipulating interface of para-curve distance calculation; And
With this target location this display position as this particular artifact.
9. the control method at three-dimensional manipulating as claimed in claim 5 interface is wherein making this screen also comprise show the step of this first regional area with this first visual angle after:
If this three-D displacement variable quantity of this electronic apparatus surpasses a preset value in a special time, then change a depth of field of this first visual angle correspondence, to show the subregion in this first regional area according to this three-D displacement variable quantity.
10. the control method at three-dimensional manipulating as claimed in claim 5 interface is wherein making this screen also comprise show the step of this first regional area with this first visual angle after:
This that detects this particular artifact chosen instruction; And
If this choose instruction when continuing to exist this three-D displacement variable quantity in a special time, surpass a preset value, then change a depth of field of this first visual angle correspondence to show the subregion in this first regional area according to this three-D displacement variable quantity, change this display position of this particular artifact simultaneously according to this three-dimensional amount of movement, make this particular artifact be shown in this subregion.
11. an electronic apparatus comprises:
One screen shows one first regional area at a three-dimensional manipulating interface with one first visual angle, and wherein this first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle;
One chooses detection module, chooses instruction in order to detect one of in a three-dimensional manipulating interface of this an electronic apparatus particular artifact;
One three-dimensional amount of movement detection module is in order to detect the three-dimensional amount of movement that this electronic apparatus produces at a three dimensions; And
One processing module is coupled to this screen, this chooses detection module, with this three-dimensional amount of movement detection module, wherein
This processing module is controlled this screen and is shown one first regional area at this three-dimensional manipulating interface with one first visual angle, and this first visual angle is corresponding to a current reference position, a present level position angle and a current vertical orientations angle;
If this that should three-dimensional amount of movement detection module detects that this three-dimensional amount of movement of this electronic apparatus and this choose that detection module continues to detect this particular artifact in this first regional area chosen instruction, this processing module is controlled one second regional area that this screen transfers to show with one second visual angle this three-dimensional manipulating interface according to this three-dimensional amount of movement, simultaneously change the display position of this particular artifact in this three-dimensional manipulating interface, make this particular artifact be shown in this second regional area according to this three-dimensional amount of movement.
12. electronic apparatus as claimed in claim 11, wherein this three-dimensional manipulating interface comprises a plurality of default objects, and this processing module obtains the initial level position angle on the surface level at an origin position, this three-dimensional manipulating interface at predefined this three-dimensional manipulating interface, and the initial vertical orientations angle on the vertical plane at this three-dimensional manipulating interface; Wherein this processing module judges whether this current reference position meets this origin position, when this current reference position meets this origin position, definition is the center with this origin position, and adding and subtracting between one first special angle between this initial level position angle on this surface level, be the pairing visual range in this first visual angle in the scope of adding and subtracting between one second special angle between this initial vertical orientations angle on this vertical plane simultaneously, the vectorial angle that this object position of obtaining an object position of each these default object and calculating each these default object is constituted with this origin position respectively, and show and pairingly should the vector angle fall within all default objects of this visual range in this screen.
13. electronic apparatus as claimed in claim 12, when wherein this processing module does not meet this origin position in this current reference position, definition is the center with this current reference position, and adding and subtracting between this first special angle between this current horizontal azimuth on this surface level, be pairing this visual range in this first visual angle in the scope of adding and subtracting between this second special angle between this current vertical orientations angle on this vertical plane simultaneously, obtain this object position of each these default object, and this vector angle of being constituted with this current reference position respectively of this object position of calculating each these default object, and show and pairingly should the vector angle fall within all default objects of this visual range in this screen.
14. electronic apparatus as claimed in claim 12, wherein this screen is a Touch Screen, when this screen detects a touch action on one first two-dimensional coordinate that acts on this screen, this processing module is changed the corresponding position of this first two-dimensional coordinate for this three-dimensional manipulating interface, judge whether this correspondence position accords with one of them this object position of these default objects, if, then with this object position accord with this correspondence position should default object as this particular artifact, and this is chosen detection module and chooses instruction with this touch action as this of this particular artifact.
15. electronic apparatus as claimed in claim 12 wherein should comprise a three-D displacement variable quantity and a three-dimensional perspective variable quantity by the three-dimensional amount of movement, and this three-dimensional perspective variable quantity comprises a surface level component and a vertical plane component; Wherein
This processing module according to this current reference position and the calculating of this three-D displacement variable quantity to a new reference position that should second visual angle, and with this surface level component of this three-dimensional perspective variable quantity as to a new height position angle that should second visual angle, and with this vertical plane component of this three-dimensional perspective variable quantity as to a new vertical orientations angle that should second visual angle, definition is center and adding and subtracting between this first special angle between this new height position angle on this surface level with this new reference position, be pairing this visual range in this second visual angle in the scope of adding and subtracting between this second special angle between this new vertical orientations angle on this vertical plane simultaneously, this vector angle that this object position of obtaining this object position of each these default object and calculating each these default object is constituted with this new reference position respectively, and show and pairingly should the vector angle fall within all default objects of this visual range in this screen.
16. electronic apparatus as claimed in claim 14, wherein this processing module is when judging that this correspondence position accords with one of them this object position of these default objects, and writing down at that time, the time was one first reference time; This processing module obtains this touch action and disappears preceding corresponding to one second two-dimensional coordinate on this screen, writing down at that time, the time was one second reference time, and the difference between this first reference time and this second reference time is carried out the pairing function of this particular artifact less than very first time during preset value;
Wherein this particular artifact to an application program, a file and a data folder that should electronic apparatus one of them, and this processing module is carried out this application program to should application program the time in this particular artifact; When this particular artifact correspondence this document, open this document; And open this data folder to should data folder the time in this particular artifact.
17. electronic apparatus as claimed in claim 16, wherein this processing module this difference between this first reference time and this second reference time judges more than or equal to this during preset value whether this present display position of this particular artifact meets an ad-hoc location at this three-dimensional manipulating interface very first time; If, then in this three-dimensional manipulating interface, delete this particular artifact, if not, then the distance between this first two-dimensional coordinate and this second two-dimensional coordinate less than one apart from preset value and this difference between this first reference time and this second reference time greater than one second time during preset value, make this particular artifact fixedly be shown in this present display position.
18. electronic apparatus as claimed in claim 17, wherein this processing module this distance between this first two-dimensional coordinate and this second two-dimensional coordinate is less than or equal to this during preset value more than or equal to this apart from preset value and this difference between this first reference time and this second reference time second time, calculating is moved to a para-curve distance of this second two-dimensional coordinate by this first two-dimensional coordinate, according to a target location at this this three-dimensional manipulating interface of para-curve distance calculation, and with this target location this display position as this particular artifact.
19. electronic apparatus as claimed in claim 15, wherein when this screen shows this first regional area with this first visual angle, if this three-D displacement variable quantity of this electronic apparatus surpasses a preset value in a special time, this processing module changes the depth of field of this first visual angle correspondence according to this three-D displacement variable quantity, to show the subregion in this first regional area.
20. electronic apparatus as claimed in claim 15, wherein when this screen shows this first regional area with this first visual angle, continue to detect this and choose instruction and this three-D displacement variable quantity surpasses a preset value in a special time if this chooses detection module, this processing module changes a depth of field of this first visual angle correspondence to show the subregion in this first regional area according to this three-D displacement variable quantity, change this display position of this particular artifact simultaneously according to this three-dimensional amount of movement, make this particular artifact be shown in this subregion.
CN2009102467864A 2009-12-01 2009-12-01 Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof Pending CN102081493A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2009102467864A CN102081493A (en) 2009-12-01 2009-12-01 Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2009102467864A CN102081493A (en) 2009-12-01 2009-12-01 Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof

Publications (1)

Publication Number Publication Date
CN102081493A true CN102081493A (en) 2011-06-01

Family

ID=44087481

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2009102467864A Pending CN102081493A (en) 2009-12-01 2009-12-01 Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof

Country Status (1)

Country Link
CN (1) CN102081493A (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164121A (en) * 2011-12-19 2013-06-19 腾讯科技(深圳)有限公司 Method and device for terminal software interface display
WO2013182142A1 (en) * 2012-12-18 2013-12-12 中兴通讯股份有限公司 Terminal device desktop implementation method, system and terminal device
WO2014108024A1 (en) * 2013-01-14 2014-07-17 华为终端有限公司 Method for interface object movement and apparatus for supporting interface object movement
CN104142685A (en) * 2014-08-21 2014-11-12 深圳市佳顺伟业科技有限公司 AGV trackless guide method and system based on optical positioning
CN104914980A (en) * 2014-03-10 2015-09-16 联想(北京)有限公司 Information processing method and device
CN105094294A (en) * 2014-05-12 2015-11-25 联想(北京)有限公司 Method and apparatus for operating naked-eye 3-dimensional graphics display device
CN105183288A (en) * 2015-08-31 2015-12-23 惠州Tcl移动通信有限公司 Single-window multi-task display method and intelligent mobile terminal
CN106257394A (en) * 2015-06-22 2016-12-28 三星电子株式会社 Three-dimensional user interface for head-mounted display
CN109804618A (en) * 2016-10-19 2019-05-24 三星电子株式会社 Electronic equipment for displaying images and computer readable recording medium
WO2019141055A1 (en) * 2018-01-19 2019-07-25 腾讯科技(深圳)有限公司 Viewing angle adjustment method and apparatus, storage medium, and electronic apparatus
CN111921195A (en) * 2020-09-24 2020-11-13 成都完美天智游科技有限公司 Three-dimensional scene generation method and device, storage medium and electronic device

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103164121B (en) * 2011-12-19 2016-07-06 腾讯科技(深圳)有限公司 A kind of method and apparatus of terminal software interface display
WO2013091483A1 (en) * 2011-12-19 2013-06-27 腾讯科技(深圳)有限公司 Method and device for software interface display on terminal, and computer storage medium
CN103164121A (en) * 2011-12-19 2013-06-19 腾讯科技(深圳)有限公司 Method and device for terminal software interface display
WO2013182142A1 (en) * 2012-12-18 2013-12-12 中兴通讯股份有限公司 Terminal device desktop implementation method, system and terminal device
WO2014108024A1 (en) * 2013-01-14 2014-07-17 华为终端有限公司 Method for interface object movement and apparatus for supporting interface object movement
CN104914980A (en) * 2014-03-10 2015-09-16 联想(北京)有限公司 Information processing method and device
CN105094294A (en) * 2014-05-12 2015-11-25 联想(北京)有限公司 Method and apparatus for operating naked-eye 3-dimensional graphics display device
CN105094294B (en) * 2014-05-12 2018-07-06 联想(北京)有限公司 Operate the method and device of naked eye three-dimensional graphic display device
CN104142685A (en) * 2014-08-21 2014-11-12 深圳市佳顺伟业科技有限公司 AGV trackless guide method and system based on optical positioning
CN104142685B (en) * 2014-08-21 2019-08-16 深圳市佳顺智能机器人股份有限公司 AGV trackless guidance method and system based on optical alignment
CN106257394A (en) * 2015-06-22 2016-12-28 三星电子株式会社 Three-dimensional user interface for head-mounted display
CN106257394B (en) * 2015-06-22 2021-03-05 三星电子株式会社 Three-dimensional user interface for head-mounted display
CN105183288A (en) * 2015-08-31 2015-12-23 惠州Tcl移动通信有限公司 Single-window multi-task display method and intelligent mobile terminal
CN109804618A (en) * 2016-10-19 2019-05-24 三星电子株式会社 Electronic equipment for displaying images and computer readable recording medium
WO2019141055A1 (en) * 2018-01-19 2019-07-25 腾讯科技(深圳)有限公司 Viewing angle adjustment method and apparatus, storage medium, and electronic apparatus
US11877049B2 (en) 2018-01-19 2024-01-16 Tencent Technology (Shenzhen) Company Limited Viewing angle adjustment method and device, storage medium, and electronic device
CN111921195A (en) * 2020-09-24 2020-11-13 成都完美天智游科技有限公司 Three-dimensional scene generation method and device, storage medium and electronic device
CN111921195B (en) * 2020-09-24 2020-12-29 成都完美天智游科技有限公司 Three-dimensional scene generation method and device, storage medium and electronic device

Similar Documents

Publication Publication Date Title
CN102081493A (en) Mobile electronic device and control method of 3D (three-dimensional) operation interface thereof
US20220129060A1 (en) Three-dimensional object tracking to augment display area
US10852913B2 (en) Remote hover touch system and method
JP6812579B2 (en) Methods and devices for detecting planes and / or quadtrees for use as virtual substrates
JP6141300B2 (en) Indirect user interface interaction
KR101946366B1 (en) Display device and Method for controlling the same
US10198854B2 (en) Manipulation of 3-dimensional graphical objects for view in a multi-touch display
US10394346B2 (en) Using a hardware mouse to operate a local application running on a mobile device
Ni et al. Design and evaluation of freehand menu selection interfaces using tilt and pinch gestures
US20120105367A1 (en) Methods of using tactile force sensing for intuitive user interface
TWI590147B (en) Touch modes
KR20100041006A (en) A user interface controlling method using three dimension multi-touch
KR20110104096A (en) User interface for mobile devices
JP2018063700A (en) Contextual pressure sensing haptic responses
EP2201443A1 (en) A system and method for manipulating digital images on a computer display
CN105103112A (en) Apparatus and method for manipulating the orientation of object on display device
KR20090087270A (en) Method and apparatus for 3d location input
CN105474164B (en) The ambiguity inputted indirectly is eliminated
US10073612B1 (en) Fixed cursor input interface for a computer aided design application executing on a touch screen device
TWI502468B (en) Mobile electronic device and method for controlling 3d operation interface thereof
JP6711616B2 (en) Graphic element selection
JP6548956B2 (en) SYSTEM, METHOD, AND PROGRAM
JP2009223532A (en) Operation control method for icon interface
JP6863918B2 (en) Control programs, control methods and information processing equipment
CN106325613A (en) Touch display device and method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20110601