CN104156082A - Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes - Google Patents

Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes Download PDF

Info

Publication number
CN104156082A
CN104156082A CN201410382288.3A CN201410382288A CN104156082A CN 104156082 A CN104156082 A CN 104156082A CN 201410382288 A CN201410382288 A CN 201410382288A CN 104156082 A CN104156082 A CN 104156082A
Authority
CN
China
Prior art keywords
unit
space
user interface
application
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410382288.3A
Other languages
Chinese (zh)
Inventor
王洪亮
管丽娜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BEIJING XINGYUN TIME AND SPACE TECHNOLOGY Co Ltd
Original Assignee
BEIJING XINGYUN TIME AND SPACE TECHNOLOGY Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BEIJING XINGYUN TIME AND SPACE TECHNOLOGY Co Ltd filed Critical BEIJING XINGYUN TIME AND SPACE TECHNOLOGY Co Ltd
Priority to CN201410382288.3A priority Critical patent/CN104156082A/en
Publication of CN104156082A publication Critical patent/CN104156082A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides a control system and intelligent terminal of user interfaces and applications aimed at space-time scenes. The system comprises a spatial information sensing unit (1), a clock unit (2), a processing unit (3) and a display output unit (4), wherein the spatial information sensing unit (1) is taken along by a user and used for collecting human body space three-dimensional information of the user; the clock unit (2) is used for providing time information; the processing unit (3) is connected to the space information sensing unit (1) and the clock unit (2) and used for analyzing and processing the human body space three-dimensional information and the time information and outputting control instructions; the display output unit (4) is connected with the processing unit (3) and used for forwarding the control instructions to an external device so as to display the user interfaces through a virtual screen by means of the external device and switching over the user interfaces and / or controlling application software on the user interfaces to carry out corresponding motions according to the control instructions. Management of the user interfaces and the applications aimed at the space-time scenes can be accomplished naturally and effectively via simple motion sensing by means of a method for accurately positioning human body postures, directions and positions and dynamic switching of the user interfaces, and user experience is good.

Description

Towards the user interface of space-time scene and the control system of application and intelligent terminal
Technical field
The present invention relates to network intelligence terminal control technical field, particularly a kind of towards the user interface of space-time scene and the control system of application and intelligent terminal.
Background technology
The technology such as multi-point touch that adopt in traditional mobile intelligent terminal such as mobile phone, flat board, just can see corresponding information after often needing to trigger more.The use of virtual screen technology in traditional application software, cannot represent owing to being limited to screen size.For example, in PC or mobile phone, when too many application program has been installed, desktop icons will get more and more, although can adopt, want the icon combining to create file certain icon with it.But for user, if the quantity of icon is too much, for example, surpassed 50 even 100, caused spending the plenty of time to move merging icon.Although existing method can be carried out a bonding also to a plurality of icons, a bonding and after, icon is stored in file.When user wants to use certain application in file, still need opened file folder, and then selected application icon.In addition, in routine work life, in traditional mobile intelligent terminal, often move a plurality of application programs simultaneously, between distinct program, during switch window, often adopt button to control.When carrying out many application managements operation, while moving the application program that another does not open, must exit current interface and open again this application program, the enjoyment of having brought while having interrupted smooth interface alternation.
Particularly, in prior art, there is following technical matters:
(1) for mobile intelligent terminals of new generation such as intelligent movable glasses, owing to being difficult to realize location, interface, cause the technology such as multi-point touch unnatural in actual applications, user experiences very poor.
(2) be limited to screen size, virtual screen technology cannot represent at traditional mobile intelligent terminal, therefore more and more along with application program, screen parking space cause deficiency or need repeatedly trigger after just can find corresponding application icon.
(3) in traditional mobile intelligent terminal information demonstration and many application managements, common function adopts button to control more, and while moving a plurality of application program at the same time, current application program can take screen, and between application program, blocked operation is loaded down with trivial details.
Summary of the invention
An object of the present invention is to provide a kind of towards the user interface of space-time scene and the control system of application, this system can effectively be passed through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, naturally finished surface is to user interface and the application management of space-time scene, and user experience is good.
For achieving the above object, the present invention proposes a kind of towards the user interface of space-time scene and the control system of application, comprise: spatial information perception unit 1, described spatial information perception unit 1 is worn with oneself by user, for gathering described user's human space three-dimensional information; Clock unit 2, for providing temporal information; Processing unit 3, described processing unit 3 is connected to described spatial information perception unit 1, described clock unit 2, for described human space three-dimensional information and temporal information are carried out analyzing and processing and export steering order; Show output unit 4, described demonstration output unit 4 is connected to described processing unit 3, for described steering order is forwarded to external unit 20 to show user interface by described external unit 20 by virtual screen, and switch described user interface and/or control the application software execution corresponding actions in described user interface according to described steering order.
According to an aspect of the present invention, described human space three-dimensional information comprises: the azimuth information of human body, attitude information and positional information.
According to another aspect of the present invention, described spatial information perception unit 1 comprises: compass 11, for obtaining the azimuth information of human body; Compass 12, for obtaining the attitude information of human body; Wireless locating module 13, for obtaining the positional information of human body.
According to another aspect of the invention, described spatial information perception unit 1 also comprise following at least one: acceleration transducer, direction sensor, magnetometric sensor, gravity sensor, rotating vector sensor, linear acceleration sensor.
In accordance with a further aspect of the present invention, azimuth information and the attitude information of described human body comprise: the displacement of head and/or hand three dimensions in space, comprise following at least one: move forward and backward, the combination of upper and lower displacement, left and right displacement and at least above-mentioned two kinds of displacements; The angle of head and/or hand changes, comprise following at least one: the combination of left and right horizontal rotation, rotation up and down, side direction rotation and at least above-mentioned two kinds of rotation modes; The absolute displacement of head and/or hand; Head is the relative displacement with respect to head with respect to the relative displacement of hand, hand.
According to an aspect of the present invention, 3 pairs of described human space three-dimensional informations of described processing unit and temporal information analysis are obtained user's action trend, and described processing unit 3 further obtains active user interface, the control behavior representing at described active user interface according to default user's action trend, sends corresponding steering order.
According to a further aspect in the invention, described processing unit 3, for resolving the relative motion between different sensors, calculates the relative displacement of human body according to analysis result, according to the relative displacement of described human body, generate steering order; Described processing unit 3, also for closing the displacement model of described spatial information perception unit 1, retains the variation of the space angle of surveying described spatial information perception unit 1, and generates steering order by the variation of described space angle.
In accordance with a further aspect of the present invention, also comprise: application management unit 5, described application management unit 5 is connected to described processing unit 3, for application software is classified, backed up and unloads.
Of the present invention towards the user interface of space-time scene and the control system of application, can effectively pass through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, naturally finished surface is to the user interface of space-time scene and simple and practical application management, and user experience is good.Meanwhile, the realization because the body sense of low energy consumption is controlled, can be applicable to mobile intelligent terminal better, has broad application prospects.
Another object of the present invention is to provide a kind of intelligent terminal, this equipment can effectively pass through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, naturally finished surface is to user interface and the application management of space-time scene, and user experience is good.
For achieving the above object, the present invention proposes a kind of intelligent terminal, comprising: above-mentioned embodiment provide towards the user interface of space-time scene and the control system of application 10; External unit 20, described external unit 20 is connected to described towards the user interface of space-time scene and the control system of application 10, for showing user interface by virtual screen, and switch described user interface and/or control the application software execution corresponding actions in described user interface according to described steering order.
According to an aspect of the present invention, described external unit 20 comprises the finger ring that is worn on the intelligent glasses of user's head and/or is worn on user's hand, wherein, described spatial information perception unit 1, described clock unit 2 is integrated in described intelligent glasses and/or finger ring, described processing unit 3 and described application management unit 5 are integrated in described intelligent glasses, the video output device of described demonstration output unit 4 is integrated in the lens of described intelligent glasses, the audio output device of described demonstration output unit 4 extends to user's external auditory meatus from the mirror leg bottom of described intelligent glasses.
Intelligent terminal of the present invention, can effectively pass through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, finished surface is to the user interface of space-time scene and simple and practical application management naturally, and user experience is good.Meanwhile, the realization because the body sense of low energy consumption is controlled, can be applicable to mobile intelligent terminal better, intelligent glasses for example, thus have broad application prospects.
Accompanying drawing explanation
Fig. 1 is the structural drawing towards the user interface of space-time scene and the control system of application according to first embodiment of the invention;
Fig. 2 is according to the structural drawing of the spatial information perception unit of embodiment of the present invention;
Fig. 3 is the structural drawing towards the user interface of space-time scene and the control system of application according to second embodiment of the invention;
Fig. 4 is the schematic diagram that in controlling according to the body sense of embodiment of the present invention, button is clicked, button is double-clicked;
Fig. 5 is in controlling according to the body sense of embodiment of the present invention, presses & hold that left and right drags, pressing & hold drags up and down, the schematic diagram dragging before and after pressing & hold;
Fig. 6 is in controlling according to the body sense of embodiment of the present invention, the schematic diagram of the left and right paddling that presses & hold, the upper and lower paddling that presses & hold;
Fig. 7 is in controlling according to the body sense of embodiment of the present invention, head left-right rotation, the schematic diagram rotating up and down;
Fig. 8 is according to the structural drawing of the intelligent terminal of embodiment of the present invention.
Embodiment
For making the object, technical solutions and advantages of the present invention more cheer and bright, below in conjunction with embodiment and with reference to accompanying drawing, the present invention is described in more detail.Should be appreciated that, these descriptions are exemplary, and do not really want to limit the scope of the invention.In addition, in the following description, omitted the description to known configurations and technology, to avoid unnecessarily obscuring concept of the present invention.
Fig. 1 is the structural drawing towards the user interface of space-time scene and the control system of application according to first embodiment of the invention.
As shown in Figure 1, it is a kind of towards the user interface of space-time scene and the control system of application that embodiments of the present invention provide, and comprises spatial information perception unit 1, clock unit 2, processing unit 3 and show output unit 4.
Particularly, spatial information perception unit 1 is worn with oneself by user, for gathering user's human space three-dimensional information.In the present embodiment, spatial information perception unit 1 can be worn at user's head and/or hand, and for example spatial information perception unit 1 can be built in the intelligent glasses of user's head and/or on the finger ring of hand.Spatial information perception unit 1 further will collect human space three-dimensional information and be sent to processing unit 3.
In embodiments of the present invention, human space three-dimensional information comprises: the azimuth information of human body, attitude information and positional information.
Particularly, the azimuth information of human body and attitude information comprise:
(1) displacement of head and/or hand three dimensions in space, comprise following at least one: move forward and backward, the combination of upper and lower displacement, left and right displacement and at least above-mentioned two kinds of displacements;
(2) angle of head and/or hand changes, comprise following at least one: the combination of left and right horizontal rotation, rotation up and down, side direction rotation and at least above-mentioned two kinds of rotation modes; And/or
(3) absolute displacement of head and/or hand;
(4) head relative displacement with respect to head with respect to the relative displacement of hand, hand.
Fig. 2 is according to the structural drawing of the spatial information perception unit of embodiment of the present invention.
As shown in Figure 2, spatial information perception unit 1 comprises: compass 11, gyroscope 12 and wireless locating module 13.Wherein, compass 11 is for obtaining the azimuth information of human body, and gyroscope 12 is for obtaining the attitude information of human body, and wireless locating module 13 is for obtaining the positional information of human body.
In embodiments of the present invention, wireless locating module 13 can be obtained user's positional information by least one in global position system GPS, cellular base station and WIFI technology.That is, wireless locating module 13 is obtained user's positional information by receiving wireless signal, thereby realizes the location to user.
Further, spatial information perception unit 1 also comprises following at least one sensor: acceleration transducer, direction sensor, magnetometric sensor, gravity sensor, rotating vector sensor, linear acceleration sensor.Wherein, spatial information perception unit 1 can gather by various kinds of sensors all kinds of motor messages of human hands.Wherein, sensor includes but not limited to operational and controlled key button, gravity sensor, gyroscope, acceleration transducer, magnetometric sensor etc.Motor message includes but not limited to hand locus, three-dimensional rotation angle, three-dimensional acceleration, three-dimensional magnetic force orientations etc.
Clock unit 2 is for providing temporal information.Wherein, clock unit 2 can be timer, can recording time information, and temporal information is sent to processing unit 3.
In embodiments of the present invention, clock unit 2 can be worn at user's head and/or hand, and for example spatial information perception unit 1 can be built in the intelligent glasses of user's head and/or on the finger ring of hand.
Processing unit 3 is connected to spatial information perception unit 1 and clock unit 2, for the human space three-dimensional information to from spatial information perception unit 1 with from the temporal information of clock unit 2, carries out analyzing and processing and exports steering order to showing output unit 4.
Particularly, 3 pairs of human space three-dimensional informations of processing unit and temporal information analysis are obtained user's action trend, and processing unit 3 further obtains active user interface, the control behavior representing at active user interface according to default user's action trend, sends corresponding steering order.
In addition, processing unit 3, for resolving the relative motion between different sensors, calculates the relative displacement of human body according to analysis result, according to the relative displacement of human body, generate steering order.In other words, under motion state, processing unit 3 is according to the human space three-dimensional information of spatial information perception unit 1, know different sensors absolute motion separately, and then parse the relative motion between different sensors, calculate the relative displacement of human body, and generate steering order by the relative displacement of human body.
Processing unit 3 also, for the displacement model of close space information Perception unit 1, retains the variation of the space angle of space exploration information Perception unit 1, and generates steering order by the variation of 1 space angle.
Show that output unit 4 is connected to processing unit 3, for steering order is forwarded to external unit to show user interface by external unit by virtual screen, and switch user interface and/or control the application software execution corresponding actions in user interface according to steering order.
Fig. 3 is the structural drawing towards the user interface of space-time scene and the control system of application according to second embodiment of the invention.
As shown in Figure 3, of the present inventionly towards the user interface of space-time scene and the control system of application, also comprise: application management unit 5, this application management unit 5 is connected to processing unit 3, for application software is classified, backed up and unloads.
Fig. 4 is the schematic diagram that in controlling according to the body sense of embodiment of the present invention, button is clicked, button is double-clicked.
As shown in Figure 4, control system of the present invention can pre-configured following steering logic: when button single-click operation being detected, carry out corresponding control action, for example, file is selected the opening action of action, application and web page interlinkage; When button double click operation being detected, execute file opening operation.
When processing unit 3 is by after analyzing human space three-dimensional information, when judgement user carries out button single-click operation, the selected steering order of spanned file or related application are opened steering order, web page interlinkage is opened steering order and above-mentioned steering order is sent to and shows output unit 4.Show output unit 4 by the selected steering order of above-mentioned file or related application is opened steering order, web page interlinkage is opened steering order and is forwarded to external unit, the file in the selected virtual screen of peripheral equipment control, open related application or web page interlinkage.When processing unit 3 is by after analyzing human space three-dimensional information, when judgement user carries out button double click operation, spanned file is opened steering order and is sent to and shows output unit 4.Show that output unit 4 is forwarded to external unit by above-mentioned File Open steering order, by peripheral equipment control, open the file in virtual screen.
Fig. 5 is in controlling according to the body sense of embodiment of the present invention, presses & hold that left and right drags, pressing & hold drags up and down, the schematic diagram dragging before and after pressing & hold.
As shown in Figure 5, when processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand moves right, and generates icon and browses backward steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed backward steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed backward.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand is moved to the left, and generates icon and browses forward steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed forward steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed forward.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand moves up, and generates icon and upwards browses steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is upwards browsed steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is upwards browsed.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand moves down, and generates icon and browses steering order downwards and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed steering order downwards by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed downwards.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand moves forward, and generates weather conditions and checks steering order and be sent to demonstration output unit 4.Demonstration output unit 4 checks that by above-mentioned weather conditions steering order is forwarded to external unit, shows local weather situation in peripheral equipment control virtual screen.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: after pressing & hold, hand moves backward, and the rise time checks steering order and is sent to and shows output unit 4.Demonstration output unit 4 checks that by the above-mentioned time steering order is forwarded to external unit, shows local clock or calendar in peripheral equipment control virtual screen.
Fig. 6 is in controlling according to the body sense of embodiment of the present invention, the schematic diagram of the left and right paddling that presses & hold, the upper and lower paddling that presses & hold.
As shown in Figure 6, when processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: hand to the right or downward paddling fast after pressing & hold, generates the downward switching controls instruction of application program and be sent to and show output unit 4.Show that output unit 4 is forwarded to external unit by the downward switching controls instruction of above-mentioned application program, by peripheral equipment control, be switched to next application program.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: hand paddling left or upwards fast after pressing & hold, generates make progress switching controls instruction be sent to and show output unit 4 of application program.Show that output unit 4 is forwarded to external unit by the switching controls instruction that makes progress of above-mentioned application program, is switched to a upper application program by peripheral equipment control.
Fig. 7 is in controlling according to the body sense of embodiment of the present invention, head left-right rotation, the schematic diagram rotating up and down.
As shown in Figure 7, when processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: head turns left, and generates icon and browses forward steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed forward steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed forward.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: head turns right, and generates icon and browses backward steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed backward steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed backward.
After processing unit 3 passes through human space three-dimensional information to analyze, judgement user's action is: head upwards rotates, and generates icon and upwards browses steering order and be sent to demonstration output unit 4.Demonstration output unit 4 is upwards browsed steering order by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is upwards browsed.
When processing unit 3 is by after analyzing human space three-dimensional information, judgement user's action is: head rotates, and generates icon and browses steering order downwards and be sent to demonstration output unit 4.Demonstration output unit 4 is browsed steering order downwards by above-mentioned icon and is forwarded to external unit, and the icon in the user interface in peripheral equipment control virtual screen is browsed downwards.
To sum up, embodiment of the present invention towards the user interface of space-time scene and the control system of application, can realize towards user interface and the application of space-time scene and representing, represent virtual giant-screen, therefore can more conveniently practicably manage application icon, comprise real-time clock, weather forecast and calendar reminding etc. in above-mentioned embodiment, the user interface that the present invention can also be by virtual screen in addition and application realize and push away special reception, coupon offer, at once by screen, share the functions such as picture, blog, microblogging, micro-letter.
According to embodiment of the present invention towards the user interface of space-time scene and the control system of application, can effectively pass through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, naturally finished surface is to the user interface of space-time scene and simple and practical application management, and user experience is good.Meanwhile, the realization because the body sense of low energy consumption is controlled, can be applicable to mobile intelligent terminal better, has broad application prospects.
To sum up, embodiment of the present invention provide towards the user interface of space-time scene and the control system of application, there is following beneficial effect:
(1) realize and controlling to the precise positioning of human body with to the complexity of external unit;
(2) utilize the three-dimensional solid of spatial information perception sing1e unit calibration towards;
(3) pass through processing unit, show communicating by letter and control of output unit and external unit, realize space-time scene, represent virtual giant-screen, comprise clock, weather, calendar etc. general information, and at once by screen, share picture, blog, microblogging etc.;
(5) pass through processing unit, show communicating by letter and control of output unit and external unit, realize many application managements and control, freely manage application icon position;
(6) by processing unit, show communicating by letter and control of output unit and external unit, realize fast many application windows and switch and do not affect use fluency.
Fig. 8 is according to the structural drawing of the intelligent terminal of embodiment of the present invention.
As shown in Figure 8, the intelligent terminal 100 of embodiment of the present invention, comprise that above-mentioned embodiment provides towards the user interface of space-time scene and the control system of application 10 and external unit 20.External unit 20 is connected to towards the user interface of space-time scene and the control system of application 10, for showing user interface by virtual screen, and switches user interface and/or controls the application software execution corresponding actions in user interface according to steering order.
In embodiments of the present invention, external unit 20 comprises the finger ring that is worn on the intelligent glasses of user's head and/or is worn on user's hand.Wherein, spatial information perception unit 1 and clock unit 2 are integrated in intelligent glasses and/or finger ring, and processing unit 3 and application management unit 5 are integrated in intelligent glasses.
The video output device that shows output unit 4 is integrated in the lens of intelligent glasses, shows that the audio output device of output unit 4 extends to user's external auditory meatus from the mirror leg bottom of intelligent glasses.In embodiments of the present invention, show that output unit 4 comprises column or the drop-shaped audio output device that extends to external auditory meatus from the mirror leg bottom of intelligent glasses.
According to the intelligent terminal of embodiment of the present invention, can effectively pass through the method for precise positioning and the dynamic translation of user interface of human body attitude, orientation and position, by simple body sense, naturally finished surface is to the user interface of space-time scene and simple and practical application management, and user experience is good.Meanwhile, the realization because the body sense of low energy consumption is controlled, can be applicable to mobile intelligent terminal better, intelligent glasses for example, thus have broad application prospects.
Should be understood that, above-mentioned embodiment of the present invention is only for exemplary illustration or explain principle of the present invention, and is not construed as limiting the invention.Therefore any modification of, making, be equal to replacement, improvement etc., within protection scope of the present invention all should be included in without departing from the spirit and scope of the present invention in the situation that.In addition, claims of the present invention are intended to contain whole variations and the modification in the equivalents that falls into claims scope and border or this scope and border.

Claims (10)

1. towards the user interface of space-time scene and a control system for application, comprising:
Spatial information perception unit (1), described spatial information perception unit (1) is worn with oneself by user, for gathering described user's human space three-dimensional information;
Clock unit (2), for providing temporal information;
Processing unit (3), described processing unit (3) is connected to described spatial information perception unit (1), described clock unit (2), for described human space three-dimensional information and temporal information are carried out analyzing and processing and export steering order;
Show output unit (4), described demonstration output unit (4) is connected to described processing unit (3), for described steering order is forwarded to external unit to show user interface by described external unit by virtual screen, and switch described user interface and/or control the application software execution corresponding actions in described user interface according to described steering order.
2. according to claim 1 towards the user interface of space-time scene and the control system of application, wherein, described human space three-dimensional information comprises: the azimuth information of human body, attitude information and positional information.
3. according to claim 1 and 2 towards the user interface of space-time scene and the control system of application, wherein, described spatial information perception unit (1) comprising:
Compass (11), for obtaining the azimuth information of human body;
Compass (12), for obtaining the attitude information of human body; With
Wireless locating module (13), for obtaining the positional information of human body.
4. according to claim 3 towards the user interface of space-time scene and the control system of application, wherein, described spatial information perception unit (1) also comprise following at least one: acceleration transducer, direction sensor, magnetometric sensor, gravity sensor, rotating vector sensor, linear acceleration sensor.
5. according to claim 2 towards the user interface of space-time scene and the control system of application, wherein, azimuth information and the attitude information of described human body comprise:
The displacement of head and/or hand three dimensions in space, comprise following at least one: move forward and backward, the combination of upper and lower displacement, left and right displacement and at least above-mentioned two kinds of displacements;
The angle of head and/or hand changes, comprise following at least one: the combination of left and right horizontal rotation, rotation up and down, side direction rotation and at least above-mentioned two kinds of rotation modes; And/or
The absolute displacement of head and/or hand;
Head is the relative displacement with respect to head with respect to the relative displacement of hand, hand.
6. according to claim 1 towards the user interface of space-time scene and the control system of application, wherein, described processing unit (3) obtains user's action trend to described human space three-dimensional information and temporal information analysis, and described processing unit (3) further obtains active user interface, the control behavior representing at described active user interface according to default user's action trend, sends corresponding steering order.
7. according to claim 4 towards the user interface of space-time scene and the control system of application, described processing unit (3) is for resolving the relative motion between different sensors, according to analysis result, calculate the relative displacement of human body, according to the relative displacement of described human body, generate steering order;
Described processing unit (3) is also for closing the displacement model of described spatial information perception unit (1), retain the variation of the space angle of surveying described spatial information perception unit (1), and generate steering order by the variation of described space angle.
8. according to claim 1 towards the user interface of space-time scene and the control system of application, wherein, also comprise:
Application management unit (5), described application management unit (5) is connected to described processing unit (3), for application software is classified, backed up and unloads.
9. an intelligent terminal (100), comprising:
Described in claim 1 to 8 any one towards the user interface of space-time scene and the control system of application (10);
External unit (20), described external unit (20) is connected to described towards the user interface of space-time scene and the control system of application (10), for showing user interface by virtual screen, and switch described user interface and/or control the application software execution corresponding actions in described user interface according to described steering order.
10. intelligent terminal according to claim 9, wherein, described external unit (20) comprises the finger ring that is worn on the intelligent glasses of user's head and/or is worn on user's hand,
Wherein, described spatial information perception unit (1), described clock unit (2) are integrated in described intelligent glasses and/or finger ring;
Described processing unit (3) and described application management unit (5) are integrated in described intelligent glasses;
The video output device of described demonstration output unit (4) is integrated in the lens of described intelligent glasses, and the audio output device of described demonstration output unit (4) extends to user's external auditory meatus from the mirror leg bottom of described intelligent glasses.
CN201410382288.3A 2014-08-06 2014-08-06 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes Pending CN104156082A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410382288.3A CN104156082A (en) 2014-08-06 2014-08-06 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410382288.3A CN104156082A (en) 2014-08-06 2014-08-06 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes

Publications (1)

Publication Number Publication Date
CN104156082A true CN104156082A (en) 2014-11-19

Family

ID=51881601

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410382288.3A Pending CN104156082A (en) 2014-08-06 2014-08-06 Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes

Country Status (1)

Country Link
CN (1) CN104156082A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104484047A (en) * 2014-12-29 2015-04-01 北京智谷睿拓技术服务有限公司 Interactive method and interactive device based on wearable device, and wearable device
WO2016033762A1 (en) * 2014-09-03 2016-03-10 北京行云时空科技有限公司 Somatosensory control system and method
CN105843407A (en) * 2016-06-08 2016-08-10 北京行云时空科技有限公司 Clicking method and device based on virtual reality system
CN105920838A (en) * 2016-06-08 2016-09-07 北京行云时空科技有限公司 System and method for movement collection and control
CN106095090A (en) * 2016-06-07 2016-11-09 北京行云时空科技有限公司 Control method, device and the system of spatial scene based on intelligence system
WO2017121276A1 (en) * 2016-01-13 2017-07-20 阿里巴巴集团控股有限公司 Display device task launching method and device
CN108230807A (en) * 2017-12-27 2018-06-29 阎东 A kind of human body temperature type sport simulated system
CN108984561A (en) * 2017-06-01 2018-12-11 华为技术有限公司 Site selecting method and equipment
CN109144265A (en) * 2018-08-30 2019-01-04 Oppo广东移动通信有限公司 Display changeover method, device, wearable device and storage medium
CN110134249A (en) * 2019-05-31 2019-08-16 王刘京 Wear interactive display device and its control method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103814382A (en) * 2012-09-14 2014-05-21 华为技术有限公司 Augmented reality processing method and device of mobile terminal
CN103858073A (en) * 2011-09-19 2014-06-11 视力移动技术有限公司 Touch free interface for augmented reality systems

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103858073A (en) * 2011-09-19 2014-06-11 视力移动技术有限公司 Touch free interface for augmented reality systems
CN103814382A (en) * 2012-09-14 2014-05-21 华为技术有限公司 Augmented reality processing method and device of mobile terminal

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016033762A1 (en) * 2014-09-03 2016-03-10 北京行云时空科技有限公司 Somatosensory control system and method
CN104484047A (en) * 2014-12-29 2015-04-01 北京智谷睿拓技术服务有限公司 Interactive method and interactive device based on wearable device, and wearable device
WO2017121276A1 (en) * 2016-01-13 2017-07-20 阿里巴巴集团控股有限公司 Display device task launching method and device
CN106095090A (en) * 2016-06-07 2016-11-09 北京行云时空科技有限公司 Control method, device and the system of spatial scene based on intelligence system
CN105843407A (en) * 2016-06-08 2016-08-10 北京行云时空科技有限公司 Clicking method and device based on virtual reality system
CN105920838A (en) * 2016-06-08 2016-09-07 北京行云时空科技有限公司 System and method for movement collection and control
CN108984561A (en) * 2017-06-01 2018-12-11 华为技术有限公司 Site selecting method and equipment
CN108984561B (en) * 2017-06-01 2021-06-22 华为技术有限公司 Site selection method and equipment
CN108230807A (en) * 2017-12-27 2018-06-29 阎东 A kind of human body temperature type sport simulated system
CN108230807B (en) * 2017-12-27 2020-07-24 阎东 Somatosensory type motion simulation system
CN109144265A (en) * 2018-08-30 2019-01-04 Oppo广东移动通信有限公司 Display changeover method, device, wearable device and storage medium
CN110134249A (en) * 2019-05-31 2019-08-16 王刘京 Wear interactive display device and its control method

Similar Documents

Publication Publication Date Title
CN104156082A (en) Control system and intelligent terminal of user interfaces and applications aimed at space-time scenes
KR102209099B1 (en) Apparatus including a touch screen and method for controlling the same
US10416789B2 (en) Automatic selection of a wireless connectivity protocol for an input device
CN107734217B (en) Closed loop position control of camera actuator
CN105722009B (en) Portable device and method of controlling position information of portable device
EP2708983B1 (en) Method for auto-switching user interface of handheld terminal device and handheld terminal device thereof
KR101655812B1 (en) Mobile terminal and operation method thereof
KR102051418B1 (en) User interface controlling device and method for selecting object in image and image input device
EP2613224B1 (en) Mobile terminal and control method therof
CN107636594A (en) For handling the apparatus and method of touch input
CN102789313A (en) User interaction system and method
JP2016536715A (en) Modeling structures using depth sensors
CN104335268A (en) Method, system and apparatus for providing a three-dimensional transition animation for a map view change
US10474324B2 (en) Uninterruptable overlay on a display
CN104166466A (en) Body feeling manipulating system and method provided with auxiliary control function
JPWO2013118417A1 (en) Information processing apparatus, information processing method, and program
US11355094B2 (en) Wireless virtual display controller
CN109844702B (en) Control method for electronic equipment and input equipment
CN104238900B (en) A kind of page positioning method and device
CN108055635B (en) Position information acquisition method and device, storage medium and terminal
CN111124156A (en) Interaction control method of mobile terminal and mobile terminal
US9983029B2 (en) Integrated optical encoder for tilt able rotatable shaft
CN104156070A (en) Body intelligent input somatosensory control system and method
CN115480639A (en) Human-computer interaction system, human-computer interaction method, wearable device and head display device
CN104516660A (en) Information processing method and system and electronic device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20141119