CN105204613A - Information processing method and wearable equipment - Google Patents

Information processing method and wearable equipment Download PDF

Info

Publication number
CN105204613A
CN105204613A CN201410302909.2A CN201410302909A CN105204613A CN 105204613 A CN105204613 A CN 105204613A CN 201410302909 A CN201410302909 A CN 201410302909A CN 105204613 A CN105204613 A CN 105204613A
Authority
CN
China
Prior art keywords
wearable
area
projected
content
state
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410302909.2A
Other languages
Chinese (zh)
Other versions
CN105204613B (en
Inventor
杨晨
马骞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410302909.2A priority Critical patent/CN105204613B/en
Publication of CN105204613A publication Critical patent/CN105204613A/en
Application granted granted Critical
Publication of CN105204613B publication Critical patent/CN105204613B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The invention provides an information processing method which is applied to wearable equipment. The wearable equipment is provided with a fixing device which is used for fixing the wearable equipment to a first main body. The method comprises the steps that a first operation of an operating body at a first area on the first main body is acquired with at least one sensor of the wearable equipment, wherein the first area comprises a first preset distance over the first main body and does not comprise the area covered by the wearable equipment; the first operation is responded, and a first instruction is generated. The invention further discloses corresponding electronic equipment.

Description

A kind of information processing method and Wearable
Technical field
The present invention relates to areas of information technology, particularly relate to a kind of information processing method and Wearable.
Background technology
Along with the development of infotech, increasing electronic equipment appears in the live and work of people.Such as: smart mobile phone, panel computer, notebook computer etc.If people want to utilize these electronic equipments to realize contacting with other people whenever and wherever possible, these electronic equipments must be carried with, because the electronic equipments itself such as smart mobile phone, panel computer, notebook computer have the size of at least hand size, use so be not easy to user.
Under these circumstances, Wearable is arisen at the historic moment.Such as: intelligent watch, intelligent glasses etc.These Wearable usually relative to smart mobile phone, panel computer, notebook computer etc., have less volume, have good portability, be therefore subject to the popular welcome of people.These Wearable of the usual body-worn of people, use more convenient.
But, the present inventor finds in the process realizing the technical scheme in the embodiment of the present invention, in the prior art, user can only by the control of operation realization to Wearable on the display unit of Wearable, be limited to the size of Wearable itself, be not easy to the control of user to Wearable.
Therefore, the technical matters that prior art exists is: Wearable can only respond the operation of the display unit for Wearable itself.
Summary of the invention
The invention provides a kind of information processing method and Wearable, the technical matters of the operation of the display unit for Wearable itself can only be responded in order to solve the Wearable existed in prior art.Achieve the interaction area of expansion Wearable, improve Consumer's Experience.
On the one hand, the invention provides a kind of information processing method, be applied to a Wearable, described Wearable has a stationary installation, and for described Wearable is fixed to the first main body, described method comprises:
At least one sensor of described Wearable is utilized to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Respond described first operation, and generate the first instruction.
Optionally, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize the visible images collecting unit of described Wearable obtain described operating body carry out on described first area described first operation time at least two images;
Described at least two images are analyzed, obtains described first operation.
Optionally, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize the first image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the first image, and utilize the second image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the second image, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described first image and the second image are analyzed, obtains described first operation.
Optionally, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize structured light projection that the infrared light supply of described Wearable produces by the infrared projection unit of described Wearable to described first area;
Utilize the infrared image acquisition unit of described Wearable obtain described operating body carry out on described first area described first operation time infrared image;
Described infrared image is analyzed, obtains described first operation.
Optionally, described utilize at least one sensor of described Wearable obtain the first area of an operating body in described first main body first operation before, described method also comprises:
The projecting cell of described Wearable is utilized one content to be projected to be projected to described first area.
Optionally, before the described structured light projection produced by the infrared light supply of described Wearable to described first area, described method also comprises:
Utilize the projecting cell of described Wearable one content to be projected to be projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
Optionally, described utilize at least one sensor of described Wearable obtain the first area of an operating body in described first main body first operation before, described method also comprises:
The outer operation detection function of equipment controlling described Wearable is in opening.
Optionally, the operation detection function of the described Wearable of described control is in opening, is specially:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
Optionally, the outer operation detection function of equipment of the described Wearable of described control is in opening, is specially:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
Optionally, be the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, described method also comprises:
Detect the trigger action of acquisition one for described content to be projected;
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
On the other hand, the invention provides a kind of Wearable, described Wearable has a stationary installation, and for described Wearable is fixed to the first main body, described Wearable also comprises:
At least one sensor, for obtaining the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Processor, for responding described first operation, and generates the first instruction.
Optionally, at least one sensor described is specially:
Visible images collecting unit, for obtaining at least two images when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described at least two images are analyzed, obtains described first operation.
Optionally, at least one sensor described specifically comprises:
First image acquisition units, for obtaining the first image when described operating body carries out described first operation on described first area;
Second image acquisition units, for obtaining the second image when described operating body carries out described first operation on described first area, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described processor specifically for:
Described first image and the second image are analyzed, obtains described first operation.
Optionally, at least one sensor described specifically comprises:
Infrared projection unit, for the structured light projection that produced by the infrared light supply of described Wearable to described first area;
Infrared image acquisition unit, for obtaining infrared image when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described infrared image is analyzed, obtains described first operation.
Optionally, described Wearable also comprises:
Projecting cell, for projecting a content to be projected to described first area.
Optionally, described projecting cell specifically for:
Before the structured light projection utilizing the infrared projection unit of described Wearable to be produced by the infrared light supply of described Wearable to described first area, one content to be projected is projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
Optionally, described processor also for:
Utilize before at least one sensor of described Wearable obtains the first operation of the first area of an operating body in described first main body described, the outer operation detection function of equipment controlling described Wearable is in opening.
Optionally, described processor specifically for:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
Optionally, described processor specifically for:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
Optionally, described Wearable also comprises:
Detecting unit, for being the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, detect the trigger action of acquisition one for described content to be projected;
Described processor is used for:
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
The one or more technical schemes provided in the embodiment of the present invention, at least have following technique effect or advantage:
1, in the embodiment of the present invention, at least one sensor of Wearable is utilized to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and do not comprise the region that described Wearable covers, respond described first operation, and generate the first instruction.This shows, Wearable in the present embodiment can respond the operation of the first area of an operating body in first main body of dressing described Wearable, and first area is the first preset distance above described first main body, and do not comprise the region that described Wearable covers, so solve the technical matters that the Wearable existed in prior art in prior art can only respond the operation of the display unit for Wearable itself.Achieve the interaction area of expansion Wearable, improve Consumer's Experience.
2, in the embodiment of the present invention, at least one sensor of Wearable can be: visible images collecting unit, or first image acquisition units and the second image acquisition units, and the first image acquisition units and the second image acquisition units relative to Wearable for the described Wearable stationary installation be fixed in the first main body is had different angles, or infrared projection unit and infrared image acquisition unit, so, provide the various ways realizing information processing method of the present invention.
Accompanying drawing explanation
Fig. 1 is the process flow diagram of a kind of information processing method in the embodiment of the present invention one;
Fig. 2 is the schematic diagram of first area in the embodiment of the present invention one;
Fig. 3 is the structural drawing of the electronic equipment in the embodiment of the present invention two.
Embodiment
The embodiment of the present invention provides a kind of information processing method and Wearable, can only respond the technical matters of the operation of the display unit for Wearable itself in order to solve the Wearable existed in prior art.Achieve the technique effect of the interaction area of expansion Wearable, and improve Consumer's Experience.
Technical scheme in the embodiment of the present invention is solve above-mentioned technical matters, and general thought is as follows:
Provide a kind of information processing method, be applied to a Wearable, described Wearable has a stationary installation, for described Wearable is fixed to the first main body, described method comprises: utilize at least one sensor of described Wearable to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers; Respond described first operation, and generate the first instruction.This shows, Wearable in the present embodiment can respond the operation of the first area of an operating body in first main body of dressing described Wearable, and first area is the first preset distance above described first main body, and do not comprise the region that described Wearable covers, so solve the technical matters that the Wearable existed in prior art in prior art can only respond the operation of the display unit for Wearable itself.Achieve the interaction area of expansion Wearable, improve Consumer's Experience.
In order to better understand technique scheme, below in conjunction with Figure of description and concrete embodiment, technique scheme is described in detail.
Term "and/or" herein, being only a kind of incidence relation describing affiliated partner, can there are three kinds of relations in expression, and such as, A and/or B, can represent: individualism A, exists A and B simultaneously, these three kinds of situations of individualism B.In addition, character "/" herein, general expression forward-backward correlation is to the relation liking a kind of "or".
Embodiment one
Embodiments provide a kind of information processing method, be applied to a Wearable, described Wearable has a stationary installation, for described Wearable is fixed to the first main body.In the embodiment of the present invention, Wearable can be: intelligent watch, intelligent glasses etc.For intelligent watch, stationary installation can be: watchband, and intelligent watch can be fixed on the arm of user by watchband.For intelligent glasses, stationary installation can be: mirror holder, on the bridge of the nose that intelligent glasses can be fixed to user by mirror holder and ear.
The information processing method that the embodiment of the present invention provides, is applied to a Wearable.Specifically, can install an application software in Wearable, when this application software is run, Wearable just performs the information processing method that the embodiment of the present invention provides.
As shown in Figure 1, the information processing method that the embodiment of the present invention provides comprises:
Step 1: utilize at least one sensor of described Wearable to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Step 2: respond described first operation, and generate the first instruction.
In the embodiment of the present invention, when Wearable performs said method, as shown in Figure 2, Wearable is fixed in the first main body 20 by the stationary installation 101 of Wearable 10, the first preset distance above the first main body, and not comprise the region that Wearable 10 covers be first area 201.Described first preset distance is a distance being more than or equal to 0, such as: 5 centimetres, 10 centimetres or 20 centimetres.
In the embodiment of the present invention, operating body can be: finger, pen, teacher's pointer, demonstration device etc.When an operating body carries out the first operation for first area, such as: on the first region by left-to-right or by the right side to left slip, or what jointly completed by forefinger and thumb zooms in or out operation etc., at least one sensor of Wearable can obtain this first operation.
In the embodiment of the present invention, because at least one sensor of Wearable can be: visible images collecting unit, or first image acquisition units and the second image acquisition units, and the first image acquisition units and the second image acquisition units relative to Wearable for the described Wearable stationary installation be fixed in the first main body is had different angles, or infrared projection unit and infrared image acquisition unit, so step 1 has different embodiments.Below the different embodiments of step 1 are elaborated.
In the first embodiment of step 1, utilize the visible images collecting unit of described Wearable obtain described operating body carry out on described first area described first operation time at least two images; Described at least two images are analyzed, obtains described first operation.
Specifically, when operating body carries out the first operation on the first region, because the first action need regular hour just can complete, so the visible images collecting unit of Wearable can obtain multiple images during this period of time continuously, by analyzing multiple images, the operation trace that the first operation is concrete just can be known.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, visible images collecting unit on intelligent watch is in opening, when the finger of user is above arm or when arm surface is carried out by left-to-right slide, visible images collecting unit can slide to complete during this period of time in obtain multiple images, then the processor of intelligent watch is analyzed these multiple images, just can know that this is operating as by left-to-right slide.
In the second embodiment of step 1, utilize the first image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the first image, and utilize the second image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the second image, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation; Described first image and the second image are analyzed, obtains described first operation.
Specifically, when operating body carries out the first operation on the first region, first image acquisition units of Wearable and the second image acquisition units are taken the complete process of the first operation simultaneously, obtain the first image and the second image respectively, wherein, the first image acquisition units and described second image acquisition units have different angle for fixing this Wearable to the stationary installation in the first main body relative to Wearable.By analyzing the first image and the second image, operating body particular location in space just can be known.Because the first action need regular hour just can complete, so the first image acquisition units of Wearable and the second image acquisition units can obtain multiple images during this period of time continuously, by analyzing multiple images, just can know the consecutive variations of operating body particular location in space, and then just can know the operation trace that the first operation is concrete.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, the first image acquisition units on intelligent watch and the second image acquisition units are in opening, and the first image acquisition units and the second image acquisition units have different angles relative to watchband, when the finger of user is above arm or when arm surface is carried out by left-to-right slide, first image acquisition units and the second image acquisition units can slide to complete during this period of time in synchronization obtain the first image and the second image respectively, multiple moment during this period of time obtain multiple first images and the second image respectively.
Then the processor of intelligent watch is analyzed the first image of synchronization and the second image, a bit on operating body is selected from the first image, the same point on operating body is selected from the second image, this point first image acquisition units is connected respectively with the second image acquisition units, form two straight lines, then the extended line of two straight lines is made, due to these two points represent be shooting operating body synchronization in same point, so two are prolonged and are bound to intersect at a point, the position of this point is exactly the real spatial three-dimensional position (two intersecting straight lines determines a point in space) of this some correspondence on image.Light path principle of reversibility, in space a bit, it, towards radius from all directions, is captured by two cameras now simultaneously, and two light paths conversely, the extending line intersection point that two cameras are done is exactly that point in space.Some particular location in space namely on operating body.According to similar method, to the first image acquisition units and the second image acquisition units slide from multiple images of obtaining continuously within the multiple moment completed analyze, just can know the consecutive variations of operating body particular location in space, and then just can know that the first operation is specially by left-to-right slide.
In the third embodiment of step 1, utilize structured light projection that the infrared light supply of described Wearable produces by the infrared projection unit of described Wearable to described first area; Utilize the infrared image acquisition unit of described Wearable obtain described operating body carry out on described first area described first operation time infrared image; Described infrared image is analyzed, obtains described first operation.
Specifically, in the embodiment of the present application, Wearable has an infrared light supply, and this infrared light supply is provided with cylindrical lens, and infrared light pools the light belt of narrower in width after cylindrical lens, is structured light.Utilize infrared projection unit by structured light projection to first area.When operating body carries out the first operation on the first region, the infrared image acquisition unit of Wearable can obtain infrared image, is analyzed by infrared image, just can know the depth information of operating body.Because the first action need regular hour just can complete, so the infrared image acquisition unit of Wearable can obtain multiple images during this period of time continuously, by analyzing multiple images, just can know the consecutive variations of multiple depth informations of operating body, and then just can know the operation trace that the first operation is concrete.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, infrared light supply on intelligent watch, infrared projection unit and infrared image acquisition unit are in opening, the structured light projection that infrared light supply produces to the arm of user above or arm surperficial, when the finger of user above arm or arm surface carry out amplifieroperation, during the increasing slide of the spacing namely between thumb and forefinger, infrared image acquisition unit can amplifieroperation to complete during this period of time in obtain multiple infrared images.
Then the processor of intelligent watch is analyzed successively to multiple infrared images, obtain the depth information of finger on infrared image successively, according to multiple depth information, just can know the consecutive variations of operating body particular location in space, and then just can know that the first operation is specially amplifieroperation.
In above-mentioned three kinds of implementations of step 1, Wearable is fixed in the first main body, because the first operation is carried out in first area, and first area is the first preset distance above the first main body, and do not comprise the region that Wearable covers, there is randomness, Wearable cannot determine the coordinate position of first area relative to Wearable itself, so at least one sensor of Wearable only can determine the operation trace of the first operation, Wearable only can operation response track, performs simple instruction.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, on intelligent watch, at least one sensor is in opening, when the finger of user is above arm or when arm surface to be carried out by the right side to the slide on a left side, intelligent watch be can not determine above arm or the surperficial coordinate position relative to intelligent watch itself of arm, so at least one sensor of intelligent watch only can respond the sliding trace of slide, intelligent watch only performs as switched next page or switching the simple instructions such as next song.
In another embodiment of the application, before execution step 1, can also following steps be performed:
The projecting cell of described Wearable is utilized one content to be projected to be projected to described first area.
Specifically, Wearable has a projecting cell, and band project content is projected to first area by this projecting cell, and then utilizes at least one sensor of Wearable to obtain first operation of an operating body in first area.Owing to first area needing project content, so namely utilize at least one sensor to obtain first operation of an operating body for content to be projected.
Content to be projected can be identical with the displaying contents on the display unit of Wearable, also can be different.When content to be projected is identical with displaying contents, user can directly for the content to be projected on first area, and namely displaying contents operates.When content to be projected is different from displaying contents, user both can operate for the content to be projected on first area, also can operate for the viewing area on display unit.
No matter whether content to be projected is identical with displaying contents, at the projecting cell of Wearable by content projection to be projected to first area, and then utilize at least one sensor to obtain an operating body when the first operation of first area, because Wearable itself can obtain the coordinate position of each content in content to be projected, so when an operating body carries out the first operation for a certain content, at least one sensor of Wearable can determine the operating point of the first operation for content to be projected, Wearable can operation response point, perform complicated order.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, on intelligent watch, at least one sensor is in opening, the projecting cell of intelligent watch is by above content projection to be projected to the arm of user or arm surface, when the finger prick a certain content treated in project content of user carries out clicking operation, because intelligent watch itself can obtain the coordinate position of this content in content to be projected, so when finger prick carries out clicking operation to this content, at least one sensor of intelligent watch can determine the operating point of this clicking operation for content to be projected, intelligent watch energy operation response point, perform as run the complicated instruction such as an application program or editor's word.
In the another embodiment of the application, before the described structured light projection utilizing the infrared projection unit of described Wearable to be produced by the infrared light supply of described Wearable to described first area, can also following steps be performed:
Utilize the projecting cell of described Wearable one content to be projected to be projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
Specifically, the projection unit projects of Wearable goes out visible ray, and by a content projection to be projected to first area, the infrared projection cell projection of Wearable goes out sightless infrared structure light, and structured light is also projected to first area.Because infrared image acquisition unit and infrared projection unit matching gather infrared image, in order to ensure that an operating body can both be collected by infrared image acquisition unit for the operation of content to be projected, the projected area of projecting cell must be more than or equal to the infrared projection area of infrared projection unit.Infrared image acquisition unit can obtain infrared image, is analyzed by infrared image, just can know the depth information of operating body.Because the first action need regular hour just can complete, so the infrared image acquisition unit of Wearable can obtain multiple images during this period of time continuously, by analyzing multiple images, just can know the consecutive variations of multiple depth informations of operating body, because Wearable itself can obtain the coordinate position of each content in content to be projected, so the operating point of the first operation for content to be projected just can be known, and then respond this operating point, perform complicated order.
Intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, the projection unit projects of intelligent watch goes out visible ray, by above a content projection to be projected to the arm of user or arm surface, the infrared projection cell projection of intelligent watch goes out sightless infrared structure light, above arm structured light also being projected to user or arm surface.When the finger prick a certain content treated in project content of user carries out clicking operation, infrared image acquisition unit can obtain infrared image, is analyzed by infrared image, just can know the depth information of finger.Infrared image acquisition unit can obtain infrared image, is analyzed by infrared image, just can know the depth information of operating body.Because intelligent watch itself can obtain the coordinate position of each content in content to be projected, so the operating point of clicking operation for content to be projected just can be known, and then respond this operating point, perform as run the complicated instruction such as an application program or editor's word.
In an embodiment again of the application, before execution step 1, can also following steps be performed:
The outer operation detection function of equipment controlling described Wearable is in opening.
Wherein, the outer operation detection function of equipment just just Wearable first operating of at least one sensor can being utilized to obtain carry out outside equipment, and carry out the function that responds.As mentioned before, information processing method in the embodiment of the present application is applied to a Wearable, specifically, can install an application software in Wearable, when this application software is run, Wearable just performs the information processing method that the embodiment of the present invention provides.When Wearable performs the method, the outer measuring ability of equipment of Wearable is just in opening.
The embodiment that the operation detection function controlling described Wearable is in opening specifically has following several:
The first embodiment: the second operation undertaken by described first main body of described at least one sensing unit detection; Judge whether described second operation is a predetermined registration operation, obtains the first judged result; When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
Specifically, intelligent watch for Wearable, when on the arm that intelligent watch to be fixed on user by watchband, detect whether arm has carried out as shaken, the action such as whipping, from front to back whipping from top to bottom by the gravity sensor of intelligent watch etc., suppose that the deliberate action into the outer operation detection function of opening device is shaken in application software on intelligent watch or user's setting, when user has carried out shaking, be in opening with regard to the outer operation detection function of opertaing device.
The second embodiment: obtain current at least one application program being in running status of described Wearable; Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result; When described second judged result is for being, controls described vitro detection function and being in opening.
Specifically, intelligent watch for Wearable, when intelligent watch runs application, judge whether this application program is the application program that default application program is concentrated, wherein, default sets of applications is the set of the application program needing the outer operation detection function of opening device, when the application program that intelligent watch is running is the application program needing the outer operation detection function of opening device, is in opening with regard to the outer operation detection function of opertaing device.
The third embodiment: detect the current described environmental information and/or current time information that obtain the current residing environment of described Wearable; Judge whether described current residing environmental information and/or described current time information meet and preset environmental information and/or Preset Time information, obtain the 3rd judged result; When described 3rd judged result is for being, controls described vitro detection function and being in opening.
Specifically, intelligent watch for Wearable, detect the current residing ambient brightness of intelligent watch and/or current time, suppose that application software on intelligent watch or the brightness of user's set environment are default environmental information and/or the Preset Time information of the outer operation detection function of opening device lower than the backlight illumination of the display unit of smart mobile phone and/or night 9 .-11, when the current residing ambient brightness of intelligent watch is night 9 .-11 lower than the backlight illumination of the display unit of smart mobile phone and/or current time, control described vitro detection function and be in opening.
In the embodiment of the present application, be the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, described method also comprises:
Detect the trigger action of acquisition one for described content to be projected;
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
Illustrate with a concrete example below.
Suppose that user wants the game of bowl, so can open bowls game application on the display unit, show the part content of pin on the display unit, then by projecting cell by bowling and content projection corresponding to bowling track on the first region, then user can trigger bowling along orbital motion, namely for the trigger action of described content to be projected, wherein, user triggers bowling, such as that user makes the bowls action of release, then this gesture is detected by such as at least one sensor aforesaid, just control bowling along orbital motion, namely the Current projection state of content to be projected adjusts to the second state from described first state.Obtain the state change of content described to be projected.
When bowling is to the end of track, when namely reaching a predetermined condition, the border of such as first area, so at this moment, just represent that the place of pin has been arrived in bowling, and hit bowling, there is corresponding state and change in the displaying contents so with regard to controlling display unit, namely adjust the current display state of described present displayed content based on described second state.In the present embodiment, because pin is hit in bowling, so be just adjusted to the state that pin falls down.
These are only citing, be not limited to the present invention, in practice, can carry out alternately multiple between display unit and projecting cell, will not enumerate at this.
Embodiment two
Based on same inventive concept, a kind of Wearable is additionally provided in the embodiment of the present application, the principle of dealing with problems due to this Wearable and above-mentioned information processing method is similar to information processing method, therefore the enforcement of this Wearable see the enforcement of method, can repeat part and repeats no more.
The embodiment of the present application provides a kind of Wearable, and described Wearable has a stationary installation, and for described Wearable is fixed to the first main body, as shown in Figure 3, described Wearable also comprises:
At least one sensor 01, for obtaining the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Processor 02, for responding described first operation, and generates the first instruction.
Optionally, at least one sensor described is specially:
Visible images collecting unit, for obtaining at least two images when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described at least two images are analyzed, obtains described first operation.
Optionally, at least one sensor described specifically comprises:
First image acquisition units, for obtaining the first image when described operating body carries out described first operation on described first area;
Second image acquisition units, for obtaining the second image when described operating body carries out described first operation on described first area, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described processor specifically for:
Described first image and the second image are analyzed, obtains described first operation.
Optionally, at least one sensor described specifically comprises:
Infrared projection unit, for the structured light projection that produced by the infrared light supply of described Wearable to described first area;
Infrared image acquisition unit, for obtaining infrared image when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described infrared image is analyzed, obtains described first operation.
Optionally, described Wearable also comprises:
Projecting cell, for projecting a content to be projected to described first area.
Optionally, described projecting cell specifically for:
Before the structured light projection utilizing the infrared projection unit of described Wearable to be produced by the infrared light supply of described Wearable to described first area, one content to be projected is projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
Optionally, described processor also for:
Utilize before at least one sensor of described Wearable obtains the first operation of the first area of an operating body in described first main body described, the outer operation detection function of equipment controlling described Wearable is in opening.
Optionally, described processor specifically for:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
Optionally, described processor specifically for:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
Optionally, described Wearable also comprises:
Detecting unit, for being the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, detect the trigger action of acquisition one for described content to be projected;
Described processor is used for:
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
The one or more technical schemes provided in above-mentioned application embodiment, at least have following technique effect or advantage:
1, in the embodiment of the present invention, at least one sensor of Wearable is utilized to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and do not comprise the region that described Wearable covers, respond described first operation, and generate the first instruction.This shows, Wearable in the present embodiment can respond the operation of the first area of an operating body in first main body of dressing described Wearable, and first area is the first preset distance above described first main body, and do not comprise the region that described Wearable covers, so solve the technical matters that the Wearable existed in prior art in prior art can only respond the operation of the display unit for Wearable itself.Achieve the interaction area of expansion Wearable, improve Consumer's Experience.
2, in the embodiment of the present invention, at least one sensor of Wearable can be: visible images collecting unit, or first image acquisition units and the second image acquisition units, and the first image acquisition units and the second image acquisition units relative to Wearable for the described Wearable stationary installation be fixed in the first main body is had different angles, or infrared projection unit and infrared image acquisition unit, so, provide the various ways realizing information processing method of the present invention.
Those skilled in the art should understand, the embodiment of the application can be provided as method, system or computer program.Therefore, the application can adopt the form of complete hardware embodiment, completely software implementation or the embodiment in conjunction with software and hardware aspect.And the application can adopt in one or more form wherein including the upper computer program implemented of computer-usable storage medium (including but not limited to magnetic disk memory, CD-ROM, optical memory etc.) of computer usable program code.
The application describes with reference to according to the process flow diagram of the method for the embodiment of the present application, equipment (system) and computer program and/or block scheme.Should understand can by the combination of the flow process in each flow process in computer program instructions realization flow figure and/or block scheme and/or square frame and process flow diagram and/or block scheme and/or square frame.These computer program instructions can being provided to the processor of multi-purpose computer, special purpose computer, Embedded Processor or other programmable data processing device to produce a machine, making the instruction performed by the processor of computing machine or other programmable data processing device produce device for realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be stored in can in the computer-readable memory that works in a specific way of vectoring computer or other programmable data processing device, the instruction making to be stored in this computer-readable memory produces the manufacture comprising command device, and this command device realizes the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
These computer program instructions also can be loaded in computing machine or other programmable data processing device, make on computing machine or other programmable devices, to perform sequence of operations step to produce computer implemented process, thus the instruction performed on computing machine or other programmable devices is provided for the step realizing the function of specifying in process flow diagram flow process or multiple flow process and/or block scheme square frame or multiple square frame.
Specifically, a kind of information processing method in the embodiment of the present application is applied in Wearable, described Wearable has a stationary installation, for described Wearable is fixed to the first main body, computer program instructions corresponding to described method can be stored in CD, hard disk, on the storage mediums such as USB flash disk, when the computer program instructions corresponding with a kind of information processing method in storage medium is read by a Wearable or be performed, comprise the steps:
At least one sensor of described Wearable is utilized to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Respond described first operation, and generate the first instruction.
Optionally, that store in described storage medium and step: utilize at least one sensor of described Wearable to obtain first of the first area of an operating body in described first main body and operate, corresponding computer instruction, being specifically performed in process, specifically comprises the steps:
Utilize the visible images collecting unit of described Wearable obtain described operating body carry out on described first area described first operation time at least two images;
Described at least two images are analyzed, obtains described first operation.
Optionally, that store in described storage medium and step: utilize at least one sensor of described Wearable to obtain first of the first area of an operating body in described first main body and operate, corresponding computer instruction, being specifically performed in process, specifically comprises the steps:
Utilize the first image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the first image, and utilize the second image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the second image, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described first image and the second image are analyzed, obtains described first operation.
Optionally, that store in described storage medium and step: utilize at least one sensor of described Wearable to obtain first of the first area of an operating body in described first main body and operate, corresponding computer instruction, being specifically performed in process, specifically comprises the steps:
Utilize structured light projection that the infrared light supply of described Wearable produces by the infrared projection unit of described Wearable to described first area;
Utilize the infrared image acquisition unit of described Wearable obtain described operating body carry out on described first area described first operation time infrared image;
Described infrared image is analyzed, obtains described first operation.
Optionally, other computer instruction is also stored in described storage medium, these computer instructions with step: utilize at least one sensor of described Wearable to obtain first of the first area of an operating body in described first main body and operate, being performed before corresponding computer instruction is performed, comprising the steps: when being performed
The projecting cell of described Wearable is utilized one content to be projected to be projected to described first area.
Optionally, other computer instruction is also stored in described storage medium, these computer instructions with step: utilize structured light projection that the infrared light supply of described Wearable produces by the infrared projection unit of described Wearable to described first area, being performed before corresponding computer instruction is performed, comprising the steps: when being performed
Utilize the projecting cell of described Wearable one content to be projected to be projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
Optionally, other computer instruction is also stored in described storage medium, these computer instructions with step: utilize at least one sensor of described Wearable to obtain first of the first area of an operating body in described first main body and operate, being performed before corresponding computer instruction is performed, comprising the steps: when being performed
The outer operation detection function of equipment controlling described Wearable is in opening.
Optionally, that store in described storage medium and step: the operation detection function controlling described Wearable is in opening, and corresponding computer instruction, being specifically performed in process, specifically comprises the steps:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
Optionally, that store in described storage medium and step: the operation detection function controlling described Wearable is in opening, and corresponding computer instruction, being specifically performed in process, specifically comprises the steps:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
Optionally, other computer instruction is also stored in described storage medium, these computer instructions are the first state in the current state in described content to be projected, and be performed when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, comprise the steps: when being performed
Detect the trigger action of acquisition one for described content to be projected;
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
Although describe the preferred embodiments of the present invention, those skilled in the art once obtain the basic creative concept of cicada, then can make other change and amendment to these embodiments.So claims are intended to be interpreted as comprising preferred embodiment and falling into all changes and the amendment of the scope of the invention.
Obviously, those skilled in the art can carry out various change and modification to the present invention and not depart from the spirit and scope of the present invention.Like this, if these amendments of the present invention and modification belong within the scope of the claims in the present invention and equivalent technologies thereof, then the present invention is also intended to comprise these change and modification.

Claims (20)

1. an information processing method, is applied to a Wearable, and described Wearable has a stationary installation, and for described Wearable is fixed to the first main body, described method comprises:
At least one sensor of described Wearable is utilized to obtain the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Respond described first operation, and generate the first instruction.
2. the method for claim 1, is characterized in that, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize the visible images collecting unit of described Wearable obtain described operating body carry out on described first area described first operation time at least two images;
Described at least two images are analyzed, obtains described first operation.
3. the method for claim 1, is characterized in that, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize the first image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the first image, and utilize the second image acquisition units of described Wearable obtain described operating body carry out on described first area described first operation time the second image, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described first image and the second image are analyzed, obtains described first operation.
4. the method for claim 1, is characterized in that, described the first operation utilizing at least one sensor of described Wearable to obtain the first area of an operating body in described first main body, is specially:
Utilize structured light projection that the infrared light supply of described Wearable produces by the infrared projection unit of described Wearable to described first area;
Utilize the infrared image acquisition unit of described Wearable obtain described operating body carry out on described first area described first operation time infrared image;
Described infrared image is analyzed, obtains described first operation.
5. the method as described in claim arbitrary in claim 1-3, is characterized in that, described utilize at least one sensor of described Wearable obtain the first area of an operating body in described first main body first operation before, described method also comprises:
The projecting cell of described Wearable is utilized one content to be projected to be projected to described first area.
6. method as claimed in claim 4, it is characterized in that, before the described structured light projection utilizing the infrared projection unit of described Wearable to be produced by the infrared light supply of described Wearable to described first area, described method also comprises:
Utilize the projecting cell of described Wearable one content to be projected to be projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
7. the method for claim 1, is characterized in that, described utilize at least one sensor of described Wearable obtain the first area of an operating body in described first main body first operation before, described method also comprises:
The outer operation detection function of equipment controlling described Wearable is in opening.
8. method as claimed in claim 7, it is characterized in that, the operation detection function of the described Wearable of described control is in opening, is specially:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
9. method as claimed in claim 7, is characterized in that, the outer operation detection function of equipment of the described Wearable of described control is in opening, is specially:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
10. method as claimed in claim 6, it is characterized in that, be the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, described method also comprises:
Detect the trigger action of acquisition one for described content to be projected;
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
11. 1 kinds of Wearable, described Wearable has a stationary installation, and for described Wearable is fixed to the first main body, described Wearable also comprises:
At least one sensor, for obtaining the first operation of the first area of an operating body in described first main body, wherein, described first area is the first preset distance above described first main body, and does not comprise the region that described Wearable covers;
Processor, for responding described first operation, and generates the first instruction.
12. Wearable as claimed in claim 11, it is characterized in that, at least one sensor described is specially:
Visible ray visible images collecting unit, for obtaining at least two images when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described at least two images are analyzed, obtains described first operation.
13. Wearable as claimed in claim 11, it is characterized in that, at least one sensor described specifically comprises:
First image acquisition units, for obtaining the first image when described operating body carries out described first operation on described first area;
Second image acquisition units, for obtaining the second image when described operating body carries out described first operation on described first area, wherein, described first image acquisition units and described second image acquisition units have different angles relative to described stationary installation;
Described processor specifically for:
Described first image and the second image are analyzed, obtains described first operation.
14. Wearable as claimed in claim 11, it is characterized in that, at least one sensor described specifically comprises:
Infrared projection unit, for the structured light projection that produced by the infrared light supply of described Wearable to described first area;
Infrared image acquisition unit, for obtaining infrared image when described operating body carries out described first operation on described first area;
Described processor specifically for:
Described infrared image is analyzed, obtains described first operation.
15. Wearable as described in claim arbitrary in claim 11-13, it is characterized in that, described Wearable also comprises:
Projecting cell, for projecting a content to be projected to described first area.
16. Wearable as claimed in claim 14, is characterized in that, described projecting cell specifically for:
Before the structured light projection utilizing the infrared projection unit of described Wearable to be produced by the infrared light supply of described Wearable to described first area, one content to be projected is projected to described first area, wherein, the projected area of described projecting cell is more than or equal to the infrared projection area of described infrared projection unit.
17. Wearable as claimed in claim 11, is characterized in that, described processor also for:
Utilize before at least one sensor of described Wearable obtains the first operation of the first area of an operating body in described first main body described, the outer operation detection function of equipment controlling described Wearable is in opening.
18. Wearable as claimed in claim 17, is characterized in that, described processor specifically for:
The second operation undertaken by described first main body of described at least one sensing unit detection;
Judge whether described second operation is a predetermined registration operation, obtains the first judged result;
When described first judged result is for being, controls the outer operation detection function of described equipment and being in opening.
19. Wearable as claimed in claim 17, is characterized in that, described processor specifically for:
Obtain current at least one application program being in running status of described Wearable;
Judge whether at least one application program described is the application program that default application program is concentrated, and obtains the second judged result;
When described second judged result is for being, controls described vitro detection function and being in opening.
20. Wearable as claimed in claim 16, it is characterized in that, described Wearable also comprises:
Detecting unit, for being the first state in the current state of described content to be projected, and when described content to be projected is different from the present displayed content that the display unit of described Wearable shows, detect the trigger action of acquisition one for described content to be projected;
Described processor is used for:
Respond described trigger action, the Current projection state of described content to be projected is adjusted to the second state from described first state;
Based on described first state and described second state, obtain the state change of content described to be projected;
When described state change reaches a predetermined condition, adjust the current display state of described present displayed content based on described second state.
CN201410302909.2A 2014-06-27 2014-06-27 A kind of information processing method and wearable device Active CN105204613B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410302909.2A CN105204613B (en) 2014-06-27 2014-06-27 A kind of information processing method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410302909.2A CN105204613B (en) 2014-06-27 2014-06-27 A kind of information processing method and wearable device

Publications (2)

Publication Number Publication Date
CN105204613A true CN105204613A (en) 2015-12-30
CN105204613B CN105204613B (en) 2019-02-05

Family

ID=54952351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410302909.2A Active CN105204613B (en) 2014-06-27 2014-06-27 A kind of information processing method and wearable device

Country Status (1)

Country Link
CN (1) CN105204613B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210568A (en) * 2016-07-15 2016-12-07 深圳奥比中光科技有限公司 Image processing method and device
CN107450672A (en) * 2017-09-19 2017-12-08 曾泓程 A kind of wrist intelligent apparatus of high discrimination
CN108200419A (en) * 2018-03-30 2018-06-22 联想(北京)有限公司 A kind of projecting method and electronic equipment

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120249409A1 (en) * 2011-03-31 2012-10-04 Nokia Corporation Method and apparatus for providing user interfaces
CN102985896A (en) * 2010-07-16 2013-03-20 高通股份有限公司 Methods and systems for interacting with projected user interface
US20130086531A1 (en) * 2011-09-29 2013-04-04 Kabushiki Kaisha Toshiba Command issuing device, method and computer program product
CN203178918U (en) * 2013-03-21 2013-09-04 联想(北京)有限公司 Electronic device
US20140055352A1 (en) * 2012-11-01 2014-02-27 Eyecam Llc Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
TW201419037A (en) * 2012-11-09 2014-05-16 Nat Applied Res Laboratories Wearable multimedia device
CN103827780A (en) * 2011-07-12 2014-05-28 谷歌公司 Methods and systems for a virtual input device
WO2014081181A1 (en) * 2012-11-20 2014-05-30 Samsung Electronics Co., Ltd. Wearable electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102985896A (en) * 2010-07-16 2013-03-20 高通股份有限公司 Methods and systems for interacting with projected user interface
US20120249409A1 (en) * 2011-03-31 2012-10-04 Nokia Corporation Method and apparatus for providing user interfaces
CN103827780A (en) * 2011-07-12 2014-05-28 谷歌公司 Methods and systems for a virtual input device
US20130086531A1 (en) * 2011-09-29 2013-04-04 Kabushiki Kaisha Toshiba Command issuing device, method and computer program product
US20140055352A1 (en) * 2012-11-01 2014-02-27 Eyecam Llc Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing
TW201419037A (en) * 2012-11-09 2014-05-16 Nat Applied Res Laboratories Wearable multimedia device
WO2014081181A1 (en) * 2012-11-20 2014-05-30 Samsung Electronics Co., Ltd. Wearable electronic device
CN203178918U (en) * 2013-03-21 2013-09-04 联想(北京)有限公司 Electronic device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210568A (en) * 2016-07-15 2016-12-07 深圳奥比中光科技有限公司 Image processing method and device
CN107450672A (en) * 2017-09-19 2017-12-08 曾泓程 A kind of wrist intelligent apparatus of high discrimination
CN107450672B (en) * 2017-09-19 2024-03-29 曾泓程 Wrist type intelligent device with high recognition rate
CN108200419A (en) * 2018-03-30 2018-06-22 联想(北京)有限公司 A kind of projecting method and electronic equipment
CN108200419B (en) * 2018-03-30 2020-09-25 联想(北京)有限公司 Projection method and electronic equipment

Also Published As

Publication number Publication date
CN105204613B (en) 2019-02-05

Similar Documents

Publication Publication Date Title
US10101873B2 (en) Portable terminal having user interface function, display method, and computer program
US10082886B2 (en) Automatic configuration of an input device based on contextual usage
US9329714B2 (en) Input device, input assistance method, and program
US11941181B2 (en) Mechanism to provide visual feedback regarding computing system command gestures
EP3742263A1 (en) Terminal and method for controlling the same based on spatial interaction
KR20150014083A (en) Method For Sensing Inputs of Electrical Device And Electrical Device Thereof
KR20150130431A (en) Enhancing touch inputs with gestures
KR20140005141A (en) Three dimensional user interface effects on a display by using properties of motion
EP2558924B1 (en) Apparatus, method and computer program for user input using a camera
KR20140040246A (en) Gesture-controlled technique to expand interaction radius in computer vision applications
US10474324B2 (en) Uninterruptable overlay on a display
US9400575B1 (en) Finger detection for element selection
CN105204613A (en) Information processing method and wearable equipment
CN105808129B (en) Method and device for quickly starting software function by using gesture
WO2021004413A1 (en) Handheld input device and blanking control method and apparatus for indication icon of handheld input device
KR20130124139A (en) Control method of terminal by using spatial interaction
CN111433832B (en) Entity globe with touch function, display terminal and map display method
CN110837295A (en) Handheld control equipment and tracking and positioning method, equipment and system thereof
CN104063037A (en) Operating command recognition method and device as well as wearable electronic equipment
US9898183B1 (en) Motions for object rendering and selection
CN105786360A (en) Method and device for demonstrating PPT file with mobile terminal
Colaço Sensor design and interaction techniques for gestural input to smart glasses and mobile devices
CN108521497A (en) A kind of terminal control method, control device, terminal and readable storage medium storing program for executing
JP6686319B2 (en) Image projection device and image display system
CN109284051B (en) Method and terminal for realizing application interaction

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant