CN103809751A - Information sharing method and device - Google Patents

Information sharing method and device Download PDF

Info

Publication number
CN103809751A
CN103809751A CN201410049125.3A CN201410049125A CN103809751A CN 103809751 A CN103809751 A CN 103809751A CN 201410049125 A CN201410049125 A CN 201410049125A CN 103809751 A CN103809751 A CN 103809751A
Authority
CN
China
Prior art keywords
region
shared
content
instruction
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201410049125.3A
Other languages
Chinese (zh)
Inventor
周梁
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Zhigu Ruituo Technology Services Co Ltd
Original Assignee
Beijing Zhigu Ruituo Technology Services Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Zhigu Ruituo Technology Services Co Ltd filed Critical Beijing Zhigu Ruituo Technology Services Co Ltd
Priority to CN201410049125.3A priority Critical patent/CN103809751A/en
Publication of CN103809751A publication Critical patent/CN103809751A/en
Pending legal-status Critical Current

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses an information sharing method and device. The method includes the steps of correspondingly presenting at least one piece of content in a virtual interaction area, determining a sharing area corresponding to at least one external device in the virtual interaction area according to an area determining instruction, determining at least one piece of shared content in the content according to a sharing control instruction, correspondingly presenting the shared content in the sharing area, and sending shared information with the shared content to the external device. Through the technology, information sharing between the external device and devices which include wearable devices and can hardly implement complex interaction can be conveniently and visually achieved.

Description

Information sharing method and information sharing apparatus
Technical field
The application relates to areas of information technology, relates in particular to a kind of information sharing method and information sharing apparatus.
Background technology
Along with the coverage of wireless network and the lifting of network speed, wearable device is becoming new personal data and is obtaining and share channel, but due to portable demand, the volume of wearable device is generally less, and therefore the man-machine interaction of wearable device generally realizes by interactive interfaces such as the button on wearable device, touch pad, voice controls.But while carrying out the sharing operation of wearable device local content by these interactive interfaces, because needs are determined shared content, shared to which external unit etc., operation steps more complicated, do not meet the demand of user for the convenient interoperability of wearable device aspect, and shared procedure is also directly perceived not.
Summary of the invention
The technical matters that the application will solve is: provide a kind of Information Sharing Technology, to realize conveniently, intuitively the information sharing between more difficult equipment and the external units of realizing complex interaction such as wearable device.
First aspect, the application provides a kind of information sharing method, comprising:
Carry out present corresponding with at least one content in a virtual interacting region;
Determine that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share steering order according to one determines and enjoys at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, and the shared information that comprises described at least one shared content is sent to described at least one external unit.
Second aspect, the application provides a kind of information sharing method, comprising:
Determine that according to a region instruction determines a shared region corresponding with at least one external unit;
Receive the shared information that comprises at least one shared content of at least one transmission in described at least one external unit;
Carry out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
The third aspect, the application provides a kind of information sharing apparatus, comprising:
Present module, for carrying out present corresponding with at least one content in a virtual interacting region;
Shared region determination module, for determining that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share content determination module, determine and enjoy at least altogether content in described at least one content for sharing steering order according to one;
The described module that presents, also for carrying out at described shared region and corresponding the presenting of described at least one shared content;
Communication module, for sending the shared information that comprises described at least one shared content to described at least one external unit.
Fourth aspect, the application provides a kind of information sharing apparatus, comprising:
Shared region determination module, for determining that according to a region instruction determines a shared region corresponding with at least one external unit;
Communication module, for receiving the shared information that comprises at least one shared content of at least one transmission of described at least one external unit;
Present module, for carrying out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
The 5th aspect, the application provides a kind of wearable device, described wearable device comprise the above-mentioned the 3rd or fourth aspect described in information sharing apparatus.
At least one embodiment of the embodiment of the present application is by the virtual interacting region division shared region corresponding with external unit of device external, and the mode that content shared needs moves to shared region is facilitated, realizes intuitively the information sharing between current device and external unit; In addition; at least one embodiment of the embodiment of the present application is by carrying out corresponding Share Permissions setting to shared region; control the Share Permissions of corresponding content in described shared region; make the external unit corresponding with described shared region under corresponding Share Permissions, to share the content in described shared region, realized easily the secret protection of shared content.
Accompanying drawing explanation
Fig. 1 is the schematic flow sheet of a kind of information sharing method of the embodiment of the present application;
Fig. 2 is the schematic flow sheet of the another kind of information sharing method of the embodiment of the present application;
Fig. 3 is the schematic flow sheet of another information sharing method of the embodiment of the present application;
Fig. 4 is the structural representation block diagram of a kind of information sharing apparatus of the embodiment of the present application;
Fig. 5 is the structural representation block diagram of the another kind of information sharing apparatus of the embodiment of the present application;
Fig. 6 a is the structural representation block diagram that a kind of information sharing apparatus of the embodiment of the present application presents module;
Fig. 6 b is the structural representation block diagram that the another kind of information sharing apparatus of the embodiment of the present application presents module;
Fig. 6 c is the structural representation block diagram of a kind of information sharing apparatus second instruction acquisition module of the embodiment of the present application;
Fig. 6 d is the structural representation block diagram of a kind of information sharing apparatus first instruction acquisition module of the embodiment of the present application;
Fig. 7 is the structural representation block diagram of another information sharing apparatus of the embodiment of the present application;
Fig. 8 is the structural representation block diagram of another information sharing apparatus of the embodiment of the present application;
Fig. 9 a and Fig. 9 b are the structural representation block diagram of two kinds of wearable devices of the embodiment of the present application;
Figure 10 is the application scenarios schematic diagram of the embodiment of the present application.
Embodiment
Below in conjunction with accompanying drawing (in some accompanying drawings, identical label represents identical element) and embodiment, the application's embodiment is described in further detail.Following examples are used for illustrating the application, but are not used for limiting the application's scope.
The interactive interface that wearable device has at present can not finely meet the demand of convenient interoperability, take interactive interfaces such as voice control, touching type induction devices as example, phonetic entry is applicable to simple, clear and definite user instruction, but is not suitable for mutual with virtual objects; Similar with hand touching touching type induction device and phonetic entry, good stability, but be only suitable for simple task.For wearable device and external unit, particularly the content between wearable device is shared, and implements natural not, efficient by above-mentioned interactive interface.Therefore, as shown in Figure 1, the embodiment of the present application provides a kind of information sharing method, comprising:
S110 carries out present corresponding with at least one content in a virtual interacting region;
S120 determines that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
S130 shares steering order according to one and determines and enjoy at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, and the shared information that comprises described at least one shared content is sent to described at least one external unit.
In the application's embodiment, by a shared region corresponding with external unit is set in a virtual interacting region, and the content that need to share with this external unit is corresponding is presented in described shared region, the content that user can be controlled be easily presented in this shared region shared content of this external unit (be user need to be shared with), and then make user can carry out naturally, intuitively, efficiently sharing of information.Be particularly suitable for wearable device and external unit, particularly the information sharing between wearable device.
As shown in Figure 2, in a kind of possible embodiment of the embodiment of the present application, described information sharing method comprises:
S110 carries out present corresponding with at least one content in a virtual interacting region;
S120 determines that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
S140 is that described shared region arranges Share Permissions according to a control of authority instruction;
S130 shares steering order according to one and determines and enjoy at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, wherein, described at least one shared content is corresponding to described Share Permissions corresponding to described shared region, and the shared information that comprises described at least one shared content and described Share Permissions is sent to described at least one external unit.
Present embodiment is on the basis of the embodiment shown in Fig. 1, increase the step that Share Permissions is set for shared region, make the corresponding shared content of described shared region corresponding to described Share Permissions, make the external unit corresponding with described shared region can only be under corresponding Share Permissions to described shared content consult, preserve, the operation such as modification, improved the security of information sharing.
In the present embodiment, in the time that user need to carry out information sharing from different external units, can be respectively corresponding from described different external unit multiple shared regions be set, be respectively each shared region Share Permissions is set separately.
As seen from the above, the technical scheme of present embodiment has improved convenient, the intuitive of sharing operation on the one hand, has also improved on the other hand the security of information sharing.
Further illustrate each step of the embodiment of the present application below by following embodiment:
In a kind of possible embodiment, described method also comprised before described step S110: determine described virtual interacting region.
In the embodiment of the present application, the mode in described definite described virtual interacting region comprises multiple, for example, can be:
1) determine that a predefined region is virtual interacting region.
For example, be set in advance in user's sight line dead ahead and the region (rectangular area of for example 30cm*20cm) of the definite shape for example, located apart from eyes certain distance (35cm) and size is described virtual interacting region.
Certainly, the direction of this virtual interacting region distance eyes of user, distance and size can be according to user's needs or according to the adjustable setting of user's self feature.
2) determine described virtual interacting region according to a user instruction.
The user instruction here can be for example a user hand motion, and for example, user makes the action of a rectangular area of picture with finger in eyes front, can determine that this rectangular area is described virtual interacting region according to this hand motion.Certainly, those skilled in the art can know, the user instruction here can also be that other is mutual, and for example sight line mutual for example obtains described user instruction by the blinkpunkt position of following the tracks of user.In addition, user instruction described here can also, for user is by an input equipment, as the instruction of the input of button or touch pad etc., for example, be inputted the key parameter in described virtual interacting region etc.
S110 carries out present corresponding with at least one content in a virtual interacting region.
In the present embodiment, be provided with a virtual interacting region, facilitate the mutual of equipment and user, information sharing is implemented more convenient.
In the embodiment of the present application, in described step S110, present for example corresponding with at least one content can be:
Show and each content correspondence image, image described here is for example icon, thumbnail or title etc., certainly, when described content is a pictorial information or while being a Word message, described image can be also picture itself or word corresponding to Word message that described pictorial information is corresponding, its objective is that making user to determine very intuitively and easily and to choose needs shared content.
Here, described at least one content, can be used for shared content except described below, can also comprise bezel locations information and some other informations etc. in described virtual interacting region.Described in carrying out, be current, can demonstrate the frame in described virtual interacting region and above-mentioned information.
In a kind of possible embodiment of the embodiment of the present application, describedly carry out the mode that present corresponding with at least one content in a virtual interacting region and comprise multiplely, be for example following one:
A) image corresponding with described at least one content is directly projected to user eyeground and at described virtual interacting regional imaging.
Present embodiment for example can directly project image corresponding to described at least one content by the grenade instrumentation of a wear-type wearable device to user eyeground, and adjust the projection parameter of described wearable device, the image that makes described projection through after the optical system of the described wearable device between described grenade instrumentation and the eyes of user optical system corresponding with eyes of user, become in described virtual interacting region clearly as.; position (comprising distance and the direction of described virtual interacting region with respect to eyes of user), optical parametric corresponding to eyes of user optical system according to described virtual interacting region with respect to eyes of user; regulate the projection parameter (the projection parameter here for example can comprise at least one in the optical parametric of optical system of parameter that the image of projection is corresponding and wearable device) of described wearable device, user is seen be positioned at the picture of the resolution standard that meets at least one setting of the described image in described virtual interacting region.Here, described virtual interacting region can obtain (how detecting eyes of user blinkpunkt position is prior art, repeats no more) here by detecting eyes of user blinkpunkt position with respect to the position of eyes of user; In addition in the time that described virtual interacting region is positioned at a physical surface, can also obtain the position of this physical surface with respect to described depth transducer by a depth transducer, more further obtain the position of described virtual interacting region with respect to eyes of user.
While regulating described projection parameter, can be on the one hand: the parameter corresponding to the image of described projection regulates, for example, position according to described virtual interacting region with respect to eyes of user, determine the size (in the time that described virtual interacting region is positioned at a physical surface, also likely needing according to the deformation parameter of image described in the angular adjustment between described physical surface and user's sight line etc.) of the image of projection; On the other hand: the optical parametric of the optical system to wearable device regulates, for example, described wearable device arranges the adjustable lens module of a focal length between described grenade instrumentation and user eyeground, in present embodiment, the adjusting of described projection parameter can also comprise the adjusting to described lens module focal length.
Present embodiment can make the content presenting in described virtual interacting region only have user oneself to see, has further improved the secret protection level of information sharing process; In addition, in the present embodiment, described virtual interacting region can need not be realizes on a physical surface, while making information interaction dirigibility higher, reduce the restriction of actual environment.
B) image corresponding to described at least one content is projected to described virtual interacting region.
In the present embodiment, can described image be directly incident upon to described virtual interacting region by a grenade instrumentation, in the present embodiment, described virtual interacting region need to be positioned at one and for example can present, on the physical surface (desktop, palm, wall etc.) that projects content.In the present embodiment, user is more easily identified at the action command in described virtual interacting region, and for user, more meets artificial mechanics, is suitable for long-term work.
As seen from the above, the embodiment of the embodiment of the present application can be realized the information sharing of equipment room by user's wearable device (as intelligent glasses), do not need to increase extra electronic equipment (as mobile phone, computer etc.), meets portable demand.
S120 determines that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region.
In described step S120, described region determines that instruction obtains by detecting user's interactive action corresponding with described virtual interacting region.
In some possible embodiments of the embodiment of the present application, described interactive action comprises: the interactive action of the interactive action of user's hand, the interactive action of eyes of user blinkpunkt or user's hand and eye gaze point.Certainly,, in other possible embodiment of the embodiment of the present application, described interactive action also may comprise other interactive action.
In the present embodiment, describedly obtain described region and determine that the mode of instruction for example comprises by detecting the user interactive action corresponding with described virtual interacting region:
The interactive action that draws described shared region position by detecting user in described virtual interacting region obtains described region and determines instruction.
For example, user draws the region of rectangle, circle or other arbitrary shape in described virtual interacting region by instruments such as finger or pens, need to be using this panel region as described shared region.The image of the above-mentioned action that comprises user by collections such as the image collecting devices on wearable device such as, and carry out image processing, convert user's above-mentioned action to described region and determine instruction, this region determines that instruction for example can comprise the coordinate information in the region that user draws etc.
Above-mentioned image acquisition and image processing process can adopt the equipment of prior art and method to complete, for example realize image acquisition by the environment camera on intelligent glasses, by separate the profile of selling from background, further detect oval part (corresponding finger or pen) and respective vertices position (corresponding finger tip or nib) in profile, extract characteristic information.In whole reciprocal process, algorithm is by analyzing the room and time information of characteristic information extraction, by with the command information comparison pre-defining, judge user's input instruction and then obtain described region and determine instruction.This shows, in the present embodiment, in order to facilitate follow-up image processing, selecting when described virtual interacting region, should select as far as possible background relatively unify (for example not having dapple monochrome) and with hand or etc. the region of color distinction larger (for example aberration exceedes predetermined threshold) as described virtual interacting region.Certainly, the profile of hand is cut apart also and can be obtained by the camera of other type, such as degree of depth camera collection; Than above-mentioned environment camera, degree of depth camera can be adapted to the interaction scenarios of complex background.
Certainly, those skilled in the art can know, when described user's interactive action is when to be blinkpunkt mutual, shift position, the residence time etc. of detecting user's blinkpunkt by user's blinkpunkt detection technique also can be realized above-mentioned zone and be determined obtaining of instruction.
In the embodiment of the present application, after having determined the shared region in described virtual interacting region, for user is used more intuitively, can in described virtual interacting region, carry out present corresponding with described shared region.For example show frame, background colour of described shared region etc., or the information such as external unit title corresponding described shared region (or user's name corresponding to described external unit), Share Permissions also can also be carried out to corresponding demonstration.
Those skilled in the art can know, the distance that arrives user when described virtual interacting region is distant, while being relatively difficult to directly carry out described interactive action on virtual interacting region, can also be by carrying out corresponding interactive action between described virtual interacting region and described image collecting device, and these interactive action correspondence mappings are obtained to described shared region to described virtual interacting region.For example, determine described shared region by eyes of user to the extended line of finger tip line and the position of intersecting point in described virtual interacting region.
S140 is that described shared region arranges Share Permissions according to a control of authority instruction.
In the embodiment of the present application, for after described shared region is provided with Share Permissions, the corresponding content of image that described shared region presents is corresponding to described Share Permissions.
In a kind of possible embodiment of the embodiment of the present application, the described control of authority instruction using in described step S140 obtains by detecting user's interactive action corresponding with described shared region.
In some possible embodiments of the embodiment of the present application, described interactive action similarly comprises with described above:
The interactive action of the interactive action of user's hand, the interactive action of eyes of user blinkpunkt or user's hand and eye gaze point.Certainly,, in other possible embodiment of the embodiment of the present application, described interactive action also may comprise other interactive action.
In a kind of possible embodiment, for example, while only having a shared region on described virtual interacting face, user's interactive action corresponding with control of authority instruction not necessarily needs physical mappings to arrive described shared region.For example, user can, by doing at an arbitrary position the gesture corresponding with a certain Share Permissions in the time that shared region authority need to be set, can arrange the Share Permissions of described shared region.
But, in the time having multiple shared region in described virtual interacting region, in the time need to arranging the Share Permissions of one of them shared region, for user friendly operation, determine that with described region above obtaining of instruction is corresponding, in the present embodiment, interactive action corresponding to described and described shared region comprises:
The interactive action of user in described shared region or user outside described shared region and can physical mappings to the action of described shared region.
In the present embodiment, the interactive action that user does in a shared region or can physical mappings obtain control of authority instruction to the interactive action of described shared region and be directly used in the Share Permissions that corresponding shared region is set.
For example, described virtual interacting region comprises the first shared region and the second shared region, corresponding with the equipment of user B and user C respectively, when need to be when sharing to the Share Permissions of information of user B and modify, only need in described the first shared region, carry out corresponding interactive operation, just can carry out corresponding authority setting; Meanwhile, this interactive operation can not affect the information sharing between the equipment of user C completely.
In a kind of possible embodiment of the embodiment of the present application, described method also comprises: preset the corresponding relation between described user interactive action and the described control of authority instruction corresponding with described shared region.
In the present embodiment, the corresponding relation between described interactive action and described control of authority instruction can be given tacit consent to and define, or needs self-defining according to user.Described corresponding relation can be for example:
Read-only authority is corresponding to (with finger or pen) triangle of drawing in described shared region;
The local copy authority of preserving of external unit is corresponding to draw circle in described shared region;
Finish control of authority corresponding to pitching etc. at described shared region inside-paint.
Certainly, those skilled in the art can know, except the above-mentioned interactive action that draws corresponding track, other is applicable to carrying out mutual mode with virtual objects and also can be applied in the embodiment of the present application, for example, by use varying number finger in shared region, corresponding to different Share Permissions, (for example two fingers slide corresponding to an authority in described shared region, three corresponding another authorities of finger etc.), or different authorities is set (for example points in shared region and put the corresponding time by stop the different time in shared region, or blinkpunkt stops the corresponding time in described shared region).
S130 shares steering order according to one and determines and enjoy at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, and the shared information that comprises described at least one shared content and described Share Permissions is sent to described at least one external unit.
In described step S130, similar with step above, described shared steering order is also obtained at the interactive action in described virtual interacting region by detecting user.Described user's interactive action can be for example: choose need to be corresponding with shared region at least one shared content shared of external unit at image corresponding to described virtual interacting region, and with the form pulling, this image is placed in described shared region.By gathering user's image corresponding to above-mentioned interactive action, and process and obtain corresponding instruction by image.When above-mentioned interactive action is convenient to image processing, can clearly know user view, those skilled in the art can know certainly, according to predefined instruction generation method, also can generate described shared steering order by other interactive mode.
Except the above-mentioned shared region by described virtual interacting region is shared to local content the external unit corresponding with described shared region, in a kind of possible embodiment of the embodiment of the present application, can also receive the shared information that described external unit sends, and present by the shared region of described correspondence the icon that the shared content in described shared information comprises, make user can understand in real time, easily shared content and authority thereof that this external unit is shared., described method also comprises:
Receive the shared information in the outside that comprises the shared content at least one outside that an external unit sends;
Share the described shared information of content carries out sharing corresponding the presenting of content with described at least one outside at described shared region according to described at least one outside.
In a kind of possible embodiment, described external unit may also be provided with a shared region, in the time that image corresponding to described shared content is drawn to the other side's shared region by described external unit, experience in order to improve user, local shared region comprises the process that presents correspondence image and be presented on from less to more described shared region to presenting of described shared content.
The above embodiments of the present application have been described the application's information sharing method as an example of wearable device example, but, those skilled in the art can know, it is not especially easily on portable set that the application's information sharing method also can be applied in miscellaneous equipment, particularly small volume, demonstration and operation.
As shown in Figure 3, the embodiment of the present application also provides a kind of information sharing method, comprising:
S310 determines that according to a region instruction determines a shared region corresponding with at least one external unit;
S320 receives the shared information that comprises at least one shared content of at least one transmission in described at least one external unit;
S330 carries out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
First the embodiment of the present application confirms a shared region, and the shared content of the equipment transmission corresponding with described shared region is carried out to corresponding presenting in described shared region, makes user can share intuitively and easily the shared content that external unit sends.
In a kind of possible embodiment of the embodiment of the present application, described shared information comprises Share Permissions corresponding to described at least one shared content.
Description about the description of Share Permissions corresponding to described shared content referring to Fig. 1 and middle correspondence embodiment illustrated in fig. 2, repeats no more here.
For example, in order to realize some equipment (wearable device recited above) convenient mutual not too easily alternately, in a kind of possible embodiment of the embodiment of the present application, described method also comprises:
Determine described virtual interacting region.
In the present embodiment, describedly determine that according to a region instruction determines that a shared region corresponding with at least one external unit comprises:
Determine that according to a region instruction determines a shared region corresponding with described external unit on described virtual interacting region.
Virtual interacting region by above and corresponding mutual on described virtual interacting region, can make user carry out information sharing with external unit intuitively and easily by more difficult equipment of realizing complex interaction such as wearable devices.
In present embodiment, described virtual interacting region, determine the method in described virtual interacting region and determine that the method for described shared region all can, referring to the description of Fig. 1 and middle correspondence embodiment illustrated in fig. 2, repeat no more here.
It will be appreciated by those skilled in the art that, in the said method of the application's embodiment, the sequence number size of each step does not also mean that the priority of execution sequence, the execution sequence of each step should be definite with its function and internal logic, and should not form any restriction to the implementation process of the application's embodiment.
As shown in Figure 4, the embodiment of the present application provides a kind of information sharing apparatus 400, comprising:
Present module 410, for carrying out present corresponding with at least one content in a virtual interacting region;
Shared region determination module 420, for determining that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share content determination module 430, determine and enjoy at least altogether content in described at least one content for sharing steering order according to one;
The described module 410 that presents, also for carrying out at described shared region and corresponding the presenting of described at least one shared content;
Communication module 440, for sending the shared information that comprises described at least one shared content to described at least one external unit.
In the application's embodiment, by a shared region corresponding with external unit is set in a virtual interacting region, and the content that need to share with this external unit is corresponding is presented in described shared region, the content that user can be controlled be easily presented in this shared region shared content of this external unit (be user need to be shared with), and then make user can carry out naturally, intuitively, efficiently sharing of information.The application's device is particularly suitable for wearable device and external unit, particularly the information sharing between wearable device.
As shown in Figure 5, in a kind of possible embodiment of the embodiment of the present application, described information sharing apparatus 400 is except comprising recited above presenting module 410, shared region determination module 420, shared content determination module 430 and communication module 440, and described device also comprises:
Share Permissions arranges module 450, and for being that described shared region arranges Share Permissions according to a control of authority instruction, wherein, the shared content corresponding with described shared region is corresponding to described Share Permissions.Now, described shared information also comprises: the described Share Permissions corresponding with described at least one shared content.
Present embodiment is on the basis of the embodiment shown in Fig. 4, increase the Share Permissions that Share Permissions is set for shared region module 450 has been set, make the corresponding shared content of described shared region corresponding to described Share Permissions, make the external unit corresponding with described shared region can only be under corresponding Share Permissions to described shared content consult, preserve, the operation such as modification, improved the security of information sharing.
In the present embodiment, in the time that user need to carry out information sharing with the Share Permissions that can control separately from different external units, can be respectively corresponding from described different external unit multiple shared regions be set, be respectively each shared region Share Permissions is set separately.
Further illustrate each module of the embodiment of the present application below by following embodiment:
In a kind of possible embodiment of the embodiment of the present application, described information sharing apparatus 400 also comprises:
Interaction area determination module 460, for determining described virtual interacting region.
In the embodiment of the present application, the mode in described definite described virtual interacting region comprises multiple, for example, can be:
1) determine that a predefined region is virtual interacting region.
In the present embodiment, described device 400 also comprises a memory module 470, for storing the relevant information in described predefined region, described interaction area determination module 460 obtains described relevant information from described memory module 470 and determines described virtual interacting region.
2) determine described virtual interacting region according to a user instruction.
In the present embodiment, described interaction area determination module 460 for example can comprise the input block just like button, touch pad etc., be used for inputting corresponding instruction for user, described interaction area determination module 460 is determined described virtual interacting region according to the instruction of described user's input.
Mode corresponding description in embodiment is as shown in Figure 2 determined in concrete virtual interacting region, repeats no more here.
As shown in Figure 6 a, in the present embodiment, described in present module 410 and comprise:
Eyeground projecting unit 411, for being directly projected to the image corresponding with described at least one content user eyeground and at described virtual interacting regional imaging.
In the present embodiment, described eyeground projecting unit 411 can be for example a part for a wear-type wearable device, and it can comprise:
One micro projection subelement, for projecting the described image corresponding with described at least one content;
One adjustable optical path component, in the light path between described micro projection subelement and user's eyes, regulates the optical parametric of described light path, makes the described image can be at described virtual interacting regional imaging.Even user can see the picture of the described image that is positioned at described virtual interacting region.
Present embodiment can make the content presenting in described virtual interacting region only have user oneself to see, has further improved the secret protection level of information sharing process; In addition, in the present embodiment, described virtual interacting region can need not be realizes on a physical surface, while making information interaction dirigibility higher, reduce the restriction of actual environment.
As shown in Figure 6 b, in the possible embodiment of the another kind of the embodiment of the present application, described in present module 410 and comprise:
Projecting unit 412, for being projected to described virtual interacting region by image corresponding to described at least one content.
In the present embodiment, can described image be directly incident upon to described virtual interacting region by described projecting unit 412, and then be seen by user.In the present embodiment, described virtual interacting region need to be positioned at one and for example can present, on the physical surface (desktop, palm, wall etc.) that projects content.In the present embodiment, user is more easily identified at the action command in described virtual interacting region, and for user, can operate at actual physical surface, more meets artificial mechanics, is suitable for long-term work.
As seen from the above, the device of the embodiment of the present application is realized in the time that user's wearable device (as intelligent glasses) is upper, can realize the information sharing of equipment room by described wearable device, not need to increase extra electronic equipment (as mobile phone, computer etc.), meet portable demand.
In the present embodiment, first described shared region determination module 420, before determining described shared region, need to obtain described region and determine instruction, and therefore, as shown in Figure 5, in the present embodiment, described device 400 also comprises:
The second instruction acquisition module 480, determines instruction for obtaining described region by the detection user interactive action corresponding with described virtual interacting region.
Wherein, in a kind of possible embodiment of the embodiment of the present application, further, described the second instruction acquisition module 480 is for obtaining described region and determine instruction by detecting user draws described shared region position interactive action in described virtual interacting region.
For example, user draws the region of rectangle, circle or other arbitrary shape in described virtual interacting region by instruments such as finger or pens, need to be using this panel region as described shared region.The image of the above-mentioned action that comprises user by collections such as the image collecting devices on wearable device such as, and carry out image processing, convert user's above-mentioned action to described region and determine instruction, this region determines that instruction for example can comprise the coordinate information in the region that user draws etc.
As shown in Fig. 6 c, in a kind of possible embodiment of the embodiment of the present application, described the second instruction acquisition module 480 comprises:
The second image acquisition units 481, for gathering image corresponding to user's interactive action corresponding with described virtual interacting region;
The second image analyzing unit 482, obtains described region and determines instruction for analyzing the image of described the second image acquisition units collection.
In the embodiment of the present application, described interactive action comprises: the interactive action of the interactive action of user's hand, the interactive action of eyes of user blinkpunkt or user's hand and eye gaze point.Certainly,, in other possible embodiment of the embodiment of the present application, described interactive action also may comprise other interactive action.Wherein, in the time that described interactive action is hand interactive action, described the second image acquisition units 481 can be the camera of user's wear-type wearable device (for example intelligent glasses) photographs front ambient image.In the time that described interactive action is blinkpunkt interactive action, described the second image acquisition units 481 can be the camera of described wearable device photographs eyes of user image.
Correspondence about image acquisition and image processing process in the further functional description embodiment shown in Figure 2 of described the second image acquisition units 481 and the second image analyzing unit 482 is described, and repeats no more here.
In the embodiment of the present application, after described shared region determination module 420 has been determined the shared region in described virtual interacting region, for user is used more intuitively, can in described virtual interacting region, carry out present corresponding with described shared region by the described module 410 that presents.For example show frame, background colour of described shared region etc., or the information such as external unit title corresponding described shared region (or user's name corresponding to described external unit), Share Permissions also can also be carried out to corresponding demonstration.
Those skilled in the art can know, the distance that arrives user when described virtual interacting region is distant, while being relatively difficult to directly carry out described interactive action on virtual interacting region, can also be by carrying out corresponding interactive action between described virtual interacting region and described the second image acquisition units 481, and these interactive action correspondence mappings are obtained to described shared region to described virtual interacting region.For example, determine described shared region by eyes of user to the extended line of finger tip line and the position of intersecting point in described virtual interacting region.
As shown in Figure 5, described device 400 also comprises one first instruction acquisition module 490, for obtaining described control of authority instruction by detecting user's interactive action corresponding with described shared region.
As shown in Fig. 6 d, in a kind of possible embodiment, described the first instruction acquisition module 490 comprises:
The first image acquisition units 491, for gathering image corresponding to user's interactive action corresponding with described shared region;
The first image analyzing unit 492, obtains described control of authority instruction for analyzing the image of described the first image acquisition units collection.
In the present embodiment, described interactive action similarly comprises with described above:
The interactive action of the interactive action of user's hand, the interactive action of eyes of user blinkpunkt or user's hand and eye gaze point.Certainly,, in other possible embodiment of the embodiment of the present application, described interactive action also may comprise other interactive action.What described the first image acquisition units 491 was corresponding from described interactive action can be the similar different form of the second image acquisition units 481 of as above face.In a kind of possible embodiment, described the first image acquisition units 491 is same unit with described the second image acquisition units 481.
In a kind of possible embodiment, for example, while only having a shared region on described virtual interacting face, user's interactive action corresponding with control of authority instruction not necessarily needs physical mappings to arrive described shared region.For example, user can, by doing at an arbitrary position the gesture corresponding with a certain Share Permissions in the time that shared region authority need to be set, can arrange the Share Permissions of described shared region.
But, in the time having multiple shared region in described virtual interacting region, in the time need to arranging the Share Permissions of one of them shared region, for user friendly operation, determine that with described region above obtaining of instruction is corresponding, in the present embodiment, interactive action corresponding to described and described shared region comprises:
The interactive action of user in described shared region or user outside described shared region and can physical mappings to the action of described shared region.
In the present embodiment, the interactive action that user does in a shared region or can physical mappings obtain control of authority instruction to the interactive action of described shared region and be directly used in the Share Permissions that corresponding shared region is set.
For example, described virtual interacting region comprises the first shared region and the second shared region, corresponding with the equipment of user B and user C respectively, when need to be when sharing to the Share Permissions of information of user B and modify, only need in described the first shared region, carry out corresponding interactive operation, just can carry out corresponding authority setting; Meanwhile, this interactive operation can not affect the information sharing between the equipment of user C completely.
In a kind of possible embodiment, described device 400 also comprises:
Instruction is related to presetting module 4100, for the corresponding relation between default described user interactive action and the described control of authority instruction corresponding with described shared region.
In a kind of possible embodiment, described memory module 470 is for storing the corresponding relation between described default described user interactive action and the described control of authority instruction corresponding with described shared region.
Wherein, corresponding between described interactive action and control of authority instruction further described referring to corresponding description in embodiment of the method above, repeats no more here.
In the embodiment of the present application, described device 400 also comprises that one shares steering order acquisition module 4110, for obtaining described shared steering order by detecting user at the interactive action in described virtual interacting region.For described shared content determination module 430.Here user at the interactive action of user described in the interactive action in described virtual interacting region can be for example: choose need to be corresponding with shared region at least one shared content shared of external unit at image corresponding to described virtual interacting region, and with the form pulling, this image is placed in described shared region.By gathering user's image corresponding to above-mentioned interactive action, and process and obtain corresponding instruction by image.When above-mentioned interactive action is convenient to image processing, can clearly know user view, those skilled in the art can know certainly, according to predefined instruction generation method, also can generate described shared steering order by other interactive mode.
Can find out, described shared steering order acquisition module 4110 also can comprise an image acquisition units and an image analyzing unit, for obtaining described shared steering order according to image analysis corresponding to interactive action gathering.And the image acquisition units of described shared steering order acquisition module 4110 can be also same unit with first, second image acquisition units recited above.
Except the above-mentioned shared region by described virtual interacting region is shared to local content the external unit corresponding with described shared region, in a kind of possible embodiment of the embodiment of the present application, can also receive the shared information that described external unit sends, and present by the shared region of described correspondence the icon that the shared content in described shared information comprises, make user can understand in real time, easily shared content and authority thereof that this external unit is shared.Therefore, in a kind of possible embodiment, in described device 400,
Described communication module, the shared information in the outside that comprises the shared content at least one outside also sending for receiving an external unit;
The described module that presents, also carries out sharing corresponding the presenting of content with described at least one outside at described shared region for the described shared information of sharing content according to described at least one outside.
The above embodiments of the present application have been described the application's information sharing apparatus to be applied in wearable device as example, but, those skilled in the art can know, the application's information sharing apparatus also can be applied in miscellaneous equipment, particularly small volume, demonstration and operation not too easily on portable set.
The structural representation of another information sharing apparatus 700 that Fig. 7 provides for the embodiment of the present application, the application's specific embodiment does not limit the specific implementation of information sharing apparatus 700.As shown in Figure 7, this information sharing apparatus 700 can comprise:
Processor (processor) 710, communication interface (Communications Interface) 720, storer (memory) 730 and communication bus 740.Wherein:
Processor 710, communication interface 720 and storer 730 complete mutual communication by communication bus 740.
Communication interface 720, for net element communication such as client etc.
Processor 710, for executive routine 732, specifically can carry out the correlation step in said method embodiment.
Particularly, program 732 can comprise program code, and described program code comprises computer-managed instruction.
Processor 710 may be a central processor CPU, or specific integrated circuit ASIC(Application Specific Integrated Circuit), or be configured to implement one or more integrated circuit of the embodiment of the present application.
Storer 730, for depositing program 732.Storer 730 may comprise high-speed RAM storer, also may also comprise nonvolatile memory (non-volatile memory), for example at least one magnetic disk memory.Program 732 specifically can be for making described information sharing apparatus 700 carry out following steps:
Carry out present corresponding with at least one content in a virtual interacting region;
Determine that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share steering order according to one determines and enjoys at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, and the shared information that comprises described at least one shared content is sent to described at least one external unit.
In program 732, the specific implementation of each step can, referring to description corresponding in the corresponding steps in above-described embodiment and module, be not repeated herein.Those skilled in the art can be well understood to, and for convenience and simplicity of description, the specific works process of the equipment of foregoing description and module, can describe with reference to the corresponding process in preceding method embodiment, does not repeat them here.
As shown in Figure 8, the embodiment of the present application also provides a kind of information sharing apparatus 800, comprising:
Shared region determination module 810, for determining that according to a region instruction determines a shared region corresponding with at least one external unit;
Communication module 820, for receiving the shared information that comprises at least one shared content of at least one transmission of described at least one external unit;
Present module 830, for carrying out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
First the embodiment of the present application confirms a shared region, and the shared content of the equipment transmission corresponding with described shared region is carried out to corresponding presenting in described shared region, makes user can share intuitively and easily the shared content that external unit sends.
In a kind of possible embodiment of the embodiment of the present application, described shared information comprises Share Permissions corresponding to described at least one shared content.
Description about the description of Share Permissions corresponding to described shared content referring to Fig. 1 and middle correspondence embodiment illustrated in fig. 2, repeats no more here.
For realize some alternately not too easily equipment (for example wearable device recited above) convenient share alternately, in a kind of possible embodiment of the embodiment of the present application, described device 800 also comprises:
Interaction area determination module 840, for determining a virtual interacting region.
In a kind of possible embodiment, described shared region determination module 840 is further used for:
Determine that according to a region instruction determines a shared region corresponding with described external unit on described virtual interacting region.
Virtual interacting region by above and corresponding mutual on described virtual interacting region, can make user carry out information sharing with external unit intuitively and easily by more difficult equipment of realizing complex interaction such as wearable devices.
In embodiments of the present invention, described device 800 also comprises: instruction acquisition module 850, and for obtaining described region and determine instruction by detecting user draws described shared region position interactive action in described virtual interacting region.
In present embodiment, described virtual interacting region, determine the method in described virtual interacting region and determine that the method for described shared region all can be referring to the description of Fig. 1 and middle correspondence embodiment illustrated in fig. 2, the function of described device 800 each modules realizes the description to middle correspondence embodiment illustrated in fig. 6 referring to Fig. 4, repeats no more here.
As shown in Fig. 9 a and 9b, the embodiment of the present application also provides a kind of wearable device 900, described wearable device 900 comprises the information sharing apparatus 400 of Fig. 4 to middle record embodiment illustrated in fig. 6, or comprises the information sharing apparatus 800 of middle record embodiment illustrated in fig. 8.
Because intelligent glasses had not only facilitated to user eyeground projects images, but also convenient ambient image (action of hand etc.) in the user visual field and the image of eyes of user taken as wear-type wearable device, therefore, compared with other wearable devices, preferably, in a kind of possible embodiment, described wearable device is intelligent glasses.
Certainly, those skilled in the art can know, except intelligent glasses, other suitable wearable devices also can be applied the information sharing apparatus of the embodiment of the present application, for example intelligent helmet, intelligent watch etc., or, in some possible embodiments, can jointly apply the information sharing apparatus described in the embodiment of the present application by the combination of the wear-type wearable devices such as intelligent glasses and other wearable devices.
Be that the method for the embodiment of the present application and the application scenarios of device are described as shown in figure 10, in the present embodiment, user A and user B all use one to comprise the intelligent glasses wearable device of Fig. 4 to the information sharing apparatus in embodiment illustrated in fig. 6, and wherein user A carries out the shared of the photo stored in the intelligent glasses of user A by its intelligent glasses and the intelligent glasses of user B.In the present embodiment, the interactive action of pointing the track drawing in described virtual interacting region by detecting user obtains corresponding instruction, and the process of information sharing is as follows:
First carry out predefined process, user defines trace graphics corresponding to shared procedure different operating instruction, take control of authority instruction for example, as example: the corresponding triangle of read-only authority, the local copy authority of preserving is corresponding circular, finishing authority is picture fork;
S1010 equipment room connects and presents and treats shared content: in this stage, two users can only see the content on oneself equipment; Having determined after applicable view field (desktop, paper, palm), by corresponding parameter adjustment, the described view field that is presented at correct the content projection that comprises the shared photo of needs on Wearable equipment is formed to virtual interacting region separately.As shown in figure 10, in the present embodiment, described user A has presented two width images in described virtual interacting region: the first image I mag1 and the second photo Image2.
S1020 determines shared region and Share Permissions is set: user A uses rectangle of finger picture in the indication range in its virtual interacting region, information sharing apparatus detects after this interactive action, obtain a region and determine instruction, this rectangular area is set as to shared region and starts sharing, then set the authority (as reading and writing authority, total-control authority) of user B to the shared content showing in the shared region of oneself.For example user A is with pointing at triangle of the shared region inside-paint corresponding with described user B, the information sharing apparatus of user A detects after this interactive action, and the Share Permissions of the shared region that the control of authority instruction of acquisition one read-only authority described user A are corresponding with described user B is set to read-only authority.Now, user B receive described user A by the shared content of described shared region after, can only carry out read-only operation to this content.In this process, the geometric figure of drawing by gesture at user A by system identification after, the equipment of user B, also can reminding user B shares and is about to start, and in its corresponding virtual interacting region, automatically or according to the definition of user B generate the shared region corresponding with described user A.
S1030 shares sharing of content: in this stage, user A intends described the first photo Image1 to share with described user B, therefore it is chosen described the first photo Image1 and is dragged in described shared region by finger, when described information sharing apparatus detects after this interactive action, generate and share the shared steering order of described the first photo Image1, and then in described shared region, show the icon that described the first photo Image1 is corresponding and information corresponding to described the first photo Image1 is sent to the information sharing apparatus of user B.Now in the shared region of user B, can show icon corresponding to described the first photo Image1, user B can check described the first photo Image1 under read-only authority.
Above, be a complete information sharing process.Certainly, in the process of information sharing, may also comprise other step, for example, also comprise the following stage:
S1040 changes authority: user B enjoys a lot the photo that user A shares, wish to preserve a local replica (write permission), after the agreement of user A, drawn another one and possessed the circle of write permission at its shared region by gesture in this locality, the information sharing apparatus of described user A is obtaining after corresponding control of authority instruction, the first photo Image1 in shared region is sent to a copy and arrive method, apparatus; On backstage, this first photo Image1 can be saved in the memory module 1010 of user B equipment automatically.
S1050 shares end: user draws a difference at local shared region arbitrarily, and its corresponding information sharing apparatus finishes corresponding shared procedure.
Those of ordinary skills can recognize, unit and the method step of each example of describing in conjunction with embodiment disclosed herein, can realize with the combination of electronic hardware or computer software and electronic hardware.These functions are carried out with hardware or software mode actually, depend on application-specific and the design constraint of technical scheme.Professional and technical personnel can realize described function with distinct methods to each specifically should being used for, but this realization should not thought and exceeds the application's scope.
If described function realizes and during as production marketing independently or use, can be stored in a computer read/write memory medium using the form of SFU software functional unit.Based on such understanding, the part that the application's technical scheme contributes to prior art in essence in other words or the part of this technical scheme can embody with the form of software product, this computer software product is stored in a storage medium, comprise that some instructions (can be personal computers in order to make a computer equipment, server, or the network equipment etc.) carry out all or part of step of method described in each embodiment of the application.And aforesaid storage medium comprises: various media that can be program code stored such as USB flash disk, portable hard drive, ROM (read-only memory) (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disc or CDs.
Above embodiment is only for illustrating the application; and the not restriction to the application; the those of ordinary skill in relevant technologies field; in the case of not departing from the application's spirit and scope; can also make a variety of changes and modification; therefore all technical schemes that are equal to also belong to the application's category, and the application's scope of patent protection should be defined by the claims.

Claims (41)

1. an information sharing method, is characterized in that, comprising:
Carry out present corresponding with at least one content in a virtual interacting region;
Determine that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share steering order according to one determines and enjoys at least altogether content in described at least one content, carry out and corresponding the presenting of described at least one shared content at described shared region, and the shared information that comprises described at least one shared content is sent to described at least one external unit.
2. the method for claim 1, is characterized in that, described method also comprises:
Be that described shared region arranges Share Permissions according to a control of authority instruction, the shared content corresponding with described shared region is corresponding to described Share Permissions.
3. method as claimed in claim 2, is characterized in that, described shared information also comprises: the described Share Permissions corresponding with described at least one shared content.
4. method as claimed in claim 2, is characterized in that, described method also comprises: obtain described control of authority instruction by detecting user's interactive action corresponding with described shared region.
5. method as claimed in claim 4, is characterized in that, interactive action corresponding to described and described shared region comprises:
Interactive action in described shared region.
6. method as claimed in claim 4, is characterized in that, described method also comprises: preset the corresponding relation between described user interactive action and the described control of authority instruction corresponding with described shared region.
7. the method for claim 1, is characterized in that, describedly carries out present corresponding with at least one content in a virtual interacting region and comprises:
The image corresponding with described at least one content is directly projected to user eyeground and at described virtual interacting regional imaging.
8. the method for claim 1, is characterized in that, describedly carries out present corresponding with at least one content in a virtual interacting region and comprises:
Image corresponding to described at least one content is projected to described virtual interacting region.
9. the method as described in claim 1 or 7 or 8, is characterized in that, described method carries out also comprising before corresponding with at least one content presenting in a virtual interacting region described: determine described virtual interacting region.
10. method as claimed in claim 9, is characterized in that, described definite described virtual interacting region comprises:
Determine described virtual interacting region according to a user instruction.
11. the method for claim 1, is characterized in that, described method also comprises:
Obtain described region by the detection user interactive action corresponding with described virtual interacting region and determine instruction.
12. methods as described in claim 4 or 11, is characterized in that, described interactive action comprises:
Hand interactive action and/or blinkpunkt interactive action.
13. methods as claimed in claim 11, is characterized in that, the described detection user interactive action corresponding with described virtual interacting region obtains described region and determine that instruction comprises:
Instruction is determined in the described region of acquisition of moving alternately that draws described shared region position by detecting user in described virtual interacting region.
14. the method for claim 1, is characterized in that, described method also comprises:
Receive the shared information in the outside that comprises the shared content at least one outside that an external unit sends;
Share the described shared information of content carries out sharing corresponding the presenting of content with described at least one outside at described shared region according to described at least one outside.
15. 1 kinds of information sharing methods, is characterized in that, comprising:
Determine that according to a region instruction determines a shared region corresponding with at least one external unit;
Receive the shared information that comprises at least one shared content of at least one transmission in described at least one external unit;
Carry out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
16. methods as claimed in claim 15, is characterized in that, described shared information comprises Share Permissions corresponding to described at least one shared content.
17. methods as claimed in claim 15, is characterized in that, described method also comprises:
Determine a virtual interacting region.
18. methods as claimed in claim 17, is characterized in that, describedly determine that according to a region instruction determines that a shared region corresponding with at least one external unit comprises:
Determine that according to a region instruction determines a shared region corresponding with described external unit on described virtual interacting region.
19. methods as claimed in claim 18, is characterized in that, described method also comprises:
The interactive action that draws described shared region position by detecting user in described virtual interacting region obtains described region and determines instruction.
20. 1 kinds of information sharing apparatus, is characterized in that, comprising:
Present module, for carrying out present corresponding with at least one content in a virtual interacting region;
Shared region determination module, for determining that according to a region instruction determines a shared region corresponding with at least one external unit on described virtual interacting region;
Share content determination module, determine and enjoy at least altogether content in described at least one content for sharing steering order according to one;
The described module that presents, also for carrying out at described shared region and corresponding the presenting of described at least one shared content;
Communication module, for sending the shared information that comprises described at least one shared content to described at least one external unit.
21. devices as claimed in claim 20, is characterized in that, described device also comprises:
Share Permissions arranges module, and for being that described shared region arranges Share Permissions according to a control of authority instruction, wherein, the shared content corresponding with described shared region is corresponding to described Share Permissions.
22. devices as claimed in claim 21, is characterized in that, described shared information also comprises: the described Share Permissions corresponding with described at least one shared content.
23. devices as claimed in claim 21, is characterized in that, described device also comprises: the first instruction acquisition module, and for obtaining described control of authority instruction by detecting user's interactive action corresponding with described shared region.
24. devices as claimed in claim 23, is characterized in that, described the first instruction acquisition module comprises:
The first image acquisition units, for gathering image corresponding to user's interactive action corresponding with described shared region;
The first image analyzing unit, obtains described control of authority instruction for analyzing the image of described the first image acquisition units collection.
25. devices as claimed in claim 23, is characterized in that, described device also comprises:
Instruction is related to presetting module, for the corresponding relation between default described user interactive action and the described control of authority instruction corresponding with described shared region.
26. devices as claimed in claim 25, is characterized in that, described device also comprises:
Memory module, for storing the corresponding relation between described default described user interactive action and the described control of authority instruction corresponding with described shared region.
27. devices as claimed in claim 20, is characterized in that, described in present module and comprise:
Eyeground projecting unit, for being directly projected to the image corresponding with described at least one content user eyeground and at described virtual interacting regional imaging.
28. devices as claimed in claim 20, is characterized in that, described in present module and comprise:
Projecting unit, for being projected to described virtual interacting region by image corresponding to described at least one content.
29. devices as described in claim 20 or 27 or 28, is characterized in that, described device also comprises:
Interaction area determination module, for determining described virtual interacting region.
30. devices as claimed in claim 29, is characterized in that, described interaction area determination module is further used for: determine described virtual interacting region according to a user instruction.
31. devices as claimed in claim 20, is characterized in that, described device also comprises:
The second instruction acquisition module, determines instruction for obtaining described region by the detection user interactive action corresponding with described virtual interacting region.
32. devices as claimed in claim 31, is characterized in that, described the second instruction acquisition module comprises:
The second image acquisition units, for gathering image corresponding to user's interactive action corresponding with described virtual interacting region;
The second image analyzing unit, obtains described region and determines instruction for analyzing the image of described the second image acquisition units collection.
33. devices as described in claim 23 or 31, is characterized in that, described interactive action comprises:
Hand interactive action and/or blinkpunkt interactive action.
34. devices as claimed in claim 32, is characterized in that, described the second instruction acquisition module is further used for: the interactive action that draws described shared region position by detecting user in described virtual interacting region obtains described region and determines instruction.
35. devices as claimed in claim 20, is characterized in that,
Described communication module, the shared information in the outside that comprises the shared content at least one outside also sending for receiving an external unit;
The described module that presents, also carries out sharing corresponding the presenting of content with described at least one outside at described shared region for the described shared information of sharing content according to described at least one outside.
36. 1 kinds of information sharing apparatus, is characterized in that, comprising:
Shared region determination module, for determining that according to a region instruction determines a shared region corresponding with at least one external unit;
Communication module, for receiving the shared information that comprises at least one shared content of at least one transmission of described at least one external unit;
Present module, for carrying out and corresponding the presenting of described at least one shared content at described shared region according to described shared information.
37. devices as claimed in claim 36, is characterized in that, described device also comprises:
Interaction area determination module, for determining a virtual interacting region.
38. devices as claimed in claim 37, is characterized in that, described shared region determination module is further used for:
Determine that according to a region instruction determines a shared region corresponding with described external unit on described virtual interacting region.
39. devices as claimed in claim 38, is characterized in that, described device comprises:
Instruction acquisition module, for obtaining described region and determine instruction by detecting user draws described shared region position interactive action in described virtual interacting region.
40. 1 kinds of wearable devices, is characterized in that, described wearable device comprises the information sharing apparatus described in claim 20 or 36.
41. wearable devices as claimed in claim 40, is characterized in that, described wearable device is intelligent glasses.
CN201410049125.3A 2014-02-12 2014-02-12 Information sharing method and device Pending CN103809751A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410049125.3A CN103809751A (en) 2014-02-12 2014-02-12 Information sharing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410049125.3A CN103809751A (en) 2014-02-12 2014-02-12 Information sharing method and device

Publications (1)

Publication Number Publication Date
CN103809751A true CN103809751A (en) 2014-05-21

Family

ID=50706641

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410049125.3A Pending CN103809751A (en) 2014-02-12 2014-02-12 Information sharing method and device

Country Status (1)

Country Link
CN (1) CN103809751A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077149A (en) * 2014-07-18 2014-10-01 北京智谷睿拓技术服务有限公司 Content sharing method and device
CN104093061A (en) * 2014-07-18 2014-10-08 北京智谷睿拓技术服务有限公司 Content sharing method and device
CN104375646A (en) * 2014-11-24 2015-02-25 联想(北京)有限公司 Information processing method, electronic equipment and wearable electronic equipment
CN107003739A (en) * 2014-10-06 2017-08-01 皇家飞利浦有限公司 Docking system
CN107110962A (en) * 2014-09-25 2017-08-29 三星电子株式会社 Apparatus and method for measuring wireless distances
CN109074772A (en) * 2016-01-25 2018-12-21 艾维赛特有限公司 Content based on sight shares dynamic self-organization network
CN109361727A (en) * 2018-08-30 2019-02-19 Oppo广东移动通信有限公司 Information sharing method, device, storage medium and wearable device
CN110244844A (en) * 2019-06-10 2019-09-17 Oppo广东移动通信有限公司 Control method and relevant apparatus
WO2021072912A1 (en) * 2019-10-17 2021-04-22 广州视源电子科技股份有限公司 File sharing method, apparatus, and system, interactive smart device, source end device, and storage medium
WO2021227628A1 (en) * 2020-05-14 2021-11-18 华为技术有限公司 Electronic device and interaction method therefor

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104077149B (en) * 2014-07-18 2018-02-02 北京智谷睿拓技术服务有限公司 Content share method and device
US10802786B2 (en) 2014-07-18 2020-10-13 Beijing Zhigu Rui Tuo Tech Co., Ltd Content sharing methods and apparatuses
WO2016008343A1 (en) * 2014-07-18 2016-01-21 Beijing Zhigu Rui Tuo Tech Co., Ltd. Content sharing methods and apparatuses
CN104093061A (en) * 2014-07-18 2014-10-08 北京智谷睿拓技术服务有限公司 Content sharing method and device
CN104077149A (en) * 2014-07-18 2014-10-01 北京智谷睿拓技术服务有限公司 Content sharing method and device
CN107110962A (en) * 2014-09-25 2017-08-29 三星电子株式会社 Apparatus and method for measuring wireless distances
CN107003739A (en) * 2014-10-06 2017-08-01 皇家飞利浦有限公司 Docking system
CN107003739B (en) * 2014-10-06 2020-10-27 皇家飞利浦有限公司 Docking system
CN104375646A (en) * 2014-11-24 2015-02-25 联想(北京)有限公司 Information processing method, electronic equipment and wearable electronic equipment
CN104375646B (en) * 2014-11-24 2018-07-06 联想(北京)有限公司 A kind of information processing method, electronic equipment and wearable electronic equipment
CN109074772A (en) * 2016-01-25 2018-12-21 艾维赛特有限公司 Content based on sight shares dynamic self-organization network
CN109361727A (en) * 2018-08-30 2019-02-19 Oppo广东移动通信有限公司 Information sharing method, device, storage medium and wearable device
CN110244844A (en) * 2019-06-10 2019-09-17 Oppo广东移动通信有限公司 Control method and relevant apparatus
WO2021072912A1 (en) * 2019-10-17 2021-04-22 广州视源电子科技股份有限公司 File sharing method, apparatus, and system, interactive smart device, source end device, and storage medium
WO2021227628A1 (en) * 2020-05-14 2021-11-18 华为技术有限公司 Electronic device and interaction method therefor

Similar Documents

Publication Publication Date Title
CN103809751A (en) Information sharing method and device
US11287956B2 (en) Systems and methods for representing data, media, and time using spatial levels of detail in 2D and 3D digital applications
US11467709B2 (en) Mixed-reality guide data collection and presentation
US10733716B2 (en) Method and device for providing image
US10120454B2 (en) Gesture recognition control device
US11947729B2 (en) Gesture recognition method and device, gesture control method and device and virtual reality apparatus
US11663784B2 (en) Content creation in augmented reality environment
US11188143B2 (en) Three-dimensional object tracking to augment display area
US9430093B2 (en) Monitoring interactions between two or more objects within an environment
KR101423536B1 (en) System for constructiing mixed reality using print medium and method therefor
US9323422B2 (en) Spatially-aware projection pen display
US20160034058A1 (en) Mobile Device Input Controller For Secondary Display
US20140028567A1 (en) Display device and control method thereof
CN104598119A (en) Screen capture method and device
US20200142495A1 (en) Gesture recognition control device
TWI637347B (en) Method and device for providing image
CN104102349A (en) Content sharing method and content sharing device
Qian et al. Portalware: Exploring free-hand AR drawing with a dual-display smartphone-wearable paradigm
CN208689558U (en) A kind of intelligence system assisting user
EP3088991B1 (en) Wearable device and method for enabling user interaction
US11983802B2 (en) Systems and methods for annotating a scene to include digital objects
US20230042447A1 (en) Method and Device for Managing Interactions Directed to a User Interface with a Physical Object
González-Zúñiga et al. S3doodle: Case Study for Stereoscopic Gui and user Content Creation
Bauer Large Display Interaction Using Mobile Devices
CN104914981A (en) Information processing method and electronic equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20140521

RJ01 Rejection of invention patent application after publication