CN105786163A - Display processing method and display processing device - Google Patents

Display processing method and display processing device Download PDF

Info

Publication number
CN105786163A
CN105786163A CN201410803983.2A CN201410803983A CN105786163A CN 105786163 A CN105786163 A CN 105786163A CN 201410803983 A CN201410803983 A CN 201410803983A CN 105786163 A CN105786163 A CN 105786163A
Authority
CN
China
Prior art keywords
image
display
electronic equipment
unit
display unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410803983.2A
Other languages
Chinese (zh)
Other versions
CN105786163B (en
Inventor
孙炳川
李斌
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410803983.2A priority Critical patent/CN105786163B/en
Publication of CN105786163A publication Critical patent/CN105786163A/en
Application granted granted Critical
Publication of CN105786163B publication Critical patent/CN105786163B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

The invention discloses a display processing method and device. The method comprises the following steps of detecting input operation by an acquisition unit when a first image is output from a first display unit, wherein the first display unit comprises a first display assembly and a first optical assembly, the first display assembly is used for generating a light ray corresponding to the first image according to the first image, and the first optical assembly is used for carrying out optical path conversion on the light ray corresponding to the first image so as to form an amplification virtual image corresponding to the first image; judging whether the input operation conforms to a first predetermined condition or not; and storing at least one part of image region in the first image as a first cut image if the input operation conforms to the first predetermined condition. Therefore, the invention provides an image cut mode of a display unit only suitable for a current user to independently view, the display processing of an electronic device is enriched, and a user experience is enhanced.

Description

Display processing method and display processing unit
Technical field
The present invention relates to field of computer technology, more particularly it relates to an display processing method and display processing unit.
Background technology
In recent years, the electronic equipment of such as notebook, desk computer, panel computer (PAD), mobile phone, multimedia player, personal digital assistant (PDA) etc is universal all the more.Such electronic equipment is often all integrated with display unit so that described electronic equipment can display to the user that various picture wherein.
Along with the development of modern society, interpersonal exchange becomes important all the more.But, it is subject to the restriction of a variety of causes, in some electronic equipment, the display unit of equipment is likely to only be suitable only for being checked alone by active user, and is unfavorable for that carrying out picture between a plurality of users shares, thus the interaction demand of user cannot be met.
Such as, when display unit is equipped in wear-type electronic equipment (such as, intelligent glasses) in time, owing to display unit is among the working morphology of this uniqueness of eyeglass, and it must be worn on the specific location of drawing axis of user rightly, so that to the eyes projection image of user, so among such working morphology, this display unit obviously cannot be checked by multiple users simultaneously.
Summary of the invention
In order to solve above-mentioned technical problem, an aspect according to embodiments of the present invention, provide a kind of display processing method, described method is applied to electronic equipment, described electronic equipment includes the first display unit and collecting unit, and described method includes: when exporting the first image by described first display unit, input operation is detected by described collecting unit, wherein, described first display unit includes the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image;Judge described input operation whether first predetermined condition, and obtain the first judged result;And if described first judged result indicates described input operation and meets described first predetermined condition, then at least some of image-region in described first image being stored as the first intercepting image.
In addition, another aspect according to embodiments of the present invention, provide a kind of display processing unit, described method is applied to electronic equipment, described electronic equipment includes the first display unit and collecting unit, and described device includes: operation receives unit, for when exporting the first image by described first display unit, input operation is detected by described collecting unit, wherein, described first display unit includes the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image;First judging unit, is used for judging described input operation whether first predetermined condition, and obtains the first judged result;And image storage unit, if indicating described input operation for described first judged result to meet described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
Compared with prior art, adopt display processing method according to embodiments of the present invention and device, may determine that the input operation that receives on an electronic device whether first predetermined condition, and described first judged result indicate described input operation meet described first predetermined condition time, at least some of image-region in described first image is stored as the first intercepting image.Obtain in the electronic device the first intercepting image can be stored by user so as to be used for carrying out image reading after a while, the various operations such as image procossing, picture are shared.Therefore, in the present invention, it is provided that a kind of for being only suitable only for the image interception mode of display unit checked alone by active user, enrich the display in electronic equipment and process, improve Consumer's Experience.
Other features and advantages of the present invention will be set forth in the following description, and, partly become apparent from description, or understand by implementing the present invention.The purpose of the present invention and other advantages can be realized by structure specifically noted in description, claims and accompanying drawing and be obtained.
Accompanying drawing explanation
Accompanying drawing is for providing a further understanding of the present invention, and constitutes a part for description, is used for together with embodiments of the present invention explaining the present invention, is not intended that limitation of the present invention.In the accompanying drawings:
Fig. 1 illustrates the flow chart of display processing method according to embodiments of the present invention.
Fig. 2 illustrates the block diagram of electronic equipment according to embodiments of the present invention.
Fig. 3 A illustrates the schematic diagram of the first image of display in the first display unit according to embodiments of the present invention.
Fig. 3 B and Fig. 3 C illustrates the schematic diagram that full frame screenshotss according to embodiments of the present invention process.
Fig. 3 D and Fig. 3 E illustrates the schematic diagram that region screenshotss according to embodiments of the present invention process.
Fig. 4 illustrates the block diagram of display processing unit according to embodiments of the present invention.
Fig. 5 illustrates the functional block diagram of electronic equipment according to a first embodiment of the present invention.
Fig. 6 A and Fig. 6 B illustrates the structured flowchart of electronic equipment according to a first embodiment of the present invention.
Fig. 7 A to Fig. 7 D respectively illustrates the first configuration example of fixing device in electronic equipment according to embodiments of the present invention to the 4th configuration example.
Fig. 8 A to Fig. 8 D illustrates schematic diagram and the embodiment of the nearly optics of the eye display system adopted in electronic equipment according to embodiments of the present invention.
Fig. 9 A to Fig. 9 C illustrates the schematic diagram of the display unit in electronic equipment according to embodiments of the present invention.
Figure 10 illustrates the functional block diagram of electronic equipment according to a second embodiment of the present invention.
Figure 11 illustrates the structured flowchart of electronic equipment according to a second embodiment of the present invention.
Figure 12 A and Figure 12 B be a diagram that top view and the side view of the first configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
Figure 12 C and Figure 12 D be a diagram that top view and the side view of the second configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
Figure 12 E and Figure 12 F be a diagram that top view and the side view of the 3rd configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
Figure 13 illustrates the outside side view of electronic equipment according to embodiments of the present invention.
Detailed description of the invention
Will be described in detail with reference to the accompanying drawings each embodiment according to embodiments of the present invention.Here it is to be noted that it in the accompanying drawings, identical accompanying drawing labelling is given the ingredient substantially with same or like 26S Proteasome Structure and Function, and the repeated description about them will be omitted.
First, display processing method according to embodiments of the present invention is described reference Fig. 1 and Fig. 2.
Fig. 1 illustrates the flow chart of display processing method according to embodiments of the present invention, and Fig. 2 illustrates the block diagram of electronic equipment according to embodiments of the present invention.
Display processing method illustrated in Fig. 1 can apply to the electronic equipment illustrated in Fig. 2.
As illustrated in Figure 2, electronic equipment 100 according to embodiments of the present invention can at least include: the first display unit 104 and collecting unit 106.
This first display unit 104 is used for showing the first image, and this first display unit 104 has a viewing areas so that only when this viewing areas of the eye alignment of user, could watch this first image from the first display unit 104.
Specifically, in electronic equipment 100 according to embodiments of the present invention, the viewing areas of this first display unit 104 is often less, that is, the eyes of user can only watch this first image shown by the first display unit 104 in a narrow and small scope, and once beyond this narrow range, user will be unable to watch this first image, or this first image cannot be intactly watched, or this first image cannot be clearly watched.Clearly as viewing areas is narrow and small, so this first display unit 104 is likely to only be suitable only for being checked alone by active user, and is not suitable for carrying out picture between a plurality of users and shares.
In optical system, it is possible to characterize this viewing areas by optical parametric " emergent pupil ".The aperture diaphragm of optical system imaging in optical system image space is called the emergent pupil of this system.Further, emergent pupil mainly includes two physical parameters, namely emergent pupil position (by " distance of exit pupil " represent) and emergent pupil diameter (by " exit pupil diameter " expression), they represent position and the bore of outgoing beam respectively.
Distance of exit pupil refers to the distance from last vertex of surface of optical system to exit pupil plane Yu optical axes crosspoint.Such as, in the such as visual optical instrument such as telescope and microscope, the pupil of human eye must overlap with emergent pupil just can see whole visual field, collides in order to avoid eyelash and system finally one side and hinders observation, and distance of exit pupil can not less than certain numerical value.Laboratory Instruments or general General Instrument, it is desirable to minimum distance of exit pupil is about 6mm;In military optical instrument, it is contemplated that adding eyeshield and band canister respirator, distance of exit pupil is longer, is generally about 20mm.
Exit pupil diameter refers to light diameter of the speck of formation after eyepiece converges, after eyepiece.For the optical equipment that naked eyes use, light enters retina image-forming after having to pass through pupil, and the pupil of the mankind is approximately 3 millimeters by day, and night is maximum up to 7 millimeter.When observing with optical equipment, the speck that eyepiece converged light line is formed is projected onto on pupil, and therefore, more big exit pupil diameter, the brightness being perceived by picture to people is also more big.But it is nonsensical more than the exit pupil diameter of pupil diameter.
Accordingly, in electronic equipment 100 according to embodiments of the present invention, this first display unit 104 can have relatively small distance of exit pupil and exit pupil diameter.In other words, this first display unit 104 visible angle and visual range is respectively less than or equal to respective predetermined threshold.So, multiple users cannot simultaneously watch the first image shown by this first display unit 104 by the mode pressed close to each other by respective eyes.
Specifically, described first display unit 104 can be the display unit following various displaying principle.
Such as, as described by background technology, this first display unit 104 can be the lens type tradition display unit of equipment in wear-type electronic equipment (such as, intelligent glasses).Owing to the eyes of this first display unit 104 distance users are close, so this first display unit 104 must have relatively small distance of exit pupil and exit pupil diameter, in order to the picture shown wherein is focused and watches by user.
Alternatively, but other display units equally with relatively small distance of exit pupil and exit pupil diameter of different displaying principle can also be equipped, be used to this first display unit in such as other Wearable electronic equipment or any form electronic equipment.
Such as, described first display unit 104 can include the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image.At this moment, the viewing areas of described first display unit 104 can be the region corresponding to emergent light of the light in the first optical module after light path converting.
Specifically, this first display unit 104 can be a kind of display unit based on light-guide optical element (LOE).Such as, LOE can be only small, a transparent optical flat, the image transmitted from display screen can be processed by it, and use the technology of optical projection, this image is reflexed in user's eye, thus forming virtual picture in the brain of user, it is equally clear and bright in luster that its image projected appears as, user, the picture that a computer display represents.
Obviously, as it has been described above, this first display unit 104 can also be based on the display unit of non-LOE (that is, other optical elements), as long as it has relatively small distance of exit pupil and exit pupil diameter.
For the ease of understanding intuitively, it is possible to reference to Fig. 8 A to Fig. 8 D described subsequently, wherein, Fig. 8 A and Fig. 8 B illustrates the display unit based on non-LOE, and Fig. 8 C and Fig. 8 D illustrates the display unit based on LOE.
In addition, this collecting unit 106 is used for detecting input operation, and be this, this collecting unit 106 has the detection region for detecting described input operation, allow users to input various input operation in this detection region, in order to electronic equipment carries out, according to input operation, the various displays that such as picture shares etc and processes.And, as mentioned above, owing to the first display unit 104 has relatively small distance of exit pupil and exit pupil diameter, so, so that user does not interfere with when inputting various operation watches operation normally, such as, the detection region of this collecting unit 106 can not be overlapping or not overlapping at least partly with the viewing areas of this first display unit 104.
Specifically, described collecting unit 106 can be the induction apparatus following various acquisition principle.
Such as, described collecting unit 106 can be touch-control sensing unit, image capturing unit, voice capturing unit, Proximity Sensor, pressure transducer, biological characteristic capturing unit etc., for gathering various types of parameter informations.
nullAnd,In order to not overlapping with the viewing areas of this first display unit 104 or not overlapping at least partly,Can reasonably arrange the touch area of this touch-control sensing unit on an electronic device (such as,The touch-surface of touch pad)、The image capture area of this image capturing unit is (such as,The shooting space of photographic head)、The voice capturing region of this voice capturing unit is (such as,The input open-work district of mike)、The distance detection region of this Proximity Sensor is (such as,Infrared light outgoing and incident open-work district)、The pressure detecting region of this pressure transducer is (such as,The pressing surface of pressing plate)、The feature detection region of this biological characteristic capturing unit is (such as,The contact surface of organism contact plate) etc.,So that user (or be referred to as,Beholder) while performing input operation for described electronic equipment 100,The display of the first display unit 104 will not be caused and block.
After a while, the concrete configuration of electronic equipment will be described more fully in an embodiment of the present invention.
As illustrated in figure 1, display processing method according to embodiments of the present invention may include that
In step s 110, when exporting the first image by described first display unit, detect input operation by described collecting unit.
Such as, when collecting unit 106 is touch-control sensing unit, the described input operation that this touch-control sensing unit detects can be user or operating body near to or in contact with touch area (such as, touch pad) and the various actions that perform thereon;And for example, when collecting unit 106 is image capturing unit, the described input operation that this touch-control sensing unit detects can be that user or operating body enter image capture area (such as, shooting space) and the various actions performed wherein;And for example, when collecting unit 106 is voice capturing unit, the various sound that the described input operation that this touch-control sensing unit detects can be user or sounding body sends;And for example, when collecting unit 106 is gyroscope, the described input operation that this gyroscope detects can be acceleration of gravity in certain orientation etc..
In the step s 120, it is judged that described input operation whether first predetermined condition, and obtain the first judged result.
It is then possible to the input operation received and the first predetermined condition are compared, and judge whether both mate.
As mentioned above, for instance, when collecting unit 106 is touch-control sensing unit, this first predetermined condition can be user or operating body performs predetermined action (such as, double-clicking touch area) on touch area;And for example, when collecting unit 106 is image capturing unit, this first predetermined condition can be user or operating body performs predetermined action (such as, holding palm into fist) in image capture area;And for example, when collecting unit 106 is voice capturing unit, this first predetermined condition can be user or sounding body sends predetermined sound (such as, user says " screenshotss ");And for example, when collecting unit 106 is gyroscope, this first predetermined condition can be that the acceleration of gravity in certain orientation is more than or equal to predetermined threshold (such as, by electronic equipment downward whipping vertically) etc..
In step s 130, if described first judged result indicates described input operation and meets described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
In one example, it is possible to the first image is intactly stored as the first intercepting image, say, that the first image of display in the first display unit is carried out full frame screenshotss.
In another example, it is also possible to a part of image-region in the first image is stored as the first intercepting image, say, that the first image of display in the first display unit has been carried out region screenshotss.Obviously, in order to carry out region screenshotss, it is necessary first to determine that user wishes truncated picture region.
For this, display processing method according to embodiments of the present invention can also include:
After receiving the input operation (that is, step S110) on electronic equipment, it is judged that described input operation whether second predetermined condition, and obtain the second judged result;If described second judged result indicates described input operation meets described second predetermined condition, then on described first image, show to superposition a position indicating symbol;And determine described at least some of image-region position in described first image according to described position indicating symbol position in described first image.
For example, it is possible to after receiving input operation, and before carrying out screenshotss operation, it is judged that this input operation is for full frame screenshotss or for region screenshotss.The input operation received and the second predetermined condition can be compared, and judge whether both mate.
As mentioned above, for instance, when collecting unit 106 is touch-control sensing unit, this second predetermined condition can be user or operating body performs predetermined action (such as, three hitting touch area) on touch area;And for example, when collecting unit 106 is image capturing unit, this second predetermined condition can be user or operating body performs predetermined action (such as, left and right repeatedly swings palm) in image capture area;And for example, when collecting unit 106 is voice capturing unit, this second predetermined condition can be user or sounding body sends predetermined sound (such as, user says " zone location ");And for example, when collecting unit 106 is gyroscope, this second predetermined condition can be that the acceleration of gravity in certain orientation is more than or equal to predetermined threshold (such as, repeatedly being rocked electronic equipment flatly left and right) etc..
If it is judged that input operation is region screenshotss operation, then the first image of display shows to superposition in the first display unit a position indicating symbol.
In the first case, the position indicating symbol shown to superposition on the first image of display in the first display unit can be prompting figure that is that show in the pre-position of the first image and that have reservation shape.Such as, this position indicating symbol can be the rectangle of the left-half occupying the first picture;Or with pixel coordinate (100,50) as top left corner apex, length-width ratio be 100 pixel × 100 pixels square (wherein, such as, when the first picture is rectangular frame, assume the initial point (0 using the top left corner apex of the first picture as pixel coordinate, 0), the positive direction being x-axis with level direction to the left, and with positive direction that direction vertically downward is y-axis);Or using the central point of the first picture as its centre point, circle that radius is 50 pixels.
It is then possible to the partial content of the first image of this enclosed choosing of prompting figure is stored as the first intercepting image.Such operation is suitable for intercepting its display position generally all can be in the part picture in a presumptive area.
Further, before described at least some of image-region position in described first image is determined in the position accorded with in described first image according to described position indicating, display processing method according to embodiments of the present invention can also include: changes described position indicating symbol position on described first image according to described input operation.
That is, can pass through to move finger or stylus on a touchpad, in shooting space, mobile palm, user or sounding body send " upwards/down/left/move right " or electronic equipment up/down/left/right are rocked the mobile position pointing out figure with reservation shape in the first image, enable a user to other positions being in the first image except precalculated position, the part picture with reservation shape carry out screenshotss.
In a second situation, the position indicating symbol shown to superposition on the first image of display in the first display unit can also be simply the instruction icon (such as, arrow, circle, triangle etc.) shown in the pre-position of the first image.
At this moment, the position accorded with in described first image according to described position indicating is determined that described at least some of image-region position in described first image may include that and is changed described position indicating symbol position on described first image according to described input operation;And determine described at least some of image-region position in described first image according to described position indicating symbol change in location in described first image.
That is, can pass through to move finger or stylus on a touchpad, in shooting space, mobile palm, user or sounding body send " upwards/down/left/move right " or electronic equipment up/down/left/right are rocked the position of this instruction icon mobile in the first image, enable a user to the part picture to being in desired locations region in the first image and carry out screenshotss.
Below, reference Fig. 3 A to Fig. 3 E is described a concrete example of display processing method according to embodiments of the present invention.
In described concrete example, it will be assumed that the collecting unit 106 equipped in the electronic device is touch pad, assume that the action being provided with in advance in this electronic equipment for triggering full frame screenshotss is input circular trace, and assume to be provided with in advance in this electronic equipment for triggering the action for region screenshotss of the instruction icon for double-clicking touch pad, and the action for toggle area screenshotss is the input trajectory forming Guan Bi.
Fig. 3 A illustrates the schematic diagram of the first image of display in the first display unit according to embodiments of the present invention;Fig. 3 B and Fig. 3 C illustrates the schematic diagram that full frame screenshotss according to embodiments of the present invention process;And Fig. 3 D and Fig. 3 E illustrates the schematic diagram that region screenshotss according to embodiments of the present invention process.
As illustrated in Fig. 3 A, the first display unit of electronic equipment shows the first image, including word " association ".
When user wishes this first image is carried out overall situation screenshotss, gesture is preset in the touch pad input that this user can pass through to equip in the electronic device.After user is to touchpad operation, electronic equipment can detect the data that touch pad reports.
As illustrated in fig. 3b, as mentioned above, only when this user have input circular trace at this touch pad, that is, only after electronic equipment detects that the figure that user draws is to preset gesture (circle), the first image of display on LOE just intactly is intercepted by electronic equipment, as illustrated in Fig. 3 C.It follows that this first image can be carried out buffer memory by this electronic equipment, or automatically or when obtaining user and determining, the image that this intercepting is got off can be stored in the memorizer (not shown) of electronic equipment further.
As illustrated in figure 3d, as it has been described above, when this user is after this touch pad have input double-click gesture, it is possible to show to superposition position instruction icon (triangle) on this first image.It follows that this user can move this triangular graph target position by moving finger on a touchpad.Only when this user have input closed trajectory at this touch pad, that is, that is, only after electronic equipment detects that this triangular graph mark its track under the control of the user defines one close-shaped (circle), electronic equipment just by the first image of display on LOE this close-shaped in region intercept, as illustrated in Fig. 3 E.It follows that this first image can be carried out buffer memory by this electronic equipment, or store further.
After at least some of image-region in described first image is stored as the first intercepting image (step S130), display processing method according to embodiments of the present invention can also include: after setting up communication connection with other equipment, intercept image by described first and be sent to other equipment, so that other equipment described show that described first intercepts image.
For this, this electronic equipment can also include communication unit (not shown), for setting up communication connection with other equipment.Such as, this electronic equipment can by this communication unit, be connected to other equipment via wiredly and/or wirelessly network, and according to agreement data form transmit by aforesaid operations obtain first intercept image.
Such as, this communication unit could be for being connected to the wireless communication module of the Internet by WLAN (WLAN) communication standard, or for being connected to the mobile communication module of the Internet by mobile communication standard or for being connected to bluetooth communication or the near-field communication module etc. of other equipment by short-range communication standard.
So, user just can by due to viewing areas narrow and small and be not suitable for carrying out between a plurality of users the first image that picture is shared, display in the first display unit and be sent on other equipment of other users in whole or in part, share such that it is able to easily carry out picture between a plurality of users.
Further, (namely at least some of image-region in described first image is stored as the first intercepting image, step S130) may include that in first time period, repeatedly obtain described at least some of image-region with predetermined time interval, and the described at least some of image-region repeatedly obtained is stored as multiple first intercepts images.At this moment, intercept described first image be sent to other equipment may include that by described multiple first intercept images and be sent to other equipment described so that other equipment described show successively described multiple first intercept images.
So, by controlling sufficiently small by the predetermined time interval carrying out screenshotss, it is possible to outside picture is shared, video sharing is easily carried out between a plurality of users.Such as, if just in display of video streams or playing on the electronic equipment of active user, then by constantly performing screenshotss operation and screenshotss picture can be sent share the picture on the electronic equipment of this active user between a plurality of users.Pass through aforesaid operations, it is also possible to there is larger sized display unit up by what have that the first image shown on the first display unit of reduced size renders in other equipment equipment, be beneficial to user and check, and reduce the visual fatigue of user.
Therefore, in the foregoing description, the realization of this programme is based on the specific operation to touch pad (drawing specific preset pattern on a touchpad), electronic equipment detects that touchpad operation is after specific operation, LOE screen is carried out screenshotss, wherein, both can intercept by whole screen, can also according to graphical screen shot drawn by touch screen, in order to after a while screenshotss are saved in electronic equipment or and/or wire communication mode wireless by mobile network/Bluetooth protocol (BT)/Wireless Fidelity agreement (Wi-Fi) etc. and other people share.
Oppositely, display processing method according to embodiments of the present invention can also include: after setting up communication connection with other equipment, the second intercepting image is received from other equipment described, wherein, at least some of image-region in the second image exported by the second display unit is stored as by other equipment described described second and intercepts image;And switch to described second to intercept image from described first image the output of described first display unit.
So, user can also obtain operation data from other equipment, and responds in this locality so that the output of the first display unit switches to long-range second to intercept image from the first local image, to reach the effect of two-way or multidirectional remote interaction better.
As can be seen here, adopt display processing method according to embodiments of the present invention, may determine that the input operation that receives on an electronic device whether first predetermined condition, and described first judged result indicate described input operation meet described first predetermined condition time, at least some of image-region in described first image is stored as the first intercepting image.Obtain in the electronic device the first intercepting image can be stored by user so as to be used for carrying out image reading after a while, the various operations such as image procossing, picture are shared.Therefore, in the present invention, it is provided that a kind of for being only suitable only for the image interception mode of display unit checked alone by active user, enrich the display in electronic equipment and process, improve Consumer's Experience.
Below, just with reference to Fig. 4, display processing unit according to embodiments of the present invention is described.
Fig. 4 illustrates the block diagram of display processing unit according to embodiments of the present invention.
Display processing method according to embodiments of the present invention illustrated in Fig. 1 can be realized by the display processing unit 10 illustrated in Fig. 4, and this display processing unit 10 can apply to the one or more electronic equipments 100 illustrated in Fig. 2.As illustrated in Figure 2, electronic equipment 100 according to embodiments of the present invention can at least include: the first display unit 104 and collecting unit 106.
This first display unit 104 is used for showing the first image, and this first display unit 104 has a viewing areas so that only when this viewing areas of the eye alignment of user, could watch this first image from the first display unit 104.
This collecting unit 106 is used for detecting input operation, and be this, this collecting unit 106 has the detection region for detecting described input operation, allow users to input various input operation in this detection region, in order to electronic equipment carries out, according to input operation, the various displays that such as picture shares etc and processes.
Additionally, this display processing unit 10 can be communicated by any mode and electronic equipment 100.
In one example, this display processing unit 10 can be integrated in this electronic equipment 100 as a software module and/or hardware module, and in other words, this electronic equipment 100 can include this display processing unit 10.Such as, when electronic equipment 100 is mobile phone, this display processing unit 10 can be a software module in the operating system of this mobile phone, or may be for the application program developed in this mobile phone;Certainly, this display processing unit 10 is equally possible is one of numerous hardware modules of this mobile phone.
Alternatively, in another example, this display processing unit 10 can also be the equipment separated with this electronic equipment 100, and this display processing unit 10 can be connected to this electronic equipment 100 by wiredly and/or wirelessly network, and transmits interactive information according to the data form of agreement.
As illustrated in figure 4, this display processing unit 10 may include that operation receives unit the 11, first judging unit 12 and image storage unit 13.
This operation receives unit 11 and may be used for when exporting the first image by described first display unit, detects input operation by described collecting unit.
This first judging unit 12 may be used for judging described input operation whether first predetermined condition, and obtains the first judged result.
If this image storage unit 13 may be used for described first judged result and indicates described input operation and meet described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
In one example, described first display unit 104 can include the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image.
In one example, the detection region of described collecting unit 106 can not be overlapping with the viewing areas of described first display unit 104, wherein, the detection region of described collecting unit 106 could be for detecting the region of described input operation, and the viewing areas of described first display unit 104 can be the region corresponding to emergent light of light after light path converting in the first optical module.
In one example, this display processing unit 10 can also include: the second judging unit 14, prompt display unit 15 and position determination unit 16.
This second judging unit 14 may be used for after receiving the input operation on electronic equipment, it is judged that described input operation whether second predetermined condition, and obtains the second judged result.
If this prompt display unit 15 may be used for described second judged result and indicates described input operation and meet described second predetermined condition, then on described first image, show to superposition a position indicating symbol.
This position determination unit 16 can determine described at least some of image-region position in described first image according to described position indicating symbol position in described first image.
Specifically, described position determination unit 16 can determine described at least some of image-region position in described first image by the position that following operation accords with in described first image according to described position indicating: changes described position indicating symbol position on described first image according to described input operation;And determine described at least some of image-region position in described first image according to described position indicating symbol change in location in described first image.
In one example, this display processing unit 10 can also include: image transmitting element 17.
This image transmitting element 17 may be used for, after setting up communication connection with other equipment, intercepting image by described first and being sent to other equipment, so that other equipment described show that described first intercepts image.
Specifically, at least some of image-region in described first image can be stored as the first intercepting image by following operation by described image storage unit 13: repeatedly obtain described at least some of image-region in first time period with predetermined time interval, and the described at least some of image-region repeatedly obtained is stored as multiple first intercepting images.Now, described first intercepting image can be sent to other equipment by following operation and include by described image transmitting element 17: multiple first intercepting images described are sent to other equipment described, so that other equipment described show multiple first intercepting images described successively.
In one example, this display processing unit 10 can also include: image receiving unit 18 and output switching unit 19.
This image receiving unit 18 may be used for after setting up communication connection with other equipment, the second intercepting image is received from other equipment described, wherein, at least some of image-region in the second image exported by the second display unit is stored as by other equipment described described second and intercepts image.
The output of described first display unit can be switched to described second to intercept image from described first image by this output switching unit 19.
The concrete configuration of the unit in information processor 10 according to embodiments of the present invention and operation have been described above being discussed in detail referring to figs. 1 in Fig. 3 E display processing method described, and therefore, will omit its repeated description.
As can be seen here, adopt display processing unit according to embodiments of the present invention, may determine that the input operation that receives on an electronic device whether first predetermined condition, and described first judged result indicate described input operation meet described first predetermined condition time, at least some of image-region in described first image is stored as the first intercepting image.Obtain in the electronic device the first intercepting image can be stored by user so as to be used for carrying out image reading after a while, the various operations such as image procossing, picture are shared.Therefore, in the present invention, it is provided that a kind of for being only suitable only for the image interception mode of display unit checked alone by active user, enrich the display in electronic equipment and process, improve Consumer's Experience.
Although it should be noted that above-mentioned unit to be illustrated each embodiment of the present invention as the executive agent of each step herein, but, it will be appreciated to those of skill in the art that and the invention is not restricted to this.The executive agent of each step can be served as by other one or more units, unit, even module.
It follows that the configuration example of electronic equipment according to embodiments of the present invention be will be described in further detail.
At present, the Wearable electronic equipment of such as intelligent watch is typically equipped with traditional display, such as liquid crystal display (LCD), display of organic electroluminescence, Organic Light Emitting Diode (OLED) display etc..But, owing to being limited to the size of the Wearable electronic equipment of such as intelligent watch itself, the display area of its provisioned traditional monitor is generally only small, is only able to display limited information.
Therefore, further providing a kind of electronic equipment in an embodiment of the present invention, it can not be subject to the restriction of size of electronic equipment itself, it is provided that more the image of large scale and higher resolution or video show.Additionally, it can also use scene and demand according to the difference of user, by different modes, the picture between multiple user is shared and be possibly realized, thus improving the Consumer's Experience of this electronic equipment.
First, electronic equipment according to a first embodiment of the present invention will be described with reference to Figure 5.
Fig. 5 illustrates the functional block diagram of electronic equipment according to a first embodiment of the present invention.
As illustrated in fig. 5, electronic equipment 100 according to a first embodiment of the present invention may include that processing unit the 103, first display unit 104 and collecting unit 106.
First, described processing unit 103 is for generating the image to show and performing display control and acquisition controlling.
Such as, described processing unit 103 can include central processing unit (CPU), microprocessing unit (MPU), digital signal processor (DSP), field programmable gate array (FPGA), special IC (ASIC) and/or other chip with disposal ability etc..
Secondly, described first display unit 104 includes the first viewable portion and (or is referred to as, first viewing area) and be used for showing the first image, described first viewable portion be described first display unit is watched by user thus perception its display content part.More specifically, described first display unit 104 can under the display performed by described processing unit 103 controls, the first image that output is generated by described processing unit 103, so that user can perceive this first image by this first viewable portion.Such as, this first image can be any kind of video data, and it includes but not limited to: image, video, text, even more generally, the standby picture etc. of the graphic user interface of application program or electronic equipment 100.
Such as, described first viewable portion of described first display unit 104 can have a viewing areas so that only when this viewing areas of the eye alignment of user, could watch this first image from the first display unit 104.
Such as, described first display unit 104 can be the display unit following various displaying principle.In one embodiment of the invention, described first display unit 104 can be nearly optics of the eye display system, say, that described first display unit 104 is for exporting the virtual image corresponding with described first image.Specifically, described first display unit 104 can include the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image.
Being limited to the displaying principle of nearly optics of the eye display system, at this moment, the visible angle of this viewing areas is comparatively narrow and viewing distance is nearer.It is to say, only when eyes closely this first display unit 104 of user, and when being directed at this first display unit 104, this user could watch the amplification virtual image corresponding with the first image from the first viewing area of the first display unit 104.
Finally, described collecting unit 106 is for detecting input operation within a detection region, in order to controls electronic equipment according to input operation and carries out picture and share process.More specifically, described collecting unit 106 under the detection performed by described processing unit 103 controls, can detect the change of various parameter value within a detection region.
So that described collecting unit 106 is able to detect that the various operations that user inputs, described collecting unit 106 can include detection region.And, as mentioned above, owing to the first display unit 104 is nearly optics of the eye display system, so its viewing areas is relatively small, therefore, so that user does not interfere with when inputting various operation watches operation normally, for instance, the detection region of this collecting unit 106 can not be overlapping or not overlapping at least partly with the viewing areas of this first display unit 104.Wherein, the detection region of described collecting unit 106 is the region for detecting described input operation, and the viewing areas of described first display unit 104 is the region corresponding to emergent light of light after light path converting in the first optical module.
For this, described collecting unit 106 can be the detection unit following various acquisition principle.
In the first example, described collecting unit 106 can include touch-control sensing unit, for producing the curent change of varying strength near to or in contact with described detection region in response to good conductor, as being used for characterizing the first parameter of input operation, and by with track identification, the position of touch point is determined whether user is desired with picture and shares process.
Specifically, this sensing unit 106 can be a kind of contact induction unit, it is to be operated by body (such as, finger, stylus etc.) physical contact (touch operation) on contact induction region (such as, the smooth surface of operation panel) controls the input block of electronic equipment 100.Such as, this touch sensing unit can be contact touch pad etc..
Alternatively, this sensing unit 106 can also be a kind of suspension sensing unit, it is to be operated by body (such as, finger, stylus etc.) in contact induction region (such as, the smooth surface of operation panel) non-physical contact (unsettled operation) in distance above (such as, 2 centimetres (cm)) controls the input block of electronic equipment 100.Such as, this touch sensing unit can be contactless touch pad etc..
After sensing unit 106 detects that the sensing that user sends controls to operate, this processing unit 103 can pass through a kind of mapping relations, by inquiring about mapping table, mapped file, database, controls electronic equipment and controls to operate in response to the sensing that user sends.Such as, when sensing unit 106 senses that user performs single-click operation (when sensing) or finger depression (when non-contact inductive) in contact on operation panel, processing unit 103 may determine that the above-mentioned interactive controlling of user operates whether first predetermined condition, and control electronic equipment 100 accordingly and whether complete picture and share process.
In the second example, described collecting unit 106 can include image capturing unit, is used for catching image, to generate the first image capturing signal, as being used for characterizing the first parameter of input operation, and determine whether user is desired with picture and shares process by action or gesture identification.
Specifically, this collecting unit 106 could be for the image capturing unit (such as, photographic head) of the interactive action catching operating body (such as, iris, finger, palm, limb action, user's attitude etc.) in image capturing space.In electronic equipment 100, this image capturing unit may be used for the shooting form of user, gesture motion or limb action etc., so that according to these actions, processing unit 103 judges that the above-mentioned interactive controlling of user operates whether first predetermined condition, and control electronic equipment 100 accordingly and whether complete picture and share process.
In the 3rd example, described collecting unit 106 can include voice capturing unit, is used for catching sound, to generate the first voice capturing signal, as being used for characterizing the first parameter of input operation, and determine whether user is desired with picture and shares process by speech recognition.
Specifically, this collecting unit 106 could be for catching and identifying the voice capturing unit (such as, mike) of the voice command of user.In electronic equipment 100, this voice capturing unit may be used for the sound etc. catching user oneself or being sent by other sound-producing devices, so that according to these sound, processing unit 103 judges that the above-mentioned interactive controlling of user operates whether first predetermined condition, and control electronic equipment 100 accordingly and whether complete picture and share process.
In electronic equipment 100, when described first display unit 104 exports the first image, described collecting unit 106 detects input operation, and described processing unit 103 judges described input operation whether first predetermined condition, if described input operation meets described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
Such as, the first storage operation intercepting image can be intercept image volatibility buffer memory in screen, it is also possible to is intercept image non-volatile memories in physical memory.
For this, this electronic equipment 100 can also include: memory element (not shown), is used for storing described first and intercepts image.Correspondingly, described processing unit 103 can be also used for performing storage control.
In addition, in order to interact operation with other equipment, this electronic equipment 100 can also include: communication unit (not shown), for after setting up communication connection with other equipment, intercept image by described first and be sent to other equipment, so that other equipment described show that described first intercepts image.
Additionally, described communication unit can be also used for after setting up communication connection with other equipment described, receive operation data from other equipment described, and described processing unit 103 is additionally operable to, in response to described operation data, from described first image, the output of described first display unit is switched to the second image.
So, by the operation of communication unit, the local image intercepted not only can be sent to other equipment and display by described electronic equipment 100, operation data can also be obtained from other equipment, and respond (such as in this locality, currently displaying photo is carried out page turning, currently running game is performed operation, the text when pre-editing is modified), thus the interactive operation realized between equipment.
Closer, can also pass through in electronic equipment 100, constantly carry out screen shot, transmit screenshotss to other equipment, receive operation data from other equipment, make operation response in this locality, proceed screen shot, continue to transmit screenshotss to other equipment ... etc., realize at other equipment places, electronic equipment 100 remotely being controlled incessantly.
As can be seen here, electronic equipment according to a first embodiment of the present invention, it can utilize the amplification virtual image including display module and optical system to show, it is achieved that not by the restriction of size of electronic equipment itself, it is provided that more the image of large scale and higher resolution or video show.Additionally, it can also utilize the input operation acquisition function provided by various collecting units, it is achieved the picture being suitable for this electronic equipment is shared operation and controls, thus interact between a plurality of users for electronic equipment and share, it is provided that optimum Consumer's Experience.Meanwhile, this electronic equipment compared to can carry out equally picture share operation for show more macreikonic micro projector, its power consumption is very low, and by use environmental limitation, also provide good use privacy simultaneously.
In addition, it can be seen that each operation that each step in display processing method according to embodiments of the present invention as above or the operation in display processing unit according to embodiments of the present invention receive performed by unit the 11, first judging unit 12, image storage unit the 13, second judging unit 14, prompt display unit 15, position determination unit 16 and image transmitting element 17 can be realized by the processing unit 103 in electronic equipment 100 uniformly.
It follows that reference Fig. 6 A to Fig. 8 C to be described in further detail external structure and the specific implementation of electronic equipment according to embodiments of the present invention.
Such as, described electronic equipment 100 may be among various form.Such as, described electronic equipment can be Wearable electronic equipment or non-Wearable electronic equipment.
In one embodiment, so that user carries electronic equipment more easily, electronic equipment according to embodiments of the present invention may be among the form of Wearable electronic equipment.That is, described electronic equipment can be the electronic equipment of Wearable, it can be worn on the arm of user, wrist or finger etc., thus forming arm straps formula electronic equipment (such as, be in armlet form), wrist carried electronic equipment (such as, being in wrist-watch or bracelet form) or refer to belt electronic equipment (being in ring form) etc..
In other embodiments, described electronic equipment can also be non-Wearable electronic equipment, the i.e. electronic equipment of common morphological, it can depend on the particular body portion of user (such as by gripping or clamp, in the hands of user), or specific wearable device can also be passed through (such as, it is banded in arm, wrist or the carry bag at finger place, tied rope etc.) and depend in the particular body portion (such as, the arm of user, wrist or finger etc.) of user.As such, it is possible to ensure electronic equipment versatility on using and carrying well, meet the conventional requirement of various user.
Below, for the ease of the purpose described, the Wearable electronic equipment for such as intelligent watch is described.
As Wearable electronic equipment, in order to depend on the particular body portion of user, electronic equipment according to embodiments of the present invention may include that body apparatus and fixing device.
First, reference Fig. 6 A and Fig. 6 B is described in detail electronic equipment according to embodiments of the present invention.Such as, electronic equipment according to embodiments of the present invention can be the Wearable electronic equipment of such as intelligent watch.It is of course also possible, as described before, those skilled in the art will readily understand, electronic equipment according to embodiments of the present invention is not limited to this, but can include any electronic equipment wherein with display unit.
Fig. 6 A and Fig. 6 B illustrates the structured flowchart of electronic equipment according to a first embodiment of the present invention.As shown in Figure 6 A and 6 B, electronic equipment 100 according to embodiments of the present invention includes body apparatus 101 and fixing device 102.Wherein, described fixing device 102 is connected with described body apparatus 101, and described fixing device 101 is for the relative position relation of the fixing user with described electronic equipment.
Obviously, adopting this configuration of body apparatus 101 and fixing device 102 to go for all of Wearable electronic equipment, it both can be a kind of intelligent glasses, it is also possible to be intelligent watch, Intelligent bracelet or intelligence finger ring etc..It follows that this Wearable electronic equipment will be gone on to say for intelligent watch.
In this intelligent watch, described fixing device 102 at least includes a stationary state, under described stationary state, described fixing device 102 can as an annulus or proximate annular space at least some of meeting the first preset requirement, and the column that described annulus or described proximate annular space can be centered around satisfied second preset requirement is peripheral.
Hereinafter, Fig. 7 A to Fig. 7 D be will be further referenced and describe the first configuration example of fixing device to the 4th configuration example.
Fig. 7 A to Fig. 7 D respectively illustrates the first configuration example of fixing device in electronic equipment according to embodiments of the present invention to the 4th configuration example.
In this Fig. 7 A to Fig. 7 D, succinct and clear in order to what describe, illustrate only the body apparatus 101 in described electronic equipment 100 and fixing device 102.
Specifically, Fig. 7 A and Fig. 7 B illustrates two kinds of stationary states that described fixing device 102 is connected with described body apparatus 101 respectively.Under the first stationary state as shown in Figure 7 A, described fixing device 102 and described body apparatus 101 form the annulus of closed loop, and wherein said fixing device 102 and described body apparatus 101 respectively constitute a part for annulus.Under the second stationary state as shown in Figure 7 B, described fixing device 102 and described body apparatus 101 form the proximate annular space with little opening, and wherein said fixing device 102 and described body apparatus 101 respectively constitute a part for annulus.In a preferred embodiment of the invention, described body apparatus 101 is the dial plate part of intelligent watch, and described fixing device 102 is the band portion of intelligent watch.The described annulus formed by described body apparatus 101 and described fixing device 102 or described proximate annular space can be centered around around the wrist of the user of the intelligent watch as described column, the Breadth Maximum of the opening portion in this proximate annular space less than user's wrist minimum widith (namely, first preset requirement), so that this intelligent watch will not come off user's wrist, and the diameter in described annulus or described proximate annular space more than user's wrist diameter and less than user's fist diameter (namely, second preset requirement), enable a user to be worn on this intelligent watch the wrist place of oneself.
Additionally, described annulus or described proximate annular space can certainly be independently formed by described fixing device 102.As shown in figs. 7 c and 7d, (namely described body apparatus 101 can be arranged on described fixing device 102, described body apparatus 101 is attached to described fixing device 102 in the way of the contact of face), in order to only have described fixing device 102 self and formed for the outer described annulus (Fig. 7 C) around described column or described proximate annular space (Fig. 7 D).Described fixing device 102 is provided with the fixed mechanism (not shown) of such as hasp, fastener, slide fastener etc..
Return Fig. 6 A and Fig. 6 B, further describe the configuration of electronic equipment 100.
More specifically, as shown in Figure 6 A and 6 B, described body apparatus 101 is provided with processing unit 103 and the first display unit 104.Described processing unit 103 is for generating the image to show and performing display control and acquisition controlling.Described first display unit 104 is used for exporting the first image.More specifically, described first display unit 104 is under the display performed by described processing unit 103 controls, the first image that output is generated by described processing unit 103.In the electronic equipment 100 shown in Fig. 6 A, described first display unit 104 is arranged on described body apparatus 101.But, those skilled in the art will readily understand, the invention is not restricted to this.Such as, in the electronic equipment 100 shown in Fig. 6 B, described first display unit 104 can also be arranged on described fixing device 102.
More specifically, when described electronic equipment 100 includes multiple first display unit 104, it should be apparent that these first display units 104 can be respectively provided on described body apparatus 101 and/or described fixing device 102.
But, when described electronic equipment 100 includes only one the first display unit 104, owing to this first display unit 104 is likely to be made up of multiple independent functional modules, so these functional modules can also be respectively provided on described body apparatus 101 and/or described fixing device 102, it is to say, this first independent display unit 104 can also be arranged on described body apparatus 101 and/or described fixing device 102.Such as, when described first display unit 104 is such as nearly optics of the eye display system, if nearly optics of the eye display system is narrowly interpreted as only including the first viewable portion, then described first display unit 104 can be arranged on described body apparatus 101 or described fixing device 102.On the contrary, if nearly optics of the eye display system is broadly understood to include the functional modules such as the first viewable portion, battery of lens, internal imaging unit, then obvious, these different functional modules can be respectively provided on described body apparatus 101 and/or described fixing device 102.
Additionally, described first display unit 104 is preferably the display unit following various displaying principle.Such as, described first display unit 104 can be nearly optics of the eye display system.
More specifically, described first display unit 104 can include the first viewable portion 1041, described first viewable portion 1041 is watched by user in described first display unit 104 thus perception shows the part of content.It is to say, the first display unit 104 as described below depends on its principle, comprise multiple parts, and wherein said first viewable portion 1041 is to be actually observed, by user, the region that picture material shows.At this moment, the position of described first display unit 104 described above can essentially refer to the position of described first viewable portion 1041.
Principle and the embodiment of described first display unit 104 is specifically described below with reference to Fig. 8 A to Fig. 8 D and Fig. 9 A to Fig. 9 C.
Fig. 8 A to Fig. 8 D illustrates schematic diagram and the embodiment of the nearly optics of the eye display system adopted in electronic equipment according to embodiments of the present invention.
Specifically, Fig. 8 A illustrates the schematic diagram of the nearly optics of the eye display system adopted in electronic equipment according to embodiments of the present invention.In electronic equipment according to embodiments of the present invention, adopt nearly optics of the eye display system as described first display unit 104.As shown in Figure 8 A, the light corresponding with its shown image that the micro-display unit 201 in nearly optics of the eye display system sends receives via the optical module 202 of such as battery of lens and carries out corresponding light path converting.As a result, the light after light path converting enters the pupil 203 of beholder, forms the virtual image amplified.
Fig. 8 B to Fig. 8 D is illustrated based on three kinds of detailed description of the invention of schematic diagram as shown in Figure 8 A further.Specifically, the technical scheme of Fig. 8 B diagram have employed folding and spreads out compound curved surface design, and wherein battery of lens 204 is corresponding to the optical module 202 shown in Fig. 8 A, thus reducing required eyeglass volume.The technical scheme of Fig. 8 C diagram have employed Random Curved Surface Designing, and the free-form surface lens group 205 including curved surface 1, curved surface 2 and curved surface 3 corresponds to the optical module 202 shown in Fig. 8 A, thus further reducing required eyeglass volume.The technical scheme of Fig. 8 D diagram then have employed parallel flat design, wherein except the battery of lens 206 corresponding to the optical module 202 shown in Fig. 8 A, also includes light wave guide card 207.By utilizing light wave guide card 207, when reducing required lens thickness, the control that the exit direction (that is, the display direction of the virtual image of amplification) for forming the light of the virtual image amplified such as translates.Those skilled in the art will readily understand, the nearly optics of the eye display system adopted in electronic equipment according to embodiments of the present invention is not limited to shown in figure 8 above B to Fig. 8 D, but can also adopt other embodiments of such as projection eyepiece design.
Fig. 9 A to Fig. 9 C illustrates the schematic diagram of the display unit in electronic equipment according to embodiments of the present invention.The first display unit 104 in electronic equipment 100 according to embodiments of the present invention adopts above with reference to Fig. 8 A to Fig. 8 D nearly optics of the eye display system described.Described first display unit 104 includes the first display module 301 and the first optical module 302 (the first optical module 302A to 302C in Fig. 9 A to Fig. 9 C), and described first display module 301 is for producing the light corresponding with described first image according to described first image;Described first optical module 302 is for receiving the light corresponding with described first image sent from described first display module 301, and the light that described first image is corresponding is carried out light path converting, the amplification virtual image corresponding to form described first image.
At this moment, the viewing areas of described first display unit 104 can be the region corresponding to emergent light of the light in the first optical module 302 after light path converting.
Specifically, in figure 9 a, described first display module 301 can be micro-display, and described first optical module 302A is formed by battery of lens.This battery of lens forms the amplification virtual image corresponding with described first image of described first display module 301 display.
In figures 9 b and 9, described first display module 301 can also be micro-display, and described first optical module 302B is formed by the optics carrying out multiple reflections in equipment.In the case, compared with the described first optical module 302A shown in Fig. 8 A, it is possible to save the space size needed for the first display unit 104, consequently facilitating the design of the electronic equipment of miniaturization and manufacture more.
In Fig. 9 C, described first display module 301 can be micro-display too, and described first optical module 302C is formed by the variable focus lens package carrying out flexible zoom in equipment under driver element (not shown) drives.In the case, compared with the described first optical module 302A shown in Fig. 9 A, it is possible to dynamically adjusted the size amplifying the virtual image shown by described first display unit 104 by zoom, thus meeting the different needs of user.
As shown in Fig. 9 A to Fig. 9 C, user actually observing the region that the picture material of the first display unit 104 shows is above with reference to Fig. 6 A and Fig. 6 B described first viewable portion 1041 described.
Above with reference in Fig. 9 A to Fig. 9 C electronic equipment 100 described, at least some of of described first optical module 302 is that light transmittance meets the assembly of predetermined condition on described annulus or described proximate annular space outwardly direction.Corresponding to the region of display image when described first optical module 302 at least some of is for performing display.More generally, described electronic equipment 100 is with at least some of corresponding described annulus or described proximate annular space outwardly direction of described first optical module 302, and light transmittance meets described predetermined condition.Specifically, as in fig. 8d, described electronic equipment 100 corresponds to the part of light wave guide card 207 on the display direction with at least some of virtual image for amplifying of described first optical module 302.Met described predetermined condition by the light transmittance of the part of the light wave guide card 207 of eyes of user direct viewing, eyes of user direct viewing be not such as then not necessarily light transmittance corresponding to the part of micro-display unit 201 and battery of lens 206 and meet described predetermined condition.Described predetermined condition can be that light transmittance is be more than or equal to predetermined value.Such as, this predetermined value can be 30%.Preferably, this predetermined value can be 70%.So, user can pass through described electronic equipment 100 and observe the skin of self.
Return Fig. 6 A and Fig. 6 B, further describe the configuration of electronic equipment 100.
More specifically, as shown in Figure 6 A and 6 B, described body apparatus 101 there also is provided collecting unit 106.Described processing unit 103 is for performing to control for the detection of this collecting unit 106.Described collecting unit 106, under the detection performed by described processing unit 103 controls, detects input operation, in order to controls electronic equipment according to input operation and carries out picture and share process.In the electronic equipment 100 shown in Fig. 6 A, described collecting unit 106 is arranged on described body apparatus 101;But, those skilled in the art will readily understand, the invention is not restricted to this.Such as, in the electronic equipment 100 shown in Fig. 6 B, described collecting unit 106 can also be arranged on described fixing device 102, as long as this collecting unit 106 detects region not overlapping with the viewing areas of described first viewable portion 1041 (or at least not exclusively overlapping).
Can being provided with one or more collecting unit 106 in described electronic equipment 100, described collecting unit 106 can be arranged on each position on electronic equipment 100.
More specifically, when described electronic equipment 100 includes multiple collecting unit 106, it should be apparent that these collecting units 106 can be respectively provided on described body apparatus 101 and/or described fixing device 102.
But, when described electronic equipment 100 includes only one collecting unit 106, owing to this collecting unit 106 is likely to be made up of multiple independent functional modules, so these functional modules can also be respectively provided on described body apparatus 101 and/or described fixing device 102, it is to say, this independent collecting unit 106 can also be arranged on described body apparatus 101 and/or described fixing device 102.Such as, when the image capturing unit that described collecting unit 106 is such as photographic head, if image capturing unit is narrowly interpreted as only include battery of lens, then described collecting unit 106 can be arranged on described body apparatus 101 or described fixing device 102.On the contrary, if image capturing unit is broadly understood to include the functional modules such as battery of lens, internal imaging unit and shutter release button, then obvious, these different functional modules can be respectively provided on described body apparatus 101 and/or described fixing device 102.
Figure 10 illustrates the functional block diagram of electronic equipment according to a second embodiment of the present invention, and Figure 11 illustrates the structured flowchart of electronic equipment according to a second embodiment of the present invention.
As illustrated in Figure 10, compared with Fig. 5, electronic equipment 100 according to a second embodiment of the present invention may further include: the second display unit 105, described second display unit includes the second viewable portion and (or is referred to as, second viewing area) and be used for showing the second image, described second viewable portion be described second display unit is watched by user thus perception its display content part.
Such as, described second viewable portion of described second display unit 105 can have a viewing areas, make only when the eyes of user are in this viewing areas, the second image that processing unit 103 generates could be watched from the second viewing area of the second display unit 105.
As shown in Figure 10 and Figure 11, except the first display unit 104, electronic equipment 100 can also include the second display unit 105, for instance, it can be arranged on described body apparatus 101.Described processing unit 103 is for generating the image to show and performing display control and acquisition controlling.Described second display unit 105 is used for exporting the second image.More specifically, described second display unit 105 is under the display performed by described processing unit 103 controls, the second image that output is generated by described processing unit 103, so that user can perceive this second image by this second viewable portion.Such as, this second image can be any kind of video data, and it includes but not limited to: image, video, text, even more generally, the standby picture etc. of the graphic user interface of application program or electronic equipment 100.
In the electronic equipment 100 shown in Figure 11, described second display unit 105 is arranged on described body apparatus 101.But, those skilled in the art will readily understand, the invention is not restricted to this.Such as, described second display unit 105 can also be arranged on described fixing device 102.
Described second display unit 105 is preferably the display unit following various displaying principle.Such as, described second display unit 105 can be ordinary optical display system, and it includes but not limited to as liquid crystal display, organic electroluminescence display unit, organic light-emitting diode display unit, EInk type display unit etc..Preferably, described second display unit 105 is and the described first different types of display unit 104 of display unit.
More specifically, described second display unit 105 can include the second viewable portion 1051, described second viewable portion 1051 is watched by user in described second display unit 105 thus perception shows the part of content.It is to say, the second display unit 105 as described below depends on its principle, comprise multiple parts, and wherein said second viewable portion 1051 is to be actually observed, by user, the region that picture material shows.At this moment, the position of described second display unit 105 described above can essentially refer to the position of described second viewable portion 1051.
Hereinafter, described first viewable portion and the different configuration examples of described second viewable portion will be described with reference to Figure 12 A to Figure 12 F.
Figure 12 A and Figure 12 B be a diagram that top view and the side view of the first configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
As illustrated in fig. 12, described first viewable portion 1041 and described second viewable portion 1051 have the first configuration example overlapped at described body apparatus 101.The invention is not restricted to this, described first viewable portion 1041 and described second viewable portion 1051 can also overlap on described fixing device 102.
Figure 12 B illustrates that described first viewable portion 1041 and described second viewable portion 1051 have the side view of the first configuration example overlapped further.As shown in Figure 12 B, it is configured with the first display unit 104 of described first viewable portion 1041 and is configured with the second display unit 105 of described second viewable portion 1051.Configure as shown in Figure 12 B, making in described first viewable portion 1041 and described second viewable portion 1051 at least in the viewable portion outside described annulus or described proximate annular space, on described annulus or described proximate annular space outwardly direction, light transmittance meets predetermined condition.Described predetermined condition can be that light transmittance is be more than or equal to predetermined value (such as 70%).In the example shown in Figure 12 A and Figure 12 B so that described first viewable portion 1041 is in described outside.The invention is not restricted to this, it is also possible to make described second viewable portion 1051 be in described outside.By making the light transmittance of described first viewable portion 1041 be more than or equal to predetermined value, only one of which in described first viewable portion 1041 and described second viewable portion 1051 is made to be in display, and be in non-display in viewable portion without influence on the display function of the viewable portion being in display, thus realizing more compact configuration.
Figure 12 C and Figure 12 D be a diagram that top view and the side view of the second configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
As indicated in fig. 12 c, described first viewable portion 1041 and described second viewable portion 1051 have the second configuration example being disposed adjacent on described body apparatus 101 or described fixing device 102.In Figure 12 C and Figure 12 D, described first viewable portion 1041 and described second viewable portion 1051 are disposed adjacent in described body apparatus 101.The invention is not restricted to this, described first viewable portion 1041 and described second viewable portion 1051 respectively at described body apparatus 101 and described fixing device 102, and can be smaller than threshold value (such as 1 centimetre) between described first viewable portion 1041 and described second viewable portion 1051.
Figure 12 D illustrates that described first viewable portion 1041 and described second viewable portion 1051 have the side view of the second configuration example overlapped further.As indicated in fig. 12d, the first display unit 104 being configured with described first viewable portion 1041 and the second display unit 105 being configured with described second viewable portion 1051 being disposed adjacent as indicated in fig. 12d, and the display direction of described first viewable portion 1041 and described second viewable portion 1051 is all on described annulus or described proximate annular space outwardly direction.
Figure 12 E and Figure 12 F be a diagram that top view and the side view of the 3rd configuration example of the viewable portion of electronic equipment according to embodiments of the present invention respectively.
As shown in figure 12e, described first viewable portion 1041 and described second viewable portion 1051 have the 3rd configuration example being disposed adjacent on described body apparatus 101 or described fixing device 102.Different from the second configuration example shown in Figure 12 C and Figure 12 D, as shown in Figure 12 F, the display direction of in described first viewable portion 1041 and described second viewable portion 1051 is in described annulus or described proximate annular space outwardly direction, and described first viewable portion 1041 and another the display direction in described second viewable portion 1051 are on the direction vertical with described annulus or described proximate annular space outwardly direction.
Below, the appearance example of this electronic equipment when adopting the viewable portion as illustrated in Figure 12 C and Figure 12 D is described in detail with reference to Figure 13.
Figure 13 illustrates the outside side view of electronic equipment according to embodiments of the present invention.
As shown in figure 13, described first display unit 104 and described second display unit 105 are preferably the display unit following different displaying principles.Specifically, described first display unit 104 can be nearly optics of the eye display system, and described first viewable portion 1041 can be at least some of of the surface of described first optical module emergent ray, and described first viewable portion 1041 can be arranged in described body apparatus 101.Additionally, described second display unit 105 can be ordinary optical display system, and described second viewable portion 1051 can be the region corresponding to second display screen of the second display unit 105.
Therefore, displaying principle due to near-to-eye and regular display, described first display unit 104 will present the virtual image being exaggerated compared with the size of the first viewable portion 1041, and the second display unit 105 will present a big real image such as compared with the size of the second viewable portion 1051.Both concrete image-forming principles will be described in detail after a while.
Although it should be noted that in fig. 13, the first display module and the first optical module (including collimation unit and Wave guide unit) being all disposed within described body apparatus 101, but, the invention is not restricted to this.Such as, described first optical module can cross over described body apparatus 101 and fixing device 102 is arranged, and described first display module can be arranged in described body apparatus 101.
With continued reference to Figure 13, described first display unit 104 and described second display unit 105 are radially overlapping at annulus, and the display of described first display unit 104 is without influence on the display of described second display unit 105, similarly, the display of described second display unit 105 is without the display affecting described first display unit 104.
First viewable portion 1041 of described first display unit 104 and the second viewable portion 1051 of described second display unit 105 adjacently positioned on same surface (that is, coplanar adjacent).Then, the invention is not restricted to this.Obviously, this first viewable portion 1041 and this second viewable portion 1051 can also on two surfaces being parallel to each other adjacently positioned (namely, parallel surface is adjacent), or adjacently positioned (that is, angle is adjacent) on two surfaces each other with certain angle.
In one embodiment, as illustrated in figure 13, due to the displaying principle of regular display and near-to-eye, so the size of described second viewable portion 1051 can more than the size of described first viewable portion 1041.Therefore, distance between described user and described body apparatus be second distance (such as, this second distance can have larger distance value) time, the size of first image presented according to the first display effect in described first viewable portion 1041 being sized larger than the perception of described user institute of the second image presented according to the second display effect in described second viewable portion 1051 that described user watches.
Produce above-mentioned phenomenon to be due to the fact that and cause: when user (or is referred to as, beholder) it is located remotely from ad-hoc location (second distance of the virtual image can not the be perceived) place of described electronic equipment 100 when watching, the size of the second image presented according to the second display effect in described second viewable portion 1051 that this user watches is equal to the size of described second viewable portion 1051, and first image presented according to the first display effect in described first viewable portion 1041 of this user institute perception is owing to failing to be formed as scheduled the virtual image, and only produce a hot spot, so its size is also approximately equal to the size of described first viewable portion 1041.
In another embodiment, described second display unit 105 can include second display screen, and described second display screen has the second size, and described second size is equal to the size of described second viewable portion.Meanwhile, described first display unit 104 includes the first display screen, and described first display screen has first size, and described first size is less than the size of described first viewable portion.
Displaying principle due to regular display and near-to-eye, so when distance between described user and described body apparatus be the first distance (such as, this first distance can have nearer distance value) time, the size of the second image presented according to the second display effect in described second viewable portion that described user watches is equal to the size of described second viewable portion;And the distance between described user and described body apparatus be described first distance (assuming its ad-hoc location being able to perceive the virtual image) time, the size being sized larger than described first viewable portion of the first image presented according to the first display effect in described first viewable portion that described user perceives.
Produce above-mentioned phenomenon to be due to the fact that and cause: when ad-hoc location (can perceive the first distance of the virtual image) place that user is located adjacent to described electronic equipment 100 is watched, the size of the second image presented according to the second display effect in described second viewable portion 1051 that this user watches is still equal to the size of described second viewable portion 1051, and first image presented according to the first display effect in described first viewable portion 1041 of this user institute perception is owing to defining the virtual image of amplification, so it is sized larger than the size of described first viewable portion 1041.Depend on the setting of the first display unit 104, when the specific location that user is located exactly at described electronic equipment 100 is watched, it is even possible that, the virtual image size of first image presented according to the first display effect in described first viewable portion 1041 of this user institute perception may produce several times, even several real image size decupling the second image.
As can be seen here, in electronic equipment according to a second embodiment of the present invention, second display unit may be used for exporting the real image corresponding with display image, make the user being arranged in the second distance place of described electronic equipment can watch described real image at the second viewable portion, and the first display unit is for exporting the virtual image corresponding with described display image, make the user being arranged in the first distance of described electronic equipment can perceive the described virtual image at the first viewable portion, wherein, the size of the described real image watched is equal to the size of described second viewable portion, and the size being sized larger than described first viewable portion of the described virtual image perceived.Therefore, this electronic equipment can not be subject to the restriction of size of the Wearable electronic equipment of such as intelligent watch itself, it is provided that more the image of large scale and higher resolution or video show.
Through the above description of the embodiments, those skilled in the art is it can be understood that can add the mode of required hardware platform by means of software to the present invention and realize, naturally it is also possible to implement all through software or hardware.Based on such understanding, what background technology was contributed by technical scheme can embody with the form of software product in whole or in part, this computer software product can be stored in storage medium, such as ROM/RAM, disk, CD etc., including some instructions with so that a computer equipment (can be personal computer, server, or the network equipment etc.) perform the method described in some part of each embodiment of the present invention or embodiment.
Each embodiment of the present invention described in detail above.But, it should be appreciated by those skilled in the art that without departing from the principles and spirit of the present invention, these embodiments can be carried out various amendment, combination or sub-portfolio, and such amendment should fall within the scope of the present invention.

Claims (13)

1. a display processing method, it is characterised in that described method includes:
When exporting the first image by the first display unit, input operation is detected by collecting unit, wherein, described first display unit includes the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image;
Judge described input operation whether first predetermined condition, and obtain the first judged result;And
If described first judged result indicates described input operation meets described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
2. method according to claim 1, it is characterised in that described method also includes:
After described input operation detected by collecting unit, it is judged that described input operation whether second predetermined condition, and obtain the second judged result;
If described second judged result indicates described input operation meets described second predetermined condition, then on described first image, show to superposition a position indicating symbol;And
Described at least some of image-region position in described first image is determined according to the position that described position indicating accords with in described first image.
3. method according to claim 2, it is characterised in that determine that described at least some of image-region position in described first image includes according to the position that described position indicating accords with in described first image:
Described position indicating symbol position on described first image is changed according to described input operation;And
Described at least some of image-region position in described first image is determined according to the change in location that described position indicating accords with in described first image.
4. method according to claim 1, it is characterised in that described method also includes:
After setting up communication connection with other equipment, intercept image by described first and be sent to other equipment, so that other equipment described show that described first intercepts image.
5. method according to claim 4, it is characterised in that at least some of image-region in described first image is stored as the first intercepting image and includes:
First time period repeatedly obtains with predetermined time interval described at least some of image-region, and the described at least some of image-region repeatedly obtained is stored as multiple first intercepting images, and
Described first intercepting image is sent to other equipment include:
Multiple first intercepting images described are sent to other equipment described, so that other equipment described show multiple first intercepting images described successively.
6. method according to claim 5, it is characterised in that described method also includes:
Operation data are received from other equipment described;And
In response to described operation data, the output of described first display unit is switched to the second image from described first image.
7. a display processing unit, it is characterised in that described device includes:
Operation receives unit, for when exporting the first image by the first display unit, input operation is detected by collecting unit, wherein, described first display unit includes the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image;
First judging unit, is used for judging described input operation whether first predetermined condition, and obtains the first judged result;And
Image storage unit, if indicating described input operation for described first judged result to meet described first predetermined condition, is then stored as the first intercepting image by least some of image-region in described first image.
8. an electronic equipment, it is characterised in that described electronic equipment includes:
Processing unit, described processing unit is for generating the image to show and performing display control and acquisition controlling;
First display unit, is used for exporting the first image, and wherein, described first display unit includes the first display module and the first optical module, and described first display module is for producing the light corresponding with described first image according to described first image;Described first optical module is for carrying out light path converting to the light that described first image is corresponding, to form the amplification virtual image corresponding with described first image;And
Collecting unit, is used for detecting input operation, and
Wherein, when described first display unit exports the first image, the detection input operation of described collecting unit, and described processing unit judges described input operation whether first predetermined condition, if described input operation meets described first predetermined condition, then at least some of image-region in described first image is stored as the first intercepting image.
9. electronic equipment according to claim 8, it is characterised in that the detection region of described collecting unit is not overlapping with the viewing areas of described first display unit,
Wherein, the detection region of described collecting unit is the region for detecting described input operation, and the viewing areas of described first display unit is the region corresponding to emergent light of light after light path converting in the first optical module.
10. electronic equipment according to claim 8, it is characterised in that described electronic equipment also includes:
Memory element, is used for storing described first and intercepts image, and
Described processing unit is additionally operable to perform storage and controls.
11. electronic equipment according to claim 8, it is characterised in that described electronic equipment also includes:
Communication unit, for, after setting up communication connection with other equipment, intercepting image by described first and be sent to other equipment, so that other equipment described show that described first intercepts image.
12. electronic equipment according to claim 11, it is characterised in that
Described communication unit is additionally operable to after setting up communication connection with other equipment described, receives operation data from other equipment described, and
Described processing unit is additionally operable to, in response to described operation data, from described first image, the output of described first display unit is switched to the second image.
13. electronic equipment according to claim 8, it is characterised in that described electronic equipment also includes:
Body apparatus, including processing unit, described processing unit is used for generating described first image and performing display controlling and acquisition controlling;
Fixing device, it is connected with described body apparatus, described fixing device at least includes a stationary state, under described stationary state, described fixing device can at least some of as at least some of of an annulus or the proximate annular space meeting the first preset requirement, the column that described annulus or described proximate annular space can be centered around satisfied second preset requirement is peripheral.
CN201410803983.2A 2014-12-19 2014-12-19 Display processing method and display processing unit Active CN105786163B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410803983.2A CN105786163B (en) 2014-12-19 2014-12-19 Display processing method and display processing unit

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410803983.2A CN105786163B (en) 2014-12-19 2014-12-19 Display processing method and display processing unit

Publications (2)

Publication Number Publication Date
CN105786163A true CN105786163A (en) 2016-07-20
CN105786163B CN105786163B (en) 2019-04-26

Family

ID=56386326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410803983.2A Active CN105786163B (en) 2014-12-19 2014-12-19 Display processing method and display processing unit

Country Status (1)

Country Link
CN (1) CN105786163B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444023A (en) * 2016-08-29 2017-02-22 北京知境科技有限公司 Super-large field angle binocular stereoscopic display transmission type augmented reality system
CN106651823A (en) * 2016-12-28 2017-05-10 努比亚技术有限公司 Device and method for eliminating picture light spot and mobile terminal
CN106897605A (en) * 2017-02-13 2017-06-27 合肥铭锶伟途信息科技有限公司 A kind of cryptographic system based on multispectral gesture identification
CN111050091A (en) * 2019-12-23 2020-04-21 联想(北京)有限公司 Output control method and device and electronic equipment
CN111813473A (en) * 2020-06-23 2020-10-23 维沃移动通信有限公司 Screen capturing method and device and electronic equipment
CN112053286A (en) * 2020-09-04 2020-12-08 北京字节跳动网络技术有限公司 Image processing method, image processing device, electronic equipment and readable medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930263A (en) * 2012-09-27 2013-02-13 百度国际科技(深圳)有限公司 Information processing method and device
CN103455315A (en) * 2012-06-04 2013-12-18 百度在线网络技术(北京)有限公司 Method and equipment used for realizing screen capturing and acquiring corresponding target information
CN103713387A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic device and acquisition method
CN104077784A (en) * 2013-03-29 2014-10-01 联想(北京)有限公司 Method for extracting target object and electronic device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103455315A (en) * 2012-06-04 2013-12-18 百度在线网络技术(北京)有限公司 Method and equipment used for realizing screen capturing and acquiring corresponding target information
CN102930263A (en) * 2012-09-27 2013-02-13 百度国际科技(深圳)有限公司 Information processing method and device
CN103713387A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic device and acquisition method
CN104077784A (en) * 2013-03-29 2014-10-01 联想(北京)有限公司 Method for extracting target object and electronic device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444023A (en) * 2016-08-29 2017-02-22 北京知境科技有限公司 Super-large field angle binocular stereoscopic display transmission type augmented reality system
CN106651823A (en) * 2016-12-28 2017-05-10 努比亚技术有限公司 Device and method for eliminating picture light spot and mobile terminal
CN106897605A (en) * 2017-02-13 2017-06-27 合肥铭锶伟途信息科技有限公司 A kind of cryptographic system based on multispectral gesture identification
CN111050091A (en) * 2019-12-23 2020-04-21 联想(北京)有限公司 Output control method and device and electronic equipment
CN111813473A (en) * 2020-06-23 2020-10-23 维沃移动通信有限公司 Screen capturing method and device and electronic equipment
CN112053286A (en) * 2020-09-04 2020-12-08 北京字节跳动网络技术有限公司 Image processing method, image processing device, electronic equipment and readable medium
CN112053286B (en) * 2020-09-04 2023-09-05 抖音视界有限公司 Image processing method, device, electronic equipment and readable medium

Also Published As

Publication number Publication date
CN105786163B (en) 2019-04-26

Similar Documents

Publication Publication Date Title
CN114402589B (en) Smart stylus beam and auxiliary probability input for element mapping in 2D and 3D graphical user interfaces
US9753518B2 (en) Electronic apparatus and display control method
EP4172726A1 (en) Augmented reality experiences using speech and text captions
US9727132B2 (en) Multi-visor: managing applications in augmented reality environments
CN105786163A (en) Display processing method and display processing device
CN104898406A (en) Electronic device and acquisition control method
KR102110208B1 (en) Glasses type terminal and control method therefor
US20210407205A1 (en) Augmented reality eyewear with speech bubbles and translation
JP2017091433A (en) Head-mounted type display device, method of controlling head-mounted type display device, and computer program
US11195341B1 (en) Augmented reality eyewear with 3D costumes
CN112835445B (en) Interaction method, device and system in virtual reality scene
JP6638392B2 (en) Display device, display system, display device control method, and program
CN105334718A (en) Display switching method and electronic apparatus
US20240094819A1 (en) Devices, methods, and user interfaces for gesture-based interactions
WO2022005733A1 (en) Augmented reality eyewear with mood sharing
US11826635B2 (en) Context-sensitive remote eyewear controller
US10558951B2 (en) Method and arrangement for generating event data
JP2017120302A (en) Display device, display system, control method of display device, and program
KR20240009984A (en) Contextual visual and voice search from electronic eyewear devices
US11762202B1 (en) Ring-mounted flexible circuit remote control
US11733789B1 (en) Selectively activating a handheld device to control a user interface displayed by a wearable device
US20240134492A1 (en) Digital assistant interactions in extended reality
US20240231558A9 (en) Digital assistant interactions in extended reality
WO2023230354A1 (en) Systems for interpreting thumb movements of in-air hand gestures for controlling user interfaces based on spatial orientations of a user's hand, and method of use thereof
WO2024064016A1 (en) Devices, methods, and user interfaces for gesture-based interactions

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant