CN105468135A - Information processing method and electronic device - Google Patents
Information processing method and electronic device Download PDFInfo
- Publication number
- CN105468135A CN105468135A CN201410455816.3A CN201410455816A CN105468135A CN 105468135 A CN105468135 A CN 105468135A CN 201410455816 A CN201410455816 A CN 201410455816A CN 105468135 A CN105468135 A CN 105468135A
- Authority
- CN
- China
- Prior art keywords
- display
- user
- unit
- display area
- electronic device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000010365 information processing Effects 0.000 title claims abstract description 26
- 238000003672 processing method Methods 0.000 title claims abstract description 24
- 230000003993 interaction Effects 0.000 claims abstract description 279
- 230000002452 interceptive effect Effects 0.000 claims abstract description 53
- 238000000034 method Methods 0.000 claims abstract description 42
- 230000004044 response Effects 0.000 claims abstract description 9
- 230000006698 induction Effects 0.000 claims description 61
- 230000008447 perception Effects 0.000 claims description 49
- 230000008859 change Effects 0.000 claims description 24
- 238000013459 approach Methods 0.000 claims description 14
- 230000008569 process Effects 0.000 claims description 8
- 239000012780 transparent material Substances 0.000 claims description 8
- 230000003287 optical effect Effects 0.000 claims description 5
- 238000010586 diagram Methods 0.000 description 24
- 230000006870 function Effects 0.000 description 18
- 230000005484 gravity Effects 0.000 description 17
- 238000001514 detection method Methods 0.000 description 13
- 238000009434 installation Methods 0.000 description 12
- 230000001960 triggered effect Effects 0.000 description 9
- 238000004590 computer program Methods 0.000 description 7
- 230000000007 visual effect Effects 0.000 description 7
- 210000000707 wrist Anatomy 0.000 description 6
- 230000010287 polarization Effects 0.000 description 4
- 238000003860 storage Methods 0.000 description 3
- 230000000712 assembly Effects 0.000 description 2
- 238000000429 assembly Methods 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 206010013082 Discomfort Diseases 0.000 description 1
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 208000003464 asthenopia Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 238000005520 cutting process Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
Landscapes
- User Interface Of Digital Computer (AREA)
Abstract
Embodiments of the present invention disclose an information processing method and an electronic device; The method comprises: controlling a first display area of a first display module to output and display a first object logotype, so that a user in a first interaction mode can sense the first object logotype by the first display area; acquiring a sensing parameter by a first sensing unit; according to the sensing parameter, determining the interaction mode of the electronic device; when the sensing parameter characterizes that the electronic device is in a second interaction mode, determining the first object logotype; based on the first object logotype, generating a call instruction; in response to the call instruction, starting a first application corresponding to the first object logotype; and controlling a second display area of a second display module to output a first user interactive interface corresponding to the first application, so that the user in the second interaction mode can sense the first user interactive interface by the second display area.
Description
Technical Field
The present invention relates to information processing technologies, and in particular, to an information processing method and an electronic device.
Background
At present, a notification bar area of an electronic device presents a notification message of an application in the electronic device, and the notification message of the application may be presented as an icon of the application or summary information of the notification message of the application, in which case, the icon of the application or the summary information of the notification message of the application is an entry of detailed information corresponding to the notification message of the application, and a user needs to click the icon or the summary information to obtain the detailed information corresponding to the notification message of the application, which may result in low information obtaining efficiency.
If an information processing scheme is provided, information can be acquired without triggering operation aiming at the icons or the summary information, so that the information acquisition efficiency can be greatly improved, and the operation experience of a user is improved. However, in the related art, there is no effective solution to this problem.
Disclosure of Invention
In order to solve the existing technical problem, embodiments of the present invention provide an information processing method and an electronic device, which can improve information acquisition efficiency and improve user experience.
In order to achieve the above purpose, the technical solution of the embodiment of the present invention is realized as follows:
the invention provides an information processing method, which is applied to electronic equipment, wherein the electronic equipment is provided with a first induction unit, a first display module and a second display module; the first display module is provided with a first display area; the second display module is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display module; when the electronic equipment interacts with the user in a second interaction mode, the user can observe the second display area of the second display module; the method comprises the following steps:
controlling a first display area of the first display module to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area;
obtaining induction parameters through the first induction unit;
determining an interaction mode of the electronic equipment according to the induction parameters;
when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier; generating a call instruction based on the first object identification; responding to the calling instruction to start a first application program corresponding to the first object identifier;
and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area.
In the above scheme, the display principles of the first display module and the second display module are different; the second display area is smaller than the first display area; the interaction modes of the user using the electronic equipment are different due to different display principles of the first display module and the second display module;
the first interaction mode is an observation mode that the eyes of the user are far away from the first display area of the first display module on the electronic equipment; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interaction mode is an observation mode that the eyes of the user approach the second display area of the second display module on the electronic device, wherein when the user observes the second display area of the electronic device in the second interaction mode, the light beams emitted by the second display module are incident on the eyes of the user, so that the user perceives a second perception picture; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
In the above scheme, the second display area is smaller than or equal to a predetermined area; the first display area is larger than the predetermined area.
In the above scheme, the predetermined area is a cross-sectional area of a visual field of the eyes of the user when the eyes of the user and the electronic device meet a predetermined distance.
In the foregoing solution, before the controlling the first display area of the first display module to output and display the first object identifier, the method further includes:
obtaining induction parameters through the first induction unit;
when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer;
controlling a first display area of the first display module to display one object identifier in the K object identifiers;
control first object identification is shown in first display area output of first display module assembly, include:
obtaining a selection operation through a second sensing unit of the electronic device;
and determining a first object identifier from the K object identifiers according to the selection operation and displaying the first object identifier in a first display area of the first display module.
In the above scheme, the method further comprises:
obtaining an input operation aiming at the first user interaction interface through a third sensing unit of the electronic equipment;
changing the first user interaction interface to form a second user interaction interface in response to the input operation;
obtaining induction parameters through the first induction unit of the electronic equipment;
when the sensing parameter represents that the electronic equipment is in the first interaction mode, processing the second user interaction interface to generate addition information;
the change of the first object identification is controlled by the adding information, and the changed first object identification is displayed and output in a first display area of the first display module; the changed first object identification includes the added information.
In the above scheme, the method includes:
when the sensing parameter represents that the electronic equipment is in the second interaction mode, outputting a display parameter value of a bearing surface of a first user interaction interface corresponding to the first application program to update once for the second display area of the second display module controlled by the same object identifier.
In the above scheme, the electronic device is a wearable electronic device; the wearable electronic device includes: a frame body, a fixing device and a functional main body part;
the fixing device is connected with the frame body, the fixing device is used for fixing the electronic equipment on a support body, and the frame body and the fixing device form an annular space when the electronic equipment is fixed on the support body through the fixing device;
the functional main body part at least comprises a first display module and a second display module; the first display module is arranged in the frame body, and the second display module is arranged in the frame body;
the first display module is a first display screen, the first display screen is used for displaying and outputting second display content, and the size of the first display screen is a first display area of the first display module;
the second display module is an optical projection system, a first part of the second display module is a light conduction assembly, and a second part of the second display module is a display assembly and a collimation assembly;
the light conduction assembly is made of transparent materials;
the display component is used for displaying and outputting first display content and projecting first light beam output in a light beam mode;
the collimation assembly is used for processing and converting the first light beam projected to be output in the light beam mode into the second light beam output;
the light conduction assembly is used for conducting the second light beam in a transparent material forming the light conduction assembly, wherein the light conduction assembly comprises a reflection unit, the reflection unit is arranged in a specific area of the excess part, and the reflection unit is used for projecting the second light beam in a second direction by changing the conduction direction of the second light beam in the transparent material; the second direction is consistent with the output direction of the first content to be displayed of the first display screen of the first display module; the specific area of the light conduction assembly, in which the reflection unit is arranged, is a second display area of the second display module.
The invention provides electronic equipment, which is provided with a first induction unit, a first display unit and a second display unit; the first display unit is provided with a first display area; the second display unit is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display unit; the electronic equipment can observe the second display area of the second display unit when interacting with the user in a second interaction mode; the electronic device further includes: the system comprises a first output control unit, a first determining unit, a calling execution unit and a second output control unit; wherein,
the first output control unit is used for controlling a first display area of the first display unit to output and display a first object identifier, so that the user in the first interaction mode can perceive the first object identifier through the first display area;
the first determining unit is used for obtaining induction parameters through the first induction unit and determining an interaction mode of the electronic equipment according to the induction parameters;
the calling execution unit is used for determining the first object identifier output and displayed by the first output control unit when the first determination unit determines that the electronic equipment is in the second interaction mode, and generating a calling instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit is configured to control the second display area of the second display unit to output the first user interaction interface corresponding to the first application started by the call execution unit, so that the user in the second interaction manner perceives the first user interaction interface through the second display area.
In the above scheme, the display principles of the first display unit and the second display unit are different; the second display area is smaller than the first display area; the interaction modes of the users using the electronic equipment are different due to the fact that the display principles of the first display unit and the second display unit are different;
the first interaction mode is a viewing mode that the eyes of the user are far away from the first display area of the first display unit on the electronic equipment; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit is incident on the eye of the user, so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
In the above scheme, the second display area is smaller than or equal to a predetermined area; the first display area is larger than the predetermined area.
In the above scheme, the predetermined area is a cross-sectional area of a visual field of the eyes of the user when the eyes of the user and the electronic device meet a predetermined distance.
In the above scheme, the electronic device further includes a second sensing unit for obtaining a selection operation;
the first output control unit is further configured to obtain sensing parameters through the first sensing unit before the first display area of the first display unit is controlled to output and display the first object identifier; when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer; controlling a first display area of the first display unit to display one object identifier in the K object identifiers; and the first display unit is also used for determining a first object identifier from the K object identifiers according to the selection operation obtained by the second sensing unit and displaying the first object identifier in a first display area of the first display unit.
In the above scheme, the electronic device further includes a third sensing unit, configured to obtain an input operation for the first user interaction interface;
the second output control unit is also used for responding to the input operation obtained by the third sensing unit to change the first user interaction interface to form a second user interaction interface;
the first determining unit is used for obtaining induction parameters through the first induction unit and determining an interaction mode of the electronic equipment according to the induction parameters;
the first output control unit is further configured to process the second user interaction interface to generate addition information when the first determining unit determines that the electronic device is in the first interaction mode; controlling the change of the first object identifier by the adding information, and displaying and outputting the changed first object identifier on a first display area of the first display unit; the changed first object identification includes the added information.
In the foregoing solution, the second output control unit is further configured to, when the first determining unit determines that the electronic device is in the second interaction mode, output a display parameter value of a bearing surface of the first user interaction interface corresponding to the first application program to the second display area of the second display unit controlled by the same object identifier started by the calling execution unit for updating once every time. The information processing method and the electronic equipment are applied to the electronic equipment, and the electronic equipment is provided with a first induction unit, a first display module and a second display module; the first display module is provided with a first display area; the second display module is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display module; when the electronic equipment interacts with the user in a second interaction mode, the user can observe the second display area of the second display module; the method comprises the following steps: controlling a first display area of the first display module to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area; obtaining induction parameters through the first induction unit; determining an interaction mode of the electronic equipment according to the induction parameters; when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier; generating a call instruction based on the first object identification; responding to the calling instruction to start a first application program corresponding to the first object identifier; and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area. Therefore, the application program corresponding to the first object identifier can be started without triggering operation aiming at the first object identifier, and the first user interaction interface corresponding to the first application program is output at the same time, so that the information acquisition efficiency is greatly improved, and the user experience is improved; on the other hand, the embodiment of the invention provides two display interfaces, wherein the first display module is used for displaying some summary information, and the second display module is used for displaying more and more comprehensive detailed information; thus, the electronic device provided by the embodiment of the invention can provide image or video display with larger size and higher resolution without being limited by the size of the electronic device.
Drawings
Fig. 1 is a schematic flowchart of an information processing method according to a first embodiment of the present invention;
fig. 2a and 2b are schematic diagrams of an electronic device according to a second embodiment of the invention;
fig. 3a to fig. 3c are schematic diagrams illustrating an interaction method of an electronic device according to a second embodiment of the invention;
FIG. 4 is a flowchart illustrating an information processing method according to a third embodiment of the present invention;
FIG. 5 is a flowchart illustrating an information processing method according to a fourth embodiment of the present invention;
fig. 6a and fig. 6b are schematic diagrams of application scenarios according to a fourth embodiment of the present invention;
fig. 7a to 7d are schematic diagrams of a wearable electronic device according to an embodiment of the invention;
FIG. 8 is a schematic diagram of a first structure of an electronic device according to an embodiment of the invention;
FIG. 9 is a diagram illustrating a second structure of an electronic device according to an embodiment of the invention;
fig. 10 is a schematic diagram of a third structure of an electronic device according to an embodiment of the invention.
Detailed Description
The present invention will be described in further detail with reference to the accompanying drawings and specific embodiments.
Example one
The embodiment of the invention provides an information processing method, which is applied to electronic equipment, wherein the electronic equipment is provided with a first sensing unit, a first display module and a second display module; the first display module is provided with a first display area; the second display module is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display module; when the electronic equipment interacts with the user in a second interaction mode, the user can observe the second display area of the second display module; fig. 1 is a schematic flowchart of an information processing method according to a first embodiment of the present invention; as shown in fig. 1, the method includes:
step 101: and controlling a first display area of the first display module to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area.
In this embodiment, the electronic device may be a terminal device such as a smart phone or a tablet computer, or may be a wearable intelligent electronic device such as a smart watch.
Here, the first object identifier may specifically be an object identifier of a certain application in the electronic device, such as an icon of a clock application, an icon of a weather application, or the like; preferably, the object identification may be a live icon (LiveIcon), such as a weather live icon (liveweather icon); the first object identifier may also be a Widget (Widget) capable of presenting at least one piece of information, such as a clock Widget that may display time, date, current weather conditions, pollution conditions, and the like.
Step 102: and obtaining induction parameters through the first induction unit, and determining an interaction mode of the electronic equipment according to the induction parameters.
The present embodiment can be applied to the following scenarios:
scene one: the electronic equipment is provided with two screens, and the two screens are respectively positioned on different planes of the electronic equipment, for example, a first screen is positioned on the front side of the electronic equipment, and a second screen is positioned on the back side of the electronic equipment; the display area of the second screen is the second display area; in this scenario, the first interaction mode is an interaction mode in which a user faces the first screen; the second interaction mode is an interaction mode of a user facing the second screen; wherein the display area of the first screen is the first display area. The first sensing unit in this step may specifically be an image acquisition unit, and the image acquisition unit and the first display area are located on the same plane; the sensing parameters are image data parameters; specifically, image data are acquired through the image acquisition unit, and when the image data comprise characteristic parameters of a human face, the electronic equipment can be determined to be in a first interaction mode; when the electronic equipment is turned over and the image data acquired by the image acquisition unit does not include the characteristic parameters of the human face, the electronic equipment can be determined to be in a second interaction mode.
Scene two: the electronic equipment is provided with two screens which are sequentially stacked in a first direction, and the first screen is assumed to be positioned above the second screen; the first screen can slide upwards or downwards through a sliding rail to present the second screen; in this scenario, the first interaction mode is an interaction mode in which the first screen does not slide upwards or downwards so that a user faces the first screen; the second interactive mode is an interactive mode in which the first screen slides upwards or downwards so that a user faces the second screen. The first sensing unit in this step may be a pressure sensor disposed on the slide rail, and the sensing parameter is a pressure value detected by the pressure sensor; specifically, when the detected pressure value is a first preset value, it can be determined that the electronic device is in a first interaction mode; when the first screen slides upwards or downwards and the detected pressure value changes to a second numerical value, the electronic equipment can be determined to be in a second interaction mode.
Step 103: and when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier.
Here, based on the two scenarios described in step 102, the second interaction manner is an interaction manner in which the electronic device in scenario one is turned over to make the user face the second screen, or an interaction manner in which the first screen of the electronic device in scenario two is slid up or down to make the user face the second screen.
Wherein the determining of the first object identifier is to select the first object identifier displayed in the first display area output; specifically, the first display area of the first display module only outputs and displays the first object identifier, and when the electronic device is switched from the first interactive mode to the second interactive mode, the first object identifier is triggered and selected.
Step 104: and generating a calling instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the calling instruction.
Here, when the first object identifier is an object identifier of an application in the electronic device, the attribute parameter of the object identifier includes an installation address of the application in the electronic device corresponding to the object identifier; when the first object is identified as a widget capable of presenting at least one piece of information, the attribute parameter of the widget includes a Uniform Resource Locator (URL) of a webpage to which the widget corresponds and is connected, or the attribute parameter of the widget includes an installation address of an application program corresponding to the widget in the electronic device; generating a call instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the call instruction, wherein the method comprises the following steps: identifying attribute parameters of the first object identifier, acquiring an installation address of a first application program corresponding to the first object identifier according to the attribute parameters of the first object identifier, generating a calling instruction based on the installation address of the first application program, and responding to the calling instruction to start the first application program.
And when the attribute parameters of the widget comprise the URL of the webpage corresponding to the widget, the first application program is a browser application program.
Step 105: and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area.
Here, based on the two scenes in step 102, the second display area is a display area of the second screen after the electronic device in scene one is turned over, or a display area of the second screen that can be presented to the user after the first screen of the electronic device in scene two is slid upward or downward. The first user interface is presented to a user after the first application program is started; when the first application program is an application program installed in the electronic equipment, the first user interaction interface is an interaction interface presented after the application program is started; and when the first application program is a browser application program, the first user interaction interface is a webpage interaction interface corresponding to the URL of the webpage included in the attribute parameter of the first object identifier after the browser application program is started.
By adopting the technical scheme of the embodiment of the invention, the application program corresponding to the first object identifier can be started without triggering operation aiming at the first object identifier, and the first user interaction interface corresponding to the first application program is output at the same time, so that the information acquisition efficiency is greatly improved, and the user experience is improved; on the other hand, the embodiment of the invention provides two display interfaces, wherein the first display module is used for displaying some summary information, and the second display module is used for displaying more and more comprehensive detailed information; thus, the electronic device provided by the embodiment of the invention can provide image or video display with larger size and higher resolution without being limited by the size of the electronic device.
Example two
In another preferred embodiment of the present invention, based on the information processing method described in the first embodiment, the information processing method is applied to an electronic device, and the electronic device has a first sensing unit, a first display module and a second display module; the first display module is provided with a first display area; the second display module is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display module; when the electronic equipment interacts with the user in a second interaction mode, the user can observe the second display area of the second display module; the first display module and the second display module are different in display principle; the second display area is smaller than the first display area; the interaction modes of the user using the electronic equipment are different due to different display principles of the first display module and the second display module;
the first interaction mode is an observation mode that the eyes of the user are far away from the first display area of the first display module on the electronic equipment; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interaction mode is an observation mode that the eyes of the user approach the second display area of the second display module on the electronic device, wherein when the user observes the second display area of the electronic device in the second interaction mode, the light beams emitted by the second display module are incident on the eyes of the user, so that the user perceives a second perception picture; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
In this embodiment, the electronic device is a wearable intelligent electronic device (such as a smart watch) for example. Fig. 2a and 2b are schematic diagrams of an electronic device according to a second embodiment of the invention; as shown in fig. 2a and 2b, the smart watch has a first display module 21 and a second display module 22, and the display principle of the first display module 21 is different from that of the second display module 22; in this embodiment, taking the dial of the smart watch as an example, the first display module 21 has a first display area 221; the second display module 22 has a second display area 222; the second display area 222 is smaller than the first display area 221; due to the fact that the display principles of the first display module 21 and the second display module 22 are different, a first interaction mode of the user using the first display module 21 of the electronic device is different from a second interaction mode of the user using the second display module 22 of the electronic device; the second interactive mode is a viewing mode in which the eyes of the user approach the second display area 222 (as indicated by arrow 1 in fig. 2 a) on the electronic device; the first interaction mode is a viewing mode in which the eyes of the user are far away from the first display area 221 (the direction shown by the arrow 2 in fig. 2 a) on the electronic device; when the user approaches the second display area 222 of the electronic device in the second interactive mode, the light beam emitted by the second display module 22 is incident on the eye of the user, so that the user perceives a second perception picture; the second perceptual picture has a size larger than the second display area 222; the eyes of the user make the user perceive a first perception screen when the user is away from the first display area 221 of the electronic device in the first interactive manner; the size of the first perceptual picture is equal to the first display area 221. The dial plate of the intelligent watch is not limited to be round, and can be square or rectangular and other shapes.
In this example, the first display area 221 and the second display area 222 are in one dial (display screen) of the smart watch, and in other embodiments, the first display area 221 and the second display area 222 may be respectively disposed in two display screens of the smart watch, and the two display screens may be disposed in the same plane or in different planes.
In another example, the electronic device may be a smart phone, where the smart phone has any structure shown in fig. 2a, except that the first display area and the second display area may be disposed on different appearance surfaces of the smart phone, and if the first display area is disposed on a first appearance surface (an appearance surface where a display screen in the prior art is located, i.e., a front surface) of the smart phone, the second display area may be disposed on a back surface or a side surface of the smart phone, and a specific position may be flexibly set according to a layout of a display module in the smart phone.
Fig. 3a to fig. 3c are schematic diagrams illustrating an interaction method of an electronic device according to a second embodiment of the invention; taking the electronic device as an example of a smart watch, when the smart watch is in a first interactive mode, that is, an observation mode in which the eyes of the user are far away from the first display area of the first display module on the electronic device, the user wearing the smart watch to observe the first display area may be as shown in fig. 3a, and with reference to fig. 2a and fig. 2b, the user lifts the wrist to the front of the chest, so that the electronic device 20 is far away from the eyes of the user, and the user observes the first display area 221 in the first interactive mode. When the smart watch is in a second interaction mode, that is, an observation mode in which the eyes of the user approach the second display region of the second display module on the electronic device, as shown in fig. 3b, when the eyes of the user approach the second display region 222 of the electronic device, light emitted by the second display module directly enters the eyes of the user, that is, a projection mode is adopted, so that in the field of view of the user, the size of the second perception screen 23 perceived by the user is much larger than that of the second display region 222, and the second perception screen 23 perceived by the user feels a longer distance from the user. Thus, when the user observes the second display region 222 in the second interactive mode, the observed image size is larger, and the visual experience of the user is greatly improved.
Specifically, when the smart watch is in the second interactive mode, in practical application, when light emitted by the second display module directly enters the eyes of the user, a light spot is used in the second display area 222 as a display output effect of the display content output by the second display module, such as the light spot 2221 shown in fig. 3c, where the light spot 2221 is the second display area 222.
In this embodiment, the second display area is smaller than or equal to a predetermined area; the first display area is larger than the predetermined area; the predetermined area is a cross-sectional area of a visual field of eyes of the user when the eyes of the user and the electronic equipment meet a predetermined distance.
The preset distance is the distance between the eyes of the user and the electronic equipment when the electronic equipment is in the second interaction mode; when the user observes the first display area at the preset distance, obviously, the area of the first display area is larger than the cross-sectional area in the visual field of the eyes of the user at the moment; when the user observes the second display region (i.e. the light spot 2221 in fig. 3 c) at the predetermined distance, since the distance between the user's eyes and the electronic device can be changed, i.e. the user's head moves back and forth, the area of the second display region may be smaller than the cross-sectional area in the user's eye field of vision at the time, and it is also possible that the area of the second display region may be equal to the cross-sectional area in the user's eye field of vision at the time when the distance between the user's eyes and the electronic device is proper.
By adopting the technical scheme of the embodiment of the invention, based on the beneficial effects brought by the technical scheme provided by the first embodiment, the application program corresponding to the first object identifier can be started without triggering operation aiming at the first object identifier, and the first user interaction interface corresponding to the first application program is output at the same time, so that the information acquisition efficiency is greatly improved, and the user experience is improved; on the other hand, the embodiment of the invention can be applied to two display modules with different display principles, correspondingly, two display interfaces are correspondingly provided through the two display modules with different display principles, some summary information is displayed through the first display module, and more comprehensive detailed information can be borne through the display of the second display module; thus, the electronic device provided by the embodiment of the invention can provide image or video display with larger size and higher resolution without being limited by the size of the electronic device.
EXAMPLE III
In another preferred embodiment of the present invention, based on the electronic device described in the second embodiment, an information processing method is further provided in the second embodiment of the present invention, and fig. 4 is a schematic flow chart of the information processing method in the third embodiment of the present invention; as shown in fig. 4, the method includes:
step 301: and obtaining induction parameters through the first induction unit.
In this embodiment, the electronic device may specifically be a wearable intelligent electronic device with two display modules, such as a smart watch. Or a smart phone with two display modules; wherein, the display principle of two display module assemblies is different. The structure and the interaction mode of the electronic device may be as shown in embodiment two, and are not described herein again.
Here, in this step, the obtaining of the sensing parameter by the first sensing unit may be implemented by:
the first method is as follows: a switching key is arranged on the electronic equipment; when the electronic equipment detects that the switching key is triggered, determining that the interactive mode between the electronic equipment is switched from the first interactive mode to the second interactive mode; determining that the interaction mode between the electronic equipment is switched from the second interaction mode to the first interaction mode when the switching key is detected to be triggered again; in this scenario, the first sensing unit may specifically be a key detection unit, and the sensing parameter is a key detection parameter; the corresponding relation between the key detection parameters and the interaction mode can be preset, and if the key detection parameters are set to be 0, the corresponding interaction mode is a first interaction mode; setting the key detection parameter as 1, wherein the corresponding interactive mode is a second interactive mode; determining that the switching mode of the electronic equipment is a first switching mode or a second switching mode according to the key state parameters detected by the key detection unit; specifically, the switching key may be a physical key on the electronic device, or may be a virtual key, and the virtual key may be triggered by the touch display screen corresponding to the first display area;
the second method comprises the following steps: the first sensing unit is a voice recognition unit, and the sensing parameters are voice data parameters obtained through the voice recognition unit; when the electronic equipment identifies a voice data parameter corresponding to a preset switching instruction through the voice identification unit, determining an interaction mode of the electronic equipment according to the preset switching instruction corresponding to the voice data parameter;
the third method comprises the following steps: the first sensing unit is a gravity sensing unit or a gyroscope sensor, and the sensing parameter is an inclination angle obtained by the gravity sensing unit or the gyroscope sensor; when the electronic equipment detects that the inclination angle of the electronic equipment reaches a first threshold range through the gravity sensing unit or the gyroscope sensor, the electronic equipment can be determined to be in a first interaction mode; when the electronic equipment detects that the inclination angle of the electronic equipment reaches a second threshold range through the gravity sensing unit or the gyroscope sensor, the electronic equipment can be determined to be in a second interaction mode;
the method is as follows: the first sensing unit is a distance sensor, and the sensing parameter is the distance between the first sensing unit and the shielding object detected by the distance sensor; the distance sensor may be disposed on the same plane as the second display region; when the electronic equipment detects that the distance between the electronic equipment and a shelter is smaller than a first threshold value through the distance sensor, determining that the interaction mode of the electronic equipment is a second interaction mode; when the distance between the electronic equipment and a shelter is detected to reach the first threshold value through the distance sensor, determining that the interaction mode of the electronic equipment is the first interaction mode;
the fifth mode is as follows: the first sensing unit is a height measuring instrument and a gravity sensor, and the sensing parameters are a height value detected by the height measuring instrument and an inclination angle of the electronic equipment detected by the gravity sensor; when the electronic equipment detects that the electronic equipment rises to a second height range (the height of eyes of a user) through the height measuring instrument and detects that the inclination angle of the electronic equipment inclines from a first angle range to a second angle range (namely the orientation of the electronic equipment is changed from the outer side facing the side of the user to the inner side facing the front of the user) through the gravity sensor, determining that the interaction mode of the electronic equipment is a second interaction mode; when the electronic equipment is detected to be lowered from the second height range to a third height range through the height measuring instrument and the orientation of the electronic equipment is not changed through the gravity sensing unit, it is determined that the interaction mode of the user and the electronic equipment is the first interaction mode.
Step 302: and when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer.
Here, the K object identifiers may specifically be object identifiers corresponding to K applications in the electronic device, where the object identifiers are, for example, object identifiers of a clock application or object identifiers of a weather application; preferably, the object identification may be a live icon (LiveIcon), such as a weather live icon (liveweather icon); the K object identifiers may also be K widgets (widgets) capable of presenting at least one piece of information, such as a clock Widget that may display time, date, current weather conditions, pollution conditions, and the like.
Step 303: and controlling a first display area of the first display module to display one object identifier in the K object identifiers.
In this embodiment, the first display area of the electronic device may display only one object identifier of the K object identifiers.
Step 304: and obtaining a selection operation through a second sensing unit of the electronic equipment, determining a first object identifier from the K object identifiers according to the selection operation, and displaying the first object identifier in a first display area of the first display module, so that the user in the first interaction mode can perceive the first object identifier through the first display area.
Here, the second sensing unit may be a touch sensing unit, and a touch gesture operation, such as a page-turning gesture operation, is recognized by the touch sensing unit; in this embodiment, the first display area of the electronic device may only display one object identifier of the K object identifiers, and if the object identifier displayed in the first display area is not an object identifier determined by a user, a page-turning gesture operation recognized by the touch sensing unit may be performed to turn the content displayed in the first display area to a next page, and a next object identifier of the current object identifier is also displayed until the first display area displays the object identifier determined by the user through the page-turning gesture operation, so as to determine the object identifier determined by the user as the first object identifier.
In another embodiment, the second sensing unit may also be a voice recognition unit, the voice recognition unit recognizes a voice operation, and performs page turning of the content displayed in the first display area according to an operation instruction corresponding to the voice operation until the first display area displays an object identifier determined by a user through the voice operation, and the object identifier determined by the user is determined as the first object identifier.
Step 305: and obtaining induction parameters through the first induction unit, and determining an interaction mode of the electronic equipment according to the induction parameters.
Here, the specific implementation manner of this step may be as shown in step 301, and is not described here again.
Step 306: and when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier.
Here, the determining a first object identifier is to select the first object identifier displayed in the first display area output; specifically, the first display area of the first display module only outputs and displays the first object identifier, and when the electronic device is switched from the first interactive mode to the second interactive mode, the first object identifier is triggered and selected.
Step 307: and generating a calling instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the calling instruction.
Here, when the first object identifier is an object identifier of an application in the electronic device, the attribute parameter of the object identifier includes an installation address of the application in the electronic device corresponding to the object identifier; when the first object is identified to be a widget capable of presenting at least one piece of information, the attribute parameter of the widget comprises a URL of a webpage corresponding to the widget, or the attribute parameter of the widget comprises an installation address of an application program corresponding to the widget in the electronic equipment; generating a call instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the call instruction, wherein the method comprises the following steps: identifying attribute parameters of the first object identifier, acquiring an installation address of a first application program corresponding to the first object identifier according to the attribute parameters of the first object identifier, generating a calling instruction based on the installation address of the first application program, and responding to the calling instruction to start the first application program.
And when the attribute parameters of the widget comprise the URL of the webpage corresponding to the widget, the first application program is a browser application program.
Step 308: and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area.
Here, the first user interface is an interface presented to a user after the first application program is started; when the first application program is an application program installed in the electronic equipment, the first user interaction interface is an interaction interface presented after the application program is started; and when the first application program is a browser application program, the first user interaction interface is a webpage interaction interface corresponding to the URL of the webpage included in the attribute parameter of the first object identifier after the browser application program is started.
By adopting the technical scheme of the embodiment of the invention, the application program corresponding to the first object identifier can be started without triggering operation aiming at the first object identifier, and the first user interaction interface corresponding to the first application program is output at the same time, so that the information acquisition efficiency is greatly improved, and the user experience is improved; on the other hand, the embodiment of the invention provides two display interfaces, wherein the first display module is used for displaying some summary information, and the second display module is used for displaying more and more comprehensive detailed information; thus, the electronic device provided by the embodiment of the invention can provide image or video display with larger size and higher resolution without being limited by the size of the electronic device.
Example four
In another preferred embodiment of the present invention, based on the electronic device described in the second embodiment, an information processing method is further provided in the second embodiment of the present invention, and fig. 5 is a schematic flow chart of the information processing method in the fourth embodiment of the present invention; as shown in fig. 5, the method includes:
step 401: and controlling a first display area of the first display module to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area.
In this embodiment, the electronic device may specifically be a wearable intelligent electronic device with two display modules, such as a smart watch. Or a smart phone with two display modules; wherein, the display principle of two display module assemblies is different. The structure and the interaction mode of the electronic device may be as shown in embodiment two, and are not described herein again.
Here, the first object identifier may specifically be an object identifier of a certain application in the electronic device, such as an object identifier of a clock application, an object identifier of a weather application, or the like; preferably, the object identification may be a live icon (LiveIcon), such as a weather live icon (liveweather icon); the first object identifier may also be a Widget (Widget) capable of presenting at least one piece of information, such as a clock Widget that may display time, date, current weather conditions, pollution conditions, and the like.
Step 402: and obtaining induction parameters through the first induction unit, and determining an interaction mode of the electronic equipment according to the induction parameters.
Here, in this step, the obtaining of the sensing parameter by the first sensing unit and the determining of the interaction mode of the electronic device according to the sensing parameter may be implemented in the following manners:
the first method is as follows: a switching key is arranged on the electronic equipment; when the electronic equipment detects that the switching key is triggered, determining that the interactive mode between the electronic equipment is switched from the first interactive mode to the second interactive mode; determining that the interaction mode between the electronic equipment is switched from the second interaction mode to the first interaction mode when the switching key is detected to be triggered again; in this scenario, the first sensing unit may specifically be a key detection unit, and the sensing parameter is a key detection parameter; the corresponding relation between the key detection parameters and the interaction mode can be preset, and if the key detection parameters are set to be 0, the corresponding interaction mode is a first interaction mode; setting the key detection parameter as 1, wherein the corresponding interactive mode is a second interactive mode; determining that the switching mode of the electronic equipment is a first switching mode or a second switching mode according to the key state parameters detected by the key detection unit; specifically, the switching key may be a physical key on the electronic device, or may be a virtual key, and the virtual key may be triggered by the touch display screen corresponding to the first display area;
the second method comprises the following steps: the first sensing unit is a voice recognition unit, and the sensing parameters are voice data parameters obtained through the voice recognition unit; when the electronic equipment identifies a voice data parameter corresponding to a preset switching instruction through the voice identification unit, determining an interaction mode of the electronic equipment according to the preset switching instruction corresponding to the voice data parameter;
the third method comprises the following steps: the first sensing unit is a gravity sensing unit or a gyroscope sensor, and the sensing parameter is an inclination angle obtained by the gravity sensing unit or the gyroscope sensor; when the electronic equipment detects that the inclination angle of the electronic equipment reaches a first threshold range through the gravity sensing unit or the gyroscope sensor, the electronic equipment can be determined to be in a first interaction mode; when the electronic equipment detects that the inclination angle of the electronic equipment reaches a second threshold range through the gravity sensing unit or the gyroscope sensor, the electronic equipment can be determined to be in a second interaction mode;
the method is as follows: the first sensing unit is a distance sensor, and the sensing parameter is the distance between the first sensing unit and the shielding object detected by the distance sensor; the distance sensor may be disposed on the same plane as the second display region; when the electronic equipment detects that the distance between the electronic equipment and a shelter is smaller than a first threshold value through the distance sensor, determining that the interaction mode of the electronic equipment is a second interaction mode; when the distance between the electronic equipment and a shelter is detected to reach the first threshold value through the distance sensor, determining that the interaction mode of the electronic equipment is the first interaction mode;
the fifth mode is as follows: the first sensing unit is a height measuring instrument and a gravity sensor, and the sensing parameters are a height value detected by the height measuring instrument and an inclination angle of the electronic equipment detected by the gravity sensor; when the electronic equipment detects that the electronic equipment rises to a second height range (the height of eyes of a user) through the height measuring instrument and detects that the inclination angle of the electronic equipment inclines from a first angle range to a second angle range (namely the orientation of the electronic equipment is changed from the outer side facing the side of the user to the inner side facing the front of the user) through the gravity sensor, determining that the interaction mode of the electronic equipment is a second interaction mode; when the electronic equipment is detected to be lowered from the second height range to a third height range through the height measuring instrument and the orientation of the electronic equipment is not changed through the gravity sensing unit, it is determined that the interaction mode of the user and the electronic equipment is the first interaction mode.
Step 403: and when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier.
Here, the determining a first object identifier is to select the first object identifier displayed in the first display area output; specifically, the first display area of the first display module only outputs and displays the first object identifier, and when the electronic device is switched from the first interactive mode to the second interactive mode, the first object identifier is triggered and selected.
Step 404: and generating a calling instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the calling instruction.
Here, when the first object identifier is an object identifier of an application in the electronic device, the attribute parameter of the object identifier includes an installation address of the application in the electronic device corresponding to the object identifier; when the first object is identified to be a widget capable of presenting at least one piece of information, the attribute parameter of the widget comprises a URL of a webpage corresponding to the widget, or the attribute parameter of the widget comprises an installation address of an application program corresponding to the widget in the electronic equipment; generating a call instruction based on the first object identifier, and starting a first application program corresponding to the first object identifier in response to the call instruction, wherein the method comprises the following steps: identifying attribute parameters of the first object identifier, acquiring an installation address of a first application program corresponding to the first object identifier according to the attribute parameters of the first object identifier, generating a calling instruction based on the installation address of the first application program, and responding to the calling instruction to start the first application program.
And when the attribute parameters of the widget comprise the URL of the webpage corresponding to the widget, the first application program is a browser application program.
Step 405: and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area.
Here, the first user interface is an interface presented to a user after the first application program is started; when the first application program is an application program installed in the electronic equipment, the first user interaction interface is an interaction interface presented after the application program is started; and when the first application program is a browser application program, the first user interaction interface is a webpage interaction interface corresponding to the URL of the webpage included in the attribute parameter of the first object identifier after the browser application program is started.
Step 406: and obtaining an input operation aiming at the first user interaction interface through a third sensing unit of the electronic equipment, and responding to the input operation to change the first user interaction interface to form a second user interaction interface.
Here, the third sensing unit may be a voice recognition unit, and the electronic device may obtain voice information through the voice recognition unit, generate a voice instruction corresponding to the voice information by analyzing the voice information, and generate an input operation for the first user interaction interface based on the voice instruction.
Specifically, when the first user interaction interface is a display interface of a map application, for example, the first user interaction interface is a map interface of a city where the user is currently located, which is displayed after the map application is started; the user can issue a voice instruction through the voice recognition unit, and the voice instruction comprises the following steps: setting a starting position A and an end position B, calculating an optimal route between the starting position A and the end position B, and starting navigation; a second user interaction interface can be generated based on the voice instruction, the second user interaction interface can present the starting point position a, the ending point position B, an optimal route map between the starting point position a and the ending point position B and the current position of the user, acquire the current position of the user in real time, navigate based on the current position of the user, and change the current position of the user in the second user interaction interface.
Step 407: and acquiring sensing parameters through the first sensing unit of the electronic equipment, and processing the second user interaction interface to generate addition information when the sensing parameters represent that the electronic equipment is in the first interaction mode.
In this step, after the second user interaction interface is formed, when the sensing parameter obtained by the first sensing unit indicates that the electronic device is in the first interaction mode, that is, when the electronic device is switched from the second interaction mode to the first interaction mode, an addition message is generated according to the second user interaction interface.
When the electronic equipment is switched from the second interaction mode to the first interaction mode, the second user interaction interface is still displayed and output in the second display area through the second display module, and the second user interaction interface is changed according to the change of time or the change of the position of the user. Thus, the additional information may be information capable of characterizing the change presented in the second user interaction interface.
Taking a specific practical application as an example, fig. 6a and 6b are schematic diagrams of an application scenario of a fourth embodiment of the present invention; FIG. 6a is a diagram illustrating a fourth exemplary embodiment of a second user interface; as shown in fig. 6a, when the second user interface presents the starting point location a, the ending point location B, the best route pattern between the starting point location a and the ending point location B, and the location C or D where the user is located; when navigating based on the second user interaction interface, the added information may be a direction arrow that the user should move currently; fig. 6b is a schematic diagram of added information in four first interactive interfaces according to the embodiment of the present invention, as shown in fig. 6b, when the user is at a current location C, the user needs to move to the east according to a navigation route in the second user interactive interface, that is, the added information is an arrow pointing to the east; preferably, the distance the user can continue to move towards the east, as shown by the arrow towards the east and 2305 meters in fig. 6b, can also be added to the add information.
Step 408: the change of the first object identification is controlled by the adding information, and the changed first object identification is displayed and output in a first display area of the first display module; the changed first object identification includes the added information.
Here, when the first object identifier is an object identifier of an application in the electronic device, such as an icon of a weather application, an icon of a map application, or the like, the addition information is added to the first object identifier. When the first object is identified as an icon of a map application, the added information (such as an arrow, or an arrow + a distance) in step 407 is added to the icon of the map application to control the icon of the map application to change along with the change of the current position of the user, so that the user can navigate according to the change of the direction of the arrow in the icon of the map application, or the change of the direction of the arrow in the icon of the map application and the change of the distance.
When the first object is identified as a widget capable of presenting at least one piece of information, adding the added information to the widget. When the widget is used as a map widget, the addition information (such as an arrow, or an arrow + a distance) in step 407 is added to the map widget to control the map widget to change along with the change of the current location of the user, so that the user can navigate according to the change of the direction of the arrow in the map widget, or the change of the direction of the arrow in the map widget and the change of the distance.
By adopting the technical scheme of the embodiment of the invention, the application program corresponding to the first object identifier can be started without triggering operation aiming at the first object identifier and only by switching from the first interactive mode to the second interactive mode, and the first user interactive interface corresponding to the first application program is output at the same time, so that the information acquisition efficiency is greatly improved; the first object identifier can be changed only by switching from the second interaction mode to the first interaction mode without triggering operation on the first object identifier, so that the user experience is improved; on the other hand, the embodiment of the invention provides two display interfaces, wherein the first display module is used for displaying some summary information, and the second display module is used for displaying more and more comprehensive detailed information; thus, the electronic device provided by the embodiment of the invention can provide image or video display with larger size and higher resolution without being limited by the size of the electronic device.
EXAMPLE five
In another preferred embodiment of the present invention, based on the information processing method described in the second embodiment, when the sensing parameter represents that the electronic device is in the second interaction mode, the second display area of the second display module is controlled by the same object identifier to output a display parameter value of the bearing surface of the first user interaction interface corresponding to the first application program for updating once.
Specifically, the technical solution provided by the present embodiment can be applied to various environments, such as an outdoor area with strong light, an indoor area with dim light, an area with rich colors of the surrounding environment, and an area with single color of the surrounding environment; when the electronic device is switched from the first interaction mode to the second interaction mode and is identified by the same object, the background of the first user interaction interface output by the second display area of the second display module of the electronic device is adapted to the surrounding environment parameters; for example, when the current environment is an area with stronger brightness, the brightness of the background of the first user interaction interface can be adaptively adjusted to be close to the brightness of the current environment; when the current environment is an area with the main green color (such as a lawn and a forest), the background color of the first user interaction interface can be adjusted to be green in a self-adaptive mode; therefore, the display of the first user interaction interface is appropriate to the external environment, the visual fatigue of the user is reduced, various discomforts of the body of the user caused by the overlarge visual difference of the two eyes of the user are avoided, and the operation experience of the user is improved.
In the first to fifth embodiments of the present invention, the electronic device is a wearable electronic device; fig. 7a to 7d are schematic diagrams of a wearable electronic device according to an embodiment of the invention; as shown in fig. 7a, the wearable electronic device includes: a frame body 71, a fixing device, and a functional main body portion;
the fixing device is connected with the frame body 71, the fixing device is used for fixing the electronic equipment on a support body, and the frame body and the fixing device form an annular space when the electronic equipment is fixed on the support body through the fixing device;
the functional main body part at least comprises a first display module 21 and a second display module 22; the first display module 21 is arranged in the frame body 71, and the second display module 22 is arranged in the frame body 71; the electronic device has a first cross section which is a ring formed by cutting the electronic device perpendicularly to the support body with reference to the support body when the electronic device is fixed to the support body by the fixing device, and the first direction is a direction from the outside of the ring to the inside of the ring and toward the center of the ring.
In the embodiment of the present invention, the electronic device is a wearable electronic device, and as a preferred embodiment, the electronic device may be a smart watch. The first cross section can be understood as the cross section shown in fig. 7a, and from the cross section shown in fig. 7a, the frame body 71 and the fixing device in the electronic device enclose a ring, and the ring can penetrate into the support body. When the ring is a standard circle, the first direction may be understood as a direction from outside the circle to inside the circle toward the center of the circle, and the ring is unlikely to be a standard circle, so the foregoing explanation of the first direction by taking the standard circle as an example is only for explaining the technical solution of the embodiment of the present invention, and not for a real case, and in a real case, when the smart watch is assumed to be worn on the wrist, the wrist itself of the human body is not a standard cylinder, and therefore, the ring formed by the smart watch is not a standard circle.
In an embodiment of the present invention, the support body may be a wrist of the user, or may be other body parts of the user; of course, the support body may be any object other than the body part of the user, for example, an arm prosthesis and the like, and may also be a handrail on a bus or a subway operation vehicle.
In the embodiment of the present invention, the electronic device is a wearable electronic device, and as a preferred embodiment, the electronic device may also be a smart ring. When the electronic device is a smart ring, correspondingly, the support body can be a finger.
In one embodiment of the invention, the electronic device may comprise two fixing means, a first fixing means 72 and a second fixing means 73, as shown in fig. 7a, wherein: the first end of the first fixing device 72 is movably connected to the first end of the frame 71, the first end of the second fixing device 73 is movably connected to the second end of the frame 71, and the second end of the first fixing device 72 and the second end of the second fixing device 73 are matched with each other to fix the electronic device on the support. In the implementation process, the person skilled in the art can implement the mutual fit between the second end of the first fixing device 72 and the second end of the second fixing device 73 by means of a card and a component, or an adhesive component, etc.; a person skilled in the art may also realize the movable connection between the first end of the first fixing device 72 and the first end of the frame body 71 and the movable connection between the first end of the second fixing device 73 and the second end of the frame body 71 through a rotating shaft connection or the like, which will not be described herein.
The first display module 21 is a first display screen, the first display screen 21 is used for displaying and outputting second display content, and the size of the first display screen is a first display output area of the first display module; the second display module 22 is an optical projection system, as shown in fig. 7b, the second display module 22 includes a first portion 13 and a second portion, wherein the first portion 13 of the second display module is a light conducting component, and the second portion of the second display module includes a display component 15 and a collimating component 14; wherein:
the display component 15 is used for displaying and outputting the second content to be displayed, projecting a first light beam in a light beam mode and outputting the first light beam to the collimation component 14; the collimation assembly 14 is configured to process the first light beam projected and output in the light beam manner, convert the first light beam into the second light beam, and output the second light beam to the light guide assembly 13; the light conduction component 13 is also called a light path conversion component, the light conduction component is made of a transparent material, the light conduction component 13 is used for conducting the second light beam in the material forming the light conduction component, wherein the light conduction component comprises a reflection unit, the reflection unit is arranged in a specific area of the excess part, and the reflection unit is used for changing the conduction direction of the second light beam in the transparent material and projecting in a second direction; the second direction is consistent with the output direction of the first content to be displayed of the first display screen of the first display module; the specific area of the light conduction assembly, in which the reflection unit is arranged, is a second display output area of the second display module.
Fig. 7c and fig. 7d are schematic structural diagrams illustrating a second display module in an electronic device according to an embodiment of the invention; as shown in fig. 7c and 7d, the display assembly 15 includes a beam splitting unit 151 and a display unit 152, the collimating assembly 14 includes a second collimating unit 141 and a first collimating unit 142 and a polarization beam splitting unit 143, and the light guiding assembly 13 includes a waveguide unit 131 and a reflection unit 132. Wherein the display module 15 in fig. 7d further comprises a light emitting unit 150. The collimating assembly 15 processes the first light beam projected and output in the form of the light beam, converts the first light beam into the second light beam, and outputs the second light beam to the light conducting assembly.
Specifically, the collimating assembly 15 includes a first collimating unit 142 and a second collimating unit 141 arranged oppositely, and a polarization beam splitting unit 143 arranged between the first collimating unit 142 and the second collimating unit 141, and the first light beam output from the display assembly is reflected to the first collimating unit 142 via the polarization beam splitting unit 143, and then is collimated by the first collimating unit 142 and the second collimating unit 141, and then exits as the second light beam via the polarization beam splitting unit 143.
Here, the first and second collimating units 142 and 141 may be a single lens or a lens group designed as needed.
The light conduction assembly 13 is used for conducting the second light beam in a material forming the light conduction assembly and finally outputting the second light beam to an observer; the light guide assembly 13 includes a waveguide unit 131 and a reflection unit 132, and the second light beam can be controlled by setting the position and angle of the reflection unit 132 and be guided to exit at the specific position. In the first case, the collimating component 14 and the display component 15 are located at a first side relative to the plane of the waveguide unit 131, and when the reflecting unit 132 is disposed, the second light beam may exit to a second side relative to the plane of the waveguide unit 131, where the first side and the second side are opposite sides relative to the plane of the waveguide unit 131.
Specifically, when the second display module is applied to a smart watch, for example, the configuration example described above may be adopted, so that the second light beam and the second light beam are emitted to the second side, that is, the second light beam is emitted to the eyes of a user wearing and watching a wrist-worn electronic device. In further detail, the emitting direction of the second display module can be configured according to the viewing requirement, for example, the rotation of the reflecting unit 132 can be controlled, so as to control the emitting direction of the reflecting unit 132, and thus the switching of the bidirectional display of the second display module is realized. In the embodiment of the present invention, the reflecting unit 132 may be a single prism or a prism group designed according to needs.
In the embodiment of the present invention, the first display module 21 has a first display output area, and as described above, the first display module 21 is a first display screen, so that the physical size of the first display screen is consistent with the size of the first display output area.
The second display module 22 has a second display output area, which is a specific area where the reflection unit is disposed on the excess portion. Generally, the physical size of the reflection unit 132 is larger than or equal to the size of the second display output area, and the size of the display unit 152 in the second display module 22 is smaller than the size of the second display output area. It should be noted that fig. 7c or fig. 7d are only for helping those skilled in the art to understand the technical solution of the present application, and are not a state of the electronic device provided in the embodiment of the present invention when in use; for example, the electronic device is fixed on a support body when in use, when the support body is a wrist, a user lifts an arm to place the electronic device in front of eyes, so that the user can see information provided by the electronic device in a front view angle when in use.
The two display modules in the electronic equipment can have various use states when in use. The electronic device is worn on the wrist as a smart watch for explanation, for example, when a user is walking, the arm of the user naturally droops, and at this time, the first display module and the second display module of the electronic device may be both in a low power consumption state such as an off state or a standby state, and the off state or the standby state may save power to prolong the service time of the electronic device; when the user's arm is naturally drooping, the electronic device can be considered to be in a first relative area, that is, the first relative area indicates a relative positional relationship between the user and the electronic device, that is, a side area of the user's body.
Then, the user wants to see for a certain time during the walking process, and the user lifts the arm, and the electronic device is assumed to be in a second relative area (namely, the chest area of the user's body) when the user lifts the arm; when the electronic equipment detects that the electronic equipment is in the second relative area, the first display module is enabled (the first display module is started or awakened), and then the time is displayed to a user; or the prompt information is displayed to the user through the first display module.
At this time, if the user wants to see the data content more associated with the prompt message, the electronic device is further moved forward, assuming that the electronic device is located in the third opposite region (i.e., the head region of the user's body), that is, when the electronic device is located in the third opposite region, the electronic device enables the second display module, and when the user's eyes approach the second display module, the associated data content can be seen through the second display module.
Of course, the first display module and the second display module shown in fig. 7a to 7d are disposed in the smart watch, that is, the smart watch is taken as a frame, and it is not limited to only the smart watch or the smart wearable electronic device; the first display module and the second display module can be arranged in the smart phone by taking the smart phone as a frame, and the specific arrangement mode can be flexibly arranged in the smart phone.
EXAMPLE six
Fig. 8 is a schematic view of a first structure of the electronic device according to the embodiment of the present invention, and as shown in fig. 8, the electronic device includes a first sensing unit 81, a first display unit 82, and a second display unit 83; the first display unit 82 has a first display area; the second display unit 83 has a second display area; when the electronic device interacts with a user of the electronic device in a first interaction mode, the user can observe the first display area of the first display unit 82; the user can observe the second display area of the second display unit 83 when the electronic device interacts with the user in a second interaction mode; the electronic device further includes: a first output control unit 84, a first determination unit 85, a call execution unit 86, and a second output control unit 87; wherein,
the first output control unit 84 is configured to control a first display area of the first display unit 82 to output and display a first object identifier, so that the user in the first interaction manner perceives the first object identifier through the first display area;
the first determining unit 85 is configured to obtain an induction parameter through the first induction unit 81, and determine an interaction mode of the electronic device according to the induction parameter;
the call execution unit 86 is configured to determine that the first output control unit 84 outputs the displayed first object identifier when the first determination unit 85 determines that the electronic device is in the second interaction mode, and generate a call instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit 87 is configured to control the second display area of the second display unit 83 to output the first user interaction interface corresponding to the first application started by the call execution unit 86, so that the user in the second interaction manner perceives the first user interaction interface through the second display area.
According to another preferred embodiment of the present invention, the first display unit 82 and the second display unit 83 have different display principles; the second display area is smaller than the first display area; the interaction ways of the users using the electronic equipment are different due to the different display principles of the first display unit 82 and the second display unit 83;
the first interaction mode is a viewing mode in which the eyes of the user are far away from the first display area of the first display unit 82 on the electronic device; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit 83 on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit 83 is incident on the eye of the user so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
Here, the second display region is equal to or smaller than a predetermined area; the first display area is larger than the predetermined area. The predetermined area is a cross-sectional area of a visual field of eyes of the user when the eyes of the user and the electronic equipment meet a predetermined distance.
It should be understood by those skilled in the art that the functions of each processing unit in the electronic device according to the embodiment of the present invention may be understood by referring to the description of the information processing method, and each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
EXAMPLE seven
Fig. 9 is a schematic diagram of a second structure of the electronic device according to the embodiment of the present invention; as shown in fig. 9, the electronic device has a first sensing unit 81, a first display unit 82 and a second display unit 83; the first display unit 82 has a first display area; the second display unit 83 has a second display area; when the electronic device interacts with a user of the electronic device in a first interaction mode, the user can observe the first display area of the first display unit 82; the user can observe the second display area of the second display unit 83 when the electronic device interacts with the user in a second interaction mode; the first display unit 82 and the second display unit 83 are different in display principle; the second display area is smaller than the first display area; the interaction ways of the users using the electronic equipment are different due to the different display principles of the first display unit 82 and the second display unit 83;
the first interaction mode is a viewing mode in which the eyes of the user are far away from the first display area of the first display unit 82 on the electronic device; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit 83 on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit 83 is incident on the eye of the user so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen comprises the first user interaction interface;
the electronic device further includes: a second sensing unit 88, a first output control unit 84, a first determining unit 85, a call executing unit 86, and a second output control unit 87; wherein,
the second sensing unit 88 for obtaining a selection operation;
the first output control unit 84 is configured to obtain sensing parameters through the first sensing unit 81 before controlling the first display area of the first display unit 82 to output and display the first object identifier; when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer; controlling a first display area of the first display unit 82 to display one object identifier of the K object identifiers; further configured to determine a first object identifier from the K object identifiers according to the selection operation obtained by the second sensing unit 88 and display the first object identifier in a first display area of the first display unit 82; the first display unit is further used for controlling a first display area of the first display unit 82 to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area;
the first determining unit 85 is configured to obtain an induction parameter through the first induction unit 81, and determine an interaction mode of the electronic device according to the induction parameter;
the call execution unit 86 is configured to determine that the first output control unit 84 outputs the displayed first object identifier when the first determination unit 85 determines that the electronic device is in the second interaction mode, and generate a call instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit 87 is configured to control the second display area of the second display unit 83 to output the first user interaction interface corresponding to the first application started by the call execution unit 86, so that the user in the second interaction manner perceives the first user interaction interface through the second display area.
It should be understood by those skilled in the art that the functions of each processing unit in the electronic device according to the embodiment of the present invention may be understood by referring to the description of the information processing method, and each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
Example eight
Fig. 10 is a schematic diagram of a third component structure of the electronic device according to the embodiment of the present invention; as shown in fig. 10, the electronic device has a first sensing unit 81, a first display unit 82 and a second display unit 83; the first display unit 82 has a first display area; the second display unit 83 has a second display area; when the electronic device interacts with a user of the electronic device in a first interaction mode, the user can observe the first display area of the first display unit 82; the user can observe the second display area of the second display unit 83 when the electronic device interacts with the user in a second interaction mode; the first display unit 82 and the second display unit 83 are different in display principle; the second display area is smaller than the first display area; the interaction ways of the users using the electronic equipment are different due to the different display principles of the first display unit 82 and the second display unit 83;
the first interaction mode is a viewing mode in which the eyes of the user are far away from the first display area of the first display unit 82 on the electronic device; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit 83 on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit 83 is incident on the eye of the user so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen comprises the first user interaction interface;
the electronic device further includes: a third sensing unit 89, a first output control unit 84, a first determining unit 85, a call executing unit 86, and a second output control unit 87; wherein,
the first output control unit 84 is configured to control a first display area of the first display unit 82 to output and display a first object identifier, so that the user in the first interaction manner perceives the first object identifier through the first display area; the first determining unit 85 is further configured to process the second user interaction interface to generate an addition message when the electronic device is determined to be in the first interaction mode; controlling the change of the first object identifier with the added information, and displaying the changed first object identifier on the first display area of the first display unit 82; the changed first object identification includes the added information;
the first determining unit 85 is configured to obtain an induction parameter through the first induction unit 81, and determine an interaction mode of the electronic device according to the induction parameter;
the call execution unit 86 is configured to determine that the first output control unit 84 outputs the displayed first object identifier when the first determination unit 85 determines that the electronic device is in the second interaction mode, and generate a call instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit 87 is configured to control the second display area of the second display unit 83 to output the first user interaction interface corresponding to the first application program started by the calling execution unit 86, so that the user in the second interaction manner perceives the first user interaction interface through the second display area; the third sensing unit 89 is further used for responding to the input operation obtained by the third sensing unit to change the first user interaction interface to form a second user interaction interface;
the third sensing unit 89 is configured to obtain an input operation for the first user interaction interface.
It should be understood by those skilled in the art that the functions of each processing unit in the electronic device according to the embodiment of the present invention may be understood by referring to the description of the information processing method, and each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
Example nine
An embodiment of the present invention further provides an electronic device, as shown in fig. 8, the electronic device includes a first sensing unit 81, a first display unit 82, and a second display unit 83; the first display unit 82 has a first display area; the second display unit 83 has a second display area; when the electronic device interacts with a user of the electronic device in a first interaction mode, the user can observe the first display area of the first display unit 82; the user can observe the second display area of the second display unit 83 when the electronic device interacts with the user in a second interaction mode; the first display unit 82 and the second display unit 83 are different in display principle; the second display area is smaller than the first display area; the interaction ways of the users using the electronic equipment are different due to the different display principles of the first display unit 82 and the second display unit 83;
the first interaction mode is a viewing mode in which the eyes of the user are far away from the first display area of the first display unit 82 on the electronic device; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit 83 on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit 83 is incident on the eye of the user so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen comprises the first user interaction interface;
the electronic device further includes: a first output control unit 84, a first determination unit 85, a call execution unit 86, and a second output control unit 87; wherein,
the first output control unit 84 is configured to control a first display area of the first display unit 82 to output and display a first object identifier, so that the user in the first interaction manner perceives the first object identifier through the first display area;
the first determining unit 85 is configured to obtain an induction parameter through the first induction unit 81, and determine an interaction mode of the electronic device according to the induction parameter;
the call execution unit 86 is configured to determine that the first output control unit 84 outputs the displayed first object identifier when the first determination unit 85 determines that the electronic device is in the second interaction mode, and generate a call instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit 87 is configured to control the second display area of the second display unit 83 to output the first user interaction interface corresponding to the first application program started by the calling execution unit 86, so that the user in the second interaction manner perceives the first user interaction interface through the second display area; the first determining unit 85 is further configured to, when determining that the electronic device is in the second interaction mode, update a display parameter value of a bearing surface of the first user interaction interface corresponding to the first application program, where the second display area of the second display unit 83 is controlled by the same object identifier started by the calling executing unit 86, each time.
It should be understood by those skilled in the art that the functions of each processing unit in the electronic device according to the embodiment of the present invention may be understood by referring to the description of the information processing method, and each processing unit in the electronic device according to the embodiment of the present invention may be implemented by an analog circuit that implements the functions described in the embodiment of the present invention, or may be implemented by running software that executes the functions described in the embodiment of the present invention on an intelligent terminal.
In sixth to ninth embodiments of the present invention, the first output control unit 84, the first determining unit 85, the call executing unit 86, and the second output control unit 87 in the electronic device may be implemented by a Central Processing Unit (CPU), a Digital Signal Processor (DSP), or a programmable gate array (FPGA) in the electronic device in practical applications; the first display unit 82 in the electronic device may be implemented by a display screen or a display of the electronic device in practical applications; the second display unit 83 in the electronic device may be implemented by an optical projection system in the electronic device in practical applications; the first sensing unit 81 in the electronic device may be implemented by a camera, a pressure sensor, a gravity sensor, a distance sensor, a chip with a voice recognition function, a chip with a key detection function, or the like in the electronic device in practical application; the second sensing unit 88 in the electronic device may be implemented by a touch sensing screen in practical application; in practical application, the third sensing unit 89 in the electronic device may be implemented by a CPU, a DSP, or an FPGA in the electronic device.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, apparatus, or computer program product. Accordingly, the present invention may take the form of a hardware embodiment, a software embodiment, or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, optical storage, and the like) having computer-usable program code embodied therein.
The present invention is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus, and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above description is only a preferred embodiment of the present invention, and is not intended to limit the scope of the present invention.
Claims (15)
1. An information processing method is applied to electronic equipment, and the electronic equipment is provided with a first sensing unit, a first display module and a second display module; the first display module is provided with a first display area; the second display module is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display module; when the electronic equipment interacts with the user in a second interaction mode, the user can observe the second display area of the second display module; the method comprises the following steps:
controlling a first display area of the first display module to output and display a first object identifier, so that the user in the first interaction mode perceives the first object identifier through the first display area;
obtaining induction parameters through the first induction unit;
determining an interaction mode of the electronic equipment according to the induction parameters;
when the sensing parameters represent that the electronic equipment is in the second interaction mode, determining the first object identifier; generating a call instruction based on the first object identification; responding to the calling instruction to start a first application program corresponding to the first object identifier;
and controlling the second display area of the second display module to output the first user interaction interface corresponding to the first application program, so that the user in the second interaction mode can perceive the first user interaction interface through the second display area.
2. The method according to claim 1, wherein the first display module and the second display module have different display principles; the second display area is smaller than the first display area; the interaction modes of the user using the electronic equipment are different due to different display principles of the first display module and the second display module;
the first interaction mode is an observation mode that the eyes of the user are far away from the first display area of the first display module on the electronic equipment; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interaction mode is an observation mode that the eyes of the user approach the second display area of the second display module on the electronic device, wherein when the user observes the second display area of the electronic device in the second interaction mode, the light beams emitted by the second display module are incident on the eyes of the user, so that the user perceives a second perception picture; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
3. The method of claim 2, wherein the second display region is less than or equal to a predetermined area; the first display area is larger than the predetermined area.
4. The method of claim 3, wherein the predetermined area is a cross-sectional area of a field of view of the user's eyes when the user's eyes meet the predetermined distance from the electronic device.
5. The method according to claim 2, wherein before controlling the first display area of the first display module to output and display the first object identifier, the method further comprises:
obtaining induction parameters through the first induction unit;
when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer;
controlling a first display area of the first display module to display one object identifier in the K object identifiers;
control first object identification is shown in first display area output of first display module assembly, include:
obtaining a selection operation through a second sensing unit of the electronic device;
and determining a first object identifier from the K object identifiers according to the selection operation and displaying the first object identifier in a first display area of the first display module.
6. The method of claim 2, further comprising:
obtaining an input operation aiming at the first user interaction interface through a third sensing unit of the electronic equipment;
changing the first user interaction interface to form a second user interaction interface in response to the input operation;
obtaining induction parameters through the first induction unit of the electronic equipment;
when the sensing parameter represents that the electronic equipment is in the first interaction mode, processing the second user interaction interface to generate addition information;
the change of the first object identification is controlled by the adding information, and the changed first object identification is displayed and output in a first display area of the first display module; the changed first object identification includes the added information.
7. The method of claim 2, wherein the method comprises:
when the sensing parameter represents that the electronic equipment is in the second interaction mode, outputting a display parameter value of a bearing surface of a first user interaction interface corresponding to the first application program to update once for the second display area of the second display module controlled by the same object identifier.
8. The method of claim 2, wherein the electronic device is a wearable electronic device; the wearable electronic device includes: a frame body, a fixing device and a functional main body part;
the fixing device is connected with the frame body, the fixing device is used for fixing the electronic equipment on a support body, and the frame body and the fixing device form an annular space when the electronic equipment is fixed on the support body through the fixing device;
the functional main body part at least comprises a first display module and a second display module; the first display module is arranged in the frame body, and the second display module is arranged in the frame body;
the first display module is a first display screen, the first display screen is used for displaying and outputting second display content, and the size of the first display screen is a first display area of the first display module;
the second display module is an optical projection system, a first part of the second display module is a light conduction assembly, and a second part of the second display module is a display assembly and a collimation assembly;
the light conduction assembly is made of transparent materials;
the display component is used for displaying and outputting first display content and projecting first light beam output in a light beam mode;
the collimation assembly is used for processing and converting the first light beam projected to be output in the light beam mode into the second light beam output;
the light conduction assembly is used for conducting the second light beam in a transparent material forming the light conduction assembly, wherein the light conduction assembly comprises a reflection unit, the reflection unit is arranged in a specific area of the excess part, and the reflection unit is used for projecting the second light beam in a second direction by changing the conduction direction of the second light beam in the transparent material; the second direction is consistent with the output direction of the first content to be displayed of the first display screen of the first display module; the specific area of the light conduction assembly, in which the reflection unit is arranged, is a second display area of the second display module.
9. An electronic device is provided with a first sensing unit, a first display unit and a second display unit; the first display unit is provided with a first display area; the second display unit is provided with a second display area; when the electronic equipment interacts with a user of the electronic equipment in a first interaction mode, the user can observe the first display area of the first display unit; the electronic equipment can observe the second display area of the second display unit when interacting with the user in a second interaction mode; the electronic device further includes: the system comprises a first output control unit, a first determining unit, a calling execution unit and a second output control unit; wherein,
the first output control unit is used for controlling a first display area of the first display unit to output and display a first object identifier, so that the user in the first interaction mode can perceive the first object identifier through the first display area;
the first determining unit is used for obtaining induction parameters through the first induction unit and determining an interaction mode of the electronic equipment according to the induction parameters;
the calling execution unit is used for determining the first object identifier output and displayed by the first output control unit when the first determination unit determines that the electronic equipment is in the second interaction mode, and generating a calling instruction based on the first object identifier; responding to the calling instruction to start a first application program corresponding to the first object identifier;
the second output control unit is configured to control the second display area of the second display unit to output the first user interaction interface corresponding to the first application started by the call execution unit, so that the user in the second interaction manner perceives the first user interaction interface through the second display area.
10. The electronic device according to claim 9, wherein the first display unit and the second display unit are different in display principle; the second display area is smaller than the first display area; the interaction modes of the users using the electronic equipment are different due to the fact that the display principles of the first display unit and the second display unit are different;
the first interaction mode is a viewing mode that the eyes of the user are far away from the first display area of the first display unit on the electronic equipment; wherein the eyes of a user cause the user to perceive a first perception screen when the user observes the first display region of the electronic device in the first interactive manner; the size of the first perception picture is equal to the first display area; the first perceptual picture comprises the first object identification;
the second interactive manner is an observation manner in which an eye of the user approaches the second display area of the second display unit on the electronic device, wherein when the user observes the second display area of the electronic device in the second interactive manner, a light beam emitted by the second display unit is incident on the eye of the user, so that the user perceives a second perception screen; the size of the second perception picture is larger than that of the second display area; the second perception screen includes the first user interaction interface.
11. The electronic device according to claim 10, wherein the second display region is equal to or smaller than a predetermined area; the first display area is larger than the predetermined area.
12. The electronic device of claim 11, wherein the predetermined area is a cross-sectional area of a field of view of the user's eyes when the user's eyes meet a predetermined distance from the electronic device.
13. The electronic device of claim 10, further comprising a second sensing unit for obtaining a selection operation;
the first output control unit is further configured to obtain sensing parameters through the first sensing unit before the first display area of the first display unit is controlled to output and display the first object identifier; when the induction parameters represent that the electronic equipment is in the first interaction mode, K object identifications are obtained, wherein K is a positive integer; controlling a first display area of the first display unit to display one object identifier in the K object identifiers; and the first display unit is also used for determining a first object identifier from the K object identifiers according to the selection operation obtained by the second sensing unit and displaying the first object identifier in a first display area of the first display unit.
14. The electronic device of claim 10, further comprising a third sensing unit for obtaining an input operation for the first user interaction interface;
the second output control unit is also used for responding to the input operation obtained by the third sensing unit to change the first user interaction interface to form a second user interaction interface;
the first determining unit is used for obtaining induction parameters through the first induction unit and determining an interaction mode of the electronic equipment according to the induction parameters;
the first output control unit is further configured to process the second user interaction interface to generate addition information when the first determining unit determines that the electronic device is in the first interaction mode; controlling the change of the first object identifier by the adding information, and displaying and outputting the changed first object identifier on a first display area of the first display unit; the changed first object identification includes the added information.
15. The electronic device according to claim 10, wherein the second output control unit is further configured to, when the first determining unit determines that the electronic device is in the second interaction mode, update a display parameter value of a carrying surface of the first user interaction interface corresponding to the first application program, each time the second display area of the second display unit is controlled by a same object identifier activated by the invoking execution unit.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410455816.3A CN105468135B (en) | 2014-09-09 | 2014-09-09 | A kind of information processing method and electronic equipment |
US14/584,630 US9727296B2 (en) | 2014-06-27 | 2014-12-29 | Display switching method, information processing method and electronic device |
DE102014019637.2A DE102014019637B4 (en) | 2014-06-27 | 2014-12-30 | Display switching method, data processing method and electronic device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201410455816.3A CN105468135B (en) | 2014-09-09 | 2014-09-09 | A kind of information processing method and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN105468135A true CN105468135A (en) | 2016-04-06 |
CN105468135B CN105468135B (en) | 2019-02-05 |
Family
ID=55605921
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201410455816.3A Active CN105468135B (en) | 2014-06-27 | 2014-09-09 | A kind of information processing method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN105468135B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107490954A (en) * | 2016-06-12 | 2017-12-19 | 陈亮 | Include the intelligent watch of two table bodies |
CN107491163A (en) * | 2016-06-12 | 2017-12-19 | 陈亮 | Double table body intelligent watch and its table body are towards determination methods, system and display screen lighting system |
CN111143726A (en) * | 2019-12-05 | 2020-05-12 | 维沃移动通信有限公司 | Information display method and mobile terminal |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102375650A (en) * | 2010-08-11 | 2012-03-14 | 上海三旗通信科技有限公司 | Fire-new mobile terminal interface distribution and switching interaction mode |
CN102419677A (en) * | 2010-09-27 | 2012-04-18 | 上海三旗通信科技有限公司 | Man-machine interaction way for mixed display of standby information, main menu and secondary menus |
CN102646016A (en) * | 2012-02-13 | 2012-08-22 | 北京百纳信息技术有限公司 | User terminal for displaying gesture-speech interaction unified interface and display method thereof |
US20130222270A1 (en) * | 2012-02-28 | 2013-08-29 | Motorola Mobility, Inc. | Wearable display device, corresponding systems, and method for presenting output on the same |
CN103733247A (en) * | 2011-08-02 | 2014-04-16 | 微软公司 | Changing between display device viewing modes |
-
2014
- 2014-09-09 CN CN201410455816.3A patent/CN105468135B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102375650A (en) * | 2010-08-11 | 2012-03-14 | 上海三旗通信科技有限公司 | Fire-new mobile terminal interface distribution and switching interaction mode |
CN102419677A (en) * | 2010-09-27 | 2012-04-18 | 上海三旗通信科技有限公司 | Man-machine interaction way for mixed display of standby information, main menu and secondary menus |
CN103733247A (en) * | 2011-08-02 | 2014-04-16 | 微软公司 | Changing between display device viewing modes |
CN102646016A (en) * | 2012-02-13 | 2012-08-22 | 北京百纳信息技术有限公司 | User terminal for displaying gesture-speech interaction unified interface and display method thereof |
US20130222270A1 (en) * | 2012-02-28 | 2013-08-29 | Motorola Mobility, Inc. | Wearable display device, corresponding systems, and method for presenting output on the same |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107490954A (en) * | 2016-06-12 | 2017-12-19 | 陈亮 | Include the intelligent watch of two table bodies |
CN107491163A (en) * | 2016-06-12 | 2017-12-19 | 陈亮 | Double table body intelligent watch and its table body are towards determination methods, system and display screen lighting system |
CN111143726A (en) * | 2019-12-05 | 2020-05-12 | 维沃移动通信有限公司 | Information display method and mobile terminal |
Also Published As
Publication number | Publication date |
---|---|
CN105468135B (en) | 2019-02-05 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR102090075B1 (en) | Menu navigation in a head-mounted display | |
CN114402589B (en) | Smart stylus beam and auxiliary probability input for element mapping in 2D and 3D graphical user interfaces | |
US9727296B2 (en) | Display switching method, information processing method and electronic device | |
US10339382B2 (en) | Feedback based remote maintenance operations | |
KR20170126295A (en) | Head mounted display device and method for controlling the same | |
US20190227694A1 (en) | Device for providing augmented reality service, and method of operating the same | |
KR20150110257A (en) | Method and wearable device for providing a virtual input interface | |
CN105334913B (en) | A kind of electronic equipment | |
US20150193977A1 (en) | Self-Describing Three-Dimensional (3D) Object Recognition and Control Descriptors for Augmented Reality Interfaces | |
CN112136096B (en) | Displaying a physical input device as a virtual object | |
CN105334955B (en) | A kind of information processing method and electronic equipment | |
CN105450838B (en) | Information processing method and electronic equipment | |
CN106610781A (en) | Intelligent wearable equipment | |
CN105468135B (en) | A kind of information processing method and electronic equipment | |
JP2016096449A (en) | Image display device, brightness change method, and program | |
CN109845251B (en) | Electronic device and method for displaying images | |
JP6740613B2 (en) | Display device, display device control method, and program | |
US20200218198A1 (en) | Movement control of holographic objects with crown movement of a watch device | |
CN112965773A (en) | Method, apparatus, device and storage medium for information display | |
CN115981481A (en) | Interface display method, device, equipment, medium and program product | |
CN105785749B (en) | A kind of display methods and electronic equipment | |
US9563344B2 (en) | Information processing method and electronic apparatus | |
US11449296B2 (en) | Display system, display method, and program | |
US20210200495A1 (en) | Display system, display method, and program | |
JP2016212769A (en) | Display device, control method for the same and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |