CN111443805B - Display method and wearable electronic equipment - Google Patents

Display method and wearable electronic equipment Download PDF

Info

Publication number
CN111443805B
CN111443805B CN202010224965.4A CN202010224965A CN111443805B CN 111443805 B CN111443805 B CN 111443805B CN 202010224965 A CN202010224965 A CN 202010224965A CN 111443805 B CN111443805 B CN 111443805B
Authority
CN
China
Prior art keywords
display area
wearable electronic
electronic device
display
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010224965.4A
Other languages
Chinese (zh)
Other versions
CN111443805A (en
Inventor
向永航
程鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010224965.4A priority Critical patent/CN111443805B/en
Publication of CN111443805A publication Critical patent/CN111443805A/en
Application granted granted Critical
Publication of CN111443805B publication Critical patent/CN111443805B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B27/0176Head mounted characterised by mechanical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The embodiment of the invention discloses a display method and wearable electronic equipment, relates to the technical field of communication, and can solve the problem of poor use convenience of the wearable electronic equipment. The method comprises the following steps: displaying a first picture through the first display area or the second display area; and controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by the user. The embodiment of the invention is applied to the process that the wearable electronic equipment controls the relative motion of the display area of the wearable electronic equipment.

Description

Display method and wearable electronic equipment
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a display method and wearable electronic equipment.
Background
Currently, in the process of playing a Virtual Reality (VR) game, an electronic device (e.g., VR glasses) may display an interface of the VR game on a screen of VR glasses so that a user can play the VR game according to the interface. Specifically, after wearing VR glasses, the user can focus the line of sight on the screen to move in the real space where the user is located according to a prompt in an interface of the VR game in the screen, so as to play the VR game.
However, in the above method, during the VR game, the user can only focus the line of sight on the screen and move according to the prompt in the screen, and the line of sight cannot be focused in the real space, so that the user may collide in the real space, which results in poor convenience of using VR glasses.
Disclosure of Invention
The embodiment of the invention provides a display method and wearable electronic equipment, and can solve the problem that the wearable electronic equipment is poor in use convenience.
In order to solve the technical problem, the embodiment of the invention adopts the following technical scheme:
in a first aspect of the embodiments of the present invention, a display method is provided, which is applied to a wearable electronic device, where the wearable electronic device has a first display area and a second display area, and a size of the first display area is smaller than a size of the second display area, and the display method includes: displaying a first picture through the first display area or the second display area; and controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by the user.
In a second aspect of the embodiments of the present invention, there is provided a wearable electronic device having a first display area and a second display area, a size of the first display area being smaller than a size of the second display area, the wearable electronic device including: the device comprises a display module and a control module. The display module is used for displaying a first picture through the first display area or the second display area. The control module is used for controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by a user.
In a third aspect of embodiments of the present invention, a wearable electronic device is provided, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when executed by the processor, the computer program implements the steps of the display method according to the first aspect.
A fourth aspect of embodiments of the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when executed by a processor, implements the steps of the display method according to the first aspect.
In the embodiment of the invention, when the first screen is displayed through the first display area or the second display area, the wearable electronic device can control the first display area to move relative to the second display area according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user. Because wearable electronic equipment can confirm whether the user needs to know the external environment condition that the user is located according to the operating condition of wearable electronic equipment or the user to wearable electronic equipment's in service behavior, and under the condition that confirm that the user needs to know the environmental condition that the user is located, wearable electronic equipment can control first display area and move for the second display area, in order to change the display position of first display area for the second display area, thereby the user can know the external environment condition that the user is located through the first display area after changing the display position, in order to avoid the user to bump in real space, so can promote wearable electronic equipment's use convenience.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
fig. 2 is a schematic state diagram of a wearable electronic device according to an embodiment of the present invention;
FIG. 3 is a schematic diagram illustrating a display method according to an embodiment of the present invention;
fig. 4A is a second schematic state diagram of the wearable electronic device according to the embodiment of the invention;
fig. 4B is a third schematic state diagram of a wearable electronic device according to an embodiment of the invention;
fig. 5A is a fourth schematic state diagram of the wearable electronic device according to the embodiment of the present invention;
fig. 5B is a fifth schematic state diagram of the wearable electronic device according to the embodiment of the present invention;
fig. 6 is a sixth schematic view illustrating a state of the wearable electronic device according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a wearable electronic device according to an embodiment of the present invention;
fig. 8 is a second schematic structural diagram of a wearable electronic device according to a second embodiment of the present invention;
fig. 9 is a hardware schematic diagram of a wearable electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
The terms "first" and "second," and the like, in the description and in the claims of embodiments of the present invention are used for distinguishing between different objects and not for describing a particular order of the objects. For example, the first display screen and the second display screen, etc. are used to distinguish between different display screens, rather than to describe a particular order of display screens.
In the description of the embodiments of the present invention, the meaning of "a plurality" means two or more unless otherwise specified. For example, a plurality of elements refers to two elements or more.
The term "and/or" herein is an association relationship describing an associated object, and means that there may be three relationships, for example, a display panel and/or a backlight, which may mean: there are three cases of a display panel alone, a display panel and a backlight at the same time, and a backlight alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, input/output denotes input or output.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
The embodiment of the invention provides a display method and wearable electronic equipment, and the wearable electronic equipment can determine whether a user needs to know the external environment condition of the user according to the working state of the wearable electronic equipment or the use condition of the user on the wearable electronic equipment, and can control a first display area to move relative to a second display area to change the display position of the first display area relative to the second display area under the condition that the user needs to know the environment condition of the user, so that the user can know the external environment condition of the user by changing the first display area after the display position, the collision of the user in a real space is avoided, and the use convenience of the wearable electronic equipment can be improved.
The display method and the wearable electronic device provided by the embodiment of the invention can be applied to the process of controlling the relative motion of the display area of the wearable electronic device by the wearable electronic device. Specifically, the method can be applied to a process that the wearable electronic device controls the relative movement of the display area of the wearable electronic device according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
The wearable electronic device in the embodiment of the invention can be a wearable electronic device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the display method may operate based on the android operating system shown in fig. 1. That is, the processor or the electronic device may implement the display method provided by the embodiment of the present invention by running the software program in the android operating system.
The wearable electronic device in the embodiment of the invention can be a mobile electronic device and can also be a non-mobile electronic device. For example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine, a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
A display method and an electronic device provided in an embodiment of the present invention are described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
In the embodiment of the invention, when a user wears the wearable electronic equipment and plays the VR game through the wearable electronic equipment, the user can watch the picture of the VR game displayed in the display area of the display screen of the wearable electronic equipment and play the game through the game handle so as to realize interaction with the VR game. In order to avoid the situation that the user collides in the real space during the interaction with the VR game, the wearable electronic device may control the display area in the display screen to move according to a state of an external environment where the wearable electronic device is located (for example, an obstacle appears in the external environment), or a type of an application run by the wearable electronic device (for example, the type of the application is an assistant-type application), or an operation condition of the wearable electronic device by a user (for example, a time period during which the user does not operate the wearable electronic device), so that the display area is in an area outside a core visual area of the user (that is, the display area is prevented from blocking a user's sight line), therefore, the user can know the external environment condition where the user is located, so that the user is prevented from colliding in the real space, and the use convenience of the wearable electronic equipment is improved.
Optionally, in an embodiment of the present invention, the wearable electronic device may specifically be a VR-type electronic device (e.g., VR glasses), or an Augmented Reality (AR) -type electronic device (e.g., AR glasses).
It should be noted that, the wearable electronic device in the embodiment of the present invention may include multiple sets of display screens, each set of display screens in the multiple sets of display screens may include two display screens (for example, a first display screen and a second display screen), and in the following embodiments, the wearable electronic device includes only one set of display screens as an example, and the display method provided in the embodiment of the present invention is described in an exemplary manner.
Optionally, in an embodiment of the present invention, in a case that the wearable electronic device is a VR glasses, the wearable electronic device may include a VR glasses frame, two groups of display screens, and two lenses, each of the two groups of display screens may include a first display screen and a second display screen, one group of display screens corresponds to one lens, the two lenses are both connected to the VR glasses frame, and one lens corresponds to one eye of a user, and each lens corresponds to one group of display screens.
Optionally, in an embodiment of the present invention, one of the two lenses is located on the first surface of the group of display screens corresponding to the one lens, and the other of the two lenses is located on the first surface of the group of display screens corresponding to the other lens.
It should be noted that the above "the first side of the group of display screens" may be understood as: the user is when wearing wearable electronic equipment, the one side of this group of display screen is close to user's face.
Optionally, in the embodiment of the present invention, a group of display screens is taken as an example, where the group of display screens includes a first display screen and a second display screen, and both the first display screen and the second display screen are connected to the VR glasses frame; be provided with the recess on this second display screen, this recess sets up with first display screen is relative, and the notch size of this recess can be the same with the size of first display screen.
Optionally, in an embodiment of the present invention, the second display screen and the first display screen are located on the same plane. It can be understood that the first display screen is located in the groove of the second display screen, that is, the first display screen and the second display screen form a larger display screen, so that a user can watch pictures displayed by the first display screen and the second display screen to use VR glasses.
Optionally, in this embodiment of the present invention, the second display screen may be located on a different plane from the first display screen. It can be understood that the second display screen may be located on the first side of the first display screen, so that the user may directly view the second display screen, and the user's sight may view the first display screen through the groove of the second display screen to view the pictures displayed by the first display screen and the second display screen; alternatively, the second display screen may be located on the second side of the first display screen, so that the user may view the pictures displayed by the first display screen and the second display screen to use VR glasses.
It should be noted that the above "first side of the first display screen" may be understood as: when the user wears the wearable electronic equipment, the one side of the first display screen close to the face of the user. The above-mentioned "second side of the first display screen" may be understood as: when the user wears the wearable electronic equipment, the face of the first display screen far away from the face of the user.
It is understood that the first side and the second side are two opposite sides of the first display screen.
For example, taking the wearable electronic device as VR glasses for illustration, the VR glasses 10 include two sets of display screens, each set of display screen includes a first display screen and a second display screen (a set of display screens is taken as an example for illustration below). As shown in fig. 2, the VR glasses 10 includes a VR frame 11, a first display screen (e.g., a display screen 12) and a second display screen (e.g., a display screen 13), both the display screen 12 and the display screen 13 are connected to the VR frame 11, a groove 14 is formed on the display screen 13, the second display screen 13 is located in the groove 14, and forms a larger display screen with the first display 12, so that a user can directly view the pictures displayed by the display screen 12 and the display screen 13.
Fig. 3 shows a flowchart of a display method provided in an embodiment of the present invention, and the method can be applied to an electronic device having an android operating system shown in fig. 1. As shown in fig. 3, the display method provided by the embodiment of the present invention may include steps 201 and 202 described below.
Step 201, the wearable electronic device displays a first picture through the first display area or the second display area.
In an embodiment of the present invention, the wearable electronic device has a first display area and a second display area, and a size of the first display area is smaller than a size of the second display area.
It should be noted that each group of display screens in the multiple groups of display screens may include a first display screen and a second display screen, and each group of display screens may perform step 201 and step 202, so that the wearable electronic device may change the display position of the first display area relative to the second display area, and thus the user may know the external environment condition of the user through the first display area after changing the display position.
Optionally, in an embodiment of the present invention, the first display area may be all display areas in a first display screen of the wearable electronic device, and the second display area may be all display areas in a second display screen of the wearable electronic device.
For example, assuming that the wearable electronic device is VR glasses, one lens of the VR glasses may include a first display area and a second display area, so that the VR glasses may display a first screen through the first display area or the second display area.
Optionally, in this embodiment of the present invention, the first picture may be a picture in a target electronic device connected to the wearable electronic device; or, may be a picture in the wearable electronic device.
It is understood that the first display region may display a part of the content of the first screen, and the second display region may display another part of the content of the first screen.
Optionally, in an embodiment of the present invention, the target electronic device may be a mobile electronic device (e.g., a mobile phone, a tablet computer, a notebook computer, a palm computer, etc.), or may be a non-mobile electronic device (e.g., a personal computer, a television, etc.).
Optionally, in this embodiment of the present invention, when the user uses the wearable electronic device, the user may trigger the wearable electronic device and the target electronic device to connect (for example, connect wirelessly or through a wired connection), and input a first application in the target electronic device through the wearable electronic device, so that the target electronic device runs the first application, and displays a picture in the first application (for example, a play picture in the first application), so that the wearable electronic device may display the picture of the first application, that is, the first picture, through the first display area or the second display area, and thus the user may view content in the first picture through the first display screen and the second display screen.
Optionally, in this embodiment of the present invention, when the user uses the wearable electronic device, the user inputs a first application in the wearable electronic device on the wearable electronic device, so that the wearable electronic device runs the first application, and displays a picture (for example, a play picture in the first application) in the first application, so that the wearable electronic device may display the picture of the first application, that is, the first picture, through the first display area or the second display area, and the user may view content in the first picture through the first display screen and the second display screen.
It should be noted that the above "the user uses the wearable electronic device" may be understood as: the user wears the wearable electronic device, and the first display screen and the second display screen of the wearable electronic device are both in the bright screen state, so that the user can input the wearable electronic device, and the wearable electronic device can respond to the input.
Optionally, in this embodiment of the present invention, the first application may be a game type application, a video type application, a navigation type application, or a translation type application; the first application image may be a game image, a video image, or a navigation image.
Optionally, in the embodiment of the present invention, when the wearable electronic device displays the first screen through the first display area or the second display area, the wearable electronic device may acquire the first information.
In an embodiment of the present invention, the first information is used to indicate an application scenario corresponding to the wearable electronic device. As can be appreciated, the application scenario "is: an operating state of the wearable electronic device or a user's usage of the wearable electronic device.
Optionally, in an embodiment of the present invention, the first information includes at least one of the following: the wearable electronic device comprises environment information of an external environment where the wearable electronic device is located, application information of an application started by the wearable electronic device, state parameters of the wearable electronic device, a first duration of time for which the wearable electronic device does not respond to the operation of the user, and a first input parameter for the operation of the wearable electronic device by the user.
Specifically, under the condition that the first information is the environment information of the external environment where the wearable electronic device is located, the wearable electronic device can acquire the image of the external environment where the wearable electronic device is located in real time through the camera to acquire the environment information.
Specifically, when the first information is application information of an application started by the wearable electronic device, the application information is used for indicating a type of the application started by the wearable electronic device, and the wearable electronic device detects the started application to acquire the application information of the application.
Specifically, when the first information is a state parameter of the wearable electronic device, the state parameter of the wearable electronic device is at least one of the following: a yaw angle value between the wearable electronic device and a ground plane, a motion velocity of the wearable electronic device, and a motion acceleration of the wearable electronic device; the wearable electronic device can detect the state parameter of the wearable electronic device through the gyroscope.
Specifically, in the case that the first information is a first duration during which the wearable electronic device does not respond to the operation of the user, the first duration may be understood as: the duration that the wearable electronic device is in an operating state (e.g., a bright screen state) and does not respond to any operation by the user may be used to indicate the user's usage of the wearable electronic device. Further, the operation of the wearable electronic device by the user may be any one of the following: a press input (e.g., a click input), a slide input, a gaze input, a voice input, and a preset action input (e.g., a blink input);
specifically, when the first information is a first input parameter of the user operating the wearable electronic device, the first input parameter may specifically be: the type of input, the duration of input, or the information (e.g., number of inputs, content of inputs) entered by the user into the wearable electronic device, etc. It should be noted that the "gaze input" may be understood as: and inputting that the gazing duration of the wearable electronic equipment by the user is greater than or equal to the preset time threshold.
Step 202, the wearable electronic device controls the first display area to move relative to the second display area according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
It should be noted that the above "operating state of the wearable electronic device" may be understood as: when the wearable electronic equipment works, the wearable electronic equipment is in a state corresponding to the environment; or, the wearable electronic device starts the corresponding state of the application.
Optionally, in this embodiment of the present invention, if the first information meets the third preset condition, the wearable electronic device controls the first display screen to execute a motion action, so as to control the first display area to move relative to the second display area.
Specifically, in the embodiment of the present invention, the third preset condition includes at least one of the following conditions: the first input parameter is matched with the preset input parameter, the external environment where the wearable electronic device is located meets a first preset condition, the application information of the application started by the wearable electronic device is target application information, the state parameter of the wearable electronic device meets a second preset condition, and the first duration is longer than or equal to first preset time.
Optionally, in an embodiment of the present invention, one end of the first display region may be connected to the wearable electronic device through a chute of the wearable electronic device, so that the wearable electronic device may control the first display region to move along a direction perpendicular to the chute, and then move relative to the wearable electronic device along the direction of the chute.
It can be understood that the wearable electronic device can control the first display area to move along a direction perpendicular to the chute first, so that the first display area and the second display area are on different planes, and thus the wearable electronic device can control the first display area to move relative to the wearable electronic device along the direction of the chute.
In the embodiment of the invention, the wearable electronic device can control the first display area to move relative to the second display area, so that the position of the first display area relative to the wearable electronic device is changed, and the user can focus the sight line in the environment where the user is located (namely the external environment where the wearable electronic device is located).
It is understood that the wearable electronic device may perform a motion action by controlling the first display screen to move the first display screen to a location (i.e., a location other than the user's visual core area) such that the user may focus the user's gaze on the external environment in which the user is located.
In a possible implementation manner of the embodiment of the present invention, the wearable electronic device may determine whether to control the first display area to move relative to the second display area according to environment information corresponding to the wearable electronic device. Specifically, the step 202 can be realized by the following step 202 a.
Step 202a, under the condition that the external environment where the wearable electronic device is located meets a first preset condition, the wearable electronic device controls the first display area to move relative to the second display area.
It can be understood that, under the condition that the first information is environment information of an external environment where the wearable electronic device is located, if the environment information indicates that the external environment where the wearable electronic device is located meets a first preset condition, the wearable electronic device controls the first display screen to execute the motion action.
Optionally, in an embodiment of the present invention, the external environment where the wearable electronic device is located meets a first preset condition includes: a first object appears in a viewing angle range of the wearable electronic device; alternatively, a state of the first object in a viewing angle range of the wearable electronic device changes.
It should be noted that the above "viewing angle range of the wearable electronic device" can be understood as: the range that the image of shooting of wearable electronic equipment's camera corresponds.
For example, the wearable electronic device may perform image detection on an image of an external environment where the wearable electronic device is located to determine whether a first object (e.g., another user, an obstacle (e.g., a table, etc.)) appears in a viewing angle range of the wearable electronic device, and if the first object appears in the image, the wearable electronic device controls the first display screen to perform a motion action; or, to determine whether a state of the first object in the viewing angle range of the wearable electronic device changes (e.g., the first object changes from a static state to a moving state), if the state of the first object in the image changes, the wearable electronic device controls the first display screen to perform a moving action.
In the embodiment of the invention, if the environment information indicates that the external environment where the wearable electronic device is located meets the first preset condition, it can be considered that the user may collide in the external environment where the user is located, and the user may need to know the external environment condition where the user is located, so that the wearable electronic device can control the first display screen to execute the motion action, and the user can know the external environment condition where the user is located through the first display screen after executing the motion action.
In the embodiment of the invention, the wearable electronic device can determine whether the user needs to know the external environment condition of the user according to the external environment of the wearable electronic device, and under the condition that the external environment meets the first preset condition, the wearable electronic device can control the first display screen to execute the motion action, so that the first display area moves relative to the second display area, and therefore, the user can know the external environment condition of the user through the first display screen after executing the motion action, and the user is prevented from colliding in a real space.
Optionally, in another possible implementation manner of the embodiment of the present invention, the wearable electronic device may determine whether to control the first display area to move relative to the second display area according to application information started by the wearable electronic device. Specifically, the step 202 can be realized by the step 202b described below.
Step 202b, in a case where the wearable electronic device starts the target application, the wearable electronic device controls the first display area to move relative to the second display area.
It can be understood that, in the case that the first information is application information of an application started by the wearable electronic device, if the application information matches preset application information in the wearable electronic device (or the target electronic device), the wearable electronic device (or the target electronic device) may determine that the application started by the wearable electronic device is the target application, so that the wearable electronic device controls the first display screen to perform the motion action.
Optionally, in this embodiment of the present invention, the target application may specifically include an assistant application (for example, a translation application, a navigation application, and the like), a music application, a social application, and the like.
In the embodiment of the invention, if the application started by the wearable electronic device is the target application, it can be considered that the user may not need to watch the picture of the target application through the first display screen and the second display screen, so that the wearable electronic device can control the first display screen to execute the motion action, and the user can know the external environment condition of the user through the first display screen after executing the motion action.
In the embodiment of the invention, the wearable electronic device can determine whether the application is the target application according to the application information of the application started by the wearable electronic device, and under the condition that the application is determined to be the target application, the wearable electronic device can control the first display screen to execute the motion action, so that the first display area moves relative to the second display area, and thus a user can know the external environment condition of the user through the first display screen after executing the motion action.
Optionally, in a further possible implementation manner of the embodiment of the present invention, the wearable electronic device may determine whether to control the first display area to move relative to the second display area according to a state parameter of the wearable electronic device. Specifically, the step 202 can be realized by the following step 202 c.
Step 202c, under the condition that the state parameter of the wearable electronic device meets a second preset condition, the wearable electronic device controls the first display area to move relative to the second display area.
It can be understood that, under the condition that the first information is a state parameter of the wearable electronic device, if the state parameter meets a second preset condition, the wearable electronic device controls the first display screen to execute the motion action.
Optionally, in an embodiment of the present invention, the second preset condition may specifically be that the state parameter is greater than or equal to a first preset threshold.
In the embodiment of the present invention, if the state parameter of the wearable electronic device meets the second preset condition, it may be considered that the electronic device worn by the user is in a standing state (or a moving state), and in the standing state (or the moving state), the user may need to know the external environment condition where the user is located, so that the wearable electronic device may control the first display screen to execute a moving action, so that the user may know the external environment condition where the user is located through the first display screen after executing the moving action.
In the embodiment of the invention, the wearable electronic device can determine the state of the wearable electronic device worn by the user according to the state parameter of the wearable electronic device, and under the condition that the state parameter meets the second preset condition, the wearable electronic device can control the first display screen to execute the motion action, so that the first display area moves relative to the second display area, and therefore the user can know the external environment condition of the user through the first display screen after executing the motion action, and the user is prevented from colliding in the real space.
Optionally, in another possible implementation manner of the embodiment of the present invention, the wearable electronic device may determine whether to control the first display area to move relative to the second display area according to a first duration corresponding to the wearable electronic device. Specifically, the step 202 can be realized by the following step 202 d.
Step 202d, in the case that the wearable electronic device does not respond to the operation of the user within the first preset time, the wearable electronic device controls the first display area to move relative to the second display area.
It can be understood that, in the case that the first information is a first duration for which the wearable electronic device does not respond to the operation of the user, if the first duration is greater than or equal to a first preset time, the wearable electronic device controls the first display screen to execute the target action.
In the embodiment of the present invention, if the first duration is greater than or equal to the first preset time, it may be considered that the user may not need to view the content in the first picture through the first display screen and the second display screen, so that the wearable electronic device may control the first display screen to execute the motion action.
In the embodiment of the invention, the wearable electronic device can control the first display screen to execute the motion action under the condition that the operation of the user is not responded within the first preset time, so that the first display area moves relative to the second display area, and the user can know the external environment condition of the user through the first display screen after executing the motion action.
Optionally, in another possible implementation manner of the embodiment of the present invention, the wearable electronic device may determine whether to control the first display area to move relative to the second display area according to a first input parameter corresponding to the wearable electronic device. Specifically, the step 202 can be realized by the following step 202 e.
Step 202e, in the case that the wearable electronic device receives the user operation, the wearable electronic device controls the first display area to move relative to the second display area.
It can be understood that, in the case that the first information is a first input parameter of the user operating the wearable electronic device, if the first input parameter matches a preset input parameter (that is, the user operating the wearable electronic device is a preset operation), the wearable electronic device controls the first display screen to perform a motion action.
In the embodiment of the invention, if the first input parameter of the operation of the wearable electronic device by the user is matched with the preset input parameter, the user may need to know the external environment condition of the user, so that the wearable electronic device can control the first display screen to execute the motion action, and the user can know the external environment condition of the user through the first display screen after executing the motion action.
In the embodiment of the invention, the wearable electronic device can control the first display screen to execute the motion action under the condition that the wearable electronic device receives the user operation, so that the first display area moves relative to the second display area, and the user can know the external environment condition of the user through the first display screen after executing the motion action.
According to the display method provided by the embodiment of the invention, under the condition that the first picture is displayed through the first display area or the second display area, the wearable electronic device can control the first display area to move relative to the second display area according to the working state of the wearable electronic device or the use condition of the wearable electronic device by a user. Because wearable electronic equipment can confirm whether the user needs to know the external environment condition that the user is located according to the operating condition of wearable electronic equipment or the user to wearable electronic equipment's in service behavior, and under the condition that confirm that the user needs to know the environmental condition that the user is located, wearable electronic equipment can control first display area and move for the second display area, in order to change the display position of first display area for the second display area, thereby the user can know the external environment condition that the user is located through the first display area after changing the display position, in order to avoid the user to bump in real space, so can promote wearable electronic equipment's use convenience.
Optionally, in an embodiment of the present invention, the controlling the first display area to move relative to the second display area may be controlling the first display area to move to a preset position, or controlling the first display area to rotate by a preset angle value, or controlling the first display area to move to a preset position and rotate by a preset angle value.
In one embodiment, the wearable electronic device controls the first display area to move to the preset position according to the working state of the wearable electronic device or the use condition of the wearable electronic device by a user.
Optionally, in this embodiment of the present invention, the preset position may be any position on the wearable electronic device except for the target area (e.g., the user visual core area).
It should be noted that the above "user visual core area" can be understood as: when using the wearable electronic device, the user's gaze focuses on the area.
It will be appreciated that the wearable electronic device may control the first display area to move to a location outside of the user's visual core area (i.e., the wearable electronic device is in a small screen mode) so that the user may focus his or her gaze in the environment in which the user is located.
For example, referring to fig. 2, as shown in fig. 4A, in the case that the first information satisfies the preset condition, the VR glasses 10 may first control the first display area 12 to move along the first direction 15, so that the display 12 and the display 13 are not located on the same plane; as shown in fig. 4B, after controlling the first display region 12 to move in the first direction 15, the VR glasses 10 may control the first display region 12 to move in the second direction 16 to the position 17 so that the user may focus the line of sight in the environment in which the user is located.
In the embodiment of the invention, the wearable electronic device can control the first display area to move to the preset position according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user, namely, the first display area is controlled to move out of the visual core area of the user, so that the user can know the environmental state of the user, the user is prevented from colliding in the real space, and the use convenience of the wearable electronic device is improved.
In one embodiment, the wearable electronic device controls the first display area to rotate by a preset angle value according to the working state of the wearable electronic device or the use condition of the wearable electronic device by a user.
Optionally, in this embodiment of the present invention, the wearable electronic device may control the first display screen to rotate around the rotation axis by a preset angle value according to the third direction, so as to control the first display area to rotate by the preset angle value.
The "third direction" may be understood as: the first display screen is away from a direction of the user's eyes (e.g., away from the VR glasses frame) while the user is using the wearable electronic device.
It will be appreciated that the wearable electronic device may control the first display area to rotate by a preset angular value in the third direction to rotate the first display area to a position other than the target area (e.g., the user's visual core area) so that the user may focus his or her gaze in the environment in which the user is located.
For example, in conjunction with fig. 2, as shown in fig. 6, in the case that the first information satisfies the preset condition, the VR glasses 10 may control the first display (e.g., the display 12) to rotate around the rotation axis by a preset angle value according to the third direction 19, so that the user may focus the line of sight on the environment where the user is located.
In the embodiment of the invention, the wearable electronic device can control the first display area to rotate by the preset angle value according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user, namely, the first display area is controlled to rotate out of the visual core area of the user, so that the user can know the environment state of the user, the user is prevented from colliding in the real space, and the use convenience of the wearable electronic device is improved.
Optionally, in an embodiment of the present invention, the display screen corresponding to the first display area is a scroll display screen.
The display method provided by the embodiment of the invention may further include step 202f described below.
Step 202f, the wearable electronic device controls the display screen corresponding to the first display area to stretch along the target direction.
Optionally, in this embodiment of the present invention, the wearable electronic device may control the display screen corresponding to the area to extend along the target direction.
It should be noted that the scroll display screen may be understood as follows: the wearable electronic device may control the display screen to extend (i.e., in an extended state) or to retract (i.e., in a retracted state).
Optionally, in an embodiment of the present invention, the display screen corresponding to the area may include a reel and a flexible display screen, the reel may be connected to the wearable electronic device through a chute of the wearable electronic device, one end of the flexible display screen is fixedly connected to the reel, and the other end of the flexible display screen may be connected to the wearable electronic device through the chute of the wearable electronic device, so that after the display screen corresponding to the area moves to the preset position along the first direction, the wearable electronic device may control the reel to stop moving, and control the other end of the flexible display screen to move along the target direction, so as to control the first display screen to unfold along the target direction.
Exemplarily, referring to fig. 2, as shown in fig. 5A, the display 12 includes a reel 121 and a flexible display 122, the reel 121 is connected to the AR glasses 10 through a chute of the AR glasses 10, one end of the flexible display 122 is fixedly connected to the reel 121, and the other end of the flexible display 122 is connected to the AR glasses 10 through a chute of the AR glasses 10, so that the AR glasses 10 can control the display 12 to unfold; as shown in fig. 5B, after the display 12 is moved to the preset position, the AR glasses 10 may control the scroll 121 to stop moving and control the other end of the flexible display 122 to move along the target direction 18 to control the display 12 to unfold along the target direction 18.
It can be appreciated that the wearable electronic device can display more display content by enlarging the size of the display screen corresponding to the area so that the first display area can display more display content.
Optionally, in this embodiment of the present invention, the target direction may be the same as the second direction.
In the embodiment of the invention, the wearable electronic device can control the display screen corresponding to the first display area to be unfolded so as to enlarge the display size of the first display area, so that a user can receive more display contents from the first display area, and the use experience of the user can be improved.
Optionally, in the embodiment of the present invention, after the step 202, the display method provided in the embodiment of the present invention may further include the following step 203 or step 204.
Step 203, the wearable electronic device displays the user interface of the target application in the first display area, and displays the picture of the external environment where the wearable electronic device is located in the second display area.
Optionally, in the embodiment of the present invention, the user interface may specifically be a first screen.
Optionally, in the embodiment of the present invention, the wearable electronic device may collect a picture of an external environment (i.e., an external environment where the wearable electronic device is located) through a camera of the wearable electronic device, so as to display the picture of the external environment in the second display area.
It can be appreciated that after the wearable electronic device controls the first display area to move relative to the second display area, the user may still need to view the screen in the target application, so that the wearable electronic device can display the user interface of the target application in the first display area and display the screen of the external environment in the second display area.
In the embodiment of the invention, the wearable electronic device can display the user interface of the target application in the first display area and display the picture of the external environment in the second display area, so that the user can watch the user interface through the first display area to use the target application and know the environment state of the user by watching the second display area, thereby avoiding the collision of the user in the real space and improving the use convenience of the wearable electronic device.
And step 204, the wearable electronic device displays the user interface of the target application in the second display area, and displays the picture of the external environment where the wearable electronic device is located in the first display area.
It can be appreciated that after the wearable electronic device controls the first display area to move relative to the second display area, the user may still need to view the screen in the target application, so that the wearable electronic device can display the screen of the external environment in the first display area and display the screen in the target application in the second display area.
In the embodiment of the invention, the wearable electronic device can display the picture of the external environment in the first display area, and display the user interface of the target application in the second display area, so that the user can know the environment state of the user by watching the first display area, and use the target application by watching the second display area, thereby avoiding the collision of the user in the real space, and improving the use convenience of the wearable electronic device.
Optionally, in this embodiment of the present invention, the wearable electronic device may further change a display parameter (for example, a display position or a display state) of the display area of the wearable electronic device through another manner (that is, a manner described in step 205 or step 206 in the following embodiment), so that the user may know the external environment condition where the user is located, so as to avoid the user from colliding in the real space.
Optionally, in the embodiment of the present invention, step 202 may be replaced with step 205.
Step 205, the wearable electronic device controls the first display area and the second display area to move according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
Optionally, in this embodiment of the present invention, the wearable electronic device may control the target display screen to perform a motion action to control the first display area and the second display area to move.
Optionally, in an embodiment of the present invention, the target display screen includes at least one of the following items: the display screen comprises a first display screen (a display screen corresponding to a first display area) and a second display screen (a display screen corresponding to a second display area).
Optionally, in the embodiment of the present invention, one end of the first display screen and one end of the second display screen are both connected to the wearable electronic device through a chute of the wearable electronic device, so that the wearable electronic device can control the first display screen and the second display screen to move relative to the wearable electronic device along a direction of the chute. It is to be understood that the first display screen and the second display screen may be considered as a whole, moving together relative to the wearable electronic device.
Optionally, in the embodiment of the present invention, when the target display screen is a motion display screen, one end of the first display screen and one end of the second display screen are both connected to the wearable electronic device through a rotating shaft of the wearable electronic device, so that the wearable electronic device can control the first display screen and the second display screen to rotate around the rotating shaft relative to the wearable electronic device.
It is understood that the first display screen and the second display screen can be considered as a whole and rotate together around the rotating shaft relative to the wearable electronic device.
In the embodiment of the invention, the wearable electronic device can control the target display screen to move, and the target display screen is moved to a certain position (namely, a position except a visual core area of the user), so that the position of the target display screen relative to the wearable electronic device is changed, and the user can focus the sight line in the environment where the user is located (namely, the environment where the wearable electronic device is located).
In the embodiment of the invention, the wearable electronic device can control the first display area and the second display area to move to the preset positions according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user, namely, the first display area and the second display area are controlled to move out of the visual core area of the user, or the first display area and the second display area are controlled to rotate by the preset angle value, namely, the first display area and the second display area are controlled to rotate out of the visual core area of the user, so that the user can know the environmental state of the user, the user is prevented from colliding in the real space, and the use convenience of the wearable electronic device is improved.
Optionally, in the embodiment of the present invention, the step 202 may be replaced with the step 206.
Step 206, the wearable electronic device adjusts the display state of the first display area according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
Optionally, in this embodiment of the present invention, the wearable electronic device may adjust a display state of the first display screen to adjust a display state of the first display area.
Optionally, in this embodiment of the present invention, the first display screen may be a display screen with high transparency.
It should be noted that the above "display screen with high transparency" can be understood as: the wearable electronic device may adjust the transparency value of some (or all) regions of the first display screen so that the some (or all) regions do not display the content, that is, the user may know the environmental condition of the user through the some (or all) regions.
Optionally, in the embodiment of the present invention, the wearable electronic device may adjust the display state of the first display screen by adjusting display screen parameters of the first display screen; alternatively, the wearable electronic device may adjust the display state of the first display screen by displaying an image in a region of the first display screen (e.g., an image of an environment in which the wearable electronic device is located).
In the embodiment of the invention, the wearable electronic device can adjust the display state of the first display area to change the display state of the first display area, so that the user can concentrate the sight line in the environment where the user is located (namely the environment where the wearable electronic device is located), and the power consumption of the wearable electronic device can be reduced.
It can be understood that the wearable electronic device may adjust the display state of the first display screen by controlling the first display area to adjust a display parameter of a certain area (i.e., the user visual core area) in the first display area (i.e., no content is displayed in the certain area or an image of an environment where the user is located is displayed), so that the user may know the environment condition where the user is located through the adjusted first display screen.
Optionally, in the embodiment of the present invention, the step 206 may be specifically implemented by the following step 206a or 206 b.
Step 206a, the wearable electronic device controls the target display area in the first display area not to display the content according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
In an embodiment of the present invention, the target display area is used to indicate a gazing area of a user's sight line in the first display area.
Optionally, in this embodiment of the present invention, the target display area may be a whole area or a partial area of the first display area.
Optionally, in this embodiment of the present invention, the target display area may be a central area (e.g., a user visual core area) in the first display area.
Optionally, in this embodiment of the present invention, the display size of the target display area may be determined by any one of: the display control method comprises the steps of setting parameters of a default display size, historical use information of a user on a first display area, the size of a watching area of a user sight on the first display area and the display size of the user.
Optionally, in the embodiment of the present invention, when the user uses the wearable electronic device, the wearable electronic device may collect an eye image of the user through a camera of the wearable electronic device, so as to determine, according to the eye image, a size of a gazing area of the user's view to the first display area, and determine the size of the gazing area as the display size of the target display area.
Optionally, in the embodiment of the present invention, when using the wearable electronic device, the user may set the display size of the target display area in the wearable electronic device in advance, so that the wearable electronic device determines the display size set by the user as the display size of the target display area.
Optionally, in this embodiment of the present invention, the wearable electronic device may adjust (e.g., increase) a transparency value of the target display area in the first display area, so that the target display area does not display the content, and thus the user may focus the line of sight in the environment where the user is located through the target display area.
In the embodiment of the invention, the wearable electronic device can control the target display area in the first display screen not to display the content according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user, namely, the user visual core area in the first display area is controlled not to display the content, so that the user can know the environment state of the user, the user is prevented from colliding in the real space, and the use convenience of the wearable electronic device is improved.
And step 206b, the wearable electronic device controls a target display area in the first display area to display the first image according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
In an embodiment of the present invention, the first image is an environment image collected by a camera of the wearable electronic device.
In the embodiment of the invention, the wearable electronic device can display the environment image by controlling the target display area in the first display area, so that the user can determine the environment where the user is currently located by watching the environment image on the first display area.
It is understood that the user may determine whether a first object (e.g., an obstacle, etc.) is present in the environment in which the user is located based on the first image to avoid the user colliding with the first object.
Optionally, in the embodiment of the present invention, before the step 206b, the display method provided in the embodiment of the present invention may further include the following step 301.
Step 301, when an environment parameter value of an external environment where the wearable electronic device is located is less than or equal to a second preset threshold, the wearable electronic device acquires a first image through an infrared camera of the wearable electronic device.
Optionally, in this embodiment of the present invention, the environmental parameter value may include at least one of the following: light intensity values and spectral energy values.
Optionally, in the embodiment of the present invention, the wearable electronic device may acquire, by using the illuminance sensor of the wearable electronic device, a light intensity value of an external environment where the wearable electronic device is located, so that the wearable electronic device may acquire, by using the infrared camera of the wearable electronic device, the first image when the light intensity value is smaller than or equal to the second preset threshold.
It can be understood that, in a case where the wearable electronic device is in a dark environment (i.e., the light intensity value is less than or equal to the second preset threshold), the wearable electronic device may clearly acquire an image of an external environment where the wearable electronic device is located through the infrared camera, so that the user may determine whether a first object (e.g., an obstacle or the like) exists in the environment where the user is located by viewing the environment image on the first display area and the second display area to avoid the user colliding with the first object.
In the embodiment of the invention, the wearable electronic device can clearly acquire the environment image of the external environment where the user is located through the infrared camera under the condition that the environment parameter value is less than or equal to the second preset threshold value (namely, the wearable electronic device is in a dark environment), so that the user can know the external environment state where the user is located by watching the environment image, the user is prevented from colliding in a real space, and the use convenience of the wearable electronic device is improved.
Fig. 7 shows a schematic diagram of a possible structure of an electronic device related to an embodiment of the present invention, where the electronic device is a wearable electronic device, and the electronic device has a first display area and a second display area, and a size of the first display area is smaller than a size of the second display area. As shown in fig. 7, the electronic device 90 may include: a display module 91 and a control module 92. The display module 91 is configured to display a first image through the first display area or the second display area. And the control module 92 is used for controlling the first display area to move relative to the second display area according to the working state of the wearable electronic device or the use condition of the wearable electronic device by the user.
In a possible implementation manner, the control module 92 is specifically configured to control the first display area to move relative to the second display area when an external environment where the wearable electronic device is located meets a first preset condition; or, the control module 92 is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device starts the target application; or, the control module 92 is specifically configured to control the first display area to move relative to the second display area when a state parameter of the wearable electronic device meets a second preset condition, where the state parameter includes at least one of a movement speed and a movement acceleration; or, the control module 92 is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device does not respond to the operation of the user within a first preset time; alternatively, the control module 92 is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device receives a user operation.
In a possible implementation manner, the external environment where the wearable electronic device is located meets the first preset condition includes: a first object appears in a viewing angle range of the wearable electronic device; alternatively, a state of the first object in a viewing angle range of the wearable electronic device changes.
In a possible implementation manner, the control module 92 is specifically configured to control the first display area to move to a preset position; or, the control module 92 is specifically configured to control the first display area to rotate by a preset angle value.
In a possible implementation manner, the display screen corresponding to the first display area is a scroll display screen. The control module 92 is further configured to control the display screen corresponding to the first display area to extend along the target direction after controlling the first display area to move to the preset position.
In a possible implementation manner, the display module 91 is further configured to display a user interface of the target application in the first display area after the control module 92 controls the first display area to move relative to the second display area, and display a picture of an external environment where the wearable electronic device is located in the second display area; or, the display module 91 is further configured to display the user interface of the target application in the second display area after the control module 92 controls the first display area to move relative to the second display area, and display a picture of an external environment where the wearable electronic device is located in the first display area.
In a possible implementation manner, the control module 92 is specifically configured to control a target display area in the first display area not to display content, where the target display area is used to indicate a gazing area of the user's sight line in the first display area; or, the control module 92 is specifically configured to control a target display area in the first display area to display a first image, where the first image is an environment image acquired by a camera of the wearable electronic device.
In a possible implementation manner, the display size of the target display area is determined by any one of the following items: the display control method comprises the steps of setting parameters of a default display size, historical use information of a user on a first display area, the size of a watching area of a user sight on the first display area and the display size of the user.
In a possible implementation manner, with reference to fig. 7, as shown in fig. 8, an electronic device 90 provided in an embodiment of the present invention may further include: an acquisition module 93. The acquiring module 93 is configured to acquire the first image through an infrared camera of the wearable electronic device when an environmental parameter value of an external environment where the wearable electronic device is located is less than or equal to a second preset threshold before the control module 92 controls the target display area in the first display area to display the first image.
The electronic device provided by the embodiment of the present invention can implement each process implemented by the electronic device in the above method embodiments, and for avoiding repetition, detailed descriptions are not repeated here.
The embodiment of the invention provides electronic equipment, which can determine whether a user needs to know the external environment condition of the user according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by the user, and under the condition that the user needs to know the environment condition of the user, the wearable electronic equipment can control a first display area to move relative to a second display area so as to change the display position of the first display area relative to the second display area, so that the user can know the external environment condition of the user by changing the first display area after the display position, the collision of the user in a real space is avoided, and the use convenience of the wearable electronic equipment can be improved.
Fig. 9 is a hardware schematic diagram of an electronic device implementing various embodiments of the invention. As shown in fig. 9, electronic device 100 includes, but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111.
It should be noted that the electronic device structure shown in fig. 9 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown in fig. 9, or combine some components, or arrange different components, as will be understood by those skilled in the art. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The processor 110 is configured to control the display unit 106 to display a first image through the first display area or the second display area; and controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by the user.
The embodiment of the invention provides electronic equipment, which can determine whether a user needs to know the external environment condition of the user according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by the user, and under the condition that the user needs to know the environment condition of the user, the wearable electronic equipment can control a first display area to move relative to a second display area so as to change the display position of the first display area relative to the second display area, so that the user can know the external environment condition of the user by changing the first display area after the display position, the collision of the user in a real space is avoided, and the use convenience of the wearable electronic equipment can be improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 9, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides an electronic device, which includes the processor 110 shown in fig. 9, the memory 109, and a computer program stored in the memory 109 and capable of running on the processor 110, where the computer program, when executed by the processor 110, implements the processes of the foregoing method embodiment, and can achieve the same technical effect, and details are not described here to avoid repetition.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may be, for example, a read-only memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (14)

1. A display method is applied to a wearable electronic device, wherein the wearable electronic device is provided with a first display area and a second display area, the first display area and the second display area are display areas in different display screens, and the size of the first display area is smaller than that of the second display area, and the method comprises the following steps:
displaying a first picture through the first display area or the second display area;
and controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by a user.
2. The method of claim 1, wherein the controlling the first display area to move relative to the second display area according to the operating state of the wearable electronic device or the usage of the wearable electronic device by the user comprises:
controlling the first display area to move relative to the second display area under the condition that the external environment where the wearable electronic device is located meets a first preset condition;
or, in the case that the wearable electronic device starts a target application, controlling the first display area to move relative to the second display area;
or, the first display area is controlled to move relative to the second display area under the condition that the state parameters of the wearable electronic device meet a second preset condition, wherein the state parameters comprise at least one of a movement speed and a movement acceleration;
or, in the case that the wearable electronic device does not respond to the operation of the user within a first preset time, controlling the first display area to move relative to the second display area;
or, in the case that the wearable electronic device receives a user operation, controlling the first display area to move relative to the second display area.
3. The method of claim 2, wherein the external environment of the wearable electronic device meets a first preset condition comprises:
a first object appears in a viewing angle range of the wearable electronic device;
alternatively, the first and second electrodes may be,
a state of a first object in a range of viewing angles of the wearable electronic device changes.
4. The method of claim 1, wherein the controlling the first display region to move relative to the second display region comprises:
controlling the first display area to move to a preset position;
or the like, or, alternatively,
and controlling the first display area to rotate by a preset angle value.
5. The method of claim 4, wherein the display screen corresponding to the first display area is a scroll display screen;
after the controlling the first display area to move to the preset position, the method further includes:
and controlling the display screen corresponding to the first display area to stretch along the target direction.
6. The method of claim 1, wherein after the controlling the movement of the first display region relative to the second display region, the method further comprises:
displaying a user interface of a target application in the first display area, and displaying a picture of an external environment where the wearable electronic device is located in the second display area;
alternatively, the first and second electrodes may be,
and displaying a user interface of a target application in the second display area, and displaying a picture of an external environment where the wearable electronic device is located in the first display area.
7. A wearable electronic device having a first display area and a second display area, the first display area and the second display area being display areas in different display screens, a size of the first display area being smaller than a size of the second display area, the wearable electronic device comprising: the display module and the control module;
the display module is used for displaying a first picture through the first display area or the second display area;
the control module is used for controlling the first display area to move relative to the second display area according to the working state of the wearable electronic equipment or the use condition of the wearable electronic equipment by a user.
8. The wearable electronic device according to claim 7, wherein the control module is configured to control the first display area to move relative to the second display area when an external environment where the wearable electronic device is located meets a first preset condition;
or, the control module is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device starts a target application;
or the control module is specifically configured to control the first display area to move relative to the second display area when a state parameter of the wearable electronic device meets a second preset condition, where the state parameter includes at least one of a movement speed and a movement acceleration;
or, the control module is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device does not respond to the operation of the user within a first preset time;
or, the control module is specifically configured to control the first display area to move relative to the second display area when the wearable electronic device receives a user operation.
9. The wearable electronic device of claim 8, wherein the external environment of the wearable electronic device meets a first preset condition comprises:
a first object appears in a viewing angle range of the wearable electronic device;
alternatively, the first and second electrodes may be,
a state of a first object in a range of viewing angles of the wearable electronic device changes.
10. The wearable electronic device of claim 7, wherein the control module is configured to control the first display area to move to a predetermined position;
or the like, or, alternatively,
the control module is specifically used for controlling the first display area to rotate by a preset angle value.
11. The wearable electronic device of claim 10, wherein the display screen corresponding to the first display area is a scroll-type display screen;
the control module is further configured to control the display screen corresponding to the first display area to extend along a target direction after controlling the first display area to move to the preset position.
12. The wearable electronic device of claim 7, wherein the display module is further configured to display a user interface of a target application in the first display area and display a picture of an external environment where the wearable electronic device is located in the second display area after the control module controls the first display area to move relative to the second display area;
alternatively, the first and second electrodes may be,
the display module is further configured to display a user interface of a target application in the second display area after the control module controls the first display area to move relative to the second display area, and display a picture of an external environment where the wearable electronic device is located in the first display area.
13. Wearable electronic device, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, carries out the steps of the display method according to any one of claims 1 to 6.
14. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the display method according to any one of claims 1 to 6.
CN202010224965.4A 2020-03-26 2020-03-26 Display method and wearable electronic equipment Active CN111443805B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010224965.4A CN111443805B (en) 2020-03-26 2020-03-26 Display method and wearable electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010224965.4A CN111443805B (en) 2020-03-26 2020-03-26 Display method and wearable electronic equipment

Publications (2)

Publication Number Publication Date
CN111443805A CN111443805A (en) 2020-07-24
CN111443805B true CN111443805B (en) 2022-04-08

Family

ID=71654505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010224965.4A Active CN111443805B (en) 2020-03-26 2020-03-26 Display method and wearable electronic equipment

Country Status (1)

Country Link
CN (1) CN111443805B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114089814A (en) * 2021-11-08 2022-02-25 广东乐心医疗电子股份有限公司 Display method and device and electronic equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104759095A (en) * 2015-04-24 2015-07-08 吴展雄 Virtual reality head wearing display system
CN106646876A (en) * 2016-11-25 2017-05-10 捷开通讯(深圳)有限公司 Head-mounted display system and safety prompting method thereof
KR20170097521A (en) * 2016-02-18 2017-08-28 삼성전자주식회사 Wearable electronic device having plurality of display and screen providing method thereof
CN107688238A (en) * 2016-08-04 2018-02-13 陈立旭 A kind of VR reality helmet-mounted display system

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9081177B2 (en) * 2011-10-07 2015-07-14 Google Inc. Wearable computer with nearby object response
CN104750249B (en) * 2015-03-02 2020-02-21 联想(北京)有限公司 Information processing method and electronic equipment
CN109213271A (en) * 2017-07-07 2019-01-15 富泰华工业(深圳)有限公司 display screen structure and wearable electronic device
TWI652023B (en) * 2018-01-11 2019-03-01 廣達電腦股份有限公司 Head mounted display device
CN108376019A (en) * 2018-05-28 2018-08-07 Oppo广东移动通信有限公司 Electronic device
CN109976457A (en) * 2019-04-15 2019-07-05 Oppo(重庆)智能科技有限公司 Intelligent wearable device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104759095A (en) * 2015-04-24 2015-07-08 吴展雄 Virtual reality head wearing display system
KR20170097521A (en) * 2016-02-18 2017-08-28 삼성전자주식회사 Wearable electronic device having plurality of display and screen providing method thereof
CN107688238A (en) * 2016-08-04 2018-02-13 陈立旭 A kind of VR reality helmet-mounted display system
CN106646876A (en) * 2016-11-25 2017-05-10 捷开通讯(深圳)有限公司 Head-mounted display system and safety prompting method thereof

Also Published As

Publication number Publication date
CN111443805A (en) 2020-07-24

Similar Documents

Publication Publication Date Title
CN110769155B (en) Camera control method and electronic equipment
CN108628515B (en) Multimedia content operation method and mobile terminal
CN109032486B (en) Display control method and terminal equipment
CN110174993B (en) Display control method, terminal equipment and computer readable storage medium
CN109710349B (en) Screen capturing method and mobile terminal
CN111147743B (en) Camera control method and electronic equipment
CN109257505B (en) Screen control method and mobile terminal
CN110489045B (en) Object display method and terminal equipment
CN111031253B (en) Shooting method and electronic equipment
CN111385415B (en) Shooting method and electronic equipment
CN111562896B (en) Screen projection method and electronic equipment
CN111031234B (en) Image processing method and electronic equipment
US20220286622A1 (en) Object display method and electronic device
CN110830713A (en) Zooming method and electronic equipment
CN111314616A (en) Image acquisition method, electronic device, medium and wearable device
CN111399792B (en) Content sharing method and electronic equipment
CN110866465A (en) Control method of electronic equipment and electronic equipment
CN110457885B (en) Operation method and electronic equipment
CN109859718B (en) Screen brightness adjusting method and terminal equipment
CN109104573B (en) Method for determining focusing point and terminal equipment
CN111240567A (en) Display screen angle adjusting method and electronic equipment
CN110769154A (en) Shooting method and electronic equipment
CN111443805B (en) Display method and wearable electronic equipment
CN109739430B (en) Display method and mobile terminal
CN111026263B (en) Audio playing method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant