CN111142772A - Content display method and wearable device - Google Patents

Content display method and wearable device Download PDF

Info

Publication number
CN111142772A
CN111142772A CN201911358662.5A CN201911358662A CN111142772A CN 111142772 A CN111142772 A CN 111142772A CN 201911358662 A CN201911358662 A CN 201911358662A CN 111142772 A CN111142772 A CN 111142772A
Authority
CN
China
Prior art keywords
user
wearable device
screen
area
content
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911358662.5A
Other languages
Chinese (zh)
Inventor
王丰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Hangzhou Co Ltd
Original Assignee
Vivo Mobile Communication Hangzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Hangzhou Co Ltd filed Critical Vivo Mobile Communication Hangzhou Co Ltd
Priority to CN201911358662.5A priority Critical patent/CN111142772A/en
Publication of CN111142772A publication Critical patent/CN111142772A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention provides a content display method and wearable equipment, relates to the technical field of communication, and can solve the problems that a user cannot consider the driving safety and the navigation route viewing at the same time, and the man-machine interaction performance is poor. The scheme comprises the following steps: displaying target content shared by the electronic equipment in a first area of a screen of the wearable equipment; updating a display position of the target content to a second area of the screen in a case where the user wears the wearable device; wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset duration is smaller than a preset threshold. The scheme can be applied to a scene for viewing the navigation route in the driving process.

Description

Content display method and wearable device
Technical Field
The embodiment of the invention relates to the technical field of communication, in particular to a content display method and wearable equipment.
Background
With the increasing degree of intelligence of electronic devices, the types of applications that can be supported by electronic devices are increasing.
Taking a navigation application program (hereinafter referred to as a navigation application) in an electronic device as an example, currently, a user can perform route navigation through the navigation application. After the user navigates the route through the navigation-type application, the user may proceed to the destination according to the navigation route displayed by the electronic device.
However, after the user navigates the route through the navigation-like application, if the user is in a driving state, the user may view the navigation route displayed by the electronic device while driving the automobile. Thus, the user may not be able to consider the driving safety and the viewing of the navigation route, and the man-machine interaction performance is poor.
Disclosure of Invention
The embodiment of the invention provides a content display method and wearable equipment, and aims to solve the problems that a user cannot consider driving safety and navigation route viewing at the same time, and the man-machine interaction performance is poor.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present invention provides a content display method, which is applied to a wearable device having a screen, and includes: displaying target content shared by the electronic equipment in a first area of a screen of the wearable equipment; and updating the display position of the target content to a second region of the screen in a case where the user wears the wearable device. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset duration is smaller than a preset threshold.
In a second aspect, an embodiment of the present invention provides a content display method, where the method is applied to an electronic device, and the method includes: receiving a second input of the user; and in response to the second input, sharing the target content with the wearable device. The target content is an image of a first interface displayed by the electronic equipment and collected by the electronic equipment.
In a third aspect, embodiments of the present invention provide a wearable device, which may include a display module. The display module is used for displaying target content shared by the electronic equipment in a first area of a screen of the wearable equipment; updating the display position of the target content to be a second area of the screen under the condition that the user wears the wearable device; the second region includes at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset duration is smaller than a preset threshold.
In a fourth aspect, an embodiment of the present invention provides an electronic device, which may include a receiving module and a sending module. The receiving module is used for receiving a second input of the user; the sending module is used for responding to the second input received by the receiving module, and sharing target content to the wearable device, wherein the target content is an image of a first interface displayed by the electronic device and acquired by the electronic device.
In a fifth aspect, embodiments of the present invention provide a wearable device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and which, when executed by the processor, implements the steps of the content display method as in the first aspect described above.
In a sixth aspect, embodiments of the present invention provide an electronic device, which includes a processor, a memory, and a computer program stored on the memory and operable on the processor, and when executed by the processor, implement the steps of the content display method in the second aspect.
In a seventh aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, and the computer program, when executed by a processor, implements the steps of the content display method in the first aspect or the second aspect.
In the embodiment of the invention, when the wearable device receives the target content shared by the electronic device, the wearable device can display the target content shared by the electronic device in the first area of the screen of the wearable device; then, when the user wears the wearable device, the wearable device may update the display location of the target content to a second area of the screen. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. So, the user can be at the in-process of driving the vehicle, directly look over the navigation route through wearable equipment, need not to look over the navigation route that electronic equipment shows again to can realize looking over the navigation route when driving the vehicle, that is the user can compromise looking over of the security of driving and navigation route, and then can improve human-computer interaction performance.
Drawings
Fig. 1 is a schematic structural diagram of an android operating system according to an embodiment of the present invention;
FIG. 2 is a diagram illustrating a content display method according to an embodiment of the present invention;
fig. 3 is one of schematic interfaces of an application of a content display method according to an embodiment of the present invention;
fig. 4 is a second schematic interface diagram of an application of the content display method according to the embodiment of the present invention;
fig. 5 is a third schematic interface diagram of an application of the content display method according to the embodiment of the present invention;
fig. 6 is a schematic structural diagram of a wearable device according to an embodiment of the present invention;
fig. 7 is a second schematic structural diagram of a wearable device according to an embodiment of the present invention;
fig. 8 is a third schematic structural diagram of a wearable device according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 10 is a hardware schematic diagram of a wearable device provided in an embodiment of the present invention;
fig. 11 is a hardware schematic diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The term "and/or" herein is an association relationship describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. The symbol "/" herein denotes a relationship in which the associated object is or, for example, a/B denotes a or B.
The terms "first" and "second," etc. herein are used to distinguish between different objects and are not used to describe a particular order of objects. For example, the first input and the second input are for distinguishing different inputs, rather than for describing a particular order of inputs.
In the embodiments of the present invention, words such as "exemplary" or "for example" are used to mean serving as examples, illustrations or descriptions. Any embodiment or design described as "exemplary" or "e.g.," an embodiment of the present invention is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the description of the embodiments of the present invention, unless otherwise specified, "a plurality" means two or more, for example, a plurality of elements means two or more elements, and the like.
The embodiment of the invention provides a content display method and wearable equipment. When the wearable device receives the target content shared by the electronic device, the wearable device can display the target content shared by the electronic device in a first area of a screen of the wearable device; then, when the user wears the wearable device, the wearable device may update the display location of the target content to a second area of the screen. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. Therefore, the user can directly check the navigation route through the wearable device in the process of driving the vehicle without checking the navigation route displayed by the electronic device, so that the navigation route can be checked while the vehicle is driven, namely, the user can consider the driving safety and the navigation route, and the man-machine interaction performance can be improved.
The wearable device in the embodiment of the invention can be a wearable device with an operating system. The operating system may be an Android (Android) operating system, an ios operating system, or other possible operating systems, and embodiments of the present invention are not limited in particular.
The following describes a software environment to which the content display method provided by the embodiment of the present invention is applied, by taking an android operating system as an example.
Fig. 1 is a schematic diagram of an architecture of a possible android operating system according to an embodiment of the present invention. In fig. 1, the architecture of the android operating system includes 4 layers, which are respectively: an application layer, an application framework layer, a system runtime layer, and a kernel layer (specifically, a Linux kernel layer).
The application program layer comprises various application programs (including system application programs and third-party application programs) in an android operating system. Illustratively, the application layer may include an application program for displaying an interface in an embodiment of the present invention. For example, the application may be any application that may display an interface, such as a navigation application, a communication application, and a browser application.
The application framework layer is a framework of the application, and a developer can develop some applications based on the application framework layer under the condition of complying with the development principle of the framework of the application.
The system runtime layer includes libraries (also called system libraries) and android operating system runtime environments. The library mainly provides various resources required by the android operating system. The android operating system running environment is used for providing a software environment for the android operating system.
The kernel layer is an operating system layer of an android operating system and belongs to the bottommost layer of an android operating system software layer. The kernel layer provides kernel system services and hardware-related drivers for the android operating system based on the Linux kernel.
Taking an android operating system as an example, in the embodiment of the present invention, a developer may develop a software program for implementing the content display method provided in the embodiment of the present invention based on the system architecture of the android operating system shown in fig. 1, so that the content display method may operate based on the android operating system shown in fig. 1. Namely, the processor or the wearable device can implement the content display method provided by the embodiment of the invention by running the software program in the android operating system.
Optionally, in an embodiment of the present invention, the wearable device may be an electronic device having an Augmented Reality (AR) function. For example: AR glasses, AR helmets, and the like.
An execution main body of the content display method provided in the embodiment of the present invention may be the wearable device, or may also be a functional module and/or a functional entity capable of implementing the content display method in the wearable device, which may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited. The content display method provided by the embodiment of the invention is exemplarily described below by taking a wearable device as an example.
The electronic device in the embodiment of the invention can be a mobile electronic device or a non-mobile electronic device. For example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted terminal, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiment of the present invention is not particularly limited.
The content display method provided by the embodiment of the invention can be applied to scenes that a user cannot hold the electronic equipment by hand and needs to check the content displayed on the screen of the electronic equipment. Specifically, the content display method provided by the embodiment of the present invention may be applied to a scenario in which a user views a navigation route (i.e., a navigated driving route) during driving, and may also be applied to any possible scenario in which the user views information (i.e., information received by an electronic device) during room arrangement, which may be determined according to actual usage requirements, and the embodiment of the present invention is not limited.
The following specifically describes the content display method provided by the embodiment of the present invention by taking a scene in which a user views a navigation route during driving as an example. In the embodiment of the invention, the user can carry out route navigation through the electronic equipment when driving the vehicle, so that the user can drive the vehicle to reach the required destination according to the navigation route displayed by the electronic equipment. In this case, if the user desires to view the navigation route displayed by the electronic device while driving the vehicle, the user may trigger the electronic device to send an image of an interface of the navigation route displayed by the electronic device to the wearable device, and the wearable device may then display the image in the first area of the screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. So, the user can be at the in-process of driving the vehicle, directly look over the navigation route through wearable equipment, need not to look over the navigation route that electronic equipment shows again to can realize looking over the navigation route when driving the vehicle, that is the user can compromise looking over of the security of driving and navigation route, and then can improve human-computer interaction performance.
It can be understood that the above is an example of a scene in which the user views the navigation route during driving, and the content display method provided by the embodiment of the present invention is exemplarily described. For other scenes, the description of the content display method provided by the embodiment of the present invention is similar to the above exemplary scene, and for avoiding repetition, the description is omitted here.
The following describes an exemplary content display method according to an embodiment of the present invention with reference to the drawings.
As shown in fig. 2, an embodiment of the present invention provides a content display method, which may include S201 to S205 described below.
S201, the electronic equipment receives a second input of the user.
In an embodiment of the present invention, the second input may be an input of the user with respect to the first interface, that is, the second input may be an input performed by the user while the electronic device displays the first interface.
Optionally, in this embodiment of the present invention, the first interface may be a navigation interface. Specifically, in the embodiment of the present invention, after the user performs route navigation through the navigation application in the electronic device, an interface including an image of a navigation route, that is, the navigation interface, may be displayed on the screen of the electronic device.
Optionally, in the embodiment of the present invention, the navigation application may be a navigation application (hereinafter referred to as a navigation application 1) installed in the electronic device; navigation applications (hereinafter referred to as navigation applications 2) that are supported by the browser application of the electronic device but are not installed in the electronic device; the navigation application may be a navigation application (hereinafter referred to as a navigation application 3) that is not installed in the electronic device but supported by the communication application in the electronic device. Correspondingly, in the embodiment of the present invention, the first interface may be an interface displayed after the user performs route navigation through the navigation application 1, may also be an interface displayed after the user triggers the navigation application 2 to perform route navigation, and may also be an interface displayed after the user triggers the navigation application 3 to perform route navigation.
Illustratively, in a case that the navigation application is the navigation application 3, the user may trigger the electronic device to invoke the navigation application 3 by clicking a "location" control in the communication application, and display the first interface through the navigation application 3.
Optionally, in this embodiment of the present invention, the first interface may include an image and/or other content (for example, a target control described below) of the navigation route. The method can be determined according to the use requirement, and the embodiment of the invention is not limited.
Optionally, in this embodiment of the present invention, the second input may be an input of the target control by the user, or may also be a voice input of the user, or may also be any possible input such as a gesture input of the user, which is not limited in this embodiment of the present invention.
For example, the target control may be a control in a first interface displayed by the electronic device (it may be understood that the target control may be a control that triggers the electronic device to share (may also be referred to as send) target content to the wearable device, that is, a control that enables the electronic device to share the target content to the wearable device in the embodiment of the present invention). The voice input may be any possible voice input such as "share" or "send" spoken by the user. The gesture input may be the same gesture input as the preset gesture input (it is understood that the preset gesture input may be a gesture input for enabling the electronic device to share the target content with the wearable device in the embodiment of the present invention).
S202, the electronic device responds to the second input and shares the target content with the wearable device.
The target content may be an image of a first interface currently displayed by the electronic device, which is acquired by the electronic device, that is, the target content may be an image of a navigation interface currently displayed by the electronic device, which is acquired by the electronic device. It is understood that the present here refers to the moment when the electronic device captures the image of the first interface.
In the embodiment of the present invention, after the electronic device receives the second input of the user, the electronic device may share (may also be referred to as sending) the target content to the wearable device in response to the second input. Specifically, the electronic device may obtain an image of a first interface currently displayed by the electronic device, that is, the target content, in a screen capture mode, and then the electronic device shares the target content with the wearable device.
Optionally, in this embodiment of the present invention, before the electronic device sends the target content to the wearable device, a connection may be established between the electronic device and the wearable device. I.e. after establishing a connection between the electronic device and the wearable device, interaction between the electronic device and the wearable device may be possible.
Optionally, in the embodiment of the present invention, the connection established between the electronic device and the wearable device may be a bluetooth connection, a Universal Serial Bus (USB) connection, a wireless connection, or any other possible connection, which may be determined according to actual use requirements, and the embodiment of the present invention is not limited.
S203, the wearable device receives the target content.
S204, the wearable device displays the target content in a first area of a screen of the wearable device.
In the embodiment of the invention, after the wearable device receives the target content shared by the electronic device, the wearable device can display the target content in the first area of the screen of the wearable device.
Optionally, in this embodiment of the present invention, the first area may be any area (or position) on a screen of a preset wearable device. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
The methods shown in S201-S204 above are further described in conjunction with fig. 3.
Illustratively, assume that the second input is a voice input by which the user speaks "share". As shown in fig. 3 (a), a schematic diagram of a first interface (shown as 31 in fig. 3 (a)) is displayed for the electronic device. After the user says "share", that is, the electronic device receives a second input of the user, the electronic device may acquire 31 an image of the interface (that is, the target content) shown in the diagram by means of screen capture in response to the second input, and share the target content to the wearable device. After the wearable device receives the target content, as shown in (b) of fig. 3, the wearable device may display the target content (as shown at 33 of (b) of fig. 3) in a first area of a screen (as shown at 32 of (b) of fig. 3) of the wearable device. The images displayed in the "lane interface line" and "instrument desk in the motor vehicle" areas in fig. 3 are images of the real scene captured by the wearable device and within the capture range of the wearable device.
It should be noted that, in the embodiment of the present invention, the image displayed on the screen of the wearable device may include a real-scene image within the capture range of the wearable device, and may further include a virtual-scene image generated by the wearable device according to the real-scene image within the capture range of the wearable device.
Illustratively, as shown in fig. 4, a schematic diagram of an image displayed on a screen of a wearable device is shown. The image "snack room" shown in 41 is a real-scene image displayed by the wearable device and within the acquisition range of the wearable device, and the "expression" shown in 42 is a virtual-scene image generated by the wearable device according to the "snack room".
And S205, under the condition that the user wears the wearable device, the wearable device updates the display position of the target content to be a second area of the screen.
In the embodiment of the present invention, after the wearable device displays the target content in the first area of the screen of the wearable device, if the user wears the wearable device, the wearable device may update the display position of the target content to the second area of the screen of the wearable device, that is, the wearable device may display the target content in the second area of the screen.
Optionally, in this embodiment of the present invention, the wearable device may detect whether the user wears the wearable device through a sensor (e.g., a temperature sensor, an orientation sensor, or the like) of the wearable device.
Optionally, in an embodiment of the present invention, the second area may include at least one of the following: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area in a visual field of the user. And the scene change amplitude of the target area in the preset time length is smaller than a preset threshold value.
These four possible implementations of the second region are each illustrated below.
(1) The second area is a preset location on a screen of the wearable device.
In the embodiment of the invention, when the user wears the wearable device, the wearable device can update the display position of the target content to the preset position of the screen, that is, the wearable device can display the target content at the preset position on the screen.
Optionally, the preset position may be any position on a preset screen of the wearable device, for example, a position on the preset screen of the wearable device that does not affect driving of the user may be used. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
It should be noted that, in the embodiment of the present invention, the preset position and the first area may be the same or different, and may be determined specifically according to actual use requirements, and the embodiment of the present invention is not limited.
(2) The second area is a touch position of the user on the screen of the wearable device.
In the embodiment of the present invention, when the user wears the wearable device, the user may trigger the wearable device to update the display location of the target content to the input touch location (also referred to as an input location) through an input (for example, a third input described below) on a screen of the wearable device, that is, the wearable device may display the target content at the input touch location.
Optionally, in an embodiment of the present invention, the third input may be a user's press input on a screen of the wearable device; alternatively, the third input may be a long press input by the user on the screen of the wearable device; alternatively, the third input may be a user click input on a screen of the wearable device, or the like. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
In the embodiment of the present invention, the heavy-press input may be an input that is pressed by a user with a certain pressure value, the long-press input may be an input that is pressed by the user for a certain period of time, and the click input may be a single-click input, a double-click input, or an input that is clicked for a preset number of times.
(3) The second region is the position of the user's eye at the point of regard on the screen of the wearable device.
In the embodiment of the invention, when the user wears the wearable device, the wearable device can update the display position of the target content to the position of the gaze point of the eyeballs of the user on the screen of the wearable device, that is, the wearable device can display the target content at the position of the gaze point of the eyeballs of the user on the screen. Specifically, the user may perform an input (e.g., a fourth input) on the wearable device, thereby triggering the wearable device to acquire the position of the gaze point of the user's eyeball on the screen of the wearable device, and the wearable device may display the target content at the position.
Optionally, in the embodiment of the present invention, the fourth input may be a voice input of the user, and may also be any possible input such as a gesture input of the user, which is not limited in the embodiment of the present invention.
Illustratively, the voice input may be any possible voice input that the user speaks "find location," etc. The gesture input may be the same gesture input as the preset gesture input (it is understood that the preset gesture input may be a gesture input for displaying the target content at the position of the gaze point of the user eyeball on the screen of the wearable device in the embodiment of the present invention).
Optionally, in the embodiment of the present invention, the wearable device may acquire the position of the gaze point of the eyeball of the user on the screen of the wearable device through an eyeball tracking technology. Specifically, the method comprises the following steps. The wearable device can trigger a camera of the wearable device to acquire movement information (such as a movement direction and a movement track) of eyeballs of a user, characteristic points of the eyeballs and characteristic points around the eyeballs to track the eyeballs so as to acquire the positions of fixation points of the eyeballs of the user on a screen; alternatively, the wearable device may project light beams such as infrared rays to the iris of the eyeball, and then the iris of the eyeball may reflect the light to the wearable device, so that the wearable device may extract the feature points to track the eyeball to acquire the position of the gaze point of the eyeball of the user on the screen.
(4) The second area is a position corresponding to a target area in the user field of view.
Wherein, the position corresponding to the target area in the user field of view can be understood as: a location on a screen of the wearable device where the target area image is displayed.
It should be noted that, in the embodiment of the present invention, the field of view of the user is the acquisition range of the wearable device, so that the target area in the field of view of the user is the target area acquired by the wearable device and within the acquisition range of the wearable device.
Optionally, in this embodiment of the present invention, a scene change amplitude of the target area within a preset time duration is smaller than a preset threshold.
Wherein, the scene change amplitude within the preset duration being smaller than the preset threshold can be understood as: within the preset time length, the change amplitude between the corresponding scenes at any two moments is smaller than a preset threshold value.
Optionally, in this embodiment of the present invention, the wearable device may find an image with a variation amplitude smaller than a preset threshold in scene images (may also be referred to as real scene images) acquired by the wearable device within a preset time period and within an acquisition range of the wearable device, so as to determine a position where the image is displayed as the second area.
In the embodiment of the invention, after the user performs route navigation through the navigation application in the electronic device, when the user drives a vehicle, if the user needs to view the navigation route displayed by the electronic device, the user can trigger the electronic device to send the image of the interface of the navigation route displayed by the electronic device to the wearable device. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. So, the user can be at the in-process of driving the vehicle, directly look over the navigation route through wearable equipment, need not to look over the navigation route that electronic equipment shows again to can realize looking over the navigation route when driving the vehicle, that is the user can compromise looking over of the security of driving and navigation route, and then can improve human-computer interaction performance.
Optionally, in this embodiment of the present invention, the wearable device may acquire real-image images within the acquisition range of the wearable device at different times, find images included in all the real-image images, and determine the display positions of the images included in all the images as the second region.
For example, in a case where the second area includes a position corresponding to the target area in the user field of view, before S205 described above, the content display method provided in the embodiment of the present invention may further include S206 described below, and S205 described above may be specifically implemented by S205a described below.
S206, the wearable device collects a first live-action image and a second live-action image corresponding to the visual field of the user at different moments.
Wherein, the first live-action image and the second live-action image corresponding to the user view field can be understood as: the first live-action image and the second live-action image are within the acquisition range of the wearable device.
In the embodiment of the invention, the wearable device can acquire the live-action image within the acquisition range of the wearable device in real time within the preset time length.
S205a, in the case that the same content exists between the first live view image and the second live view image, the wearable device displays the target content in an area of the screen corresponding to the same content.
Here, the regions corresponding to the same content may be understood as: an area on a screen of the wearable device that displays the same content.
In the embodiment of the present invention, in a case where the same content exists between the first live view image and the second live view image, the wearable device may display the target content in an area where the same content image is displayed.
Generally, during the driving process of a car without wearing the wearable device, some real scenes in the visual field of the user, such as real scenes in the car, do not interfere with the user. Therefore, in the embodiment of the invention, in the process that the user wears the wearable device to drive the automobile, in order to avoid that the target content displayed by the wearable device interferes with the driving of the automobile, the wearable device may display the target content on the live-action image in the automobile. Specifically, after the wearable device receives the target content, the wearable device may first acquire N live-action images within a preset duration in real time, and then the wearable device may compare the N live-action images respectively to determine images included in each live-action image, that is, the images are the same content in the N live-action images, and finally the wearable device may display the target content in an area where the same content is displayed.
It can be understood that in the process of driving the vehicle, the real scene inside the vehicle seen by the user is basically fixed and unchanged, and the real scene outside the vehicle seen by the user is changed in real time. Therefore, the wearable device can find all images included in the live-action images by comparing the live-action images acquired by the wearable device within the acquisition range of the wearable device within a period of time, and determine all the images included in the images as the live-action images in the vehicle.
For example, as shown in fig. 5, after the wearable device receives the target content, the wearable device may determine, according to a plurality of live-action images captured by the wearable device, an image included in each of the plurality of live-action images, for example, an image of "in-vehicle instrument desk" shown in fig. 5, and then the wearable device may display the target content as shown at 51 in fig. 5 at a position where the image "in-vehicle instrument desk" is displayed.
In the embodiment of the invention, the wearable device can display the target content in the area for displaying the vehicle live-action image, so that the interference of the target content to the driving of the vehicle by the user can be avoided, the navigation route can be viewed while the vehicle is driven, namely, the user can view the driving safety and the navigation route at the same time, and the man-machine interaction performance can be improved.
Optionally, in this embodiment of the present invention, after the wearable device updates the display location of the target content to the second area of the screen of the wearable device, the user may also trigger the electronic device to update the target content to another location (for example, an input location of the input) for display through an input.
For example, after S205 described above, the content display method provided in the embodiment of the present invention may further include S207 and S208 described below.
S207, the wearable device receives a first input of a user on a screen of the wearable device.
Optionally, in this embodiment of the present invention, the first input may be a click input of a user on a screen of the wearable device; alternatively, the first input may be a user's press-and-hold input on a screen of the wearable device; alternatively, the first input may be a long press input by the user on a screen of the wearable device, or the like. The method can be determined according to actual use requirements, and the embodiment of the invention is not limited.
And S208, the wearable device responds to the first input, and updates the target content displayed in the second area to the input position of the first input for displaying.
In this embodiment of the present invention, after the wearable device receives the first input of the user, the wearable device may update the target content displayed in the second area to the input position display of the first input in response to the first input.
In the embodiment of the invention, the user can trigger the wearable device to update the target content displayed in the second area to the required position for display through one input according to the actual requirement, so that the user can conveniently adjust the display position of the target content, the interference of the target content to the driving of the user can be avoided, the navigation route can be viewed while the user drives the vehicle, the driving safety and the navigation route can be viewed by the user, and the man-machine interaction performance can be improved.
Optionally, in the embodiment of the present invention, the target content displayed on the wearable device is an image that the user triggers the electronic device to share in real time with the wearable device. That is, when the interface displayed by the electronic device changes, the electronic device may share the changed interface image to the wearable device, so that the image displayed on the wearable device changes synchronously with the change of the interface displayed by the electronic device. Specifically, after the electronic device sends the target content to the wearable device, if a first interface displayed on the screen of the electronic device changes, that is, the interface displayed on the screen of the electronic device is updated from the first interface to another interface (for example, a second interface described below), the electronic device may send an image of the updated interface to the wearable device, and after the wearable device receives the image of the updated interface, the wearable device may update the target content displayed by the wearable device to an image of another interface. Therefore, the user can conveniently check the image of the interface updated in real time, and the accuracy of the image displayed by the wearable device can be further ensured.
Further, under the condition that the first interface is the navigation interface, the electronic device can synchronously share the changed images of the navigation interface to the wearable device under the condition that the navigation interface displayed by the electronic device is changed, so that the images of the navigation route interface displayed on the wearable device can synchronously change along with the change of the navigation interface displayed by the electronic device, a user can conveniently view the images of the navigation route interface updated in real time, and the accuracy of the navigation route displayed by the wearable device can be ensured.
For example, after S202, the content display method provided in the embodiment of the present invention may further include S209 described below. After S205 described above, the content display method provided in the embodiment of the present invention may further include S2010-S2011 described below.
S209, the electronic device shares the first content with the wearable device under the condition that the first interface displayed by the electronic device is updated to the second interface.
The first content may be an image of a second interface currently displayed by the electronic device, which is acquired by the electronic device. It is understood that the present here refers to the moment when the electronic device captures the image of the second interface.
In the embodiment of the invention, after the user triggers the electronic device to perform route navigation and the user drives the vehicle, the geographic position of the electronic device changes in real time, so that the interface of the navigation route displayed by the electronic device is also in a real-time updating state, namely the interface of the navigation route displayed by the electronic device is continuously updated from one interface to another interface (for example, the first interface displayed by the electronic device is updated to the second interface). In this case, the electronic device may share the image of the updated interface with the wearable device in real-time.
S2010, the wearable device receives first content.
S2011, the wearable device updates the target content displayed in the second area of the wearable device to the first content.
In the embodiment of the invention, after the wearable device receives the first content, the wearable device can update the target content displayed in the second area on the screen of the wearable device to the first content. It can be understood that the first content is content that the electronic device shares after sharing the target content.
In the embodiment of the invention, the electronic equipment can send the image of the navigation interface updated in real time to the wearable equipment, so that the wearable equipment can be ensured to update the image of the interface of the displayed navigation route in real time, a user can conveniently check the image, and the accuracy of the wearable equipment in displaying the navigation route can be ensured.
In the embodiment of the present invention, the content display methods shown in the above method drawings are all exemplarily described with reference to one drawing in the embodiment of the present invention. In specific implementation, the content display method shown in each method drawing can also be implemented by combining any other drawing which can be combined and is illustrated in the above embodiments, and details are not described here.
As shown in fig. 6, an embodiment of the present invention provides a wearable device 400, and the wearable device 400 may include a display module 401. The display module 401 may be configured to display target content shared by the electronic device in a first area of a screen of the wearable device; and updating the display position of the target content to a second region of the screen in a case where the user wears the wearable device. The second region may include at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen and a position corresponding to a target area in a user visual field, wherein the scene change amplitude of the target area in a preset time length is smaller than a preset threshold value.
Optionally, in combination with fig. 6, as shown in fig. 7, in the embodiment of the present invention, in the case that the second area includes a position corresponding to the target area in the field of view of the user, the wearable device 400 may further include an acquisition module 402. The acquiring module 402 may be configured to acquire a first live-action image and a second live-action image corresponding to the field of view of the user at different times before the display module 401 updates the display position of the target content to the second area of the screen; the display module 401 may be specifically configured to display the target content in an area corresponding to the same content in the screen when the same content exists between the first live view image and the second live view image acquired by the acquisition module 402.
Optionally, in combination with fig. 7, as shown in fig. 8, in the embodiment of the present invention, the wearable device 400 may further include a receiving module 403. The receiving module 403 may be configured to receive a first input of a user on the screen after the display module 401 updates the display position of the target content to the second area of the screen; the display module 401 may be further configured to update the target content displayed in the second area to the input position display of the first input in response to the first input received by the receiving module 403.
Optionally, in this embodiment of the present invention, the target content may be an image of a navigation interface displayed by the electronic device.
Optionally, in this embodiment of the present invention, the display module 401 may be further configured to update the target content displayed in the second area to the first content after the display position of the target content is updated to the second area of the screen, where the first content may be a content shared by the electronic device after the target content is shared.
The embodiment of the invention provides wearable equipment, wherein when the wearable equipment receives target content shared by electronic equipment, the wearable equipment can display the target content shared by the electronic equipment in a first area of a screen of the wearable equipment; and the wearable device may update the display location of the target content to a second area of the screen if the wearable device is worn by the user. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may automatically update the image displayed in the first area to the second area display that does not obstruct the user's view. So, the user can be at the in-process of driving the vehicle, directly look over the navigation route through wearable equipment, need not to look over the navigation route that electronic equipment shows again to can realize looking over the navigation route when driving the vehicle, that is the user can compromise looking over of the security of driving and navigation route, and then can improve human-computer interaction performance.
As shown in fig. 9, an embodiment of the present invention provides an electronic device 500, and the electronic device 500 may include a receiving module 501 and a transmitting module 502. The receiving module 501 may be configured to receive a second input of the user; the sending module 502 may be configured to share target content to the wearable device in response to the second input received by the receiving module 501, where the target content is an image of the first interface displayed by the electronic device and acquired by the electronic device.
Optionally, in this embodiment of the present invention, the sending module 501 may be further configured to, after sharing the target content with the wearable device, share the first content with the wearable device when the first interface displayed by the electronic device is updated to the second interface, where the first content is an image of the second interface displayed by the electronic device and acquired by the electronic device.
The embodiment of the invention provides electronic equipment, which can receive second input of a user; and in response to the second input, sharing the target content with the wearable device. The target content is an image of a first interface displayed by the electronic equipment and collected by the electronic equipment. By the scheme, when the wearable device receives the target content shared by the electronic device, the wearable device can display the target content shared by the electronic device in the first area of the screen of the wearable device; and the wearable device may update the display location of the target content to a second area of the screen if the wearable device is worn by the user. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may automatically update the image displayed in the first area to the second area display that does not obstruct the user's view. So, the user can be at the in-process of driving the vehicle, directly look over the navigation route through wearable equipment, need not to look over the navigation route that electronic equipment shows again to can realize looking over the navigation route when driving the vehicle, that is the user can compromise looking over of the security of driving and navigation route, and then can improve human-computer interaction performance.
Fig. 10 is a hardware schematic diagram of a wearable device implementing various embodiments of the invention. As shown in fig. 10, the wearable device 600 includes, but is not limited to: an image pickup device 601, a display device 602, a processor 603, a bus 604, a communication interface 605, a memory 606, a speaker 607, and the like.
The display module 602 may be configured to display a target content shared by the electronic device in a first area of a screen; and updating the display position of the target content to a second region of the screen in a case where the user wears the wearable device. The second region includes at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset duration is smaller than a preset threshold.
It is to be understood that, in the embodiment of the present invention, the display module 401 in the structural schematic diagram of the wearable device (for example, fig. 6) may be implemented by the display device 602.
The embodiment of the invention provides wearable equipment, wherein when the wearable equipment receives target content shared by electronic equipment, the wearable equipment can display the target content shared by the electronic equipment in a first area of a screen of the wearable equipment; then, when the user wears the wearable device, the wearable device may update the display location of the target content to a second area of the screen. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. Therefore, the user can directly check the navigation route through the wearable device in the process of driving the vehicle without checking the navigation route displayed by the electronic device, so that the navigation route can be checked while the vehicle is driven, namely, the user can consider the driving safety and the navigation route, and the man-machine interaction performance can be improved.
The bus 604 may be a Peripheral Component Interconnect (PCI) bus, an Extended Industry Standard Architecture (EISA) bus, or the like. The bus 604 may be divided into an address bus, a data bus, a control bus, and the like. For ease of illustration, only one thick line is shown in FIG. 10, but this is not intended to represent only one bus or type of bus. In addition, the wearable device may further include some other functional modules (e.g., hard disks) not shown in fig. 10, and the embodiments of the present invention are not described herein again.
Fig. 11 is a hardware schematic diagram of an electronic device implementing various embodiments of the invention. As shown in fig. 11, the electronic device 100 includes but is not limited to: radio frequency unit 101, network module 102, audio output unit 103, input unit 104, sensor 105, display unit 106, user input unit 107, interface unit 108, memory 109, processor 110, and power supply 111. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 11 does not constitute a limitation of electronic devices, which may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 107 may be configured to receive a second input from the user; the radio frequency unit 101 may be configured to share the target content with the wearable device in response to a second input received by the user input unit 107. The target content is an image of a first interface displayed by the electronic equipment and collected by the electronic equipment.
It can be understood that, in the embodiment of the present invention, the receiving module 501 in the structural schematic diagram of the electronic device (for example, fig. 9) may be implemented by the user input unit 107; the transmitting module 502 in the structural schematic diagram of the electronic device (for example, fig. 9) may be implemented by the radio frequency unit 101.
The embodiment of the invention provides electronic equipment, wherein when the wearable equipment receives target content shared by the electronic equipment, the wearable equipment can display the target content shared by the electronic equipment in a first area of a screen of the wearable equipment; then, when the user wears the wearable device, the wearable device may update the display location of the target content to a second area of the screen. Wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen, and a position corresponding to a target area (the scene change amplitude in a preset time is less than a preset threshold value) in a visual field of the user. According to the scheme, after the user conducts route navigation through the navigation application in the electronic equipment, when the user drives the vehicle, if the user needs to check the navigation route displayed by the electronic equipment, the user can trigger the electronic equipment to send the image of the interface of the navigation route displayed by the electronic equipment to the wearable equipment. The wearable device may then display the image in a first area of a screen of the wearable device. In this case, if the user wears the wearable device, the wearable device may update the image displayed in the first area to be displayed in the second area that does not obstruct the user's view. Therefore, the user can directly check the navigation route through the wearable device in the process of driving the vehicle without checking the navigation route displayed by the electronic device, so that the navigation route can be checked while the vehicle is driven, namely, the user can consider the driving safety and the navigation route, and the man-machine interaction performance can be improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 101 may be used for receiving and sending signals during a message transmission or call process, and specifically, after receiving downlink data from a base station, the downlink data is processed by the processor 110; in addition, the uplink data is transmitted to the base station. Typically, radio frequency unit 101 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 101 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 102, such as assisting the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 103 may convert audio data received by the radio frequency unit 101 or the network module 102 or stored in the memory 109 into an audio signal and output as sound. Also, the audio output unit 103 may also provide audio output related to a specific function performed by the electronic apparatus 100 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 103 includes a speaker, a buzzer, a receiver, and the like.
The input unit 104 is used to receive an audio or video signal. The input unit 104 may include a Graphics Processing Unit (GPU) 1041 and a microphone 1042, and the graphics processor 1041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 106. The image frames processed by the graphic processor 1041 may be stored in the memory 109 (or other storage medium) or transmitted via the radio frequency unit 101 or the network module 102. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 101 in case of a phone call mode.
The electronic device 100 also includes at least one sensor 105, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 1061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 1061 and/or the backlight when the electronic device 100 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 105 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 106 is used to display information input by a user or information provided to the user. The display unit 106 may include a display panel 1061, and the display panel 1061 may be configured in the form of a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), or the like.
The user input unit 107 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 107 includes a touch panel 1071 and other input devices 1072. Touch panel 1071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 1071 (e.g., operations by a user on or near touch panel 1071 using a finger, stylus, or any suitable object or attachment). The touch panel 1071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 110, and receives and executes commands sent by the processor 110. In addition, the touch panel 1071 may be implemented in various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 1071, the user input unit 107 may include other input devices 1072. Specifically, other input devices 1072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 1071 may be overlaid on the display panel 1061, and when the touch panel 1071 detects a touch operation thereon or nearby, the touch panel 1071 transmits the touch operation to the processor 110 to determine the type of the touch event, and then the processor 110 provides a corresponding visual output on the display panel 1061 according to the type of the touch event. Although in fig. 11, the touch panel 1071 and the display panel 1061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 1071 and the display panel 1061 may be integrated to implement the input and output functions of the electronic device, and is not limited herein.
The interface unit 108 is an interface for connecting an external device to the electronic apparatus 100. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 108 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 100 or may be used to transmit data between the electronic apparatus 100 and the external device.
The memory 109 may be used to store software programs as well as various data. The memory 109 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 109 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 110 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, performs various functions of the electronic device and processes data by operating or executing software programs and/or modules stored in the memory 109 and calling data stored in the memory 109, thereby performing overall monitoring of the electronic device. Processor 110 may include one or more processing units; alternatively, the processor 110 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 110.
The electronic device 100 may further include a power supply 111 (e.g., a battery) for supplying power to each component, and optionally, the power supply 111 may be logically connected to the processor 110 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system.
In addition, the electronic device 100 includes some functional modules that are not shown, and are not described in detail herein.
Optionally, an embodiment of the present invention further provides a wearable device, where the wearable device may include a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when executed by the processor, the computer program implements the processes of the foregoing method embodiments, and can achieve the same technical effects, and details are not repeated here to avoid repetition.
Optionally, an embodiment of the present invention further provides an electronic device, where the electronic device may include a processor, a memory, and a computer program stored in the memory and capable of running on the processor, and when the computer program is executed by the processor, the computer program implements each process of the foregoing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements the processes of the method embodiments, and can achieve the same technical effects, and in order to avoid repetition, the details are not repeated here. The computer-readable storage medium may include a read-only memory (ROM), a Random Access Memory (RAM), a magnetic or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling an electronic device (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (12)

1. A content display method is applied to a wearable device with a screen, and is characterized by comprising the following steps:
displaying target content shared by electronic equipment in a first area of the screen;
updating a display position of the target content to a second area of the screen in a case where the wearable device is worn by a user;
wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset time is smaller than a preset threshold value.
2. The method of claim 1, wherein the second region comprises a location corresponding to a target region in a user field of view;
before the updating the display position of the target content to the second area of the screen, the method further includes:
acquiring a first live-action image and a second live-action image corresponding to the visual field of a user at different moments;
the updating the display position of the target content to the second area of the screen specifically includes:
and in the case that the same content exists between the first live-action image and the second live-action image, displaying the target content in an area corresponding to the same content in the screen.
3. The method of claim 1, wherein after updating the display position of the target content to the second area of the screen, the method further comprises:
receiving a first input of a user on the screen;
updating the target content displayed by the second area to an input position display of the first input in response to the first input.
4. The method of any one of claims 1-3, wherein the target content is an image of a navigation interface displayed by the electronic device.
5. The method of claim 1, wherein after updating the display position of the target content to the second area of the screen, the method further comprises:
updating the target content displayed in the second area to be first content, wherein the first content is content shared by the electronic equipment after the target content is shared.
6. A wearable device, characterized in that the wearable device comprises a display module;
the display module is used for displaying target content shared by electronic equipment in a first area of a screen of the wearable equipment; updating the display position of the target content to a second area of the screen under the condition that the wearable device is worn by a user;
wherein the second region comprises at least one of: the method comprises the steps of presetting a position, a touch position of a user on a screen, a position of a fixation point of eyeballs of the user on the screen and a position corresponding to a target area in a visual field of the user, wherein the scene change amplitude of the target area in a preset time is smaller than a preset threshold value.
7. The wearable device of claim 6, wherein the second region comprises a location corresponding to a target region in a user field of view, the wearable device further comprising an acquisition module;
the acquisition module is used for acquiring a first real image and a second real image corresponding to the visual field of the user at different moments before the display module updates the display position of the target content to the second area of the screen;
the display module is specifically configured to display the target content in an area corresponding to the same content in the screen when the same content exists between the first live view image and the second live view image acquired by the acquisition module.
8. The wearable device of claim 6, further comprising a receiving module;
the receiving module is used for receiving a first input of a user on the screen after the display module updates the display position of the target content to the second area of the screen;
the display module is further configured to update the target content displayed in the second area to an input position of the first input for display in response to the first input received by the receiving module.
9. The wearable device according to any of claims 6-8, wherein the target content is an image of a navigation interface displayed by the electronic device.
10. The wearable device of claim 6,
the display module is further configured to update the target content displayed in the second area to first content after the display position of the target content is updated to the second area of the screen, where the first content is content shared by the electronic device after the target content is shared.
11. A wearable device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the content display method of any of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the content display method according to any one of claims 1 to 5.
CN201911358662.5A 2019-12-25 2019-12-25 Content display method and wearable device Pending CN111142772A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911358662.5A CN111142772A (en) 2019-12-25 2019-12-25 Content display method and wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911358662.5A CN111142772A (en) 2019-12-25 2019-12-25 Content display method and wearable device

Publications (1)

Publication Number Publication Date
CN111142772A true CN111142772A (en) 2020-05-12

Family

ID=70520052

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911358662.5A Pending CN111142772A (en) 2019-12-25 2019-12-25 Content display method and wearable device

Country Status (1)

Country Link
CN (1) CN111142772A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111695459A (en) * 2020-05-28 2020-09-22 腾讯科技(深圳)有限公司 State information prompting method and related equipment
CN111880710A (en) * 2020-07-31 2020-11-03 Oppo广东移动通信有限公司 Screen rotation method and related product

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103837149A (en) * 2014-03-25 2014-06-04 深圳市凯立德科技股份有限公司 Navigation device and wearable device as well as interactive method thereof
CN104133550A (en) * 2014-06-27 2014-11-05 联想(北京)有限公司 Information processing method and electronic equipment
CN104731333A (en) * 2015-03-25 2015-06-24 联想(北京)有限公司 Wearable electronic equipment
CN105691304A (en) * 2016-04-11 2016-06-22 青岛理工大学 Automotive display system based on wearable display equipment
US20160259422A1 (en) * 2015-03-06 2016-09-08 Panasonic Intellectual Property Corporation Of America Wearable terminal and method for controlling the same
CN106648108A (en) * 2015-11-03 2017-05-10 通用汽车环球科技运作有限责任公司 Vehicle-wearable device interface and methods for using same
CN106679686A (en) * 2017-01-03 2017-05-17 京东方科技集团股份有限公司 Wearable device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103837149A (en) * 2014-03-25 2014-06-04 深圳市凯立德科技股份有限公司 Navigation device and wearable device as well as interactive method thereof
CN104133550A (en) * 2014-06-27 2014-11-05 联想(北京)有限公司 Information processing method and electronic equipment
US20160259422A1 (en) * 2015-03-06 2016-09-08 Panasonic Intellectual Property Corporation Of America Wearable terminal and method for controlling the same
CN104731333A (en) * 2015-03-25 2015-06-24 联想(北京)有限公司 Wearable electronic equipment
CN106648108A (en) * 2015-11-03 2017-05-10 通用汽车环球科技运作有限责任公司 Vehicle-wearable device interface and methods for using same
CN105691304A (en) * 2016-04-11 2016-06-22 青岛理工大学 Automotive display system based on wearable display equipment
CN106679686A (en) * 2017-01-03 2017-05-17 京东方科技集团股份有限公司 Wearable device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111695459A (en) * 2020-05-28 2020-09-22 腾讯科技(深圳)有限公司 State information prompting method and related equipment
CN111695459B (en) * 2020-05-28 2023-04-18 腾讯科技(深圳)有限公司 State information prompting method and related equipment
CN111880710A (en) * 2020-07-31 2020-11-03 Oppo广东移动通信有限公司 Screen rotation method and related product

Similar Documents

Publication Publication Date Title
CN110062105B (en) Interface display method and terminal equipment
CN110769155B (en) Camera control method and electronic equipment
CN109857306B (en) Screen capturing method and terminal equipment
CN109240577B (en) Screen capturing method and terminal
CN109032486B (en) Display control method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN110908557B (en) Information display method and terminal equipment
CN109190356B (en) Screen unlocking method and terminal
CN110944139B (en) Display control method and electronic equipment
US20220286622A1 (en) Object display method and electronic device
CN110830713A (en) Zooming method and electronic equipment
CN110908750B (en) Screen capturing method and electronic equipment
CN110012151B (en) Information display method and terminal equipment
CN109189514B (en) Terminal device control method and terminal device
CN109104573B (en) Method for determining focusing point and terminal equipment
CN111142772A (en) Content display method and wearable device
CN111246105B (en) Photographing method, electronic device, and computer-readable storage medium
CN111026482B (en) Application program control method and electronic equipment
CN110769162B (en) Electronic equipment and focusing method
CN110493451B (en) Data transmission method, electronic equipment and terminal
CN110647506B (en) Picture deleting method and terminal equipment
CN109829707B (en) Interface display method and terminal equipment
CN109547696B (en) Shooting method and terminal equipment
CN111443968A (en) Screenshot method and electronic equipment
CN108614725B (en) Interface display method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20200512

RJ01 Rejection of invention patent application after publication