CN114095542A - Display control method and electronic equipment - Google Patents
Display control method and electronic equipment Download PDFInfo
- Publication number
- CN114095542A CN114095542A CN202010780354.8A CN202010780354A CN114095542A CN 114095542 A CN114095542 A CN 114095542A CN 202010780354 A CN202010780354 A CN 202010780354A CN 114095542 A CN114095542 A CN 114095542A
- Authority
- CN
- China
- Prior art keywords
- connectable
- electronic device
- layout
- interface
- devices
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 62
- 238000004891 communication Methods 0.000 claims abstract description 72
- 230000006854 communication Effects 0.000 claims abstract description 72
- 230000003993 interaction Effects 0.000 claims abstract description 24
- 238000004590 computer program Methods 0.000 claims description 20
- 230000008859 change Effects 0.000 claims description 18
- 238000003860 storage Methods 0.000 claims description 12
- 238000010586 diagram Methods 0.000 description 47
- 210000004027 cell Anatomy 0.000 description 41
- 230000006870 function Effects 0.000 description 41
- 230000000670 limiting effect Effects 0.000 description 27
- 239000010410 layer Substances 0.000 description 24
- 238000007726 management method Methods 0.000 description 20
- 238000012545 processing Methods 0.000 description 18
- 230000002093 peripheral effect Effects 0.000 description 12
- 238000005516 engineering process Methods 0.000 description 11
- 238000010295 mobile communication Methods 0.000 description 11
- 230000008569 process Effects 0.000 description 11
- 230000005236 sound signal Effects 0.000 description 11
- 210000000988 bone and bone Anatomy 0.000 description 10
- 230000001976 improved effect Effects 0.000 description 9
- 230000007423 decrease Effects 0.000 description 8
- 230000001413 cellular effect Effects 0.000 description 6
- 238000009826 distribution Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 6
- 230000005540 biological transmission Effects 0.000 description 5
- 230000003247 decreasing effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 230000005855 radiation Effects 0.000 description 5
- 230000000007 visual effect Effects 0.000 description 5
- 230000009467 reduction Effects 0.000 description 4
- 230000002829 reductive effect Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 229920001621 AMOLED Polymers 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000013528 artificial neural network Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 238000000691 measurement method Methods 0.000 description 3
- 238000003825 pressing Methods 0.000 description 3
- 230000001960 triggered effect Effects 0.000 description 3
- 230000003321 amplification Effects 0.000 description 2
- 230000036772 blood pressure Effects 0.000 description 2
- 238000004422 calculation algorithm Methods 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000012790 confirmation Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000003199 nucleic acid amplification method Methods 0.000 description 2
- 238000009877 rendering Methods 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 238000010420 art technique Methods 0.000 description 1
- 230000003416 augmentation Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000010009 beating Methods 0.000 description 1
- 230000007175 bidirectional communication Effects 0.000 description 1
- 238000013529 biological neural network Methods 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 239000004020 conductor Substances 0.000 description 1
- 239000012792 core layer Substances 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000001914 filtration Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 230000036541 health Effects 0.000 description 1
- 230000001939 inductive effect Effects 0.000 description 1
- 239000010985 leather Substances 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000010349 pulsation Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 238000005070 sampling Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
- H04L67/125—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks involving control of end-device applications over a network
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/023—Services making use of location information using mutual or relative location information between multiple location based services [LBS] targets or of distance thresholds
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/02—Services making use of location information
- H04W4/025—Services making use of location information using location based information parameters
- H04W4/026—Services making use of location information using location based information parameters using orientation information, e.g. compass
Landscapes
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Health & Medical Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The application relates to the technical field of communication, and provides a display control method and an electronic device, wherein the display control method comprises the following steps: the first electronic equipment monitors a first operation or triggers an interaction event, and searches for connectable equipment; the method comprises the steps that first electronic equipment obtains a first spatial relationship between each connectable device and the first electronic equipment, wherein the first spatial relationship comprises a first distance and a first included angle; the first electronic equipment displays a first layout interface according to the first spatial relationship, the first layout interface comprises one or more display areas, each display area corresponds to different distance conditions, and a first distance corresponding to connectable equipment displayed in each display area meets the distance conditions corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition. The embodiment of the application improves the interaction efficiency and accuracy.
Description
Technical Field
The present application relates to the field of communications technologies, and in particular, to a display control method and an electronic device.
Background
In the Internet of everything era, smart devices and Internet of Things (IoT) devices are becoming more abundant, and more users have multiple smart devices and/or IoT devices. Smart devices such as smart phones, computers, smart televisions, tablets, smart speakers, and the like. IoT devices such as intelligent video and audio devices, routers, intelligent cleaning devices, intelligent kitchen appliances, and intelligent lighting systems in the home.
There are more and more devices that a user can control or interconnect with. When the user equipment needs to interact with one or more specific peripheral devices, such as control, pairing, data transmission, screen projection and the like, the user equipment and the specific devices are required to be capable of communicating. In order for the user equipment and the specific device to be able to communicate with each other, a communication connection of the user equipment and the specific device needs to be established. In the process of establishing a communication connection between a user equipment and a specific device, the user equipment needs to search for connectable devices in the periphery. The user device may search for many connectable devices on the same network, i.e. devices that can be discovered by the user device and that can establish a communication connection.
In some prior art techniques, a user device may display a menu or list of connectable devices for the user to select a particular device to connect with the user device. When there are a plurality of connectable devices in the menu or list, the user is required to identify icons and/or characters corresponding to the connectable devices to select a specific device, and sometimes even to select the specific device in conjunction with a user operation such as scrolling. If the order of the connectable devices is related to the strength of the device signal, when the signal is unstable or the user moves, the order of the connectable devices in the menu or list changes, and the connectable devices are very easy to touch by mistake during selection.
In other prior art, the user device may provide an absolute location of the connectable device through a map. This approach requires occupying a large area of the screen. When the distance between the connectable devices is short, the positioning points are very easy to coincide, and complex interaction such as zooming or dragging needs to be combined to avoid clicking errors.
Disclosure of Invention
The embodiment of the application provides a display control method and electronic equipment, which can solve at least one technical problem related to the prior art.
In a first aspect, an embodiment of the present application provides a display control method, which is applied to a first electronic device, and the display control method includes:
the first electronic equipment monitors a first operation or triggers an interaction event, and searches connectable equipment;
the first electronic device obtains a first spatial relationship between each connectable device and the first electronic device, where the first spatial relationship includes a first distance between each connectable device and the first electronic device, and a first included angle between a connection line between each connectable device and the first electronic device and a current orientation of the first electronic device;
the first electronic device displays a first layout interface according to the first spatial relationship, the first layout interface comprises one or more display areas, each display area corresponds to different distance conditions, and a first distance corresponding to connectable equipment displayed in each display area meets the distance conditions corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition.
According to the embodiment of the first aspect of the application, the distance and the included angle of the connectable equipment are mapped through the layout interface, so that a user can quickly select the target equipment, equipment interaction is further completed, and the interaction efficiency is improved. In addition, when the difference value of the first included angles of the adjacent connectable devices meets the angle condition, the adjacent connectable devices are displayed in a dispersed manner, so that on one hand, the situation that the device identifications are overlapped to cause that a user selects the target device by mistake can be avoided, and the interaction accuracy is improved; on the other hand, the situation that the user can select the target equipment only through complex manual operation (such as dragging or sliding operation to achieve amplification display) is avoided, and interaction efficiency is improved.
As a possible implementation manner of the first aspect, the first operation is, for example, a user operation of an add device.
As a possible implementation manner of the first aspect, the method further includes:
the first electronic device obtains a current position and/or a position pointed by a current orientation of the first electronic device, and if the current position and/or the position pointed by the current orientation are/is determined to be changed, a second spatial relationship between each connectable device and the first electronic device is obtained, wherein the second spatial relationship comprises a second distance between each connectable device and the first electronic device and a second included angle between a connecting line between each connectable device and the first electronic device and the current orientation of the first electronic device;
and the first electronic equipment displays a second layout interface of the connectable equipment according to the second spatial relationship, wherein the layout of the connectable equipment in the second layout interface changes according to the change of the current position and/or the position pointed by the current orientation.
As a possible implementation manner of the first aspect, each display area corresponds to an included angle range, and a first included angle corresponding to a connectable device displayed in each display area satisfies the included angle range corresponding to the display area;
or the like, or, alternatively,
each display area corresponds to an upper limit of the number of devices, and the number of connectable devices displayed in each display area is less than or equal to the corresponding upper limit of the number of devices; or the like, or, alternatively,
the connectable device displayed in the first layout interface is the same as an account on which the first electronic device logs in.
As a possible implementation manner of the first aspect, the method further includes:
the first electronic equipment receives a second operation input by a user and switches the first layout interface to a third layout interface, wherein the first layout interface and the third layout interface have different interface forms.
As a possible implementation manner of the first aspect, the interface form includes a circular layout, a semicircular layout, an arc layout, or a minimized layout.
As a possible implementation manner of the first aspect, the method further includes:
the first electronic device receives a first operation of a user on a target device, and establishes communication connection with the target device, wherein the target device is one or more devices in the connectable devices.
In a second aspect, there is provided a display control apparatus configured to a first electronic device, the display control apparatus including:
the searching module is used for monitoring the first operation or triggering the interaction event and searching the connectable equipment;
a first obtaining module, configured to obtain a first spatial relationship between each connectable device and the first electronic device, where the first spatial relationship includes a first distance between each connectable device and the first electronic device, and a first included angle between a connection line between each connectable device and the first electronic device and a current orientation of the first electronic device;
a first display module, configured to display a first layout interface according to the first spatial relationship, where the first layout interface includes one or more display areas, each display area corresponds to a different distance condition, and a first distance corresponding to a connectable device displayed in each display area satisfies the distance condition corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition.
As a possible implementation manner of the second aspect, the first operation is, for example, a user operation of adding a device.
As a possible implementation manner of the second aspect, the method further includes:
a second obtaining module, configured to obtain a current position and/or a position pointed by a current orientation of the local device, and if it is determined that the current position and/or the position pointed by the current orientation changes, obtain a second spatial relationship between each connectable device and the first electronic device, where the second spatial relationship includes a second distance between each connectable device and the first electronic device and a second included angle between a connection line between each connectable device and the first electronic device and the current orientation of the first electronic device;
and the second display module is used for displaying a second layout interface of the connectable equipment according to the second spatial relationship, wherein the layout of the connectable equipment in the second layout interface changes according to the change of the current position and/or the position pointed by the current orientation.
As a possible implementation manner of the second aspect, each display area corresponds to an included angle range, and a first included angle corresponding to a connectable device displayed in each display area satisfies the included angle range corresponding to the display area;
or the like, or, alternatively,
each display area corresponds to an upper limit of the number of devices, and the number of connectable devices displayed in each display area is less than or equal to the corresponding upper limit of the number of devices; or the like, or, alternatively,
the connectable device displayed in the first layout interface is the same as an account on which the first electronic device logs in.
As a possible implementation manner of the second aspect, the method further includes:
and the switching module is used for receiving a second operation input by a user and switching the first layout interface to a third layout interface, wherein the first layout interface and the third layout interface have different interface forms.
As a possible implementation manner of the second aspect, the interface form includes a circular layout, a semicircular layout, an arc layout, or a minimized layout.
As a possible implementation manner of the second aspect, the method further includes:
the device comprises an establishing module and a processing module, wherein the establishing module is used for receiving a first operation of a user acting on target equipment and establishing communication connection with the target equipment, and the target equipment is one or more of the connectable equipment.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program, so that the electronic device implements the method according to any one of the first aspect and possible implementation manners of the first aspect.
In a fourth aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the method according to any one of the first aspect and possible implementation manners of the first aspect.
In a fifth aspect, an embodiment of the present application provides a computer program product, which, when run on an electronic device, causes the electronic device to execute the method described in any one of the foregoing first aspect and possible implementations of the first aspect.
It will be appreciated that the advantageous effects of the second to fifth aspects described above may be seen in relation to the description of the first aspect described above.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided in an embodiment of the present application;
FIG. 2 is a diagram of a software architecture of an electronic device provided by an embodiment of the present application;
fig. 3A is a first application scenario provided by an embodiment of the present application;
FIG. 3B is a schematic diagram of an interface of a mobile phone in a first application scenario;
FIG. 4A is a schematic diagram of the positioning of a mobile phone, a display screen and a notebook computer in a first application scenario;
FIG. 4B is a schematic diagram of an angle between a mobile phone and a display screen in a first application scenario;
FIG. 4C is another schematic diagram of an angle between the mobile phone and the display screen in the first application scenario;
FIG. 4D is a schematic diagram of the orientation of a cell phone in a first application scenario;
FIG. 5 is a schematic diagram illustrating a change in a layout interface before and after clockwise rotation of a mobile phone in a first application scenario;
FIG. 6 is a schematic diagram of another layout interface change before and after clockwise rotation of the mobile phone in the first application scenario;
FIG. 7 is a schematic diagram illustrating the coincidence of device identifiers provided by an embodiment of the present application;
FIG. 8A is a schematic diagram of a layout interface for a circular layout provided by an embodiment of the present application;
FIG. 8B is a schematic diagram of a layout interface for a circular layout provided by another embodiment of the present application;
FIG. 9 is a schematic diagram of a layout interface for a semicircular layout provided by an embodiment of the present application;
FIG. 10 is a schematic diagram of a layout interface for an arc layout provided by an embodiment of the present application;
FIG. 11 is a schematic diagram of a layout interface for a semi-circular layout or an arc layout provided by an embodiment of the present application;
FIG. 12 is a schematic view of a layout interface for a semi-circular layout or an arc layout provided by another embodiment of the present application;
FIG. 13 is a schematic diagram of a layout interface for minimizing a layout provided by an embodiment of the present application;
FIG. 14 is a schematic diagram of a switch layout interface provided by an embodiment of the present application;
fig. 15A is a schematic diagram of a bluetooth positioning system according to an embodiment of the present application;
fig. 15B is a schematic diagram of a bluetooth positioning system according to an embodiment of the present application;
fig. 15C is another schematic diagram of bluetooth positioning provided by an embodiment of the present application;
fig. 15D is another schematic diagram of the bluetooth direction finding function provided in an embodiment of the present application;
fig. 16A is a second application scenario provided by an embodiment of the present application;
FIG. 16B is a schematic diagram illustrating user interface changes in a second application scenario of the present application;
FIG. 16C is a schematic diagram illustrating a scenario for generating control commands in a second application scenario of the present application;
FIG. 16D is a schematic diagram of another scenario for generating control commands in the second application scenario of the present application;
FIG. 16E is a schematic diagram of another scenario for generating control commands in the second application scenario of the present application;
fig. 17A is a third application scenario provided in an embodiment of the present application;
FIG. 17B is a diagram of a layout interface for an arc layout in a second application scenario.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
The terminology used in the following examples is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of this application and the appended claims, the singular forms "a", "an", "the" and "the" are intended to include the plural forms as well, such as "one or more", unless the context clearly indicates otherwise.
It should also be understood that in the embodiments of the present application, "a plurality" and "one or more" mean one, two or more; "and/or" describes the association relationship of the associated objects, indicating that three relationships may exist; for example, a and/or B, may represent: a alone, both A and B, and B alone, where A, B may be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
The terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
As used in this specification and the appended claims, the term "if" or "if" may be interpreted depending on the context as "when.
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
In order to explain the technical means of the present application, the following description will be given by way of specific examples.
The display control method provided by the embodiment of the application can be applied to electronic devices, and the electronic devices include, but are not limited to, a mobile phone, a wearable device, an in-vehicle device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, a super-mobile personal computer (UMPC), a netbook, a tablet computer, a smart speaker, or a Set Top Box (STB) and the like. The embodiment of the present application does not set any limit to the specific type of the electronic device.
In some embodiments of the present application, the electronic device may comprise a portable, handheld, or mobile electronic device, such as a cell phone, tablet, wearable device, or portable game console, among others.
Fig. 1 shows a schematic structural diagram of an electronic device 100, taking a mobile phone as an example.
The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a key 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like. The sensor module 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
The controller can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution.
A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 110, thereby increasing the efficiency of the system.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 110 may include multiple sets of I2C buses. The processor 110 may be coupled to the touch sensor 180K, the charger, the flash, the camera 193, etc. through different I2C bus interfaces, respectively. For example: the processor 110 may be coupled to the touch sensor 180K via an I2C interface, such that the processor 110 and the touch sensor 180K communicate via an I2C bus interface to implement the touch functionality of the electronic device 100.
The I2S interface may be used for audio communication. In some embodiments, processor 110 may include multiple sets of I2S buses. The processor 110 may be coupled to the audio module 170 via an I2S bus to enable communication between the processor 110 and the audio module 170. In some embodiments, the audio module 170 may communicate audio signals to the wireless communication module 160 via the I2S interface, enabling answering of calls via a bluetooth headset.
The PCM interface may also be used for audio communication, sampling, quantizing and encoding analog signals. In some embodiments, the audio module 170 and the wireless communication module 160 may be coupled by a PCM bus interface. In some embodiments, the audio module 170 may also transmit audio signals to the wireless communication module 160 through the PCM interface, so as to implement a function of answering a call through a bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus used for asynchronous communications. The bus may be a bidirectional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is generally used to connect the processor 110 with the wireless communication module 160. For example: the processor 110 communicates with a bluetooth module in the wireless communication module 160 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 170 may transmit the audio signal to the wireless communication module 160 through a UART interface, so as to realize the function of playing music through a bluetooth headset.
MIPI interfaces may be used to connect processor 110 with peripheral devices such as display screen 194, camera 193, and the like. The MIPI interface includes a Camera Serial Interface (CSI), a Display Serial Interface (DSI), and the like. In some embodiments, processor 110 and camera 193 communicate through a CSI interface to implement the capture functionality of electronic device 100. The processor 110 and the display screen 194 communicate through the DSI interface to implement the display function of the electronic device 100.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal and may also be configured as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 110 with the camera 193, the display 194, the wireless communication module 160, the audio module 170, the sensor module 180, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, a MIPI interface, and the like.
The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone. The interface may also be used to connect other electronic devices, such as AR devices and the like.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only illustrative, and is not limited to the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (bluetooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a shooting function through the ISP, the camera 193, the video codec, the GPU, the display 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or N cameras 193, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 121 may be used to store computer-executable program code, which includes instructions. The internal memory 121 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (such as audio data, phone book, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 110 executes various functional applications of the electronic device 100 and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc.
The audio module 170 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110.
The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call.
The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person.
The microphone 170C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on.
The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, or may be a 3.5mm open mobile electronic device platform (OMTP) standard interface, a cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The air pressure sensor 180C is used to measure air pressure. In some embodiments, electronic device 100 calculates altitude, aiding in positioning and navigation, from barometric pressure values measured by barometric pressure sensor 180C.
The magnetic sensor 180D includes a hall sensor. The electronic device 100 may detect the opening and closing of the flip holster using the magnetic sensor 180D. In some embodiments, when the electronic device 100 is a flip phone, the electronic device 100 may detect the opening and closing of the flip according to the magnetic sensor 180D. And then according to the opening and closing state of the leather sheath or the opening and closing state of the flip cover, the automatic unlocking of the flip cover is set.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The proximity light sensor 180G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The electronic device 100 emits infrared light to the outside through the light emitting diode. The electronic device 100 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it can be determined that there is an object near the electronic device 100. When insufficient reflected light is detected, the electronic device 100 may determine that there are no objects near the electronic device 100. The electronic device 100 can utilize the proximity light sensor 180G to detect that the user holds the electronic device 100 close to the ear for talking, so as to automatically turn off the screen to achieve the purpose of saving power. The proximity light sensor 180G may also be used in a holster mode, a pocket mode automatically unlocks and locks the screen.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also called a "touch device". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
The bone conduction sensor 180M may acquire a vibration signal. In some embodiments, the bone conduction sensor 180M may acquire a vibration signal of the human vocal part vibrating the bone mass. The bone conduction sensor 180M may also contact the human pulse to receive the blood pressure pulsation signal. In some embodiments, the bone conduction sensor 180M may also be disposed in a headset, integrated into a bone conduction headset. The audio module 170 may analyze a voice signal based on the vibration signal of the bone mass vibrated by the sound part acquired by the bone conduction sensor 180M, so as to implement a voice function. The application processor can analyze heart rate information based on the blood pressure beating signal acquired by the bone conduction sensor 180M, so as to realize the heart rate detection function.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys. Or may be touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The motor 191 may generate a vibration cue. The motor 191 may be used for incoming call vibration cues, as well as for touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 191 may also respond to different vibration feedback effects for touch operations applied to different areas of the display screen 194. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
Indicator 192 may be an indicator light that may be used to indicate a state of charge, a change in charge, or a message, missed call, notification, etc.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the electronic device 100 and cannot be separated from the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present invention uses an Android system with a layered architecture as an example to exemplarily illustrate a software structure of the electronic device 100.
Fig. 2 is a block diagram of a software configuration of the electronic apparatus 100 according to the embodiment of the present invention.
The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom.
The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as MPEG4, h.264, MP3, AAC, AMR, JPG, PNG, and the like.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
The workflow of the electronic device 100 software and hardware is illustrated.
When the touch sensor 180K of the electronic device 100 receives a touch operation, a corresponding hardware interrupt is issued to the kernel layer. The kernel layer processes the touch operation into an original input event (including touch coordinates, a time stamp of the touch operation, and other information). The raw input events are stored at the kernel layer. And the application program framework layer acquires the original input event from the kernel layer and identifies the control corresponding to the input event.
The following illustrates application scenarios and implementation flows of the embodiments of the present application by way of non-limiting examples.
The first application scenario is an application scenario of cross-device interaction.
In the present application scenario, a mobile phone 310, a display screen 320, a notebook 330, and a tablet 340 are included. In the scenario shown in fig. 3A, the mobile phone 310 is taken as a search device, and other devices are taken as connectable devices for example. The handset 310 may establish a communication connection with one or more of the other devices. Other devices include a display screen 320, a notebook 330, and a tablet 340. It should be understood that in other application scenarios, other devices may serve as the search device and the handset may serve as the connectable device. The roles of the devices may be interchanged according to the actual situation of the application scenario, which is not limited in this application and is only an exemplary illustration here.
In the scenario illustrated in fig. 3A, the handset 310 is a searching device that searches for surrounding connectable devices, and the handset 310 may wirelessly communicate with any one or more of the other devices using wireless communication technologies supported by the other devices. The Wireless communication technology includes, but is not limited to, Wi-Fi, BT, IR, GPS, High Performance Wireless local area network (High LAN), Radio Frequency (RF), Wireless USB (WUSB), Ultra Wide Band (UWB), or the like. It should be understood that in practical applications, other wireless communication techniques may also be employed.
In some implementations, in a situation where the display screen 320 and the mobile phone 310 are already connected, that is, the mobile phone 310 and the display screen 320 have a history of connection, the display screen 320 is located within the wireless communication range of the mobile phone 310, and the mobile phone 310 can automatically establish a communication connection with the display screen 320. If the mobile phone 310 needs to connect to the notebook 330 in addition to the display 320. The handset 310 receives a click operation by the user to add another device, and searches for connectable devices in the periphery as shown in fig. 3B. The mobile phone 310 displays a layout interface of the peripheral connectable device, receives the notebook 330 selected by the user in the layout interface, and the mobile phone 310 establishes a communication connection with the notebook 330.
In other implementations, in the case that there is no historical connection record between other devices and the mobile phone 310, the mobile phone 310 listens for the trigger interaction event and searches for connectable devices in the vicinity. The mobile phone 310 displays a layout interface of peripheral connectable devices, receives a target device selected by a user in the layout interface, and establishes a communication connection with the target device.
In other implementations, the handset 310 has no historical connection records with other devices. The handset 310 listens for a triggering interaction event to search for surrounding connectable devices. The mobile phone 310 displays a layout interface of the peripheral connectable devices, receives a target device selected by a user in the layout interface, such as the display screen 320 and the notebook 330, and the mobile phone 310 establishes a communication connection with the target device.
In these implementations, an interaction event refers to an event that performs data interaction across devices, including but not limited to data sharing, device control, screen or voice projection across devices, and the like. The interaction event may be triggered by the mobile phone detecting that a preset condition is met, or may be triggered by the mobile phone detecting that a user performs a preset operation. The preset condition may include reaching a preset time, and the like. The preset operation includes, but is not limited to, a voice input operation, a touch operation, an air gesture operation, a cursor control operation, or the like.
After the mobile phone 310 establishes communication connection with the target device, data interaction between the mobile phone 310 and the target device can be realized, so that data sharing from the mobile phone 310 to the target device, control of the target device by the mobile phone 310, screen projection from the mobile phone 310 to the target device, sound projection from the mobile phone 310 to the target device, and the like are realized.
Based on these implementations, in some other implementations, the cell phone 310 generates the haptic signal and/or the visual signal after the cell phone 310 receives the target device selected by the user in the layout interface. Tactile signals include, but are not limited to, sound or vibration, etc. For example, a speaker of the handset 310 emits a particular sound, such as a droplet sound. As another example, the motor of the handset 310 feeds back a vibration cue. The visual signal includes a dynamic cue or a text cue, etc. The dynamic prompt may be, for example, a device identifier corresponding to the target device flashes, and the text prompt may be, for example, a text prompt that a certain device is connected to be displayed. The user is prompted to complete the selection of the target equipment through the tactile signals and/or the visual signals, and the interaction experience degree is improved.
In other implementations, the connectable devices displayed in the layout interface may also be the same as the account that the cell phone is logged into.
In an application scenario of cross-device interaction, the handset 310 searches for connectable devices and displays a layout interface of the connectable devices. The layout interface of the connectable device provided by the embodiments of the present application will be described in detail below.
The cell phone obtains the spatial relationship, including distance and angle, of each connectable device (including the display 320, the notebook 330, and the tablet 340) to the cell phone 310. Distance refers to the relative or absolute distance between each connectable device and the handset. The included angle is the included angle between the connecting line of each connectable device and the mobile phone and the orientation of the mobile phone.
In the application scenarios, embodiments or examples that follow the present application, for convenience of description, an angle that is formed by rotating the mobile phone clockwise from a straight line where the mobile phone is located to a connection line between the mobile phone and the connectable device is taken as an exemplary description for calculating the size of the included angle. It should be understood that no specific limitation to the application is intended.
As shown in fig. 4A, a point a represents a positioning point of the mobile phone 310, a point B represents a positioning point of the display screen 320, and a distance between the point a and the point B is a; point C represents the anchor point of notebook 330, and the distance between points A and C is b. Continuing to refer to fig. 4A, assuming that the orientation of the mobile phone 310 is the direction indicated by the arrow X in fig. 4A, the line of the arrow X rotates clockwise by the angle θ 1 to the line connecting the point a and the point B, and the angle θ 1 is the included angle between the line connecting the display screen 320 and the mobile phone 310 and the line of the orientation of the mobile phone 310, which may be referred to as the included angle between the mobile phone 310 and the display screen 320. The line of the arrow X is rotated clockwise by an angle θ 2 to the line connecting the point a and the point C, and the angle θ 2 is an included angle between the line connecting the notebook computer 330 and the mobile phone 310 and the line connecting the orientation of the mobile phone 310, and may be referred to as an included angle between the mobile phone 310 and the notebook computer 330. The handset 310 displays the layout interface of the connectable devices according to the spatial relationship. Specifically, the layout interface of the connectable devices is mapped according to the spatial relationship between each connectable device and the cell phone.
It is noted that in some implementations, the orientation of the handset 310 can be set as shown by arrow X in fig. 4B1The direction shown, i.e. the ray direction along the long side at the top of the mobile phone, is taken as the orientation of the mobile phone. At this time, the included angle between the mobile phone 310 and the display screen 320 is θ1. In other implementations, the orientation of the handset 310 can be set as the arrow X in FIG. 4C2The direction is shown, at this time, the included angle between the mobile phone 310 and the display screen 320 is theta2. It should be understood that the orientation of the handset may be set by default in the system or may be customized by the user. The orientation of the mobile phone can be set according to the needs and/or habits, and the orientation of the mobile phone can be set in other directions than the arrow directions shown in fig. 4B and 4C. For example, arrow X in FIG. 4D3Or X4The direction shown. The present application is not particularly limited in its orientation. Preferably, in order to reduce the difficulty of the user and have higher operation efficiency, the orientation of the mobile phone can be set as theArrow X in FIG. 4B1The direction shown.
The distance and/or angle between the mobile phone 310 and the display screen 320 is used as an exemplary description in fig. 4A, 4B, and 4C, and it should be understood that the distance and angle between the notebook 330 and the tablet 340 and the mobile phone 310 can be analogized with reference to these examples.
In the case where the orientation of the cellular phone 310 is set to a certain direction, the orientation of the cellular phone is determined with respect to the cellular phone body, i.e., is not changed. When the positions of the mobile phone 310 and the connectable devices are not changed and the body of the mobile phone 310 is turned over or rotated, the orientation of the mobile phone 310 is not changed relative to the mobile phone, so that the included angle between the mobile phone 310 and each connectable device may be changed. Because the distribution of the connectable devices in the layout interface is mapped according to the spatial relationship between the mobile phone and the connectable devices, and the spatial relationship includes the distance and the included angle, the distribution of the connectable devices in the layout interface changes along with the change of the included angle between the connectable devices and the mobile phone.
As a non-limiting example, as shown in FIG. 5, the handset 310 is oriented along a radial direction along the long side of the handset, i.e., in the direction indicated by the black arrow. The body of the mobile phone 310 is rotated clockwise by an angle, and the orientation direction of the mobile phone 310 is adjusted from the non-pointing display to the pointing display. The layout interface of the handset 310 changes from the layout interface 3101 shown in diagram a in fig. 5 to the layout interface 3102 shown in diagram B in fig. 5. As can be seen in fig. 5, the layout interface changes as the spatial relationship of the handset to the connectable device changes.
More generally, as the position of at least one of the handset 310 and the connectable devices changes, and/or the angle between the handset 310 and any one or more of the connectable devices changes, the distribution of the connectable devices in the layout interface displayed by the handset 310 changes with these changes.
In some implementations, each connectable device in the layout interface can be identified in a point. And mapping the distribution of the points corresponding to the connectable devices in the layout interface according to the spatial relationship between the connectable devices and the mobile phone. On the basis of these implementation manners, in some other implementation manners, in order to better enable a user to distinguish between the connectable devices and improve the operation efficiency, the layout interface may further include an identification description corresponding to the connectable device. Identification specifications include, but are not limited to: device name and/or device type, etc. As an example, continuing with the description of fig. 5, as shown in fig. 5, the layout interface includes device types corresponding to the connectable devices, such as a display screen, a tablet computer, or a notebook.
In other implementations, each connectable device in the layout interface may be identified by an image or symbol, or the like. And the distribution of the images or symbols corresponding to the connectable devices in the layout interface is mapped according to the spatial relationship between the connectable devices and the mobile phone. And images such as appearance diagrams or appearance schematic diagrams of connectable equipment. The symbols may be letters or numbers, or a combination of letters and numbers, etc.
It should be noted that the device identifiers corresponding to the connectable devices in the layout interface may be in other forms besides points, images or symbols. For example, the device identification may take the form of a pattern of a certain geometry, or the like. In addition, the device identifier and/or identifier description may be displayed statically or dynamically, for example, by flashing.
In other implementations, the layout interface may include an identification description for each connectable device. Providing the identification specification may facilitate a user in finding the target device more quickly. Identification specifications include, but are not limited to: device name and/or device type, etc. Device name, friendly name that can be a connectable device, etc. Such as tom's display screen, etc. The friendly name can be a default name of the system or a user-defined name.
In other implementations, the layout interface may include both a device identification and an identification description for each connectable device. The device identification and identification description for a connectable device may be displayed separately, for example as shown in fig. 5. The device identifier and the identifier description may also be displayed in a fused manner, for example, a rectangular frame is used as the device identifier, and the device description is displayed in the rectangular frame.
In other implementations, the layout interface may display the mobile phone, that is, display the device identifier and/or the identifier description corresponding to the search device, in addition to the connectable device.
In other implementations, the layout interface may also include an identification of the orientation of the cell phone, i.e., the search device. The orientation mark can be presented in a static form such as a line or an arrow, and can also be presented in a dynamic form such as a flashing line or an arrow. The display duration of the orientation mark may be consistent with the display duration of the display layout interface, or may be shorter than the display duration of the display layout interface. For example, the heading indicator is displayed for a period of time from the beginning of displaying the layout interface and then disappears. As another example, the heading indicator is displayed for a period of time during which the user rotates, turns, or moves the search device, and then disappears.
As a non-limiting example, as shown in FIG. 6, the handset 310 is oriented along a radial direction along the long dimension of the handset, i.e., in the direction indicated by the arrow. The orientation in the layout interface is identified as a dashed line. The body of the mobile phone 310 is rotated clockwise by an angle, and the orientation direction of the mobile phone 310 is adjusted from the non-pointing display to the pointing display. In the layout interface 3103 shown in diagram a in fig. 6, the point corresponding to the display screen is not located on the broken line 31031. In the layout interface 3104 shown in diagram B of fig. 6, a point corresponding to the display screen is located on a broken line 31041. As can be seen from fig. 6, the orientation indication, i.e. the position of the dashed line, in the layout interface does not change as the spatial relationship of the handset and the connectable device changes. But the device identification and identification description corresponding to each connectable device, that is, the display position of the point and the device type corresponding to each connectable device is changed. Because the orientation mark is displayed in the layout interface, a user can more visually confirm the equipment pointed by the orientation of the mobile phone, the user can conveniently correspond the layout interface with the actual scene, and the user can conveniently and accurately and efficiently select the target equipment.
In the application scenarios, embodiments, implementations, or examples that follow the present application, for convenience of description, a mobile phone is still taken as an example of the search device, and the orientation of the mobile phone is taken as an example of a ray direction along the long side direction of the mobile phone, i.e., a direction indicated by an arrow shown in fig. 5 or fig. 6. It should be understood that no specific limitation with respect to the embodiments or implementations of the application is intended.
In the embodiment of the present application, the interface form of the layout interface includes, but is not limited to: circular layout, semicircular layout, circular arc layout or minimized layout, etc. It should be noted that the exemplary descriptions of the layout interface in the form of each interface are not intended to limit the scope of the present application. Indeed, various modifications, combinations, substitutions, or alterations are contemplated in connection with the illustrative description of the layout interface in the form of various interfaces, all without departing from the application.
1. Circular layout
In the circular layout, the device identifiers and/or identifier descriptions corresponding to the connectable devices can be distributed on the circular areas and the circular ring areas corresponding to different distance intervals according to the absolute positioning of the connectable devices in the space; the device identifiers and/or identifier descriptions corresponding to the connectable devices may also be distributed over the circle and circle corresponding to different distance intervals according to the relative positioning of the connectable devices in space. The layout interface for the circular layout includes one or more display regions. A display area may for example comprise a circular display area. The plurality of display areas may include, for example, a circular display area, and one or more annular display areas. As a non-limiting example of multiple display regions, a layout interface for a circular layout includes a central circular region, one or more concentric circular regions located at the periphery of the central circular region, and one or more device identifiers displayed on the central circular region and/or the concentric circular regions. Obviously, when the connectable device does not exist in the search device periphery, the layout interface does not display the device identification of the connectable device. The central circular area and each different annular area correspond to different distance intervals, and the first distance between the connectable device corresponding to any device identifier and the mobile phone meets the requirement of the distance interval of the central circular area or the annular area where the device identifier is located. As another non-limiting example of multiple display regions, a layout interface for a circular layout may refer to the layout interfaces shown in FIGS. 5 and 6.
In a circular layout formed by absolute positioning of connectable devices in space, if there is coincidence of device identifiers corresponding to different connectable devices, for a device identifier with coincidence, according to a certain direction, for example, a direction from left to right, as shown in fig. 7, two adjacent device identifiers 71 and 72 are merged each time, and the center of the distance between the two adjacent device identifiers is used as a coordinate of a new device identifier 73 until all the device identifiers with coincidence are merged, and the device identifiers are replaced by the number of the coincident device identifiers. It should be understood that in the layout interfaces in other interface forms, when absolute positioning is adopted, the device identifiers corresponding to different connectable devices are overlapped, and this combination mode may also be adopted for display, which is not described in detail later.
In some embodiments, for a circular layout interface, an upper limit on the number of connectable devices that can be displayed on the circumference of each display area may also be set by a system default setting or user customization. The layout interface can be prevented from displaying too many device identifiers and/or identifier descriptions, which causes superposition and/or overcrowding, which is unfavorable for the user to select the target device, so that misoperation of the user can be avoided. In some implementations, if the number of connectable devices corresponding to a certain display area searched by the mobile phone exceeds the upper limit of the number, in some examples, connectable devices whose included angles belong to a certain angle range may be displayed, and other connectable devices in other angle ranges may not be displayed. For example, connectable devices whose included angles belong to within 0 to 90 ° or within 270 ° to 360 ° are displayed, and the remaining connectable devices whose included angles belong to within 90 to 270 ° are not displayed. In other examples, several connectable devices that exceed an upper number limit and that include an angle closest to 180 ° may not be displayed with the corresponding device identification and/or identification description. In other examples, several connectable devices that exceed an upper limit on the number and that are included at angles closest to 180 may be combined into a number corresponding to the number of devices to be displayed.
In other embodiments, the user can customize the span and/or range of angles between the displayed connectable device and the search device, even if a circular layout is employed. For example, it may be arranged to display device identification and/or identification descriptions of connectable devices within an angle range of 90 ° and/or 270 ° to 360 °. The corresponding angle spans 290 ° and 180 ° total.
As a non-limiting example, as shown in FIGS. 8A and 8B, layout interfaces for two different circular layouts for the same application scenario.
In this application scenario, the handset searches for 5 connectable devices, where the 5 connectable devices include a first device, a second device, a third device, a fourth device, and a fifth device. The mobile phone respectively obtains the distance (unit: meter) and the included angle (unit: degree, symbol:degree) between the mobile phone and 5 connectable devices, which are hereinafter referred to as the distance and the included angle of each connectable device, as shown in the following table one.
First device | Second device | Third device | Fourth apparatus | Fifth apparatus | |
Distance between two adjacent plates | 1.7 | 5.2 | 3.2 | 3 | 2.5 |
Included angle | 5 | 290 | 87 | 89 | 91 |
As shown in fig. 8A and 8B, the layout interface of the circular layout includes a display circle region divided by three concentric circles into three regions, i.e., a central circle region 81, a first concentric circular ring region 82 and a second concentric circular ring region 83. The layout interface for the circular layout also includes device identifications for the 5 connectable devices displayed on the display circle region. The distance threshold values of three concentric circles sequentially corresponding from inside to outside are 2 meters, 4 meters and 6 meters, the first distance interval corresponding to the central circle area 81 is 0-2 meters, the second distance interval corresponding to the first concentric ring area 82 is 2-4 meters, and the third distance interval corresponding to the second concentric ring area 83 is 4-6 meters. In the layout interface of the circular layout shown in fig. 8A and 8B, the location point of the search device, i.e., the mobile phone, is at the center position of the concentric circles. The 360 ° circumference of the display circle corresponds to a maximum angle of 360 ° for the connectable device.
Fig. 8A is a diagram in which device identifiers corresponding to connectable devices are distributed over a central circular region and/or a concentric circular region according to absolute positions of the connectable devices in space. In the layout interface of the circular layout shown in fig. 8A, the device identifiers corresponding to the 5 connectable devices are distributed and displayed on the display circle area according to the distance and the included angle of the 5 connectable devices, respectively. Wherein, because there are three connectable devices, i.e. the third device, the fourth device and the fifth device have device identifiers that coincide, the device identifiers of the three connectable devices are merged, one device identifier 801 is displayed on the display circle area, and the device identifier 801 is replaced by the number "3". The device identifiers corresponding to the other two connectable devices are respectively marked, that is, the device identifier 802 corresponding to the first device and the device identifier 803 corresponding to the second device are respectively marked.
Fig. 8B is a diagram for distributing device identifications corresponding to connectable devices on a central circular region and/or a concentric circular region according to relative and/or absolute positioning of the connectable devices in space. In the layout interface of the circular layout shown in fig. 8B, the mobile phone determines to which distance section the distance of each connectable device belongs, respectively. The distance 1.7 m from the first device belongs to the first distance interval, and unlike fig. 8A, in fig. 8B, the device identifier 802 corresponding to the first device is completely displayed in the central circular area 81 according to the relative positioning of the first device. The distance of the second device of 5.2 meters belongs to a third distance interval, which may be 290 ° according to the included angle of the second device, as in fig. 8A, and in fig. 8B, the device identifier 803 corresponding to the second device is displayed in the third concentric circular ring area 83 according to the absolute positioning of the second device. The respective distances of the third device, the fourth device and the fifth device of 3.2 meters, 3 meters and 2.5 meters all belong to the second distance interval, and the respective included angles 87 °, 89 ° and 91 ° are close to each other, that is, the difference between the included angles is smaller than or equal to a preset threshold, or the difference between the included angles satisfies an angle condition, the angle condition is smaller than or equal to the preset threshold, and the preset threshold may be, for example, 5 °. On one hand, the target equipment is prevented from being selected by a user by mistake due to overlapping when the equipment identifications corresponding to the three equipment are displayed according to absolute positioning, and the interaction accuracy is improved; on the other hand, the target equipment can be prevented from being selected only by a user through complex manual operation, and the interaction efficiency is improved. Different from fig. 8A, in fig. 8B, the device identifier 801a corresponding to the third device, the device identifier 801B corresponding to the fourth device, and the device identifier 801c corresponding to the fifth device are dispersedly displayed in the second concentric circular ring area 82 according to the relative positioning of the three devices.
It should be noted that, in order to distinguish from other device identifiers that may be displayed in the second concentric circular ring area 82, when the device identifiers 801a, 801b, and 801c are displayed dispersedly, in some examples, they may be displayed dispersedly at a position close to their included angles, that is, at a position close to a circumference corresponding to 90 °. In other examples, the device identifier 801a, the device identifier 801b, and the device identifier 801c may be displayed in a clockwise direction in a scattered manner in any one area of the second concentric circular ring area 82 according to the size of the included angle.
It should be understood that in the example shown in fig. 8B, a distance equal to 2 meters may belong to a first distance interval, or may belong to a second distance interval; a distance equal to 4 meters may belong to the second distance interval or may belong to the third distance interval. The present example is not particularly limited, and may be set by default of the system or by user customization in actual use.
The circular layout may correspond to an included angle range of 0 to 360 deg. for the connectable device due to the radiation angle being 360 deg. (unit: degree). In the case where the maximum distance threshold is not defined, or is defined and is equal to or greater than the communication range of the search device, the search device may display all connectable devices around it searches for. In the case where the maximum distance threshold is limited and the maximum distance threshold is smaller than or equal to the communication range of the search device, the search device may display a connectable device whose distance from the connectable device is smaller than or equal to the maximum distance threshold and whose included angle is any angle. The layout interface of the circular layout comprises a circular display area formed by one circle or a plurality of display areas divided by a plurality of concentric circles. The plurality of display areas may have continuous or discontinuous distance sections, each display area corresponds to one distance section, and the spans of the distance sections may be equal or unequal. It should be noted that, in some implementations, the distance threshold corresponding to the concentric circle with the largest radius is the maximum distance threshold, and the maximum distance threshold may be set by default by the system or may be set by user definition. It should be appreciated that in some implementations, the maximum distance threshold may be less than, equal to, or greater than the communication range of the search device when the maximum distance threshold is set by user customization. It should also be understood that in some implementations, the distance interval corresponding to each display area may be set by default by the system or may be set by user-definition.
2. Semi-circle layout
In the semicircular layout, the device identifiers corresponding to the connectable devices may be distributed on the semicircles of different distance intervals according to the absolute positioning and/or the relative positioning of the connectable devices in space. The semicircular layout allows to display the angle of the connectable device with a maximum span equal to 180 deg., since the maximum radiation angle is 180 deg.. That is, the span of included angles may be set by default by the system or customized by the user to any value less than or equal to 180 °, such as 180 °.
In addition, the specific included angle range can be set by default by the system or can be set by the user in a self-defined manner. For example, device identifiers and/or identifier descriptions corresponding to connectable devices within an angle range of 0 to 90, and 270 to 360 are shown, where the angle spans 290, for a total of 180.
In some embodiments, similar to the case of a layout interface in a circular layout, a maximum distance threshold between a displayed connectable device and a search device may also be user-defined or set by system default for a layout interface in a semicircular layout; and/or an upper limit on the number of device identifiers or identifier specifications displayed on each semicircle.
As a non-limiting example, a layout interface for a semicircular layout may be seen in the layout interface shown in FIG. 9, panels A or B. The layout interface shown in diagram a of fig. 9 is displayed in the middle area of the handset, while the layout interface shown in diagram B of fig. 9 is displayed in the lower area of the handset.
3. Arc layout
In the arc layout, the device identifiers and/or identifier descriptions corresponding to the connectable devices may be distributed on the arcs in different distance intervals according to the absolute positioning and/or the relative positioning of the connectable devices in space. The arc layout may display the maximum span of the included angle of the connectable devices as the maximum radiation angle due to the maximum radiation angle. The arc layout comprises one arc or a plurality of arcs arranged in sequence, or comprises one or a plurality of arc display areas. The arcs may be arcs, for example, arcs, etc., where the arc layout is an arc layout.
In some embodiments, the user may also customize the range of angles between the displayed connectable device and the search device, and/or the user may customize the maximum distance threshold between the displayed connectable device and the search device, and/or the upper limit of the number displayed on each arc, and/or the distance interval corresponding to each arc, etc. It will be appreciated that the span of the angular range provided will be less than or equal to the maximum radiation angle.
In some embodiments, connectable devices whose included angles satisfy the preset angle range may be displayed, and connectable devices whose included angles do not satisfy the preset angle range may not be displayed. In some other embodiments, connectable devices whose included angle satisfies the preset angle range may be displayed, and other connectable devices that do not satisfy the preset angle range may replace the device identification and/or the identification description by the total number of devices. In other embodiments, several connectable devices that exceed the upper limit of the number and that are included at angles closest to 180 may be combined into a number corresponding to the number of devices to be displayed.
As a non-limiting example, a layout interface for an arc layout may be seen in the layout interface shown in FIG. 10. The layout interface comprises three arcs, each arc corresponds to a distance interval, namely each arc corresponds to a distance condition.
For example, the preset angle range includes: within 0 to 15 °, and within 345 ° to 360 °. The number of the connectable devices which belong to the preset angle range and are searched by the mobile phone is 7, and the connectable devices are displayed on the arc of the corresponding distance interval according to the respective distances of the 7 connectable devices. And 6 connectable devices which are searched by the mobile phone and do not belong to the preset angle range are combined according to the distance of the 6 connectable devices and then displayed on the arc of the distance range. There are two arcs that include 3 connectable devices that are not within the predetermined included angle range, respectively. For example, the device identifier 1001 is replaced with the number 3, and the combined number of connectable devices on the arc on which the device identifier 1001 does not satisfy the preset angle range is 3.
As another example, an upper number limit is set for each of the three arcs. The number of the two arc-shaped connectable devices exceeds the maximum device identification number upper limit displayed by the arc, and a plurality of connectable devices which exceed the number upper limit and have included angles closest to 180 degrees can be combined into numbers corresponding to the number of the devices for display. And if the number of the connectable devices belonging to the distance interval corresponding to the middle arc is 6 and exceeds the upper limit of the number displayed by the arc by 3, combining the 3 connectable devices closest to 180 degrees in the 6 connectable devices into a number 3 corresponding to the number 3 of the devices for display.
For another example, the three arcs do not set the upper limit of the number of devices, and the preset angle range includes: within 0 to 15 °, and within 345 ° to 360 °. The number of the connectable devices which belong to the preset angle range and are searched by the mobile phone is 13, and the connectable devices are displayed on the arc of the corresponding distance interval according to the respective distances of the 13 connectable devices. Taking the middle arc as an example, the number of connectable devices belonging to the distance interval corresponding to the middle arc is 6, for the 6 connectable devices, the included angle difference between two adjacent connectable devices is sequentially judged, if the included angle difference is smaller than a preset threshold value, for example, 5 °, the device identifications of the two adjacent connectable devices are merged, the center of the included angle between the two adjacent connectable devices is used as the included angle of the new device identification, then the included angle difference between the included angle of the new device identification and the included angle of the next adjacent connectable device is judged, whether merging is to be carried out is judged, until all the device identifications which coincide are merged are completed, and the number of the coincident device identifications is used for replacing the device identifications. As shown in fig. 10, the device identifications of 3 connectable devices out of the 6 connectable devices belonging to the middle arc are combined into the number 3 corresponding to the number 3 of devices to be displayed.
As another example, an upper number limit is set for each of the three arcs. The number of the two arc-shaped connectable devices exceeds the maximum device identification number upper limit displayed by the arc, and a plurality of connectable devices which exceed the number upper limit and have included angles closest to 180 degrees can be combined into numbers corresponding to the number of the devices for display. Taking the middle arc as an example, the number of connectable devices belonging to the distance interval corresponding to the middle arc is 6, and if the number exceeds the upper limit of the number displayed by the arc by 3, the 3 connectable devices closest to 180 ° among the 6 connectable devices are combined into the number 3 corresponding to the number 3 of devices for display. The other 3 connectable devices belonging to the middle arc are distributed and displayed on the middle arc, the included angle between the adjacent connectable devices is smaller than a preset threshold value, for example, 10 degrees, so as to avoid the coincidence of the device identifiers, and the device identifiers corresponding to the 3 connectable devices are dispersedly displayed on the middle arc.
It should be noted that, as shown in fig. 10, when the mobile phone receives a click operation of a user on the device identifier 1001, that is, the number 3, the mobile phone may display the auxiliary device list 101, and display the device identifiers and identifier descriptions corresponding to the 3 connectable devices corresponding to the number 3 in the auxiliary device list 101.
It should be understood that in other interface types of layout interfaces, the non-displayable connectable devices may be compositely displayed in the manner as illustrated in fig. 10, and include, but are not limited to, connectable devices that do not fall within the range of angles and/or exceed the upper limit of the number, etc. In the embodiment of the application, at least one of a distance limiting range, an included angle limiting range and an upper limit of the number of devices on a certain circle or arc can be set in the layout interface, so that the situation that too many device identifiers are displayed on the layout interface, overlapping and/or overcrowding are caused, the target devices are not selected by a user, and misoperation of the user is avoided.
In some implementations, if the orientation of the search device points to one or more connectable devices, the layout interface may display a device identifier corresponding to a connectable device closest to the search device in the one or more connectable devices. As a non-limiting example, the layout interface may be as shown in fig. 11, where the orientation of the mobile phone points to a device identifier 111 corresponding to a connectable device closest to the mobile phone among the connectable devices, and the device identifier 111 is located on an orientation identifier 112 of the mobile phone.
In other implementation manners, in consideration of user experience, device identifiers corresponding to connectable devices whose included angles with the search device satisfy a certain angle range or a deflection angle threshold value are dispersedly displayed in the layout interface. It should be noted that, if an upper limit of the number of devices is set in a semicircle, arc or arc, the upper limit of the number of devices needs to be met.
As a non-limiting example, an included angle between the connectable device and the orientation of the mobile phone satisfies a certain angle range, for example, within 0 to 15 ° or within 345 ° to 360 ° (i.e., the angle range is within 345 ° to 15 °), or a deviation angle between the connectable device and the orientation direction satisfies a deviation angle threshold, which may be, for example, 15 °, and a plurality of connectable devices closest to the mobile phone are provided, but the number of connectable devices does not exceed the upper limit of the number of device identifiers on the semicircle or the circular arc, and then the device identifiers corresponding to the plurality of connectable devices are sequentially arranged according to the size of the included angle. The layout interface may be as shown in fig. 12 a and B, an included angle between the layout interface and the orientation of the mobile phone is within 345 ° to 15 °, and the device identifier 121 corresponding to the connectable device closest to the mobile phone is located on the orientation identifier 122 of the mobile phone or on both sides of the orientation identifier 122. That is, in the diagrams a and B in fig. 12, the device identification 121 of the connectable device whose off angle from the orientation of the cellular phone is less than or equal to the off angle threshold 15 ° is displayed.
In other non-limiting examples, the device identifier corresponding to the connectable device with the smallest included angle and the smallest distance may also be displayed on the orientation identifier 122 of the mobile phone, for example, the device identifier is displayed on the dotted line representing the orientation identifier 121 in fig. 12.
4. Minimized layout
The layout interface, which minimizes the layout, may display the connectable device with the smallest angle to the searching device and/or the closest distance. For the convenience of user operation, the layout interface minimized according to the holding state of the search apparatus may be displayed in a certain area of a lower left corner, a lower right corner, or a middle portion of the search apparatus. For example, as shown in fig. 13, a layout interface for a minimized layout is shown that displays the connectable device mate30 at a minimum angle to the handset 310. As shown in diagram a of fig. 13, the layout interface is displayed in the lower left corner of the display screen. As shown in fig. 13B, the layout interface is displayed in the left area in the middle of the display screen.
It should be noted that, regardless of the interface type of the layout interface, the display area in the display screen may be considered to be adapted to the holding state of the search device. When the holding state of the search device, for example, a mobile phone, is a left-hand holding state, the layout interface of the minimized layout is preferably displayed in the lower left corner or a certain area below the left of the display screen; the holding state of the mobile phone is a right-hand holding state, and the layout interface of the minimized layout is preferably displayed in a lower right corner or a certain area below the right of the display screen. When the holding state of the mobile phone is the middle holding state, the layout interface can be preferably displayed in the middle display area.
When the mobile phone can provide layout interfaces in various interface forms and provides an interface form switching function, the mobile phone receives switching operation input by a user and can switch the layout interface from one interface form to another interface form. The switching operation may include, but is not limited to, a preset voice input operation, a touch operation, a cursor control operation, or an air gesture. It should be understood that the number of alternative interface forms may be 2, 3, 4 or other numbers, which may be practical, and the application is not limited in this regard.
In some implementations, the search device receives a user input touch operation of sliding upwards, downwards, leftwards or rightwards, and switches the layout interfaces displayed by different interfaces. For example, a touch operation of sliding up is received by a user for a plurality of times, and in response to the touch operation for a plurality of times, the switching order may be as follows: and a circular switching sequence of switching from the minimized layout to the semicircular layout or the circular arc layout (alternative), switching from the semicircular layout or the circular arc layout (alternative) to the circular layout, and switching from the circular layout to the minimized layout. For another example, a touch operation of sliding upwards, which is input by the user for multiple times, is received, and in response to the multiple touch operations, the switching sequence may be as follows: a one-way sequence of switching from the minimized layout to the semicircular layout or the circular arc layout (alternative), and switching from the semicircular layout or the circular arc layout (alternative) to the circular layout. If the current layout interface is in a circular layout, if the user continues to input the touch operation of sliding upwards, the layout interface is not switched. At this time, the user may input a touch operation of sliding down, and the switching order of the layout interface may be as follows: and switching from the circular layout to the semicircular layout or the circular arc layout (alternative), and then switching from the semicircular layout or the circular arc layout (alternative) to the minimized layout.
As a non-limiting example, as shown in FIG. 14, a search device is shown as a cell phone 310. The handset 310 displays a layout interface for a circular layout. The mobile phone 310 receives the upward sliding operation input by the user on the circular layout interface, and the mobile phone 310 is switched to the layout interface with the semicircular layout. The mobile phone 310 receives the downward sliding operation input by the user on the circular layout interface, and the mobile phone 310 switches to the layout interface with the minimized layout. In the layout interface with the minimized layout, the device identifier and the identifier description corresponding to the display screen with the smallest included angle with the mobile phone 310 are displayed, the device identifier is a dot, and the identifier description is the "display screen".
It should be noted that the display areas occupied by the layout interfaces in different interface forms on the display screen are different in size, and in general, the circular layout occupies the largest display area, while the minimized layout occupies the smallest display area. When a user needs to display the layout interface and other windows simultaneously or the user is convenient to operate by one hand, the layout interface can be reduced by switching the interface form, so that other windows are prevented from being shielded or target equipment cannot be selected by one-hand operation, the adaptability of multi-window display is improved, the convenience of one-hand operation is improved, and the operation efficiency of the user is improved.
For example, in a case of adopting a unidirectional switching sequence, a user may gradually reduce a display area of the layout interface by inputting a sliding touch operation in the first direction for a plurality of times; the display area of the layout interface can be gradually enlarged by inputting a sliding touch operation in the opposite direction to the first direction for a plurality of times. Or, the user can gradually enlarge the display area of the layout interface by inputting the sliding touch operation in the first direction for multiple times; the display area of the layout interface may be gradually reduced by inputting a sliding touch operation in a direction opposite to the first direction for a plurality of times.
When the mobile phone can provide layout interfaces in various interface forms but does not provide the interface form switching function, the mobile phone can display the layout interface in any interface form according to default setting of a system or user customization. The user may select an interface form of the layout interface in the system device interface.
When the mobile phone provides a layout interface in an interface form, the interface form of the layout interface cannot be changed in a user-defined mode. In this case, the mobile phone may provide a layout interface in the form of one of a circular layout, a semicircular layout, a circular arc layout, a minimized layout, or the like.
It should be noted that the application scenarios exemplified in the present application may all adopt these layout interfaces, and details will not be described in the subsequent application scenarios.
The layout interfaces of four interface forms are introduced, and then how to obtain the spatial relationship is introduced.
As previously mentioned, the spatial relationship between two devices includes distance and angle. The present application uses wireless location technology to detect relative or absolute distances and angles between two devices, such as a first device and a second device. Wireless location technologies include, but are not limited to, GPS, UWB, Wi-Fi, Bluetooth, and the like. Through wireless location technology, electronic devices are able to locate their respective locations in space. According to the method and the device, the positions of the devices (including the searching device and the connectable devices) are determined, the included angle between each connectable device and the searching device is determined, the distance and the included angle of each connectable device are mapped through the layout interface, a user can quickly select the target device, and then device interaction is completed.
The most common positioning techniques currently used are mainly: time Difference location techniques, Angle of Arrival (AoA) measurement techniques, Time of Arrival (ToA) location techniques, Time Difference of Arrival (TDoA) location techniques, and so on.
In some implementations of the present application, the distance and the included angle may be obtained through two-way communication between the devices. These implementations require hardware on the device that can transmit and receive wireless location signals, and calculate the distance and angle between the two devices based on the two-way communication between the two devices.
In some other embodiments of the present application, the time difference may be used to determine the absolute position (x, y, z, theta) of each electronic device (including the searching device and the connectable device) based on TDoA positioning techniques by triangulation between multiple devices. Each electronic device (including the searching device and the connectable device) includes hardware capable of transmitting and receiving wireless location signals. There are 3 or more than 4 fixed pieces of hardware in the space capable of transmitting and receiving wireless positioning signals, and the hardware can exist independently or depend on other devices in the space, wherein the hardware is installed. When there is sufficient hardware fixed in the scene with the capability to transmit and receive wireless positioning signals, a 6 degree of Freedom (6 DoF) positioning of the respective electronic devices can be determined.
As a non-limiting example, bluetooth positioning technology is used for illustration.
The bluetooth positioning technology is based on a Received Signal Strength Indication (RSSI) value and performs positioning by a triangulation positioning principle. Triangulation is performed among multiple devices, the devices need to have hardware capable of transmitting and receiving wireless positioning signals, 3 or more than 4 pieces of fixed hardware capable of transmitting and receiving wireless positioning signals are used as beacons in the space, and the fixed hardware can exist independently or depend on other devices in the space, wherein the hardware is installed.
For example, as shown in fig. 15A and 15B, a schematic diagram of bluetooth positioning is shown. Using bluetooth positioning technology, 3 wireless bluetooth beacons, i.e., a first bluetooth beacon 1521, a second bluetooth beacon 1522, and a third bluetooth beacon 1523, are installed in the field in advance, each of which continuously broadcasts bluetooth signals. The electronic device, i.e., the cell phone 1510, receives the bluetooth signals broadcast by the 3 bluetooth beacons in real time, determines the RSSI values of the bluetooth signals from the 3 bluetooth beacons, converts the 3 RSSI values into distances from the respective bluetooth beacons to the cell phone 1510, and determines the location of the cell phone 1510 in combination with the known coordinates of the 3 bluetooth beacons 1520 (x0, y0, z 0). That is, the location coordinates (x0, y0, z0) of the handset 1510 are determined based on the first distance r1 from the first bluetooth beacon 1521 to the handset 1510, the second distance r2 from the second bluetooth beacon 1522 to the handset 1510, the third distance r3 from the third bluetooth beacon 1523 to the handset 1510, the location coordinates (x1, y1, z1) of the first bluetooth beacon 1521, the location coordinates (x2, y2, z2) of the second bluetooth beacon 1522, and the location coordinates (x3, y3, z3) of the third bluetooth beacon 1523.
The bluetooth direction finding function can be implemented in the following two ways.
The first is the AoA measurement method, in which multiple antennas are used by devices that need to be directed, while only one antenna is used by the beacon or the device on which the beacon depends. As shown in fig. 15C, the beacon transmits a special data packet through a low power transmitter via a single antenna. The surrounding low-power receiver, for example, a receiver of a mobile phone, has a plurality of antennas arranged in an array, and because the distances from the plurality of antennas to the transmitter are different, each antenna of the receiver finds a phase difference of a received signal, and finally obtains a relative direction of the signal through data calculation, such as an angle θ 1 between the mobile phone and a beacon.
The second is an Angle of departure (AoD) measurement method in which the device that needs to be oriented uses only one antenna, while the beacon or beacon-dependent device uses multiple antennas. As shown in fig. 15D, the handset may receive signals through the low power receiver, and the beacon may send a special data packet when switching between the active antennas arranged in the array through the low power transmitter. And finally, calculating the relative direction of the signals through data to obtain the angle theta 1 and the angle theta 2 between the mobile phone and the two beacons.
The second application scene is an application scene that the mobile phone projects to the display screen.
Fig. 16A shows a schematic diagram of an application scenario in which the mobile phone 1610 projects a screen onto the display screen 1620.
With the increasing abundance of mobile phone functions, many mobile phone brands have come out models with screen projection functions. And a screen projecting key in the mobile phone is turned on, so that the wireless screen projecting function can be started. The function is simpler, easy to operate. The wireless screen projection function of the mobile phone also needs a corresponding function key on the display screen to realize connection screen projection. For convenience of description, the second application scenario is illustrated by taking a mobile phone with a screen projection function and a display screen as an example.
It should be noted that in other application scenarios, the screen projection may also be implemented by third-party screen projection software.
In some examples, display screen 1620 turns on a wireless screen projection function. The handset 1610 turns on bluetooth and WLAN. And the user starts the wireless screen projection in the pull-down notification bar, or starts the wireless screen projection in a system setting interface, or starts the wireless screen projection in a certain application interface. The mobile phone 1610 receives a screen-on operation input by a user, searches for peripheral devices, and displays a layout interface of the peripheral connectable devices, where the layout interface may include, but is not limited to, a circular layout, a semicircular layout, an arc layout, or a minimized layout. Such as the circular arc layout 1611 shown in diagram a in fig. 16B. When the device identifier of the display screen 1620 appears in the layout interface, for example, the device identifier 16111 of the display screen 1620 shown in a in fig. 16B, the mobile phone 1610 receives a click operation applied by the user to the device identifier 16111 corresponding to the display screen 1620, and the mobile phone 1610 establishes a communication connection with the display screen 1620.
In other examples, the handset 1610 may begin searching for peripheral devices and display a layout interface for peripheral connectable devices upon receiving a global gesture input by a user.
In other examples, the difference from the previous example is that: after receiving the click operation of the user on the device identifier corresponding to the display screen 1620, the mobile phone 1610 may also push a prompt, and the user may perform an operation according to the subsequent prompt, so that the mobile phone 1610 establishes a communication connection with the display screen 1620. For example, the handset 1610 receives a click operation of the user on the device identifier corresponding to the display screen 1620, and displays a query interface of whether to screen on the display screen 1620. When the mobile phone 1610 receives a confirmation operation, such as clicking a confirmation button, input by the user on the query interface, the mobile phone 1610 establishes a communication connection with the display screen 1620. For another example, the mobile phone 1610 displays a password input interface after receiving a click operation of the user on the device identifier corresponding to the display screen 1620, and requires the user to input a correct password. The handset 1610 receives the correct password entered by the user and the handset 1610 establishes a communication connection with the display screen 1620.
It should be understood that the names of the wireless screen projection functions vary from one manufacturer to another, as well as from the mobile phone. For example, it may also be referred to as wireless display, multi-screen interaction, screen mirroring, or the like.
In some embodiments, the handset 1610 determines that the target device selected by the user in the layout interface is the display screen 1620, and the handset 1610 establishes a communication connection with the display screen 1620. An interface displayed on the cell phone 1610, such as a display interface of a video application, is then projected to the display screen 1620. After the mobile phone 1610 projects the screen to the display screen 1620, the mobile phone generates a corresponding control instruction according to the received user operation, and the control instruction is sent to the display screen 1620, so that the display screen 1620 is controlled by the mobile phone.
In other embodiments, the handset determines that the target device selected by the user in the layout interface is a display screen, and the handset 1610 establishes a communication connection with the display screen 1620. And then, generating a corresponding screen projection instruction according to the user operation received by the mobile phone to complete the screen projection 1620 from the mobile phone 1610 to the display screen, or complete the screen projection from the display screen 1620 to the mobile phone 1610. In addition, after the screen projection is completed, the mobile phone 1610 may further generate a corresponding control instruction according to the received user operation, where the control instruction may be used to control the mobile phone 1610, and the mobile phone may also send the control instruction to the display screen 1620, where the control instruction is used to control the display screen 1620.
In the application scenario, the implementation manner of the mobile phone 1610 sending the control instruction to the display screen may include the following two non-limiting implementation manners.
In a first implementation, the mobile phone 1610 generates a corresponding control command according to a user operation, and based on a communication connection established between the mobile phone 1610 and the display screen 1620, the mobile phone 1610 sends the control command to the display screen 1620, so as to control the display screen 1620 through the mobile phone 1610.
In a second implementation manner, the mobile phone 1610 generates a control instruction according to a user operation, and the control instruction is sent to a cloud, such as an intelligent home cloud, and the intelligent home cloud obtains a current state of the display screen 1620 based on a communication connection established between the mobile phone 1610 and the display screen 1620, forwards the control instruction to the display screen 1620, obtains an execution result of the display screen 1620, and returns the execution result to the mobile phone 1610.
In an embodiment, an implementation manner, or an example of the present application, a mobile phone may generate a corresponding control instruction through a received user operation. The user operation includes, but is not limited to, one or more of an operation of a virtual key acting on the interface, an operation of a physical key acting on the mobile phone, an input preset touch gesture or air-space gesture, an operation performed to change a sensing parameter of a sensor of the mobile phone, and an input preset voice. According to the method and the device, one or more target devices do not need to be searched and designated in specific application, the operation path of the target devices is greatly reduced, and the method and the device can be directly realized by utilizing the software and hardware capabilities of the searching device.
In a non-limiting example, the mobile phone 1610 receives an operation that the target device selected by the user in the layout interface is the display screen, and the mobile phone 1610 may jump to the device control interface corresponding to the display screen 1620 in the device control application, or directly overlay the device control menu of the display screen on the layout interface. The equipment control interface and the equipment control menu respectively comprise various virtual controls, and different control instructions can be correspondingly generated by triggering different virtual controls. Then, the mobile phone 1610 receives the operation of the user, such as clicking or sliding any control button in the device control interface or the device control menu, generates a corresponding control command, and sends the control command to the display screen 1620, so as to control the display screen 1620 to execute the control command.
For example, continuing with fig. 16B, the cell phone 1610 receives a click operation of the device identifier 16111 of the display screen clicked by the user, and jumps to the device control interface 1612 of the corresponding display screen 1620 in the device control application, that is, the cell phone 1610 switches from the interface shown in fig. a to the interface shown in fig. B in fig. 16B. In addition, the device control interface 1612 may include a screen-projection button, and if the cell phone 1610 receives a click operation applied to the screen-projection button by the user, an interface displayed on the cell phone 1610, for example, a display interface of a video application, is projected to the display screen 1620. The device control interface 1612 may further include other control buttons, and if the mobile phone 1610 receives a click operation of a user on a control button, such as a volume up button, the mobile phone 1610 generates a volume up control command and sends the control command to the display screen 1620, and the display screen 1620 raises the volume according to the control command.
As another example, continuing with fig. 16B, the cell phone 1610 receives a click operation of the user on the device identifier 16111 of the display screen, and expands the device control menu 1613 for the display screen on the layout interface, that is, the cell phone 1610 switches from the interface shown in fig. a to the interface shown in fig. C in fig. 16B. In addition, the device control menu 1613 may include a screen-out button, and if the cell phone 1610 receives a click operation from a user on the screen-out button, an interface displayed on the cell phone 1610, for example, a display interface of a video application, is screen-out on the display screen 1620. The device control interface 1613 may also include other control buttons, and if the mobile phone 1610 receives a click operation from a user on a control button, such as a pause button, the mobile phone 1610 generates a pause control command and sends the pause control command to the display screen 1620, and the display screen 1620 pauses video playing according to the pause control command.
In another non-limiting example, the mobile phone 1610 may generate a corresponding control instruction according to the received operation of the user on the physical key, and control the display screen 1620 to respond to the control instruction. The physical keys of the mobile phone include, but are not limited to, a switch key and/or a volume key, etc.
For example, referring to fig. 16C, after the display content of the mobile phone 1610 is projected to the display screen 1620, the mobile phone 1610 may generate a control instruction for increasing the volume according to the received pressing operation of the volume up key 1614 by the user. The handset 1610 sends the control command to the display screen 1620. The display screen 1620 adjusts the volume in response to the control command.
In another non-limiting example, the cell phone 1610 receives an operation of the display screen as a target device selected by the user in the layout interface, and the cell phone 1610 establishes a communication connection with the display screen 1620. The mobile phone 1610 may generate a screen-projecting instruction according to a preset touch screen gesture or an air-separating gesture input by a user, and control screen projection.
For example, FIG. 16D is an exemplary diagram illustrating generation of a screen-projection instruction based on a user's touch screen gesture and an air gesture. Referring to a diagram a in fig. 16D, the mobile phone 1610 receives a left-sliding touch screen operation input by a user, generates a first screen-projection instruction, and projects the display content of the mobile phone 1610 to the display screen 1620. If the mobile phone 1610 receives a touch screen operation of sliding right, which is input by the user, to generate a second screen-projecting instruction, and sends the second screen-projecting instruction to the display screen, the display screen 1620 projects the display content to the mobile phone 1610 according to the second screen-projecting instruction, where a is not shown in fig. 16D.
Referring to a diagram B in fig. 16D, if the mobile phone 1610 receives a palm-stroking operation in a direction away from the body of the user, which is input by the user, a first screen-casting instruction is generated, and the display content of the mobile phone 1610 is cast on the display screen 1620. If the mobile phone 1610 receives a palm-swiping operation close to the body direction of the user, which is input by the user, to generate a second screen-throwing instruction, and the second screen-throwing instruction is sent to the display screen, and the display screen 1620 throws the display content to the mobile phone 1610 according to the second screen-throwing instruction.
In the example shown in fig. 16D, the screen projection direction is determined according to the direction of the touch screen gesture or the air gesture, that is, whether to project the screen from the mobile phone to the display screen or project the screen from the display screen to the mobile phone.
In other application scenarios, for example, an application scenario of content transmission between the first device and the second device, or an acoustic application scenario between the first device and the second device, the transmission direction of the content or the transmission direction of the audio data may be determined by the directions of the touch gesture and the space gesture. The transmission direction comprises: from the first device to the second device or from the second device to the first device.
It should be further noted that, after the mobile phone is projected to the display screen or the display screen is projected to the mobile phone, in some examples, the mobile phone may generate a control instruction according to a received preset gesture or an air-separating gesture input by the user, and control the mobile phone and/or the display screen.
In another non-limiting example, the handset 1610 may also generate control instructions through changes in the sensed parameters of the sensors of the handset 1610. Specifically, the handset 1610 can determine a change in the native posture and/or position according to a change in the sensing parameter of the sensor, the change in the native posture includes, but is not limited to, rotation or rotation, and the like, and the change in the native position includes, but is not limited to, movement, and the like. Thereby generating corresponding control instructions according to the change of the local posture and/or the local position. The control instructions may be used to control the handset and/or the display screen.
For example, referring to the screen-projecting application scenario shown in fig. 16E, after the display content of the mobile phone 1610, such as a certain video, is projected onto the display screen 1620, the user rotates the mobile phone counterclockwise with the orientation of the mobile phone 1610 as a rotation axis, and the mobile phone 1610 senses the counterclockwise rotation of the mobile phone and generates a control instruction for decreasing the volume. The handset 1610 sends the control command to the display screen 1620. The display screen 1620 decreases the volume in response to the control command. If the mobile phone 1610 senses that the mobile phone is rotated clockwise, not shown in fig. 16E, a control command for increasing the volume is generated, the mobile phone 1610 sends the control command to the display screen 1620, and the display screen 1620 responds to the control command to increase the volume.
For another example, continuing to use the screen-projecting application scenario shown in fig. 16E as an example, after the display content of the mobile phone 1610, such as a certain video, is projected onto the display screen 1620, the mobile phone 1610 senses that the mobile phone moves clockwise, and generates a control instruction for increasing the volume so as to control the display screen 1620 to increase the volume; the handset 1610 senses that the mobile phone is moving counterclockwise, and generates a control instruction for decreasing the volume to control the display screen 1620 to decrease the volume.
As another example, continuing with the screen-projecting scenario shown in fig. 16E as an example, after the display content of the mobile phone 1610, such as a certain video, is projected onto the display screen 1620, the mobile phone 1610 senses that the mobile phone is moving away from the ground, i.e., moving upward, and generates a control instruction for increasing the volume so as to control the display screen 1620 to increase the volume. The handset 1610 senses that the mobile phone is moving close to the ground, i.e., moving downward, and generates a control command to decrease the volume to control the display screen 1620 to decrease the volume.
For another example, continuing with the screen-projection application scenario shown in fig. 16E as an example, after the display content of the mobile phone 1610, such as a certain video, is projected onto the display screen 1620, the mobile phone 1610 senses that the mobile phone is moving left, and generates a control instruction for increasing the volume so as to control the display screen 1620 to increase the volume. The handset 1610 senses that the machine has moved to the right and generates a control command to decrease the volume to control the display screen 1620 to decrease the volume.
For another example, continuing with the screen-shot scenario shown in fig. 16E, the mobile phone 1610 receives a click operation of the user clicking the device identifier of the display screen on the layout interface, and the mobile phone 1610 establishes a communication connection with the display screen 1620. If the user moves the mobile phone 1610 away from the body, the mobile phone 1610 senses that the mobile phone is moving away from the user side, and generates a first screen-projecting command to project the display content of the mobile phone 1610 to the display screen 1620. If the user moves the mobile phone 1610 in the body-facing direction, the mobile phone 1610 senses that the mobile phone is moving toward the user side, generates a second screen-projecting instruction, sends the second screen-projecting instruction to the display screen, and the display screen 1620 projects the display content to the mobile phone 1610 according to the second screen-projecting instruction. In this example, the screen projection direction is determined according to the moving direction of the mobile phone, i.e., whether to project the screen from the mobile phone to the display screen or from the display screen to the mobile phone.
The third application scene is an application scene of the smart home.
Fig. 17A shows a schematic diagram of a smart home scene. In an application scenario of a smart home scenario, only the cell phone 1710 and the luminaire 1720 are shown. It should be understood that in a practical scenario, more internet of things devices may be included.
The cell phone 1710 may start searching for peripheral devices and display a layout interface of peripheral connectable devices after receiving a global gesture input by a user. The layout interface may include, but is not limited to, a circular layout, a semi-circular layout, an arc layout, or a minimized layout. In the application scenario, a layout interface in an arc layout is taken as an example for explanation, a bias angle threshold value is preset to be 15 degrees on the layout interface in the arc layout, the upper limit of the number of devices is 1, and the device identifiers are displayed on the orientation identifiers. Therefore, the layout interface only displays 1 connectable device with the smallest declination angle, wherein the declination angle between the connecting line of the connectable device and the mobile phone and the orientation of the mobile phone is less than or equal to within 15 degrees. When the user rotates the mobile phone 1710, the device pointed by the orientation of the mobile phone 1710 is adjusted, and when the user rotates the mobile phone 1710 to enable the orientation of the mobile phone to be basically pointed to the lamp 1720, the device identifier corresponding to the lamp 1720 appears in the layout interface. For example, in the arc layout interface 1711 shown in fig. 17B, the device identifier 17111 of the luminaire 1720 is displayed on the orientation identifier, the cell phone 1610 receives a click operation performed by the user on the device identifier 17111 corresponding to the luminaire 1720, and the cell phone 1710 establishes a communication connection with the luminaire 1720.
After the mobile phone 1710 establishes a communication connection with the lamp 1720, the mobile phone 1710 may further generate a corresponding control instruction according to the received user operation, the mobile phone 1710 may send the control instruction to the lamp 1720, and the lamp 1720 responds to the control instruction.
The mobile phone can generate a corresponding control instruction through the received user operation. The user operation includes, but is not limited to, one or more of an operation of a virtual key acting on the interface, an operation of a physical key acting on the mobile phone, an input preset touch gesture or air-space gesture, an operation performed to change a sensing parameter of a sensor of the mobile phone, and an input preset voice. According to the method and the device, one or more target devices do not need to be searched and designated in specific application, the operation path of the target devices is greatly reduced, and the method and the device can be directly realized by utilizing the software and hardware capabilities of the searching device.
In a non-limiting example, the cell phone 1710 receives an operation of a target device selected by a user in the layout interface as a display screen, and the cell phone 1710 may jump to a device control interface of a corresponding luminaire 1720 in a device control application, or directly overlay a device control menu of the luminaire 1720 on the layout interface. The equipment control interface and the equipment control menu respectively comprise various virtual controls, and different virtual controls in the equipment control interface or the equipment control menu can be triggered to correspondingly generate different control instructions. Then, the cell phone 1710 receives the operation of the user, such as clicking or sliding any control button in the device control interface or the device control menu, and generates a corresponding control instruction to send to the lamp 1720, thereby controlling the lamp 1720 to execute the control instruction.
For example, the cell phone 1710 receives a click operation of a user clicking the device identifier 17111 of the luminaire, and jumps to the device control interface of the corresponding luminaire 1720 in the device control application. The device control interface may include a light on button, a light off button, a brightness increase button, or a brightness adjustment drag bar, among others. If the mobile phone 1710 receives a dragging operation acted on the brightness adjustment dragging strip by a user, the mobile phone 1710 generates a control instruction for brightness adjustment, sends the control instruction to the lamp 1720, and controls the lamp 1720 to adjust the target brightness. If the mobile phone 1710 receives a click operation of a user on a light-off button, the mobile phone 1710 generates a light-off control instruction, sends the control instruction to the lamp 1720, and controls the lamp 1720 to be turned off.
In another non-limiting example, the cell phone 1710 may generate a corresponding control instruction according to a received operation of the user on a physical key, and control the lamp 1720 to respond to the control instruction. The physical keys of the mobile phone include, but are not limited to, a switch key and/or a volume key, etc.
For example, the mobile phone 1710 receives a pressing operation of a user on an on/off key, generates a light on/off command, and controls turning off the currently lit lamp 1720 or controls turning on the currently lit lamp 1720.
For another example, the cell phone 1710 receives a pressing operation of the user on the volume increasing or decreasing key, generates a brightness increasing or decreasing instruction, and controls the lamp 1720 to increase or decrease the brightness.
In another non-limiting example, the cell phone 1710 may generate a control instruction according to a received preset touch screen gesture or air-break gesture input by the user, and control the light fixture 1720 to respond to the control instruction.
For example, the cell phone 1710 receives a touch screen operation of sliding a finger up or down input by a user, generates a brightness increasing or decreasing instruction, and controls the lamp 1720 to increase or decrease the brightness.
For another example, the cell phone 1710 receives a user input of a palm-moving operation in the cell phone direction, generates a brightness increase instruction, and controls the lamp 1720 to increase the brightness. The mobile phone 1710 receives the operation of moving the palm in the opposite direction to the direction of the mobile phone from the space input by the user, generates a brightness reduction instruction, and controls the lamp 1720 to reduce the brightness. The mobile phone 1710 receives an air-cut fist-making operation input by a user, generates a light-on/off instruction, and controls the currently-turned-on lamp 1720 to be turned off or controls the currently-turned-off lamp 1720 to be turned on.
In another non-limiting example, the handset 1710 can also generate control instructions through changes in a sensed parameter of a sensor of the handset 1710.
For example, if the user lifts the height of the mobile phone 1710, that is, moves the mobile phone 1710 upward, the mobile phone 1710 senses that the mobile phone is moving upward, generates a brightness increase instruction, and controls the lamp 1720 to increase the brightness. If the user lowers the height of the mobile phone 1710, the mobile phone 1710 is moved downwards, the mobile phone 1710 senses that the mobile phone is moved downwards, a brightness lowering instruction is generated, and the lamp 1720 is controlled to lower the brightness.
For another example, if the user rotates the cell phone 1710 clockwise, the cell phone 1710 senses that the local device rotates clockwise, and generates a brightness increasing instruction to control the lamp 1720 to increase the brightness. If the user rotates the mobile phone 1710 counterclockwise, the mobile phone 1710 senses the counterclockwise rotation of the mobile phone, generates a brightness reduction instruction, and controls the lamp 1720 to reduce the brightness.
It should be understood that the execution sequence of each process in the above embodiments should be determined by the function and the inherent logic thereof, and should not constitute any limitation to the implementation process of the embodiments of the present application.
Corresponding to the display control method described in the above embodiment, the embodiment of the present application further provides a display control apparatus. Each module included in the display control device can correspondingly realize each step of the display control method.
It will be appreciated that the electronic device, in order to implement the above-described functions, comprises corresponding hardware and/or software modules for performing the respective functions. The present application can be realized in hardware or a combination of hardware and computer software in conjunction with the description of the embodiments disclosed herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application with the embodiment described, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
It should be noted that, because the contents of information interaction, execution process, and the like between the modules/units are based on the same concept as that of the method embodiment of the present application, specific functions and technical effects thereof may be referred to specifically in the method embodiment section, and are not described herein again.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
The embodiment of the present application further provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the electronic device is enabled to implement the steps in the above method embodiments.
As an example, the electronic device may be a wearable device, a cell phone, a tablet computer, or the like.
The embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the steps in the above-mentioned method embodiments may be implemented.
Embodiments of the present application provide a computer program product, which when executed on an electronic device, enables the electronic device to implement the steps in the above method embodiments.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/electronic device, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunication signals, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed electronic device and method may be implemented in other ways. For example, the above-described electronic device embodiments are merely illustrative. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.
Claims (18)
1. A display control method is applied to first electronic equipment, and is characterized by comprising the following steps:
the first electronic equipment monitors a first operation or triggers an interaction event, and searches connectable equipment;
the first electronic device obtains a first spatial relationship between each connectable device and the first electronic device, where the first spatial relationship includes a first distance between each connectable device and the first electronic device, and a first included angle between a connection line between each connectable device and the first electronic device and a current orientation of the first electronic device;
the first electronic device displays a first layout interface according to the first spatial relationship, the first layout interface comprises one or more display areas, each display area corresponds to different distance conditions, and a first distance corresponding to connectable equipment displayed in each display area meets the distance conditions corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition.
2. The display control method according to claim 1, further comprising:
the first electronic device obtains a current position and/or a position pointed by a current orientation of the first electronic device, and if the current position and/or the position pointed by the current orientation are/is determined to be changed, a second spatial relationship between each connectable device and the first electronic device is obtained, wherein the second spatial relationship comprises a second distance between each connectable device and the first electronic device and a second included angle between a connecting line between each connectable device and the first electronic device and the current orientation of the first electronic device;
and the first electronic equipment displays a second layout interface of the connectable equipment according to the second spatial relationship, wherein the layout of the connectable equipment in the second layout interface changes according to the change of the current position and/or the position pointed by the current orientation.
3. The display control method according to claim 1 or 2, wherein each display region corresponds to an included angle range, and a first included angle corresponding to the connectable device displayed in each display region satisfies the included angle range corresponding to the display region;
or the like, or, alternatively,
each display area corresponds to an upper limit of the number of devices, and the number of connectable devices displayed in each display area is less than or equal to the corresponding upper limit of the number of devices; or the like, or, alternatively,
the connectable device displayed in the first layout interface is the same as an account on which the first electronic device logs in.
4. The display control method according to claim 1 or 2, characterized by further comprising:
the first electronic equipment receives a second operation input by a user and switches the first layout interface to a third layout interface, wherein the first layout interface and the third layout interface have different interface forms.
5. The display control method according to claim 4, wherein the interface form includes a circular layout, a semicircular layout, an arc layout, or a minimized layout.
6. The display control method according to claim 1 or 2, characterized by further comprising:
the first electronic device receives a first operation of a user on a target device, and establishes communication connection with the target device, wherein the target device is one or more devices in the connectable devices.
7. A first electronic device comprising a memory, a processor, and a computer program stored in the memory and executable on the processor, wherein the processor, when executing the computer program, causes the first electronic device to perform the steps of:
the first electronic equipment monitors a first operation or triggers an interaction event, and searches connectable equipment;
the first electronic device obtains a first spatial relationship between each connectable device and the first electronic device, where the first spatial relationship includes a first distance between each connectable device and the first electronic device, and a first included angle between a connection line between each connectable device and the first electronic device and a current orientation of the first electronic device;
the first electronic device displays a first layout interface according to the first spatial relationship, the first layout interface comprises one or more display areas, each display area corresponds to different distance conditions, and a first distance corresponding to connectable equipment displayed in each display area meets the distance conditions corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition.
8. The first electronic device of claim 7, wherein the first electronic device further implements the steps of:
the first electronic device obtains a current position and/or a position pointed by a current orientation of the first electronic device, and if the current position and/or the position pointed by the current orientation are/is determined to be changed, a second spatial relationship between each connectable device and the first electronic device is obtained, wherein the second spatial relationship comprises a second distance between each connectable device and the first electronic device and a second included angle between a connecting line between each connectable device and the first electronic device and the current orientation of the first electronic device;
and the first electronic equipment displays a second layout interface of the connectable equipment according to the second spatial relationship, wherein the layout of the connectable equipment in the second layout interface changes according to the change of the current position and/or the position pointed by the current orientation.
9. The first electronic device according to claim 7 or 8, wherein each display region corresponds to an included angle range, and a first included angle corresponding to the connectable device displayed in each display region satisfies the included angle range corresponding to the display region;
or the like, or, alternatively,
each display area corresponds to an upper limit of the number of devices, and the number of connectable devices displayed in each display area is less than or equal to the corresponding upper limit of the number of devices; or the like, or, alternatively,
the connectable device displayed in the first layout interface is the same as an account on which the first electronic device logs in.
10. The first electronic device according to claim 7 or 8, characterized in that it further implements the following steps:
the first electronic equipment receives a second operation input by a user and switches the first layout interface to a third layout interface, wherein the first layout interface and the third layout interface have different interface forms.
11. The first electronic device of claim 10, wherein the interface form comprises a circular layout, a semi-circular layout, an arc layout, or a minimized layout.
12. The first electronic device according to claim 7 or 8, characterized in that it further implements the following steps:
the first electronic device receives a first operation of a user on a target device, and establishes communication connection with the target device, wherein the target device is one or more devices in the connectable devices.
13. A computer-readable storage medium storing a computer program, the computer program when executed by a processor implementing the steps of:
the first electronic equipment monitors a first operation or triggers an interaction event, and searches for connectable equipment;
the first electronic device obtains a first spatial relationship between each connectable device and the first electronic device, where the first spatial relationship includes a first distance between each connectable device and the first electronic device, and a first included angle between a connection line between each connectable device and the first electronic device and a current orientation of the first electronic device;
the first electronic device displays a first layout interface according to the first spatial relationship, the first layout interface comprises one or more display areas, each display area corresponds to different distance conditions, and a first distance corresponding to connectable equipment displayed in each display area meets the distance conditions corresponding to the display area; and if the plurality of connectable devices meet the same distance condition and the difference value of the first included angles of the adjacent connectable devices in the plurality of connectable devices meets the angle condition, dispersedly displaying the adjacent connectable devices in the display area corresponding to the distance condition.
14. The computer-readable storage medium of claim 13, further implementing the steps of:
the first electronic device obtains a current position and/or a position pointed by a current orientation of the first electronic device, and if the current position and/or the position pointed by the current orientation are/is determined to be changed, a second spatial relationship between each connectable device and the first electronic device is obtained, wherein the second spatial relationship comprises a second distance between each connectable device and the first electronic device and a second included angle between a connecting line between each connectable device and the first electronic device and the current orientation of the first electronic device;
and the first electronic equipment displays a second layout interface of the connectable equipment according to the second spatial relationship, wherein the layout of the connectable equipment in the second layout interface changes according to the change of the current position and/or the position pointed by the current orientation.
15. The computer-readable storage medium according to claim 13 or 14, wherein each of the display regions corresponds to an angle range, and a first angle corresponding to the connectable device displayed in each of the display regions satisfies the angle range corresponding to the display region;
or the like, or, alternatively,
each display area corresponds to an upper limit of the number of devices, and the number of connectable devices displayed in each display area is less than or equal to the corresponding upper limit of the number of devices; or the like, or, alternatively,
the connectable device displayed in the first layout interface is the same as an account on which the first electronic device logs in.
16. The computer-readable storage medium according to claim 13 or 14, characterized in that the following steps are further implemented:
the first electronic equipment receives a second operation input by a user and switches the first layout interface to a third layout interface, wherein the first layout interface and the third layout interface have different interface forms.
17. The computer-readable storage medium of claim 16, wherein the interface form comprises a circular layout, a semi-circular layout, an arc layout, or a minimized layout.
18. The computer-readable storage medium according to claim 13 or 14, characterized in that the following steps are further implemented:
the first electronic device receives a first operation of a user on a target device, and establishes communication connection with the target device, wherein the target device is one or more devices in the connectable devices.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010780354.8A CN114095542B (en) | 2020-08-05 | 2020-08-05 | Display control method and electronic equipment |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010780354.8A CN114095542B (en) | 2020-08-05 | 2020-08-05 | Display control method and electronic equipment |
Publications (2)
Publication Number | Publication Date |
---|---|
CN114095542A true CN114095542A (en) | 2022-02-25 |
CN114095542B CN114095542B (en) | 2023-11-28 |
Family
ID=80295197
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010780354.8A Active CN114095542B (en) | 2020-08-05 | 2020-08-05 | Display control method and electronic equipment |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114095542B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115223521A (en) * | 2022-07-12 | 2022-10-21 | 海信视像科技股份有限公司 | Display device and relay device display method |
WO2024067169A1 (en) * | 2022-09-30 | 2024-04-04 | 华为技术有限公司 | Information processing method and electronic device |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140237064A1 (en) * | 2013-02-20 | 2014-08-21 | Compal Electronics, Inc. | Method for controlling electronic device and electronic apparatus using the same |
CN106533907A (en) * | 2016-11-14 | 2017-03-22 | 北京小米移动软件有限公司 | Information sending method and device |
CN108512997A (en) * | 2018-03-02 | 2018-09-07 | 广东欧珀移动通信有限公司 | Display methods, device, mobile terminal and storage medium |
CN109151210A (en) * | 2018-09-28 | 2019-01-04 | Oppo广东移动通信有限公司 | Mobile terminal, information display method and related product |
CN109862504A (en) * | 2018-12-29 | 2019-06-07 | 维沃移动通信有限公司 | A kind of display methods and terminal device |
CN110721467A (en) * | 2019-09-24 | 2020-01-24 | 咪咕互动娱乐有限公司 | Display control method, electronic device, and computer-readable storage medium |
CN110955373A (en) * | 2019-10-30 | 2020-04-03 | 华为技术有限公司 | Display element display method and electronic equipment |
-
2020
- 2020-08-05 CN CN202010780354.8A patent/CN114095542B/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20140237064A1 (en) * | 2013-02-20 | 2014-08-21 | Compal Electronics, Inc. | Method for controlling electronic device and electronic apparatus using the same |
CN106533907A (en) * | 2016-11-14 | 2017-03-22 | 北京小米移动软件有限公司 | Information sending method and device |
CN108512997A (en) * | 2018-03-02 | 2018-09-07 | 广东欧珀移动通信有限公司 | Display methods, device, mobile terminal and storage medium |
CN109151210A (en) * | 2018-09-28 | 2019-01-04 | Oppo广东移动通信有限公司 | Mobile terminal, information display method and related product |
CN109862504A (en) * | 2018-12-29 | 2019-06-07 | 维沃移动通信有限公司 | A kind of display methods and terminal device |
CN110721467A (en) * | 2019-09-24 | 2020-01-24 | 咪咕互动娱乐有限公司 | Display control method, electronic device, and computer-readable storage medium |
CN110955373A (en) * | 2019-10-30 | 2020-04-03 | 华为技术有限公司 | Display element display method and electronic equipment |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115223521A (en) * | 2022-07-12 | 2022-10-21 | 海信视像科技股份有限公司 | Display device and relay device display method |
CN115223521B (en) * | 2022-07-12 | 2024-04-30 | 海信视像科技股份有限公司 | Display equipment and relay equipment display method |
WO2024067169A1 (en) * | 2022-09-30 | 2024-04-04 | 华为技术有限公司 | Information processing method and electronic device |
Also Published As
Publication number | Publication date |
---|---|
CN114095542B (en) | 2023-11-28 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11385857B2 (en) | Method for displaying UI component and electronic device | |
CN113645351B (en) | Application interface interaction method, electronic device and computer-readable storage medium | |
CN113794801B (en) | Method and device for processing geo-fence | |
US11474664B2 (en) | Application icon moving method and electronic device | |
CN111602108B (en) | Application icon display method and terminal | |
CN113805487B (en) | Control instruction generation method and device, terminal equipment and readable storage medium | |
CN116233300B (en) | Method for controlling communication service state, terminal device and readable storage medium | |
JP7234379B2 (en) | Methods and associated devices for accessing networks by smart home devices | |
US11921968B2 (en) | Method for interaction between devices based on pointing operation, and electronic device | |
WO2022007707A1 (en) | Home device control method, terminal device, and computer-readable storage medium | |
CN112134995A (en) | Method, terminal and computer readable storage medium for searching application object | |
WO2020103091A1 (en) | Touch operation locking method and electronic device | |
CN114995715B (en) | Control method of floating ball and related device | |
WO2022048453A1 (en) | Unlocking method and electronic device | |
CN114095542B (en) | Display control method and electronic equipment | |
CN114691064B (en) | Dual-path screen throwing method and electronic equipment | |
CN115016697A (en) | Screen projection method, computer device, readable storage medium, and program product | |
CN113885973B (en) | Translation result display method and device and electronic equipment | |
CN114690985A (en) | Display method and electronic equipment | |
CN116048236B (en) | Communication method and related device | |
CN114764300B (en) | Window page interaction method and device, electronic equipment and readable storage medium | |
CN116416994A (en) | Voice cooperative input method, electronic equipment and computer readable storage medium | |
CN113626115A (en) | Method for generating dial plate and related device | |
CN116663044A (en) | Cross-device application program management method, electronic device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |