CN117762286A - Cursor display method and related equipment - Google Patents

Cursor display method and related equipment Download PDF

Info

Publication number
CN117762286A
CN117762286A CN202211170992.3A CN202211170992A CN117762286A CN 117762286 A CN117762286 A CN 117762286A CN 202211170992 A CN202211170992 A CN 202211170992A CN 117762286 A CN117762286 A CN 117762286A
Authority
CN
China
Prior art keywords
cursor
display
input device
terminal
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211170992.3A
Other languages
Chinese (zh)
Inventor
戴天瑶
郑江震
陈欣业
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN202211170992.3A priority Critical patent/CN117762286A/en
Publication of CN117762286A publication Critical patent/CN117762286A/en
Pending legal-status Critical Current

Links

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

A method of displaying a cursor, comprising: when the input device for controlling the display position of the cursor on the display screen is a first input device, displaying the cursor in a first form; and when the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device, displaying the cursor in a second mode, wherein the first mode is different from the second mode. In the application, when the terminal equipment detects that the input equipment is switched, the form of the cursor is changed, so that a user can better find the position of the cursor when the input equipment is switched.

Description

Cursor display method and related equipment
Technical Field
The present disclosure relates to the field of terminals, and in particular, to a method for displaying a cursor and related devices.
Background
Computers are commonly used office equipment, and since conventional displays cannot be operated directly by users, input devices (commonly known as mice/keyboards/touch pads) are required to be provided for the users to perform input operations. It has recently also begun to provide touch-enabled terminal devices with input devices to enable diverse operations. For example, by configuring a touch pad/keyboard and the like for the tablet personal computer, the tablet personal computer can realize operation experience similar to that of a traditional computer while touch operation is considered. When input is performed using the input device, the terminal device is used to indicate the input position of the user by displaying a cursor on the display screen, and when the user inputs manually, the user often breaks away from the screen due to a touch operation, for example, the position of two clicks is intermittent, and the cursor is not generally provided.
Sometimes, in order to facilitate different operation needs, a plurality of different input devices are provided for the device, for example, the operation modes of a mouse and a touch pad are provided simultaneously, when the different operation modes/operation devices are switched, a user needs to search for the position of a cursor on a screen, the cursor is usually smaller, and when the distance between the user and the screen is far or the screen is smaller, the user is difficult to find the cursor, so that the use experience is poor.
Multi-device collaboration is also increasingly emphasized in terminal devices, which when operated in concert support simultaneous control of multiple cooperating devices through a set of input devices to avoid configuring each device with an input device separately and indicating the device that the input device is operating through a cursor. When the control is switched between two different devices, the cursor moves from the screen of one device to the screen of the other device, and when the cursor moves between the screens, as a certain distance is usually reserved between the two devices, the movement of the cursor is invisible, and when the cursor moves to the next screen, the user is difficult to find, so that the user experience is reduced.
Disclosure of Invention
The embodiment of the application provides a cursor display method, which is convenient for a user to capture a cursor in time by changing the morphology of the cursor.
In a first aspect, the present application provides a cursor display method, applied to a terminal device including a display screen, where the method includes: when the input device for controlling the display position of the cursor on the display screen is a first input device, displaying the cursor in a first form; and when the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device, displaying the cursor in a second mode, wherein the first mode is different from the second mode.
In the embodiment of the application, when the terminal equipment detects that the input equipment is switched, the form of the cursor is changed, so that a user can better find the position of the cursor when the input equipment is switched.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are different in size, e.g., the cursor in the first configuration is larger than the size of the cursor in the second configuration.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are shaped differently.
In one possible implementation, the cursor may be of higher interest when in the first configuration than when in the second configuration.
In one possible implementation, the method further comprises: based on the acquired control instruction from the input device, the identity ID information of the second input device is contained, and the input device used for controlling the display position of the cursor on the display screen is determined to be switched from the first input device to the second input device; the control instruction is used for controlling the display position of the cursor on the display screen.
When the input device transmits the control command, the input device may also transmit the attribute of the control command. For example, the mouse transmits information indicating the properties of the mouse at the same time when transmitting the photoelectric signal. For another example, when the touch pad transmits a control command formed by converting a touch signal through a communication link, a field may be added to indicate the attribute of the control command in the process of converting the control command. By way of example, the control instructions may include an ID field that is used to populate an ID of an input device that is uniquely directed to a particular input device.
In one possible implementation, the method further comprises: based on receiving a control instruction from an input device through an interface corresponding to the second terminal device, determining that the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device; the control instruction is used for controlling the display position of the cursor on the display screen.
The terminal device may confirm the attribute of the control instruction through an interface receiving the control instruction. For example, the mouse is connected to a designated interface of the motherboard, and the terminal device may determine the source of the control instruction through the interface that receives the control instruction. The purpose of the terminal device to confirm the attribute of the control instruction is not to identify a specific input device, but to determine whether the user has switched the input device by the attribute of the control instruction, and thus the interface to which the input device is connected may also refer to the source attribute of the control instruction. When the input device is connected wirelessly, the interface may be a signal interface for wireless communication.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In one possible implementation, after determining that the cursor is at the position of executing the control instruction, the terminal device can also determine whether the control exists at the position of the cursor and the type of the control, if yes; or if the control exists and the control is of a designated type, the size of the cursor can be changed according to the size of the control so as to assist the user in accurately interacting with different UI sizes. Specifically, when the display position of the cursor is in the area where the control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in a region outside the control; the third modality matches the size of the control.
The size of the cursor is changed for the vehicle environment and based on the speed of the vehicle. When the speed of the vehicle is smaller than a preset value, the cursor is of a default size; when the speed of the vehicle is greater than a preset value, the cursor is enlarged, for example, the cursor is enlarged by 2 times, so that the vehicle owner can conveniently operate in a driving state. For example, the vehicle system may measure the running speed of the vehicle according to various sensors on the vehicle, and control the change of the cursor form by using the running speed as a trigger signal. Such as zoom in and out, shape change, etc. Specifically, in one possible implementation, the terminal device is a vehicle-mounted device in a vehicle, and when the movement rate of the vehicle is greater than a threshold value, the cursor is displayed in a fifth form; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
In a second aspect, the present application provides a cursor display method, which is applied to a first terminal device including a display screen, and the method includes: when detecting that a cursor moves from an external area to a display screen area of the first terminal device, acquiring an initial display position of the cursor on the display screen; the external area is a display screen area of the second terminal device, a virtual display area of the second terminal device or a virtual display area of the first terminal device; displaying the cursor in a first form according to the initial display position; the first mode is different from the second mode, and the second mode is a mode of a cursor displayed in a display screen area of the second terminal device.
In the prior art, when the input device controls the cursor to shuttle on the display screens of a plurality of terminal devices, the cursor is always in the same form, and when the cursor shuttles from the display screen of one terminal device to the display screen of another terminal device, the user cannot easily find the position of the cursor.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In a third aspect, the present application provides a cursor display device, applied to a terminal device including a display screen, the device including:
the device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring target information, and the target information indicates that an input device for controlling the cursor display position on the display screen is switched from a first input device to a second input device;
The display module is used for displaying the cursor in a first mode according to the target information; the first configuration is different from a second configuration, which is a configuration of a cursor when the first input device controls the cursor to be displayed.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the target information is a control instruction from an input device to which the first terminal device is connected, the control instruction including identity ID information for indicating the input device to which the first terminal device is connected.
In one possible implementation, the first terminal device receives a control instruction from the input device through an interface, and the target information is a type of the interface.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In one possible implementation, the display module is further configured to:
when the display position of the cursor is in the area where the target control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in an area outside the target control; the third modality matches the size of the control.
In one possible implementation, the terminal device is an in-vehicle device in a vehicle, and the apparatus further includes:
displaying the cursor in a fifth form when the movement rate of the vehicle is greater than a threshold value; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
In a fourth aspect, the present application provides a cursor display device, applied to a first terminal device including a display screen, the device including:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring target information, the target information indicates that a cursor display position moves from an external area to a display screen area of first terminal equipment, and the external area is a display screen area of second terminal equipment, a virtual display area of the second terminal equipment or a virtual display area of the first terminal equipment;
the display module is used for displaying the cursor in a first mode according to the target information; the first mode is different from a second mode, and the second mode is a mode of a cursor displayed on the second terminal device.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In a fifth aspect, the present application provides an image processing apparatus including: processor, memory, camera and bus, wherein: the processor, the memory and the camera are connected through the bus;
the camera is used for collecting videos in real time;
the memory is used for storing computer programs or instructions;
the processor is configured to invoke or execute a program or an instruction stored in the memory, and further invoke a camera to implement the steps described in any of the foregoing first aspect and any possible implementation manner of the first aspect, and the steps described in any of the foregoing third aspect and any possible implementation manner of the third aspect.
In a sixth aspect, the present application provides a computer storage medium comprising computer instructions which, when run on an electronic device or a server, perform the steps of any one of the possible implementations of the first aspect and the first aspect, and the steps of any one of the possible implementations of the third aspect and the third aspect.
In a seventh aspect, the present application provides a computer program product for performing the steps of any one of the above-mentioned first aspect and possible implementation manners of any one of the third aspect and possible implementation manners of the third aspect, when the computer program product is run on an electronic device or a server.
In an eighth aspect, the present application provides a chip system comprising a processor for supporting an execution device or training device to perform the functions involved in the above aspects, e.g. to send or process data involved in the above method; or, information. In one possible design, the chip system further includes a memory for holding program instructions and data necessary for the execution device or the training device. The chip system can be composed of chips, and can also comprise chips and other discrete devices.
Drawings
FIG. 1A is a system block diagram of an embodiment of the present application;
FIG. 1B is a system block diagram of an embodiment of the present application;
fig. 2 is a schematic structural diagram of a terminal device provided in an embodiment of the present application;
fig. 3A is a software architecture block diagram of a terminal device according to an embodiment of the present application;
FIG. 3B is a system block diagram of an embodiment of the present application;
fig. 4 is an embodiment diagram of a cursor display method provided in an embodiment of the present application;
FIG. 5 is a schematic illustration of a terminal interface in an embodiment of the present application;
FIG. 6 is a schematic illustration of a terminal interface in an embodiment of the present application;
fig. 7 is a schematic diagram of an embodiment of a method for displaying a cursor according to an embodiment of the present application;
FIG. 8 is a schematic illustration of a terminal interface in an embodiment of the present application;
fig. 9 is a schematic structural diagram of a cursor display device according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a cursor display device according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a terminal device according to an embodiment of the present application.
Detailed Description
Embodiments of the present invention will be described below with reference to the accompanying drawings in the embodiments of the present invention. The terminology used in the description of the embodiments of the invention herein is for the purpose of describing particular embodiments of the invention only and is not intended to be limiting of the invention.
Embodiments of the present application are described below with reference to the accompanying drawings. As one of ordinary skill in the art can appreciate, with the development of technology and the appearance of new scenes, the technical solutions provided in the embodiments of the present application are applicable to similar technical problems.
The terms first, second and the like in the description and in the claims of the present application and in the above-described figures, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and are merely illustrative of the manner in which the embodiments of the application described herein have been described for objects of the same nature. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of elements is not necessarily limited to those elements, but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
Some related terms, concepts referred to in the embodiments of the present application are described below.
A Pixel (PX) is a basic unit of image display. Each pixel may have a respective color value and may be displayed in three primary colors, such as being divided into three sub-pixels of red, green, and blue (RGB gamut), or cyan, magenta, yellow, and black (CMYK gamut). An image is a collection of individual pixels, and typically, the more pixels per unit area represent the higher resolution, the image will be displayed close to a real object. On an electronic device, the number of pixels can be divided into a horizontal number of pixels and a vertical number of pixels. The number of horizontal pixels indicates the number of pixels contained in the horizontal direction, and the number of vertical pixels indicates the number of pixels contained in the vertical direction.
The resolution means the number of pixels in the lateral and vertical directions, and the unit is px,1 px=1 pixel. The resolution can determine how much information to display, measured in terms of number of horizontal pixels and number of vertical pixels, i.e., resolution = number of horizontal pixels x number of vertical pixels, as 1960 x 1080. For images of the same physical size, when the resolution is relatively low (e.g., 640 x 480), the displayed pixels are fewer, the size of a single pixel is larger, and the display effect is rough; when the resolution is relatively high (e.g., 1600 x 1200), there are many pixels displayed, the size of a single pixel is relatively small, and the display effect is relatively fine.
A User Interface (UI), which is a media interface for interaction and information exchange between an application or an operating system and a user, enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
An input device (inputtdevice) is a device that directs an electronic device to input data and information, and is a bridge for a user to communicate with the electronic device or the electronic device to communicate with other electronic devices. Input devices are one of the main means for information exchange between a user and an electronic device, including but not limited to a keyboard (keyboard), a mouse (mouse), a camera, a scanner, a tablet, a stylus, a remote control lever, a touch screen (touch panel), a voice input device, and so forth. The input device may input the input data generated by the detected user operation into the electronic device, and the generated input data may be numeric data or non-numeric data, such as graphics, images, sounds, and the like. The embodiment of the application does not limit the type of the input device and the input data generated by the input device.
Some related terms, concepts referred to in the embodiments of the present application are described below.
A Pixel (PX) is a basic unit of image display. Each pixel may have a respective color value and may be displayed in three primary colors, such as being divided into three sub-pixels of red, green, and blue (RGB gamut), or cyan, magenta, yellow, and black (CMYK gamut). An image is a collection of individual pixels, and typically, the more pixels per unit area represent the higher resolution, the image will be displayed close to a real object. On an electronic device, the number of pixels can be divided into a horizontal number of pixels and a vertical number of pixels. The number of horizontal pixels indicates the number of pixels contained in the horizontal direction, and the number of vertical pixels indicates the number of pixels contained in the vertical direction.
The resolution means the number of pixels in the lateral and vertical directions, and the unit is px,1 px=1 pixel. The resolution can determine how much information to display, measured in terms of number of horizontal pixels and number of vertical pixels, i.e., resolution = number of horizontal pixels x number of vertical pixels, as 1960 x 1080. For images of the same physical size, when the resolution is relatively low (e.g., 640 x 480), the displayed pixels are fewer, the size of a single pixel is larger, and the display effect is rough; when the resolution is relatively high (e.g., 1600 x 1200), there are many pixels displayed, the size of a single pixel is relatively small, and the display effect is relatively fine.
A User Interface (UI), which is a media interface for interaction and information exchange between an application or an operating system and a user, enables conversion between an internal form of information and a form acceptable to the user. A commonly used presentation form of the user interface is a graphical user interface (graphic user interface, GUI), which refers to a user interface related to computer operations that is displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in a display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
An input device (inputtdevice) is a device that directs an electronic device to input data and information, and is a bridge for a user to communicate with the electronic device or the electronic device to communicate with other electronic devices. Input devices are one of the main means for information exchange between a user and an electronic device, including but not limited to a keyboard (keyboard), a mouse (mouse), a camera, a scanner, a tablet, a stylus, a remote control lever, a touch screen (touch panel), a voice input device, and so forth. The input device may input the input data generated by the detected user operation into the electronic device, and the generated input data may be numeric data or non-numeric data, such as graphics, images, sounds, and the like. The embodiment of the application does not limit the type of the input device and the input data generated by the input device.
Fig. 1A illustrates a communication system 10 provided in an embodiment of the present application. The communication system 10 may comprise an electronic device 100 and an electronic device 200, between which a first connection 105 is established, which may also be referred to herein as a terminal device. The electronic device 100 may be a smart phone, a tablet computer, a notebook computer, a desktop computer, or other types of electronic devices, which are not limited in this application. In some embodiments, as shown in fig. 1A, electronic device 100 may be a PC and electronic device 200 may be a tablet computer.
In the communication system 10 shown in fig. 1A, an electronic device 100 (e.g., a PC) may include a display 101 and input devices such as a mouse 102 and a keyboard 103. The input device, such as the mouse 102 and the keyboard 103, may be connected to the electronic device 100 through a wired connection, such as a Universal Serial Bus (USB) connection, etc., or may be connected to the electronic device 100 through a wireless connection, such as a Bluetooth (BT) connection, a wireless fidelity (wireless fidelity, wi-Fi) connection, etc., where the connection mode of the input device is not limited in particular. After an input device such as a mouse 102 and a keyboard 103 is connected to the electronic device 100, a user can input contents to the electronic device 100 through the mouse 102 and the keyboard 103.
The electronic device 200 may include a screen 106 or the like, and the screen 106 may be used to receive touch operations from a user and display a corresponding user interface.
A first connection 105 is established between the electronic device 100 and the electronic device 200, where the first connection 105 may be a wired connection, such as a USB connection, and the first connection 105 may also be a wireless connection, such as a bluetooth connection, a Wi-Fi connection, etc., and the type of the first connection is not limited in the embodiments of the present application. The electronic device 100 and the electronic device 200 may have a Bluetooth (BT) module and/or a wireless local area network (wireless local area networks, WLAN) module therein. The bluetooth module may provide a solution including one or more bluetooth communications of classical bluetooth (bluetooth 2.1) or bluetooth low energy (bluetooth low energy, BLE), and the WLAN module may provide a solution including one or more WLAN communications of wireless fidelity point-to-point connection (wireless fidelity peer-to-peer, wi-Fi P2P), wireless fidelity local area network (wireless fidelitylocal area networks, wi-Fi LAN), or wireless fidelity software access point (wireless fidelity softwareaccess point, wi-Fi softAP). In some embodiments, the first connection 105 may be Wi-FiP2P, wi-Fi P2P meaning that devices in the wireless network are allowed to connect to each other in a point-to-point fashion without passing through a wireless router, which may also be referred to as Wi-Fi direct (wireless fidelitydirect) in the system. The Wi-FiP2P connection equipment can directly exchange data through Wi-Fi (which is necessary to be in the same frequency band) under the condition of not connecting a network or a hot spot, so that point-to-point communication, such as file, picture, video and other data transmission, is realized. Compared with Bluetooth, wi-FiP2P has the advantages of higher search speed and transmission speed, longer transmission distance and the like.
The electronic device 100 and the electronic device 200 may transmit data over the first connection 105. For example, in some embodiments, the electronic device 100 may send the coordinate data and input events of the mouse 102, the input events of the keyboard 103, to the electronic device 200 via the first connection 105. Upon receiving the message sent by the electronic device 100, the electronic device 200 may display the cursor 110 on the screen 106 or respond correspondingly to the input events of the mouse 102 and the keyboard 103. Accordingly, the user can perform an input operation on the electronic device 200 by using the mouse 102 and the keyboard 103 of the electronic device 100, the electronic device 100 and the electronic device 200 share one set of input devices, and the electronic device 200 does not need to be additionally equipped with an input device.
The electronic device 100 and the electronic device 200 may be equipped with or other types of operating systems, and the operating systems of the electronic device 100 and the electronic device 200 may be the same or different, which is not limited in this application.
In some embodiments, as shown in fig. 1A, electronic device 100 displays interface 104, interface 104 being a desktop of electronic device 100, and controls 109 and 111 may be displayed in interface 104. Control 111 may represent that electronic device 100 has established a bluetooth connection with another electronic device (referred to herein as electronic device 200) and control 109 may represent that electronic device 100 shares input devices, such as mouse 102 and keyboard 103, with the connected device (referred to herein as electronic device 200). The screen 106 of the electronic device 200 may display an interface 107, the interface 107 may be a desktop of the electronic device 200, and the interface 107 may have controls 108 and 112 displayed therein. Among other things, control 112 may indicate that electronic device 200 has established a bluetooth connection with another electronic device (referred to herein as electronic device 100). Control 108 may represent that electronic device 200 has established a connection for sharing input devices, which may be referred to herein as input devices such as mouse 102 and keyboard 103 that electronic device 200 may use connected devices (referred to herein as electronic device 100). When a user moves a mouse cursor (also referred to as a cursor) to an edge of interface 104 of electronic device 100, cursor 110 may shuttle to an edge of interface 107 of electronic device 200, and then cursor 110 is displayed in screen 106 of electronic device 200 and may change position as mouse 102 moves.
In addition, the communication system provided in the embodiment of the present application may further include a terminal device, and referring to fig. 1B, fig. 1B shows the communication system provided in the embodiment of the present application. The communication system 10 may include a screen terminal 001 and an input device 003, the screen terminal 001 may be equipped with an operating system and support operation of the terminal by externally connecting the input device 003, and when operation is performed by the input device 003, a cursor 002 is provided, and the screen terminal 001 supports expansion connection of a plurality of input devices 003. A cursor 002 may be displayed on the display screen of the on-screen terminal 001 for indicating the position where the user is operating; the input device 003 may be connected to the on-screen terminal 001 by a wired or wireless manner, and as an extension device of the on-screen terminal 001, a mouse, a touch pad, or the like is generally used, and a user changes a position or a form of a cursor on the on-screen terminal 001 by operating the input device 003, and is a program function corresponding to the cursor operation position performed by the on-screen terminal. Alternatively, one input device 003 may be used at different times for controlling a plurality of different on-screen terminals 001, respectively.
Fig. 2 shows a schematic structure of the terminal 200.
The terminal 200 may be a cell phone, tablet, desktop, laptop, handheld, notebook, ultra-mobile personal computer (UMPC), netbook, and cellular telephone, personal digital assistant (personal digital assistant, PDA), augmented Reality (AR) device, virtual Reality (VR) device, artificial Intelligence (AI) device, wearable device, in-vehicle device, smart home device, and/or smart city device, which may be self-contained or external input devices such as a keyboard or mouse, the specific type of the electronic device is not particularly limited in the embodiments of the present application.
The terminal 200 may include a processor 210, an external memory interface 220, an internal memory 221, a universal serial bus (universal serial bus, USB) interface 230, a charge management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, a sensor module 280, keys 290, a motor 291, an indicator 292, a camera 293, a display 294, and a subscriber identity module (subscriber identification module, SIM) card interface 295, etc. The sensor module 280 may include a pressure sensor 280A, a gyroscope sensor 280B, a barometric sensor 280C, a magnetic sensor 280D, an acceleration sensor 280E, a distance sensor 280F, a proximity sensor 280G, a fingerprint sensor 280H, a temperature sensor 280J, a touch sensor 280K, an ambient light sensor 280L, a bone conduction sensor 280M, and the like.
It should be understood that the structure illustrated in the embodiments of the present invention does not constitute a specific limitation on the terminal 200. In other embodiments of the present application, terminal 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 210 may include one or more processing units such as, for example: the processor 210 may include an application processor (application processor, AP), a modem processor, a graphics processor (graphics processingunit, GPU), an image signal processor (image signal processor, ISP), a controller, a video codec, a digital signal processor (digital signal processor, DSP), a baseband processor, and/or a neural network processor (neural-network processing unit, NPU), etc. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The controller can generate operation control signals according to the instruction operation codes and the time sequence signals to finish the control of instruction fetching and instruction execution.
A memory may also be provided in the processor 210 for storing instructions and data. In some embodiments, the memory in the processor 210 is a cache memory. The memory may hold instructions or data that the processor 210 has just used or recycled. If the processor 210 needs to reuse the instruction or data, it may be called directly from the memory. Repeated accesses are avoided and the latency of the processor 210 is reduced, thereby improving the efficiency of the system.
In some embodiments, processor 210 may include one or more interfaces. The interfaces may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuitsound, I2S) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver transmitter (universal asynchronous receiver/transmitter, UART) interface, a mobile industry processor interface (mobile industry processor interface, MIPI), a general-purpose input/output (GPIO) interface, a subscriber identity module (subscriber identity module, SIM) interface, and/or a universal serial bus (universal serial bus, USB) interface, among others.
The I2C interface is a bi-directional synchronous serial bus comprising a serial data line (SDA) and a serial clock line (derail clock line, SCL). In some embodiments, the processor 210 may contain multiple sets of I2C buses. The processor 210 may be coupled to the touch sensor 280K, charger, flash, camera 293, etc., respectively, through different I2C bus interfaces. For example: the processor 210 may be coupled to the touch sensor 280K through an I2C interface, so that the processor 210 and the touch sensor 280K communicate through an I2C bus interface to implement a touch function of the terminal 200.
The I2S interface may be used for audio communication. In some embodiments, the processor 210 may contain multiple sets of I2S buses. The processor 210 may be coupled to the audio module 270 via an I2S bus to enable communication between the processor 210 and the audio module 270. In some embodiments, the audio module 270 may communicate audio signals to the wireless communication module 260 through the I2S interface to implement a function of answering a call through a bluetooth headset.
PCM interfaces may also be used for audio communication to sample, quantize and encode analog signals. In some embodiments, the audio module 270 and the wireless communication module 260 may be coupled by a PCM bus interface. In some embodiments, the audio module 270 may also transmit audio signals to the wireless communication module 260 through the PCM interface to implement a function of answering a call through the bluetooth headset. Both the I2S interface and the PCM interface may be used for audio communication.
The UART interface is a universal serial data bus for asynchronous communications. The bus may be a bi-directional communication bus. It converts the data to be transmitted between serial communication and parallel communication. In some embodiments, a UART interface is typically used to connect the processor 210 with the wireless communication module 260. For example: the processor 210 communicates with a bluetooth module in the wireless communication module 260 through a UART interface to implement a bluetooth function. In some embodiments, the audio module 270 may transmit an audio signal to the wireless communication module 260 through a UART interface, implementing a function of playing music through a bluetooth headset.
The MIPI interface may be used to connect the processor 210 to peripheral devices such as the display 294, the camera 293, and the like. The MIPI interfaces include camera serial interfaces (camera serial interface, CSI), display serial interfaces (displayserial interface, DSI), and the like. In some embodiments, processor 210 and camera 293 communicate through a CSI interface to implement the photographing function of terminal 200. The processor 210 and the display 294 communicate through a DSI interface to implement the display function of the terminal 200.
The GPIO interface may be configured by software. The GPIO interface may be configured as a control signal or as a data signal. In some embodiments, a GPIO interface may be used to connect the processor 210 with the camera 293, display 294, wireless communication module 260, audio module 270, sensor module 280, and the like. The GPIO interface may also be configured as an I2C interface, an I2S interface, a UART interface, an MIPI interface, etc.
The USB interface 230 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 230 may be used to connect a charger to charge the terminal 200, or may be used to transfer data between the terminal 200 and a peripheral device. And can also be used for connecting with a headset, and playing audio through the headset. The keyboard can also be used for connecting a mouse and a keyboard, inputting operation instructions through the mouse and the keyboard, and inputting character strings through the keyboard. The interface may also be used to connect other electronic devices, such as AR devices, etc.
It should be understood that the interfacing relationship between the modules illustrated in the embodiment of the present invention is only illustrative, and does not limit the structure of the terminal 200. In other embodiments of the present application, the terminal 200 may also use different interfacing manners, or a combination of multiple interfacing manners in the foregoing embodiments.
The charge management module 240 is configured to receive a charge input from a charger. The charger can be a wireless charger or a wired charger. In some wired charging embodiments, the charge management module 240 may receive a charging input of a wired charger through the USB interface 230. In some wireless charging embodiments, the charge management module 240 may receive wireless charging input through a wireless charging coil of the terminal 200. The charging management module 240 may also provide power to the electronic device through the power management module 241 while charging the battery 42.
The power management module 241 is used for connecting the battery 242, and the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the display 294, the camera 293, the wireless communication module 260, and the like. The power management module 241 may also be configured to monitor battery capacity, battery cycle times, battery health (leakage, impedance), and other parameters. In other embodiments, the power management module 241 may also be disposed in the processor 210. In other embodiments, the power management module 241 and the charge management module 240 may be disposed in the same device.
The wireless communication function of the terminal 200 can be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in terminal 200 may be configured to cover a single or multiple communication bands. Different antennas may also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed into a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied on the terminal 200. The mobile communication module 250 may include at least one filter, switch, power amplifier, low noise amplifier (lownoise amplifier, LNA), etc. The mobile communication module 250 may receive electromagnetic waves from the antenna 1, perform processes such as filtering, amplifying, and the like on the received electromagnetic waves, and transmit the processed electromagnetic waves to the modem processor for demodulation. The mobile communication module 250 can amplify the signal modulated by the modem processor, and convert the signal into electromagnetic waves through the antenna 1 to radiate. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be disposed in the processor 210. In some embodiments, at least some of the functional modules of the mobile communication module 250 may be provided in the same device as at least some of the modules of the processor 210.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating the low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then transmits the demodulated low frequency baseband signal to the baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to speaker 270A, receiver 270B, etc.), or displays images or video through display screen 294. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 250 or other functional module, independent of the processor 210.
The wireless communication module 260 may provide solutions for wireless communication including wireless local area network (wirelesslocal area networks, WLAN) (e.g., wireless fidelity (wireless fidelity, wi-Fi) network), bluetooth (BT), global navigation satellite system (global navigation satellite system, GNSS), frequency modulation (frequency modulation, FM), near field wireless communication technology (near field communication, NFC), infrared technology (IR), etc., applied on the terminal 200. The wireless communication module 260 may be one or more devices that integrate at least one communication processing module. The wireless communication module 260 receives electromagnetic waves via the antenna 2, modulates the electromagnetic wave signals, filters the electromagnetic wave signals, and transmits the processed signals to the processor 210. The wireless communication module 260 may also receive a signal to be transmitted from the processor 210, frequency modulate it, amplify it, and convert it to electromagnetic waves for radiation via the antenna 2.
In some embodiments, antenna 1 and mobile communication module 250 of terminal 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, so that terminal 200 may communicate with a network and other devices through wireless communication techniques. The wireless communication techniques may include the Global System for Mobile communications (global system for mobile communications, GSM), general packet radio service (general packet radio service, GPRS), code division multiple access (codedivision multiple access, CDMA), wideband code division multiple access (wideband code division multipleaccess, WCDMA), time division code division multiple access (time-division code division multiple access, TD-SCDMA), long term evolution (long term evolution, LTE), BT, GNSS, WLAN, NFC, FM, and/or IR techniques, among others. The GNSS may include a global satellite positioning system (global positioning system, GPS), a global navigation satellite system (global navigation satellite system, GLONASS), a beidou satellite navigation system (beidounavigation satellite system, BDS), a quasi zenith satellite system (quasi-zenith satellitesystem, QZSS) and/or a satellite based augmentation system (satellite based augmentation systems, SBAS).
Terminal 200 implements display functions through a GPU, display screen 294, application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 294 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The display 294 is used to display images, videos, and the like. The display 294 includes a display panel. The display panel may employ a liquid crystal display (liquid crystal display, LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (AMOLED) or an active-matrix organic light-emitting diode (matrixorganic light emitting diode), a flexible light-emitting diode (flex), a mini, a Micro led, a Micro-OLED, a quantum dot light-emitting diode (quantum dot lightemitting diodes, QLED), or the like. In some embodiments, terminal 200 may include 1 or N displays 294, N being a positive integer greater than 1.
The terminal 200 may implement a photographing function through an ISP, a camera 293, a video codec, a GPU, a display 294, an application processor, and the like.
The ISP is used to process the data fed back by the camera 293. For example, when photographing, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electric signal, and the camera photosensitive element transmits the electric signal to the ISP for processing and is converted into an image visible to naked eyes. ISP can also optimize the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in the camera 293.
The camera 293 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image onto the photosensitive element. The photosensitive element may be a charge coupled device (charge coupled device, CCD) or a Complementary Metal Oxide Semiconductor (CMOS) phototransistor. The photosensitive element converts the optical signal into an electrical signal, which is then transferred to the ISP to be converted into a digital image signal. The ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into an image signal in a standard RGB, YUV, or the like format. In some embodiments, terminal 200 may include 1 or N cameras 293, N being a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals. For example, when the terminal 200 selects a frequency bin, the digital signal processor is used to fourier transform the frequency bin energy, etc.
Video codecs are used to compress or decompress digital video. The terminal 200 may support one or more video codecs. In this way, the terminal 200 may play or record video in a plurality of encoding formats, for example: dynamic picture experts group (moving picture experts group, MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
The NPU is a neural-network (NN) computing processor, and can rapidly process input information by referencing a biological neural network structure, for example, referencing a transmission mode between human brain neurons, and can also continuously perform self-learning. Applications such as intelligent cognition of the terminal 200 can be implemented through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The internal memory 221 may include one or more random access memories (random access memory, RAM) and one or more non-volatile memories (NVM).
The random access memory may include a static random-access memory (SRAM), a dynamic random-access memory (dynamic random access memory, DRAM), a synchronous dynamic random-access memory (synchronous dynamic random access memory, SDRAM), a double data rate synchronous dynamic random-access memory (double data rate synchronous dynamic random access memory, DDR SDRAM, such as fifth generation DDR SDRAM is commonly referred to as DDR5 SDRAM), etc.;
the nonvolatile memory may include a disk storage device, a flash memory (flash memory).
The FLASH memory may include NOR FLASH, NAND FLASH, 3D NAND FLASH, etc. divided according to an operation principle, may include single-level memory cells (SLC), multi-level memory cells (MLC), triple-level memory cells (TLC), quad-level memory cells (QLC), etc. divided according to a storage specification, may include universal FLASH memory (english: universalflash storage, UFS), embedded multimedia memory cards (embedded multi media Card, eMMC), etc. divided according to a storage specification.
Random access memory may be read directly from and written to by processor 210, may be used to store executable programs (e.g., machine instructions) for an operating system or other on-the-fly programs, may also be used to store data for users and applications, and the like.
The nonvolatile memory may store executable programs, store data of users and applications, and the like, and may be loaded into the random access memory in advance for the processor 210 to directly read and write.
The external memory interface 220 may be used to connect an external nonvolatile memory to realize the memory capability of the extension terminal 200. The external nonvolatile memory communicates with the processor 210 through the external memory interface 220 to implement a data storage function. For example, files such as music and video are stored in an external nonvolatile memory.
The terminal 200 may implement audio functions through an audio module 270, a speaker 270A, a receiver 270B, a microphone 270C, an earphone interface 270D, an application processor, and the like. Such as music playing, recording, etc.
The audio module 270 is used to convert digital audio information into an analog audio signal output and also to convert an analog audio input into a digital audio signal. The audio module 270 may also be used to encode and decode audio signals. In some embodiments, the audio module 270 may be disposed in the processor 210, or some functional modules of the audio module 270 may be disposed in the processor 210.
Speaker 270A, also referred to as a "horn," is used to convert audio electrical signals into sound signals. The terminal 200 can listen to music through the speaker 270A or listen to hands-free calls.
A receiver 270B, also referred to as a "earpiece", is used to convert the audio electrical signal into a sound signal. When terminal 200 is answering a telephone call or voice message, voice can be received by placing receiver 270B close to the human ear.
Microphone 270C, also referred to as a "microphone" or "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can sound near the microphone 270C through the mouth, inputting a sound signal to the microphone 270C. The terminal 200 may be provided with at least one microphone 270C. In other embodiments, the terminal 200 may be provided with two microphones 270C, and may implement a noise reduction function in addition to collecting sound signals. In other embodiments, the terminal 200 may be further provided with three, four or more microphones 270C to collect sound signals, reduce noise, identify the source of sound, implement directional recording functions, etc.
The earphone interface 270D is for connecting a wired earphone. Earphone interface 270D may be USB interface 130 or a 3.5mm open mobile electronic device platform (open mobile terminal platform, OMTP) standard interface, american cellular telecommunications industry association (cellular telecommunications industry association of the USA, CTIA) standard interface.
The pressure sensor 280A is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 280A may be disposed on display 294. The pressure sensor 280A is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a capacitive pressure sensor comprising at least two parallel plates with conductive material. When a force is applied to the pressure sensor 280A, the capacitance between the electrodes changes. The terminal 200 determines the strength of the pressure according to the change of the capacitance. When a touch operation is applied to the display 294, the terminal 200 detects the intensity of the touch operation according to the pressure sensor 280A. The terminal 200 may also calculate the location of the touch based on the detection signal of the pressure sensor 280A. In some embodiments, touch operations that act on the same touch location, but at different touch operation strengths, may correspond to different operation instructions. For example: and executing an instruction for checking the short message when the touch operation with the touch operation intensity smaller than the first pressure threshold acts on the short message application icon. And executing an instruction for newly creating the short message when the touch operation with the touch operation intensity being greater than or equal to the first pressure threshold acts on the short message application icon.
The gyro sensor 280B may be used to determine a motion gesture of the terminal 200. In some embodiments, the angular velocity of terminal 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 280B. The gyro sensor 280B may be used for photographing anti-shake. Illustratively, when the shutter is pressed, the gyro sensor 280B detects the shake angle of the terminal 200, calculates the distance to be compensated by the lens module according to the angle, and makes the lens counteract the shake of the terminal 200 through the reverse motion, thereby realizing anti-shake. The gyro sensor 280B may also be used for navigating, somatosensory game scenes.
The air pressure sensor 280C is used to measure air pressure. In some embodiments, the terminal 200 calculates altitude from barometric pressure values measured by the barometric pressure sensor 280C, aiding in positioning and navigation.
The magnetic sensor 280D includes a hall sensor. The terminal 200 may detect the opening and closing of the flip cover using the magnetic sensor 280D. In some embodiments, when the terminal 200 is a folder, the terminal 200 may detect opening and closing of the folder according to the magnetic sensor 280D. And then according to the detected opening and closing state of the leather sheath or the opening and closing state of the flip, the characteristics of automatic unlocking of the flip and the like are set.
The acceleration sensor 280E may detect the magnitude of acceleration of the terminal 200 in various directions (typically three axes). The magnitude and direction of gravity may be detected when the terminal 200 is stationary. The electronic equipment gesture recognition method can also be used for recognizing the gesture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 280F for measuring distance. The terminal 200 may measure the distance by infrared or laser. In some embodiments, the terminal 200 may range using the distance sensor 280F to achieve fast focusing.
Proximity light sensor 280G may include, for example, a Light Emitting Diode (LED) and a light detector, such as a photodiode. The light emitting diode may be an infrared light emitting diode. The terminal 200 emits infrared light outward through the light emitting diode. The terminal 200 detects infrared reflected light from nearby objects using a photodiode. When sufficient reflected light is detected, it may be determined that there is an object near the terminal 200. When insufficient reflected light is detected, the terminal 200 may determine that there is no object in the vicinity of the terminal 200. The terminal 200 can detect that the user holds the terminal 200 close to the ear by using the proximity light sensor 280G, so as to automatically extinguish the screen for the purpose of saving power. The proximity light sensor 280G may also be used in holster mode, pocket mode to automatically unlock and lock the screen.
The ambient light sensor 280L is used to sense ambient light level. The terminal 200 may adaptively adjust the brightness of the display 294 according to the perceived ambient light level. The ambient light sensor 280L may also be used to automatically adjust white balance during photographing. The ambient light sensor 280L may also cooperate with the proximity light sensor 280G to detect whether the terminal 200 is in a pocket to prevent false touches.
The fingerprint sensor 280H is used to collect a fingerprint. The terminal 200 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access the application lock, fingerprint photographing, fingerprint incoming call answering and the like.
The temperature sensor 280J is used to detect temperature. In some embodiments, the terminal 200 performs a temperature processing strategy using the temperature detected by the temperature sensor 280J. For example, when the temperature reported by temperature sensor 280J exceeds a threshold, terminal 200 performs a reduction in the performance of a processor located in the vicinity of temperature sensor 280J in order to reduce power consumption for thermal protection. In other embodiments, when the temperature is below another threshold, the terminal 200 heats the battery 242 to avoid the low temperature causing the terminal 200 to be abnormally shut down. In other embodiments, when the temperature is below a further threshold, terminal 200 performs boosting of the output voltage of battery 242 to avoid abnormal shutdown caused by low temperatures.
The touch sensor 280K, also referred to as a "touch device". The touch sensor 280K may be disposed on the display screen 294, and the touch sensor 280K and the display screen 294 form a touch screen, which is also referred to as a "touch screen". The touch sensor 280K is used to detect a touch operation acting on or near it. The touch sensor may communicate the detected touch operation to the application processor to determine the touch event type. Visual output related to touch operations may be provided through the display 294. In other embodiments, the touch sensor 280K may also be disposed on a surface of the terminal 200 at a different location than the display 294.
Bone conduction sensor 280M may acquire a vibration signal. In some embodiments, bone conduction sensor 280M may acquire a vibration signal of a human vocal tract vibrating bone pieces. The bone conduction sensor 280M may also contact the pulse of the human body to receive the blood pressure pulsation signal. In some embodiments, bone conduction sensor 280M may also be provided in a headset, in combination with an osteoinductive headset. The audio module 270 may analyze the voice signal based on the vibration signal of the sound portion vibration bone piece obtained by the bone conduction sensor 280M, so as to implement the voice function. The application processor can analyze heart rate information based on the blood pressure beat signal acquired by the bone conduction sensor 280M, so as to realize a heart rate detection function.
Keys 290 include a power on key, a volume key, etc. The keys 290 may be mechanical keys. Or may be a touch key. The terminal 200 may receive key inputs, generating key signal inputs related to user settings and function controls of the terminal 200.
The motor 291 may generate a vibration alert. The motor 291 may be used for incoming call vibration alerting or for touch vibration feedback. For example, touch operations acting on different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 291 may also correspond to different vibration feedback effects by touch operations applied to different areas of the display 294. Different application scenarios (such as time reminding, receiving information, alarm clock, game, etc.) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 292 may be an indicator light, which may be used to indicate a state of charge, a change in power, a message indicating a missed call, a notification, etc.
The SIM card interface 295 is for interfacing with a SIM card. The SIM card may be inserted into the SIM card interface 295 or withdrawn from the SIM card interface 295 to enable contact and separation with the terminal 200. The terminal 200 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 295 may support Nano SIM cards, micro SIM cards, and the like. The same SIM card interface 295 may be used to insert multiple cards simultaneously. The types of the plurality of cards may be the same or different. The SIM card interface 295 may also be compatible with different types of SIM cards. The SIM card interface 295 may also be compatible with external memory cards. The terminal 200 interacts with the network through the SIM card to realize functions such as communication and data communication. In some embodiments, the terminal 200 employs esims, i.e.: an embedded SIM card. The eSIM card may be embedded in the terminal 200 and cannot be separated from the terminal 200.
The following describes a system software architecture provided in the embodiments of the present application. Taking the communication system 10 formed by the electronic device 100 and the electronic device 200 according to the present invention as an example, a system software structure provided in the embodiment of the present application is illustrated.
Fig. 3A is a block diagram of a system software architecture of the communication system 10 provided in an embodiment of the present application.
As shown in fig. 3A, the software architecture of the communication system 10 includes an electronic device 100 and an electronic device 200, and a first connection may be established between the electronic device 100 and the electronic device 200 and the communication may be performed through the first connection, where the first connection may be a bluetooth connection, a Wi-Fi connection, or the like, and the connection manner of the first connection is not limited in this embodiment.
In some embodiments, the software architecture of communication system 10 may be divided into an application & kernel (application & kernel) layer and a device (equivalence) layer. The application layer may include a series of application packages, among other things. The system software architecture is described herein using the electronic device 100 as a PC and the electronic device 200 as a tablet computer. The input device of the PC may be a mouse 102, a keyboard 103, etc., and the tablet PC may have a touch screen.
The device layers of the electronic device 100 may include input and output devices such as a display 101, a mouse 102, a keyboard 103, and the like. Wherein,
the display 101 is an output device for displaying images, videos, and the like. The display screen 101 includes a display panel. The electronic device 100 may include 1 or more displays 101, n being a positive integer greater than 1.
The mouse 102 is an input device, is an indicator for positioning the longitudinal and transverse coordinates in the display system of the electronic device, and can make the operation of the electronic device simpler and faster. The mouse type may include a rolling ball mouse, an optical mouse, a wireless mouse, etc., and the mouse may be extended in this application to any device that can generate a cursor and make a click.
The keyboard 103 is an input device through which a user can input characters, numerals, punctuation marks, control instructions, and the like to the electronic device.
The application & kernel layers of the electronic device 100 may include a display driver 311, a mouse driver 312, a keyboard driver 313, and the like. The driver can communicate with the hardware device through the bus to control the hardware to enter various working states and acquire the values of the relevant registers of the device, so as to obtain the state of the device. User operational events such as mouse input, keyboard input, rotating an electronic device, etc., may be acquired, such as by a driver, and converted into data.
The display driver 311 may be a program for driving a display.
The mouse driver 312 may be responsible for doing three tasks: firstly, displaying a mouse cursor on a screen and maintaining the movement of the cursor; secondly, providing a state of a mouse for the application program, wherein the state comprises a position of a mouse cursor on a screen and a state of whether each key of the mouse is pressed or released; thirdly, an auxiliary function of some mouse operations is provided for the application program.
Keyboard driver 313 is an interrupt program that generates a scan code based on the key pressed, then obtains united states standard information interchange code (americanstandard code for information interchange, ASCII) from the scan code, and then places it in a cache queue for output or other invocation.
The application & kernel layer of the electronic device 100 may also include a virtual screen management module 314, an input event generation module 315, an input event transmission module 316, and the like.
Virtual screen management module 314 may be used to create virtual screens. In some embodiments, such as in 10 operating systems, a virtual screen may be created by creating an iddcx_monitor object, the created virtual screen may be the same resolution as the electronic device 200 display screen, and the virtual screen is not visible to the user. The electronic device 100 creates the virtual screen so that the cursor can pass through the electronic device 100 display screen and the coordinates of the cursor in the virtual screen can be sent directly to the electronic device 200 without requiring complex coordinate transformations. If a virtual screen is not created or a new display screen is not externally connected, the cursor may be limited to the edge of the display screen of the electronic device 100, and jump display of the cursor between different display screens (including the virtual screen) cannot be achieved.
In one possible implementation, there is a mapping relationship between the pixel position on the virtual screen and the position on the display screen of the electronic device 200, and after the cursor moves onto the virtual screen, since the resolution of the virtual screen is the same as that of the display screen of the electronic device 200, the coordinates of the cursor on the virtual screen can be directly sent to the electronic device 200, without requiring complex coordinate conversion, which is simple and convenient, and saves consumption of CPU resources.
In one possible implementation, the virtual screen is a transition region between the electronic device 100 and the electronic device 200, and when the cursor moves from the electronic device 100 display to the electronic device 200 display, the cursor may move from the electronic device 100 display to the electronic device 200 display across the virtual screen, and the cursor may not be displayed while on the virtual screen. For example, as shown in fig. 3B.
The input event generation module 315 may be configured to convert the acquired input event of the input device into a corresponding input event that may act on the electronic device 200. For example, when detecting that the cursor reaches the edge of the display screen of the electronic device 100, the electronic device 100 may calculate a starting position of the cursor displayed on the display screen of the electronic device 200, and send the starting position to the electronic device 200 through the input event sending module 316, where the electronic device 200 receives the message and displays the cursor at a corresponding position, so as to form a visual effect that the cursor shuttles from the electronic device 100 to the electronic device 200. After the cursor moves to the display of the electronic device 200, for example, the input events of the input device (such as a mouse, a keyboard, and a handwriting board) of the electronic device 100 are captured, and then the corresponding input events that can act on the electronic device 200 are generated according to the mapping relationship in the first mapping table and sent to the electronic device 200. Wherein the input events include, but are not limited to, a mouse movement event, a mouse click event, a mouse wheel scroll event, a keyboard input event, a remote hand lever movement event, a voice input event, and the like. For example, the electronic device 100 may map an input event of a system to an input event of the system according to the first mapping table, and the input event of the system may act on the electronic device 200. For example, an event on the system that clicks the left mouse button may be mapped to a single click event in the system, and an event on the system that clicks the right mouse button may be mapped to a long press event in the system. The key value of the first key in the system may be mapped to a corresponding key value in the system, such as the character "a" as well, and the key code in the system may not be identical to the key code in the system.
The input event transmitting module 316 may be configured to transmit an input event or the like to the electronic device 200 through the first connection by the electronic device 100.
The device layers of the electronic device 200 may include a touch sensor 321 and a display screen 322. Wherein,
the touch sensor 321, also referred to as a "touch panel". The touch sensor 321 may be disposed on the display screen 322, and the touch sensor 321 and the display screen 322 form a touch screen, which is also called a "touch screen". The touch sensor 321 is used to detect a touch operation acting thereon or thereabout. The touch sensor 321 may communicate the detected touch operation to the application processor to determine the touch event type.
The display screen 322 is an output device that can be used to display images and colors. The display 322 may provide visual output related to touch operations.
The application & kernel layers of the electronic device 200 may include a touch sensor driver 323, a display driver 324, an input event receiving module 325, an input event response module 326. Among them, the touch sensor driver 323 and the display screen driver 324 are programs that drive the hardware device touch sensor and the display screen.
The input event receiving module 325 may be configured to monitor a communication interface, and obtain, through a first connection, a message sent by the electronic device 100, including but not limited to an instruction for displaying a cursor, an absolute coordinate of the cursor, an offset coordinate of the cursor, a pressing event of a mouse button, a scrolling event of a mouse wheel, a pressing event of a keyboard button, a key value corresponding to the keyboard button, and so on.
The input event response module 326 may be used to process input events after the input event receiving module 325 receives a message from the electronic device 100. For example, upon receiving a message from the electronic device 100 carrying coordinates (padX, padY) to display a cursor, the input event response module 326 may draw the cursor at the coordinates (padX, padY) and display it on the display screen 322 in response to the message. For another example, the input event response module 326 may process input events after the input event receiving module 325 receives input events such as mouse movements, mouse clicks, mouse wheel scrolling, keyboard inputs, remote joystick movements, etc. from the electronic device 100.
Specific implementations may refer to the following descriptions, which are not repeated herein.
The above description of the software architecture of the communication system 10 is merely exemplary, and it should be understood that the software architecture illustrated in the embodiments of the present invention is not to be construed as limiting the application in any way. In other embodiments of the present application, communication system 10 may include more or less modules than illustrated, or certain modules may be combined, or certain modules may be split, or a different architectural arrangement may be included. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Computers are commonly used office equipment, and since conventional displays cannot be operated directly by users, input devices (commonly known as mice/keyboards/touch pads) are required to be provided for the users to perform input operations. It has recently also begun to provide touch-enabled terminal devices with input devices to enable diverse operations. For example, by configuring a touch pad/keyboard and the like for the tablet personal computer, the tablet personal computer can realize operation experience similar to that of a traditional computer while touch operation is considered. When input is performed using the input device, the terminal device is used to indicate the input position of the user by displaying a cursor on the display screen, and when the user inputs manually, the user often breaks away from the screen due to a touch operation, for example, the position of two clicks is intermittent, and the cursor is not generally provided.
Sometimes, in order to facilitate different operation needs, a plurality of different input devices are provided for the device, for example, the operation modes of a mouse and a touch pad are provided simultaneously, when the different operation modes/operation devices are switched, a user needs to search for the position of a cursor on a screen, the cursor is usually smaller, and when the distance between the user and the screen is far or the screen is smaller, the user is difficult to find the cursor, so that the use experience is poor.
When the control is switched between two different devices, the cursor moves from the screen of one device to the screen of the other device, and when the cursor moves between the screens, as a certain distance is usually reserved between the two devices, the movement of the cursor is invisible, and when the cursor moves to the next screen, the user is difficult to find, so that the user experience is reduced.
In order to solve the above problems, the present application provides a cursor display method.
An internal implementation method provided in the embodiments of the present application is described below.
In the application, the relative concepts of the up, down, left and right directions can be defined, for example, when the electronic device is in a vertical state, the direction of downward gravity, or when the eyes of the user face the display interface of the forward layout, the left side is the left hand side of the view angle of the user, the right side is the right hand side of the view angle of the user, the upper side is the direction of the eyes of the view angle of the user face the top of the head, the lower side is the direction of the eyes of the view angle of the user face the trunk, and the display interface of the forward layout refers to the arrangement layout direction of characters, symbols, icons and the like of the display interface, which is the direction most in line with the reading of the user. Since direction is a relative concept, the above description is only an example and not limiting.
Referring to fig. 4, fig. 4 is a flowchart of a cursor display method provided in an embodiment of the present application, and as shown in fig. 4, the cursor display method provided in the embodiment of the present application includes:
401. when detecting that a cursor moves from an external area to a display screen area of the first terminal device, acquiring an initial display position of the cursor on the display screen; the external area is a display screen area of the second terminal device, a virtual display area of the second terminal device or a virtual display area of the first terminal device.
In one possible implementation, the input device may be a mouse, a touch pad, or the like for controlling the position of a cursor display. The first terminal device may be connected to a plurality of input devices, for example, to a plurality of mice, to a mouse and a touch pad simultaneously, etc. The touch pad may be disposed on the first terminal device, or may be disposed separately as an independent accessory.
In one possible implementation, the input device is connected to the first terminal device in a wired or wireless manner. For example, a mouse is connected to a Bluetooth component of a computer by a wired or Bluetooth mode; for example, a touch pad on a notebook computer is connected to a motherboard by a wire, and a separate touch pad accessory is connected to a bluetooth module of a computer by bluetooth.
In one possible implementation, the user interacts with the input device while operating the terminal device through the input device. For example, moving the mouse to change the position of the mouse; the gesture moves on the touch pad. When the input device receives the interactive operation of the user, a corresponding control instruction is generated and transmitted to the terminal device. After receiving the control instruction of the input device, the terminal device converts the control instruction into a corresponding execution instruction, controls the software function to realize through the execution instruction, and displays the execution result on the display screen in a visual window mode. For example, the mouse generates a series of photoelectric signals in the moving process, the series of photoelectric signals form a control instruction of the mouse, and the host computer receives the series of photoelectric signals and then converts the series of photoelectric signals into a cursor moving instruction, and the cursor moving instruction is displayed on the display screen.
As shown in fig. 3B, fig. 3B includes a terminal device A, B, and a touch pad C, where the touch pad C is connected to the terminal device a through bluetooth, and the terminal device A, B logs in to the same account. And the user executes a sliding gesture on the touch pad C to drive a cursor to slide on the screen of the terminal equipment A. When the cursor moves to the screen edge of the device A, the terminal device searches nearby devices, finds out the device B logging in the same account number, and establishes connection. The user continues to slide on the touchpad C and the cursor moves out of the screen of device a.
In one possible implementation, device a determines a virtual screen when device a establishes a connection with device B. The virtual screen is an area located at the edge of the physical screen of the device a and extending outwards, which is theoretically infinite, but in this embodiment, the virtual screen can only satisfy: when the physical screen of the device B is projected to the plane where the virtual screen is located, the edge of one side close to the device A is overlapped with the edge of the virtual screen. Fig. 3B shows the relationship of one of the virtual screens of device a to the physical screen of device B.
In one possible implementation, when the physical screen of device A, B is not in the same plane, an icon is the physical screen that enters one device (device B) whenever it enters from the virtual screen into the projection of the screen of the other device (device a, for example).
In one possible implementation, when the cursor moves out of the physical screen of the device a, the device a continues to receive the instruction of the touch pad, simulates the movement of the cursor on the virtual screen of the device a, and when the cursor enters the physical screen of the device B from the virtual screen, the device B establishes a connection with the device C by means of the connection channel of the device a and the device C, so that the device C is used for controlling the device.
In this embodiment, the second terminal device establishes the first connection with the first terminal device, where the second terminal device has a display screen for displaying a user interface, and further has input devices such as a mouse and a keyboard, and the second terminal device may create a virtual screen invisible to the user, where the virtual screen has the same size and resolution as the display screen of the first terminal device. When the cursor reaches the edge of the display screen of the second terminal device, the cursor can pass through the display screen of the second terminal device to move to the virtual screen, and meanwhile, the first terminal device can draw the cursor according to the corresponding positions of the virtual screen of the second terminal device and the display screen of the first terminal device. When the cursor is displayed in the first terminal device, and when the second terminal device receives input events such as a mouse, a keyboard and the like, the input events can be mapped to corresponding input events on the first terminal device and sent to the first terminal device. After receiving the input event, the first terminal device responds to the input event correspondingly. Thus, the user can make an input in the first terminal device using the input device of the second terminal device.
The implementation of this embodiment can be divided into two parts: (one) creation of a virtual screen; and (II) shuttling of the cursor.
In particular, the second terminal device may create one or more virtual screens through an operating system's own application program interface (applicationprogramming interface, API) function. The created virtual screen is the same resolution as the first terminal device display screen and the virtual screen is not visible to the user. Of course, if there are multiple electronic devices that need to share an input device, multiple virtual screens may be created for each electronic device and each virtual screen is the same screen resolution as its corresponding electronic device. Here, the second terminal device creates a virtual screen having the same screen resolution as the first terminal device as an example. In some embodiments, for example, in an operating system, the virtual screen may be created by creating an iddcx_monitor object, and in particular, creating the virtual screen may include the steps of:
(1) A specification of the virtual screen is defined.
Firstly, initializing, configuring related parameters, initializing by using IDD_CX_CLIENT_CONFIG_INIT and other functions, setting callback functions, and configuring parameters of display modes such as resolution, refresh rate and the like. For example, the set resolution is the same as the resolution of the first terminal device display acquired by the second terminal device.
(2) A virtual screen is created.
After initialization is complete, the IDDCX MONITOR creation function may be used to create an iddcx_monitor object, i.e., a virtual screen.
(3) A virtual screen is inserted.
After the iddcx_monitor object is successfully created, the IDDCX MONITOR edge function is called informing the system that this virtual screen is inserted. When the system returns a message that was successfully inserted, it indicates that the virtual screen was successfully created and can be used. This virtual display screen is then detected in the display settings of the system.
If this virtual screen is to be deleted or logged off later in the system, the iddcxmonitor device function may be invoked. Each time a virtual display is "inserted" it is equivalent to successfully creating an iddcx_monitor object in the system, then "deleting" the virtual display, it is equivalent to destroying the iddcx_monitor object.
To this end, the second terminal device creates a virtual screen having the same (or different) resolution as the first terminal device display screen. In addition, the relative position of the display screen of the second terminal device and the virtual screen can be set. For example, the virtual screen may be disposed on the right side of the display screen of the second terminal device, that is, the right edge of the display screen of the second terminal device is connected to the left edge of the virtual screen, so that the cursor may penetrate from the right edge of the display screen of the second terminal device and penetrate from the position corresponding to the left edge of the virtual screen. If the virtual screen is arranged below the display screen of the second terminal device, namely the lower edge of the display screen of the second terminal device is connected with the upper edge of the virtual screen, the cursor can penetrate out from the lower edge of the display screen of the second terminal device and penetrate in from the position corresponding to the upper edge of the virtual screen. The correspondence of the other edges is the same. If a plurality of virtual screens are created, in general, to avoid collisions, the plurality of virtual screens may be correspondingly connected to different edges of the second terminal device display, for example, the virtual screen 1 may be arranged to be located on the right side of the second terminal device display, and the virtual screen 2 may be arranged to be located on the lower side of the second terminal device display.
Shuttle of (two) cursors
In visual effect, the cursor may "shuttle" from the second terminal device display screen to the first terminal device display screen. Here, a cursor is "shuttled" from the right edge of the second terminal device display screen to the left edge of the first terminal device is illustrated as an example. The second terminal equipment and the first terminal equipment are arranged transversely, namely the left and right edges of the second terminal equipment and the first terminal equipment are short edges of the equipment. Specifically, the method may include the steps of:
(1) And detecting that the cursor reaches the edge of the display screen of the second terminal device.
Specifically, the second terminal device may acquire an absolute coordinate (second terminal device X, second terminal device Y) of the position of the cursor by using a getphysical curspos function of the system, and a maximum value of the absolute coordinate (second terminal device X, second terminal device Y) of the position of the cursor may be a value of a display resolution of the display screen of the second terminal device. For example, the current display resolution (screen width, screen height) of the second terminal device display screen is 1920×1080 (pixel point), then the absolute coordinate of the upper left corner of the display screen may be the origin (0, 0), the absolute coordinate of the lower right corner is (1920,1080), the absolute coordinate of the lower left corner is (0,1080), and the absolute coordinate of the upper right corner is (1920,0). The absolute coordinates of any position of the second terminal device display screen are in the range of (0, 0) to (1920,1080). The display resolution of the display screen of the second terminal device can be obtained through a getsystemetrics function.
When the mouse moves for a certain distance, the second terminal device can acquire the offset of the cursor corresponding to the movement distance of the mouse by using a RawInput function, wherein the offset is the difference between the movement starting position and the movement ending position. The offset may be a vector including a direction and a distance, and the offset may be expressed in terms of offset coordinates (relX, relY).
Taking the example of detecting that the cursor reaches the right edge of the display screen of the second terminal device, if (second terminal device x+relx) > screen window, a result that the mouse cursor reaches the right edge of the desktop of the second terminal device is returned. And if the (second terminal equipment Y+relY) > screen height, returning a result that the mouse cursor reaches the lower edge of the desktop of the second terminal equipment. The cursor reaching the upper edge and the left edge of the display screen of the second terminal device can be analogized, and will not be described in detail herein.
Of course, the first terminal device may also set up a virtual screen. As shown in fig. 5, the virtual screen of the device a and the virtual screen of the device B are only required to overlap. Or when the device A and the device B are connected, the device B calculates a virtual screen, and when the cursor moves out of the physical screen of the device A, the device B is connected with the touch pad C.
In one possible implementation, when one input device is used to control multiple terminal devices (e.g., a first terminal device and a second terminal device in embodiments of the present application), the cursor moves on the screen of the different devices. After receiving the control instruction, the terminal equipment acquires the movement attribute of the cursor after the terminal equipment executes the control instruction, and confirms the position of the cursor after the terminal equipment executes the control instruction.
For example, in fig. 5, when the cursor enters the virtual screen of the device B, the device B parses the control command after receiving the control command of the touch pad C to confirm whether the cursor enters the physical screen of the device B.
In one possible implementation, when the first terminal device detects that the display position of the cursor enters the area of the display screen of the first terminal device from the external area, the display position of the cursor may be acquired, where the display position may be a starting position of the cursor on the display screen of the first terminal device.
For example, when the first terminal device or the second terminal device constructs a virtual screen, and the cursor leaves from the display screen of the second terminal device, the cursor may directly enter the display screen of the first terminal device for displaying, the external area may be the display screen area of the second terminal device, and when the first terminal device or the second terminal device constructs a virtual screen, and the cursor leaves from the display screen of the second terminal device, the cursor needs to pass through a distance that is not displayed, and then enters the display screen of the first terminal device for displaying, and the external area may be the virtual screen of the first terminal device or the second terminal device.
(2) And calculating the initial display position of the cursor on the display screen of the first terminal device.
When the second terminal device detects that the cursor reaches the edge of the display screen of the second terminal device (or the edge of the virtual screen, which is adjacent to the display screen area of the first terminal device), the second terminal device calculates the starting point coordinates of the cursor displayed on the display screen of the first terminal device. Specifically, how to calculate the starting point position of the cursor displayed on the display screen of the first terminal device may refer to implementation of the prior art, which is not described herein again.
402. Displaying the cursor in a first form according to the initial display position; the first mode is different from the second mode, and the second mode is a mode of a cursor displayed in a display screen area of the second terminal device.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In the prior art, when the input device controls the cursor to shuttle on the display screens of a plurality of terminal devices, the cursor is always in the same form, and when the cursor shuttles from the display screen of one terminal device to the display screen of another terminal device, the user cannot easily find the position of the cursor.
Specifically, when the cursor is displayed on the display screen of the second terminal device, the cursor may be displayed in the second form, and when the cursor is shuttled from the display screen of the second terminal device to the display screen of the first terminal device, the cursor may be displayed in the first form different from the second form, so that the user may capture the cursor on the first terminal device in time.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are different in size, e.g., the cursor in the first configuration is larger than the size of the cursor in the second configuration.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are shaped differently.
In one possible implementation, the cursor may be of higher interest when in the first configuration than when in the second configuration.
As shown in fig. 3B, the cursor changes from an arrow on the display of the second terminal device to a circle on the display of the first terminal device. For another example, the cursor may be enlarged for a certain time and then reduced, and illustratively, the cursor may be enlarged for 5 seconds and then gradually restored to the normal size.
For example, in fig. 6, when the cursor is moved from the virtual screen to the physical screen of the device B, the movement attribute of the cursor indicates that the cursor is displayed on the physical screen, the cursor is enlarged for 5 seconds while the cursor is displayed on the physical screen, and then gradually restored to the normal size. The change in the shape of the cursor is not limited to enlargement and reduction, and may be a change in shape. Fig. 6 shows one of the shape change processes.
In fig. 6 (a), when the cursor enters into the screen from outside the screen, a transition pattern is obtained based on the edge connection of the shape of the cursor (such as an arrow) outside the screen and the shape of the cursor inside the screen, and the tail flame resembling comet in fig. 6 (a) can be enlarged. In fig. 6 (b), the icon is completely entered into the screen and moved a distance. The points on the edges of the transition pattern are moved outward/inward to become circular as shown in (c) of fig. 6. The circle then tapers to the default icon size of the device, as shown in fig. 6 (d).
In addition, in the prior art, in order to facilitate different operation needs, a plurality of different input devices may be provided for the device, for example, an operation mode of a mouse and a touch pad may be provided simultaneously, when different operation modes/operation devices are switched, a user needs to search a position of a cursor on a screen, the cursor is usually smaller, and when a user distance is far or the screen is smaller, it is difficult to find the cursor, so that use experience is poor. In order to solve the above problems, an embodiment of the present application provides a method for displaying a cursor.
Referring to fig. 7, fig. 7 is a flowchart of a cursor display method provided in an embodiment of the present application, and as shown in fig. 7, the cursor display method provided in the embodiment of the present application includes:
701. and displaying the cursor in a first form when the input device for controlling the display position of the cursor on the display screen is a first input device.
702. And when the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device, displaying the cursor in a second mode, wherein the first mode is different from the second mode.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In one possible implementation, the input device for controlling the display position of the cursor on the display screen may be determined to be switched from the first input device to the second input device based on the acquired identity ID information of the second input device included in the control instruction from the input device; the control instruction is used for controlling the display position of the cursor on the display screen.
When the input device transmits the control command, the input device may also transmit the attribute of the control command. For example, the mouse transmits information indicating the properties of the mouse at the same time when transmitting the photoelectric signal. For another example, when the touch pad transmits a control command formed by converting a touch signal through a communication link, a field may be added to indicate the attribute of the control command in the process of converting the control command. By way of example, the control instructions may include an ID field that is used to populate an ID of an input device that is uniquely directed to a particular input device.
In one possible implementation, it may be determined that an input device for controlling a display position of a cursor on the display screen is switched from the first input device to the second input device based on receiving a control instruction from the input device through an interface corresponding to the second terminal device; the control instruction is used for controlling the display position of the cursor on the display screen.
The terminal device may confirm the attribute of the control instruction through an interface receiving the control instruction. For example, the mouse is connected to a designated interface of the motherboard, and the terminal device may determine the source of the control instruction through the interface that receives the control instruction. The purpose of the terminal device to confirm the attribute of the control instruction is not to identify a specific input device, but to determine whether the user has switched the input device by the attribute of the control instruction, and thus the interface to which the input device is connected may also refer to the source attribute of the control instruction. When the input device is connected wirelessly, the interface may be a signal interface for wireless communication.
In the embodiment of the application, when the terminal equipment detects that the input equipment is switched, the form of the cursor is changed, so that a user can better find the position of the cursor when the input equipment is switched.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are different in size, e.g., the cursor in the first configuration is larger than the size of the cursor in the second configuration.
In one possible implementation, the cursor in the first configuration and the cursor in the second configuration are shaped differently.
In one possible implementation, the cursor may be of higher interest when in the first configuration than when in the second configuration.
In one possible implementation, after determining that the cursor is at the position of executing the control instruction, the terminal device can also determine whether the control exists at the position of the cursor and the type of the control, if yes; or if the control exists and the control is of a designated type, the size of the cursor can be changed according to the size of the control so as to assist the user in accurately interacting with different UI sizes. Specifically, when the display position of the cursor is in the area where the control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in a region outside the control; the third modality matches the size of the control.
Illustratively, as shown in FIG. 8, when the cursor enters the display range of the control, the diameter of the cursor is adjusted to 90% of the height of the control; when the cursor moves out of the display range of the control, the space is restored to the normal size; if the diameter of the default cursor is larger than or equal to the height of the control, the size of the cursor is not adjusted.
The size of the cursor is changed for the vehicle environment and based on the speed of the vehicle. When the speed of the vehicle is smaller than a preset value, the cursor is of a default size; when the speed of the vehicle is greater than a preset value, the cursor is enlarged, for example, the cursor is enlarged by 2 times, so that the vehicle owner can conveniently operate in a driving state. For example, the vehicle system may measure the running speed of the vehicle according to various sensors on the vehicle, and control the change of the cursor form by using the running speed as a trigger signal. Such as zoom in and out, shape change, etc. Specifically, in one possible implementation, the terminal device is a vehicle-mounted device in a vehicle, and when the movement rate of the vehicle is greater than a threshold value, the cursor is displayed in a fifth form; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
In the embodiment of the application, under the vehicle-mounted environment, when the vehicle-mounted speed is greater than a preset value, the cursor is enlarged, so that a vehicle owner can conveniently operate under a driving state.
Referring to fig. 9, fig. 9 is a schematic structural diagram of a display device for a cursor provided in an embodiment of the present application, and as shown in fig. 9, a display device 900 for a cursor provided in an embodiment of the present application includes:
an obtaining module 901, configured to obtain target information, where the target information indicates that an input device for controlling a cursor display position on the display screen is switched from a first input device to a second input device;
for a specific description of the acquiring module 901, reference may be made to the description of step 701 in the above embodiment, which is not repeated here.
A display module 902, configured to display the cursor in a first form according to the target information; the first configuration is different from a second configuration, which is a configuration of a cursor when the first input device controls the cursor to be displayed.
For a specific description of the display module 902, reference may be made to the description of step 702 in the above embodiment, which is not repeated here.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the target information is a control instruction from an input device to which the first terminal device is connected, the control instruction including identity ID information for indicating the input device to which the first terminal device is connected.
In one possible implementation, the first terminal device receives a control instruction from the input device through an interface, and the target information is a type of the interface.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
In one possible implementation, the display module is further configured to:
when the display position of the cursor is in the area where the target control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in an area outside the target control; the third modality matches the size of the control.
In one possible implementation, the terminal device is an in-vehicle device in a vehicle, and the apparatus further includes:
displaying the cursor in a fifth form when the movement rate of the vehicle is greater than a threshold value; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
Referring to fig. 10, fig. 10 is a schematic structural diagram of a display device for a cursor provided in an embodiment of the present application, and as shown in fig. 10, a display device 1000 for a cursor provided in an embodiment of the present application includes:
an obtaining module 1001, configured to obtain target information, where the target information indicates that a cursor display position moves from an external area to a display screen area of the first terminal device, and the external area is a display screen area of a second terminal device, a virtual display area of the second terminal device, or a virtual display area of the first terminal device;
for a specific description of the obtaining module 1001, reference may be made to the description of step 401 in the above embodiment, which is not repeated here.
A display module 1002, configured to display the cursor in a first form according to the target information; the first mode is different from a second mode, and the second mode is a mode of a cursor displayed on the second terminal device.
For a specific description of the display module 1002, reference may be made to the description of step 402 in the above embodiment, which is not repeated here.
In one possible implementation, the first morphology differs from the second morphology by at least one of a cursor size or a cursor shape.
In one possible implementation, the first input device and the second input device are a touch pad or a mouse.
Next, a terminal device provided in the embodiments of the present application may be a display device of a cursor in fig. 9 and fig. 10, referring to fig. 11, fig. 11 is a schematic structural diagram of the terminal device provided in the embodiments of the present application, and the terminal device 1100 may be specifically represented by a virtual reality VR device, a mobile phone, a tablet, a notebook computer, an intelligent wearable device, or the like, which is not limited herein. Specifically, the terminal apparatus 1100 includes: a receiver 1101, a transmitter 1102, a processor 1103 and a memory 1104 (where the number of processors 1103 in the terminal device 1100 may be one or more, one processor is exemplified in fig. 11), wherein the processor 1103 may include an application processor 11031 and a communication processor 11032. In some embodiments of the present application, the receiver 1101, transmitter 1102, processor 1103 and memory 1104 may be connected by a bus or other means.
The memory 1104 may include read-only memory and random access memory and provides instructions and data to the processor 1103. A portion of the memory 1104 may also include non-volatile random access memory (non-volatile random access memory, NVRAM). The memory 1104 stores a processor and operating instructions, executable modules or data structures, or a subset thereof, or an extended set thereof, wherein the operating instructions may include various operating instructions for implementing various operations.
The processor 1103 controls the operation of the terminal device. In a specific application, the individual components of the terminal device are coupled together by a bus system, which may comprise, in addition to a data bus, a power bus, a control bus, a status signal bus, etc. For clarity of illustration, however, the various buses are referred to in the figures as bus systems.
The method disclosed in the embodiments of the present application may be applied to the processor 1103 or implemented by the processor 1103. The processor 1103 may be an integrated circuit chip with signal processing capabilities. In implementation, the steps of the method described above may be performed by integrated logic circuitry in hardware or instructions in software in the processor 1103. The processor 1103 may be a general purpose processor, a digital signal processor (digital signal processing, DSP), a microprocessor or a microcontroller, and may further include an application specific integrated circuit (application specific integrated circuit, ASIC), a field-programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. The processor 1103 may implement or perform the methods, steps, and logic blocks disclosed in the embodiments of the present application. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of a method disclosed in connection with the embodiments of the present application may be embodied directly in hardware, in a decoded processor, or in a combination of hardware and software modules in a decoded processor. The software modules may be located in a random access memory, flash memory, read only memory, programmable read only memory, or electrically erasable programmable memory, registers, etc. as well known in the art. The storage medium is located in the memory 1104, and the processor 1103 reads information in the memory 1104, and in combination with the hardware, performs the steps of the method described above. Specifically, the processor 1103 can read the information in the memory 1104, and combine the hardware to perform the steps 401 to 402 related to the data processing in the above embodiment, and the steps 701 to 702 related to the data processing in the above embodiment.
The receiver 1101 is operable to receive input digital or character information and to generate signal inputs related to the relevant settings and function control of the terminal device. The transmitter 1102 may be used to output numeric or character information through a first interface; the transmitter 1102 may also be configured to send instructions to the disk stack via the first interface to modify data in the disk stack; the transmitter 1102 may also include a display device such as a display screen.
Embodiments of the present application also provide a computer program product, which when executed on a computer, causes the computer to perform the steps of the cursor display method described in the embodiments corresponding to fig. 4 and 7 in the above embodiments.
Also provided in the embodiments of the present application is a computer-readable storage medium having stored therein a program for performing signal processing, which when run on a computer, causes the computer to perform the steps of the cursor display method in the method as described in the previous embodiments.
The image display device provided in this embodiment of the present application may specifically be a chip, where the chip includes: a processing unit, which may be, for example, a processor, and a communication unit, which may be, for example, an input/output interface, pins or circuitry, etc. The processing unit may execute the computer-executable instructions stored in the storage unit to cause the chip in the execution device to perform the data processing method described in the above embodiment, or to cause the chip in the training device to perform the data processing method described in the above embodiment. Optionally, the storage unit is a storage unit in the chip, such as a register, a cache, etc., and the storage unit may also be a storage unit in the wireless access device side located outside the chip, such as a read-only memory (ROM) or other type of static storage device that may store static information and instructions, a random access memory (random access memory, RAM), etc.
It should be further noted that the above-described apparatus embodiments are merely illustrative, and that the units described as separate units may or may not be physically separate, and that units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment. In addition, in the drawings of the embodiment of the device provided by the application, the connection relation between the modules represents that the modules have communication connection therebetween, and can be specifically implemented as one or more communication buses or signal lines.
From the above description of the embodiments, it will be apparent to those skilled in the art that the present application may be implemented by means of software plus necessary general purpose hardware, or of course may be implemented by dedicated hardware including application specific integrated circuits, dedicated CPUs, dedicated memories, dedicated components and the like. Generally, functions performed by computer programs can be easily implemented by corresponding hardware, and specific hardware structures for implementing the same functions can be varied, such as analog circuits, digital circuits, or dedicated circuits. However, a software program implementation is a preferred embodiment in many cases for the present application. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a readable storage medium, such as a floppy disk, a usb disk, a removable hard disk, a ROM, a RAM, a magnetic disk or an optical disk of a computer, etc., including several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform the method described in the embodiments of the present application.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product.
The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present application, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be stored by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), or the like.

Claims (23)

1. A cursor display method, characterized in that it is applied to a terminal device comprising a display screen, said method comprising:
when the input device for controlling the display position of the cursor on the display screen is a first input device, displaying the cursor in a first form;
and when the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device, displaying the cursor in a second mode, wherein the first mode is different from the second mode.
2. The method of claim 1, wherein the first morphology is different from at least one of a cursor size or a cursor shape of the second morphology.
3. The method according to claim 1 or 2, characterized in that the method further comprises:
based on the acquired control instruction from the input device, the identity ID information of the second input device is contained, and the input device used for controlling the display position of the cursor on the display screen is determined to be switched from the first input device to the second input device; the control instruction is used for controlling the display position of the cursor on the display screen.
4. The method according to claim 1 or 2, characterized in that the method further comprises:
Based on receiving a control instruction from an input device through an interface corresponding to the second terminal device, determining that the input device for controlling the display position of the cursor on the display screen is switched from the first input device to the second input device; the control instruction is used for controlling the display position of the cursor on the display screen.
5. The method of any one of claims 1 to 4, wherein the first input device and the second input device are a touch pad or a mouse.
6. The method according to any one of claims 1 to 5, further comprising:
when the display position of the cursor is in the area where the control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in a region outside the control; the third modality matches the size of the control.
7. The method according to any one of claims 1 to 6, wherein the terminal device is an in-vehicle device in a vehicle, the method further comprising:
displaying the cursor in a fifth form when the movement rate of the vehicle is greater than a threshold value; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
8. A cursor display method, characterized in that it is applied to a first terminal device comprising a display screen, said method comprising:
when detecting that a cursor moves from an external area to a display screen area of the first terminal device, acquiring an initial display position of the cursor on the display screen; the external area is a display screen area of the second terminal device, a virtual display area of the second terminal device or a virtual display area of the first terminal device;
displaying the cursor in a first form according to the initial display position; the first mode is different from the second mode, and the second mode is a mode of a cursor displayed in a display screen area of the second terminal device.
9. The method of claim 8, wherein the first morphology is different from at least one of a cursor size or a cursor shape of the second morphology.
10. The method of claim 8 or 9, wherein the first input device and the second input device are a touch pad or a mouse.
11. A cursor display device for use with a terminal device including a display screen, the device comprising:
The device comprises an acquisition module, a display module and a control module, wherein the acquisition module is used for acquiring target information, and the target information indicates that an input device for controlling the cursor display position on the display screen is switched from a first input device to a second input device;
the display module is used for displaying the cursor in a first mode according to the target information; the first configuration is different from a second configuration, which is a configuration of a cursor when the first input device controls the cursor to be displayed.
12. The apparatus of claim 11, wherein the first morphology is different from at least one of a cursor size or a cursor shape of the second morphology.
13. The apparatus according to claim 11 or 12, wherein the target information is a control instruction from an input device to which the first terminal device is connected, the control instruction including identity ID information for indicating the input device to which the first terminal device is connected.
14. The apparatus according to claim 11 or 12, wherein the first terminal device receives a control instruction from the input device through an interface, and the target information is a type of the interface.
15. The apparatus of any one of claims 11 to 14, wherein the first input device and the second input device are a touch pad or a mouse.
16. The apparatus of any one of claims 11 to 15, wherein the display module is further configured to:
when the display position of the cursor is in the area where the target control is, displaying the cursor in a third mode; the third form is different from a fourth form, and the fourth form is a form of the cursor when the cursor is displayed in an area outside the target control; the third modality matches the size of the control.
17. The apparatus according to any one of claims 11 to 16, wherein the terminal device is an in-vehicle device in a vehicle, the apparatus further comprising:
displaying the cursor in a fifth form when the movement rate of the vehicle is greater than a threshold value; the fifth mode is different from a sixth mode in which a cursor is displayed when a movement rate of the vehicle is less than the threshold value.
18. A cursor display device for use with a first terminal device comprising a display screen, the device comprising:
the device comprises an acquisition module, a display module and a display module, wherein the acquisition module is used for acquiring target information, the target information indicates that a cursor display position moves from an external area to a display screen area of first terminal equipment, and the external area is a display screen area of second terminal equipment, a virtual display area of the second terminal equipment or a virtual display area of the first terminal equipment;
The display module is used for displaying the cursor in a first mode according to the target information; the first mode is different from a second mode, and the second mode is a mode of a cursor displayed on the second terminal device.
19. The apparatus of claim 18, wherein the first morphology is different from at least one of a cursor size or a cursor shape of the second morphology.
20. The apparatus of claim 18 or 19, wherein the first input device and the second input device are a touch pad or a mouse.
21. A cursor display device, the device comprising a processor, a memory, a camera, and a bus, wherein:
the processor, the memory and the camera are connected through the bus;
the camera is used for collecting videos in real time;
the memory is used for storing computer programs or instructions;
the processor is configured to invoke or execute a program or instructions stored in the memory to implement the method steps of any of claims 1-10.
22. A computer readable storage medium comprising a program which, when run on a computer, causes the computer to perform the method of any one of claims 1 to 10.
23. A computer program product comprising instructions which, when run on a terminal, cause the terminal to perform the method of any of claims 1-10.
CN202211170992.3A 2022-09-24 2022-09-24 Cursor display method and related equipment Pending CN117762286A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211170992.3A CN117762286A (en) 2022-09-24 2022-09-24 Cursor display method and related equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211170992.3A CN117762286A (en) 2022-09-24 2022-09-24 Cursor display method and related equipment

Publications (1)

Publication Number Publication Date
CN117762286A true CN117762286A (en) 2024-03-26

Family

ID=90313051

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211170992.3A Pending CN117762286A (en) 2022-09-24 2022-09-24 Cursor display method and related equipment

Country Status (1)

Country Link
CN (1) CN117762286A (en)

Similar Documents

Publication Publication Date Title
CN111724293B (en) Image rendering method and device and electronic equipment
WO2020244497A1 (en) Display method for flexible screen and electronic device
WO2020000448A1 (en) Flexible screen display method and terminal
WO2021036770A1 (en) Split-screen processing method and terminal device
CN114115769B (en) Display method and electronic equipment
WO2021063237A1 (en) Control method for electronic device, and electronic device
WO2021180089A1 (en) Interface switching method and apparatus and electronic device
CN112506386A (en) Display method of folding screen and electronic equipment
CN112598594A (en) Color consistency correction method and related device
CN110559645B (en) Application operation method and electronic equipment
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
CN112130788A (en) Content sharing method and device
CN113728295B (en) Screen control method, device, equipment and storage medium
CN110968247B (en) Electronic equipment control method and electronic equipment
CN112087649B (en) Equipment searching method and electronic equipment
WO2021057699A1 (en) Method for controlling electronic device with flexible screen, and electronic device
CN114089932A (en) Multi-screen display method and device, terminal equipment and storage medium
CN115756268A (en) Cross-device interaction method and device, screen projection system and terminal
CN115016697A (en) Screen projection method, computer device, readable storage medium, and program product
CN114004732A (en) Image editing prompting method and device, electronic equipment and readable storage medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
CN115032640B (en) Gesture recognition method and terminal equipment
CN114201738A (en) Unlocking method and electronic equipment
CN113391775A (en) Man-machine interaction method and equipment
CN115150542B (en) Video anti-shake method and related equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication