WO2021017836A1 - 控制大屏设备显示的方法、移动终端及第一系统 - Google Patents

控制大屏设备显示的方法、移动终端及第一系统 Download PDF

Info

Publication number
WO2021017836A1
WO2021017836A1 PCT/CN2020/102191 CN2020102191W WO2021017836A1 WO 2021017836 A1 WO2021017836 A1 WO 2021017836A1 CN 2020102191 W CN2020102191 W CN 2020102191W WO 2021017836 A1 WO2021017836 A1 WO 2021017836A1
Authority
WO
WIPO (PCT)
Prior art keywords
mobile terminal
screen device
screen
display
posture
Prior art date
Application number
PCT/CN2020/102191
Other languages
English (en)
French (fr)
Inventor
张北航
曾佳
张延海
陈运哲
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Publication of WO2021017836A1 publication Critical patent/WO2021017836A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the embodiments of the present application relate to the computer field, and in particular to a method for controlling the display of a large-screen device through a mobile terminal, a mobile terminal, and a first system.
  • the speaker usually needs to make a speech report to the audience.
  • the presenter reports to the audience by playing slides (PowerPoint, PPT) or keynote on the big screen.
  • the speaker will use a laser pointer to illuminate a visible laser spot on a certain position on the large screen to remind the audience of what is being lectured.
  • the laser pointer only has general display functions such as position display and page turning, and cannot control the completion of file opening and underlining operations.
  • the laser pointer needs to replace the battery frequently, and there can be no obstructions during use, which is very inconvenient to use.
  • the embodiments of the present application provide a method, a mobile terminal, and a first system for controlling the display of a large-screen device, which can use an existing portable device to realize the control of various displays of the large-screen device display and improve user experience.
  • a method for controlling the display of a large-screen device is provided.
  • the method is applied to a mobile terminal.
  • the mobile terminal includes a plurality of motion sensors.
  • the plurality of motion sensors includes at least an acceleration sensor, a gyroscope sensor, and a magnetic sensor.
  • the mobile terminal A communication connection is established with a large-screen device, and the mobile terminal is used to control the laser pointer displayed on the large-screen device.
  • the method includes: the mobile terminal collects motion data of the mobile terminal through multiple motion sensors; the mobile terminal According to the motion data, a nine-axis fusion algorithm is used to determine the spatial attitude of the mobile terminal; the mobile terminal determines the first position on the display screen of the large-screen device according to the spatial attitude of the mobile terminal; the mobile terminal controls the large-screen device in the The first position makes a first response, and the first response includes displaying a laser pointer at the first position.
  • the user can use an existing mobile terminal to control the posture of the mobile terminal, and after the mobile terminal calculates the spatial posture corresponding to the position corresponding to the display screen of the large-screen device according to its spatial posture , To send control information to the large-screen device, which is used to control the display of the large-screen device to make a response such as displaying the laser pointer and moving the laser pointer in the corresponding position.
  • the spatial attitude is used to at least identify the following information: the yaw angle of the mobile terminal relative to the ground coordinate system Pitch angle ⁇ and roll angle ⁇ .
  • the mobile terminal can calculate the position corresponding to the display screen of the large-screen device by calculating the yaw angle, pitch and roll angle of the mobile terminal.
  • the mobile terminal determining the first position on the display screen of the large-screen device according to the spatial attitude includes: the mobile terminal uses the following to determine that the first position is on the display screen of the large-screen device Coordinates on (x' i ,y' i ): Among them, W is the width of the display screen of the large-screen device, and s is the sensitivity of the mobile terminal. Using this calculation method, the mobile terminal can determine the position corresponding to the display screen of the large-screen device according to its spatial posture.
  • the method before the mobile terminal collects the motion data of the mobile terminal through multiple motion sensors, the method further includes: when the mobile terminal is in the first posture, receiving a first instruction from the user, the first The instruction is used to instruct the mobile terminal to perform 2D position realignment with the large-screen device; in response to the first instruction, the mobile terminal sends first information to the large-screen device, and the first information is used to instruct the large-screen device
  • the position shows the laser pointer; where the first posture is any posture of the mobile terminal.
  • the mobile terminal can receive a user's realignment instruction, and realign the position of the laser pointer on the large screen device display screen, which can improve the accuracy of the laser pointer display position and improve user experience.
  • the mobile terminal determines the first position on the display screen of the large-screen device according to the space attitude, including: the mobile terminal uses the quaternion q 0 corresponding to the first attitude as the starting point, and the space attitude
  • the corresponding quaternion q i is used as the end point, and the relative rotation matrix C of the quaternion q i relative to the quaternion q 0 is calculated when the mobile terminal transforms from the first attitude to the space attitude;
  • the mobile terminal is based on the initial yaw angle,
  • the initial pitch angle and the relative rotation matrix C determine the second attitude;
  • the mobile terminal determines the first position according to the second attitude; where the initial yaw angle is the coordinates of the mobile terminal relative to the ground when the mobile terminal is in the first attitude
  • the initial pitch angle is the pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first attitude.
  • the mobile terminal can calculate the quaternion corresponding to the space attitude transformation, and calculate
  • the mobile terminal controlling the large-screen device to make a first response at the first position includes: the mobile terminal sends second information to the large-screen device; wherein, the second information is used to indicate The first position, the second information includes control information, and the control information is used to instruct the large-screen device to make a first response at the first position.
  • the mobile terminal can control the response of the large-screen device by sending specific control information to the large-screen device and the location where the large-screen device is expected to respond accordingly. This method has stable performance and high user experience.
  • the mobile terminal controlling the large-screen device to make the first response at the first position includes: the mobile terminal sends second information to the service device; wherein the second information is used to indicate In the first location, the second information includes control information, and the control information is used to instruct the service device to control the large-screen device to make a first response in the first location.
  • the mobile terminal can control the large-screen device to respond by controlling the service device. Can be applied to different application scenarios, easy to use, and high user experience.
  • the method further includes: the mobile terminal receives the user's first operation; the mobile terminal controls the large-screen device.
  • the device making the first response at the first location includes: in response to the first operation, the mobile terminal controls the large-screen device to make the first response at the first location.
  • the mobile terminal can receive a user's preset operation on behalf of a certain control instruction, and respond to the corresponding control large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first response includes at least any one of the following responses: switching the laser pointer to a laser brush, turning pages, opening a file, marking, playing, stopping playing, and adjusting brightness.
  • the mobile terminal can respond to the corresponding control large-screen device according to the user's operation, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation is a user operation received when the display screen of the mobile terminal displays a human-computer interaction interface.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface, and respond to the corresponding control of the large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation is the user's operation on the human-computer interaction interface; wherein the human-computer interaction interface includes at least one of the following virtual buttons: left-hand/right-hand human-computer interaction interface switching button, pointer / Pen switch button, 2D position realignment button, one-key play/one-key exit button, personalized setting button.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface, and respond to the corresponding control of the large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation further includes: the user's click/double-tap/long-press operation in a blank area of the human-computer interaction interface, the user touches and slides in the blank area of the human-computer interaction interface, and the user presses the mobile terminal The operation of the volume key or the operation of the user pinching the side frame of the mobile terminal.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface or other locations of the mobile terminal, and respond to the corresponding control of the large-screen device, which can support various controls of the large-screen device, with convenient operation and high user experience.
  • a mobile terminal in a second aspect, includes a sensor module, the sensor module includes a plurality of motion sensors, and the plurality of motion sensors includes at least an acceleration sensor, a gyroscope sensor, and a magnetic sensor.
  • a communication connection is established between the mobile terminal, the mobile terminal is used to control the laser pointer displayed on the large-screen device, the sensor module is used to collect the movement data of the mobile terminal; the mobile terminal also includes: an analysis module for , Using a nine-axis fusion algorithm to determine the spatial posture of the mobile terminal; and, according to the spatial posture of the mobile terminal, determine the first position on the display screen of the large-screen device; the sending module is used to send control information to the large-screen device , For controlling the large-screen device to make a first response at the first position, and the first response includes displaying a laser pointer at the first position.
  • the user can use an existing mobile terminal to control the posture of the mobile terminal. After the mobile terminal calculates the position corresponding to the display screen of the large-screen device according to its spatial posture, , To send control information to the large-screen device to control the display of the large-screen device to display the laser pointer in the corresponding position, move the laser pointer and other responses.
  • the spatial attitude is used to at least identify the following information: the yaw angle of the mobile terminal relative to the ground coordinate system Pitch angle ⁇ and roll angle ⁇ .
  • the mobile terminal can calculate the position corresponding to the display screen of the large-screen device by calculating the yaw angle, pitch and roll angle of the mobile terminal.
  • the analysis module determines the first position on the display screen of the large-screen device according to the spatial attitude, including: the analysis module uses the following to determine that the first position is on the display screen of the large-screen device Coordinates on (x' i ,y' i ): Among them, W is the width of the display screen of the large-screen device, and s is the sensitivity of the mobile terminal. Using this calculation method, the mobile terminal can determine the position corresponding to the display screen of the large-screen device according to its spatial posture.
  • the mobile terminal further includes a detecting and receiving module, which is used for the sensor module to detect the user’s first posture when the mobile terminal is in the first posture before collecting the motion data of the mobile terminal through multiple motion sensors.
  • An instruction the first instruction is used to instruct the mobile terminal to perform 2D position realignment with the large-screen device; the sending module is also used to, in response to the first instruction, send first information to the large-screen device, the first The information is used to instruct the large-screen device to display the laser pointer at a preset position; where the first posture is any posture of the mobile terminal.
  • the mobile terminal can receive a user's realignment instruction, and realign the position of the laser pointer on the large screen device display screen, which can improve the accuracy of the laser pointer display position and improve user experience.
  • the analysis module determines the first position on the display screen of the large-screen device according to the spatial attitude, including: the analysis module uses the quaternion q 0 corresponding to the first attitude as the starting point, and the spatial attitude corresponds to quaternion q i as the end point, when the mobile terminal calculated by the first posture converting spatial attitude quaternion quaternion q i q opposite relative rotation matrix C 0; and analysis module according to the yaw angle initial, initial The pitch angle and the relative rotation matrix C determine the second attitude; the mobile terminal determines the first position according to the second attitude; where the initial yaw angle is the coordinates of the mobile terminal relative to the ground when the mobile terminal is in the first attitude The initial pitch angle is the pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first attitude.
  • the mobile terminal can calculate the quaternion corresponding to the space attitude transformation, and calculate the space attitude after the attitude transformation according to the quaternion, and then perform
  • the sending module sending control information to the large-screen device includes: the sending module sends the control information to the service device through the service device, and is used to instruct the large-screen device to perform operations in the first position according to the control of the service device. Give the first response.
  • the mobile terminal can control the large-screen device to respond by controlling the service device. Can be applied to different application scenarios, easy to use, and high user experience.
  • the mobile terminal further includes: a detection module, configured to detect a user's first operation before the sending module sends control information to the large-screen device; the sending module sends control information to the large-screen device, Including: in response to the first operation, the sending module sends control information to the large-screen device.
  • the mobile terminal can receive a user's preset operation on behalf of a certain control instruction, and respond to the corresponding control large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first response includes at least any one of the following responses: switching the laser pointer to a laser brush, turning pages, opening a file, marking, playing, stopping playing, and adjusting brightness.
  • the mobile terminal can respond to the corresponding control large-screen device according to the user's operation, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation is a user operation received when the display screen of the mobile terminal displays a human-computer interaction interface.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface, and respond to the corresponding control of the large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation is the user's operation on the human-computer interaction interface; wherein the human-computer interaction interface includes at least one of the following virtual buttons: left-hand/right-hand human-computer interaction interface switching button, pointer / Pen switch button, 2D position realignment button, one-key play/one-key exit button, personalized setting button.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface, and respond to the corresponding control of the large-screen device, and can support various controls of the large-screen device, with convenient operation and high user experience.
  • the first operation further includes: the user's click/double-tap/long-press operation in a blank area of the human-computer interaction interface, the user touches and slides in the blank area of the human-computer interaction interface, and the user presses the mobile terminal The operation of the volume key or the operation of the user pinching the side frame of the mobile terminal.
  • the mobile terminal can receive the user's preset operations on the human-computer interaction interface or other locations of the mobile terminal, and respond to the corresponding control of the large-screen device, which can support various controls of the large-screen device, with convenient operation and high user experience.
  • a mobile terminal in a third aspect, includes: a memory for storing computer program code, the computer program code including instructions; a radio frequency unit for transmitting and receiving radio signals; when the memory stores a When or multiple computer programs are executed by the processor, the mobile terminal is caused to execute the method for controlling the display of the large-screen device in any one of the possible implementation manners of the first aspect.
  • a first system in a fourth aspect, includes: a mobile terminal and a large-screen device.
  • the mobile terminal is used to control the large-screen device to implement any one of the possible implementations of the first aspect.
  • the method displayed by the device is provided.
  • the first system further includes: a service device, configured to implement the method for controlling the display of the large-screen device in any one of the corresponding possible implementation manners in the first aspect.
  • a chip system in a fifth aspect, includes a processor and a memory, and instructions are stored in the memory; when the instructions are executed by the processor, the implementation is as in any possible implementation manner of the first aspect Method of controlling the display of large-screen devices.
  • the chip system can be composed of chips, or can include chips and other discrete devices.
  • a computer-readable storage medium stores computer-executable instructions.
  • the control can be implemented as in any possible implementation manner of the first aspect.
  • the method of large-screen device display is provided.
  • a computer program product which, when running on a computer, enables the method for controlling the display of a large-screen device in any possible implementation manner of the first aspect.
  • the computer may be at least one storage node.
  • FIG. 1A is an example of an application scenario of the method for controlling the display of a TV through a mobile phone according to an embodiment of the application;
  • FIG. 1B is an example of an application scenario of the method for controlling the display of a laptop computer through a mobile phone according to an embodiment of the application;
  • FIG. 2 is an example 2 of the application scenario of the method for controlling the display of a TV through a mobile phone according to an embodiment of the application;
  • 3 is an example of an application scenario of a method for controlling the display of a projection device provided by an embodiment of the application
  • FIG. 4 is a schematic diagram of the hardware structure of a mobile phone provided by an embodiment of the application.
  • FIG. 5 is a schematic diagram of a mobile phone space attitude provided by an embodiment of the application.
  • FIG. 6 is a first flowchart of a method for controlling the display of a large-screen device through a mobile terminal according to an embodiment of the application;
  • FIG. 7 is a schematic diagram of the relative position of a mobile phone and a laptop according to an embodiment of the application.
  • FIG. 8 is a schematic diagram of a first position determination method provided by an embodiment of this application.
  • FIG. 9 is a second flowchart of a method for controlling the display of a large-screen device according to an embodiment of the application.
  • FIG. 10 is an example of a human-computer interaction interface provided by an embodiment of the application.
  • FIG. 11 is a third flowchart of a method for controlling the display of a large-screen device according to an embodiment of the application.
  • FIG. 12 is a schematic structural diagram of a mobile terminal provided by an embodiment of this application.
  • FIG. 13 is a schematic structural diagram of another mobile terminal provided by an embodiment of this application.
  • the embodiments of the present application provide a method, electronic device and system for controlling the display of a large-screen device.
  • the electronic device is a mobile terminal or a large-screen device.
  • large-screen devices include display screens.
  • the large-screen device may be a TV, a personal computer (PC), a tablet computer, a netbook, a projection device, etc.
  • the mobile terminal can directly control the content and display form displayed on the display of the large-screen device controller.
  • the large screen device in the embodiment of the present application is a projection device, for example, the projection device includes a projector and a screen.
  • the content displayed on the screen of the mobile terminal is directly projected on the screen by the projector, and the mobile terminal can directly control the projector to control the content and display form of the projector projected on the screen.
  • the mobile terminal can control the content and display form displayed on the display screen by operating the service device.
  • the projector is connected to a laptop computer, and the interface displayed on the laptop computer screen is projected on the screen.
  • the mobile terminal can control the content and display form of the projector projected on the screen by controlling the laptop computer.
  • the large-screen device in the embodiment of the present application may also be other devices in a human-computer interaction scene, such as a somatosensory game machine.
  • the embodiments of this application do not limit the specific types and forms of large-screen devices.
  • the mobile terminal in the embodiment of the present application may be a handheld device, such as a smart phone, a tablet computer, a palmtop computer, etc. It may also be a wearable device (such as a smart watch), a portable multimedia player (Portable Multimedia Player, PMP), a dedicated media player, AR (augmented reality)/VR (virtual reality) devices and other types of electronic devices.
  • a handheld device such as a smart phone, a tablet computer, a palmtop computer, etc. It may also be a wearable device (such as a smart watch), a portable multimedia player (Portable Multimedia Player, PMP), a dedicated media player, AR (augmented reality)/VR (virtual reality) devices and other types of electronic devices.
  • PMP portable multimedia player
  • AR augmented reality
  • VR virtual reality
  • the mobile terminal and the large-screen device establish a communication connection.
  • the mobile terminal can control the large-screen device through the communication connection.
  • the communication connection can be a wired connection or a wireless connection.
  • mobile terminals, large-screen devices, and service devices can follow a wireless transmission protocol and transmit information through wireless connection transceivers.
  • the transmitted information includes but is not limited to content data and control instructions that need to be displayed.
  • the wireless transmission protocol may include, but is not limited to, a Bluetooth (BT) transmission protocol or a wireless fidelity (Wireless Fidelity, WiFi) transmission protocol.
  • the wireless connection transceiver includes but is not limited to a transceiver such as Bluetooth or WiFi.
  • the mobile terminal, the large-screen device, and the service device can be connected through a wired connection to realize information transmission.
  • the wired connection is a data transmission line connection.
  • the following embodiments of the present application only take as an example that the mobile terminal, the large-screen device and the service device follow the wireless transmission protocol to implement information transmission.
  • the embodiments of the present application do not limit the specific application scenarios of human-computer interaction.
  • the scene can be business exhibitions, seminars, large-scale road shows, and new product launches as described in the background technology. It can also be a somatosensory game scene, such as "cutting a watermelon", "shooting” and so on. Or it can be scenes such as TV teaching and TV meeting.
  • the mobile terminal can directly control the large-screen device to control its display, or control other devices to control the display of the large-screen device. For specific scenarios, this application does not specifically limit it.
  • the mobile terminal is a mobile phone 100
  • the large-screen device is a television 300 as an example.
  • the mobile phone 100 can directly project the interface displayed on the display screen of the mobile phone 100 to the display screen of the television 300. That is, the display screen of the television 300 can simultaneously display the content on the display screen of the mobile phone 100.
  • the mobile office software installed in the mobile phone 100 displays PPT on the display screen of the TV set 300 through technologies such as "Share WiFi".
  • the user controls the position of the laser pointer on the display screen of the TV 300 through the mobile phone 100, and controls the TV 300 to make corresponding displays, effects, etc., at the corresponding positions, such as color marking, marking, and checking.
  • the mobile phone can also display the content on the display screen of the mobile phone 100 on the display screen of the laptop computer (also referred to as a "laptop computer") through technologies such as "shared WiFi”.
  • the mobile terminal is a mobile phone 100
  • the large-screen device is a laptop computer (also referred to as a "laptop computer") 200 as an example.
  • the laptop computer 200 can only be used as a display device.
  • the content on the display of the mobile phone 100 is simultaneously displayed on the display.
  • the laptop computer 200 can also be used as a playback device to play content in the laptop computer 200.
  • the somatosensory game "Watermelon Cutting" application Application, APP
  • the user uses the mobile phone 100 as a somatosensory “mouse”. Among them, both the mobile phone 100 and the laptop computer 200 are connected to WiFi.
  • the process of the user "cutting a watermelon” may be: the laptop 200 displays a laser pointer at the initial position of the "cutting watermelon” game interface (for example, the center of the game interface). The user holds the handset 100 in the right hand and slides in any direction in the air, and the laser pointer slides along the corresponding track. The user controls the laser pointer to move to the vicinity of "Watermelon” A, and continues to slide his right hand in the air to make a "cut watermelon” action. Correspondingly, "Watermelon” A in the game interface displays the simulated visual effect of being cut.
  • the mobile phone 100 can also control the display on the screen of the projection device by operating the laptop computer 200.
  • the second application scenario example of the method for controlling the display of a TV through a mobile phone provided in this embodiment of the application the TV set 300 is only used as a display device to display the interface displayed on the display screen of the laptop computer 200.
  • the TV set 300 can also be understood as a projection device.
  • the presenter displays the PPT presentation interface on the display screen of his laptop computer 200 on the display screen of the television 300.
  • the speaker controls the laptop computer 200 through the mobile phone 100 to control the display effects of the PPT presentation interface displayed on the display screen of the television 300, such as color marking, line drawing, and check.
  • the projection device includes a projector 400 and a screen 500.
  • the speaker projects the PPT presentation interface on the display screen of his laptop computer 200 onto the screen 500 through the projector 400.
  • the speaker controls the laptop computer 200 through the mobile phone 100 so that on the display interface projected on the screen 500 through the projector 400, the position of the laser pointer can change with the posture of the mobile phone 100.
  • the mobile phone 100 controls the laptop computer 200, so that the laptop computer 200 is projected by the projector 400 on the corresponding position of the display interface on the screen 500 to produce corresponding displays, effects, and the like.
  • the mobile phone 100 and the laptop computer 200 may both be connected to WiFi.
  • a data transmission line 600 for example, a high-definition multimedia interface (HDMI)
  • HDMI high-definition multimedia interface
  • the TV 300 and the projector 400 may also be connected to WiFi, and receive displayed content data and control instructions from the laptop 200 through WiFi.
  • the mobile terminal such as the mobile phone 100
  • the mobile terminal may have basic functions such as laser pointer indication and page turning.
  • a service device or a large-screen device can be mapped on the large-screen device display screen (as shown in Figure 1A or TV 300 in Figure 1A or Figure 2) according to the 3-Dimensions (3D) position (also called spatial location) of the user’s handheld mobile phone 100.
  • the 2 Dimension (2D) position (also referred to as the plane position) on the display screen, the screen 500 in Figure 3, determines the display position of the laser pointer to remind the audience of what is being delivered.
  • the mobile phone 100 sends a "page turning" instruction to the service device or the large screen device.
  • the mobile phone 100 may also have the function of a wireless mouse.
  • the user can send commands such as "highlight”, “double-click to open”, “play”, and "line” to the service device or large-screen device through the mobile phone 100 to control the corresponding display at the laser pointer on the large-screen device display.
  • the mobile phone 100 may include a processor 410, an external memory interface 420, an internal memory 421, a universal serial bus (USB) interface 430, a charging management module 440, a power management module 441, and a battery 442, Antenna 1, antenna 2, mobile communication module 450, wireless communication module 460, audio module 470, speaker 470A, receiver 470B, microphone 470C, sensor module 480, buttons 490, motor 491, indicator 492, camera 493, display 494, And subscriber identification module (subscriber identification module, SIM) card interface 495, etc.
  • a processor 410 an external memory interface 420, an internal memory 421, a universal serial bus (USB) interface 430, a charging management module 440, a power management module 441, and a battery 442, Antenna 1, antenna 2, mobile communication module 450, wireless communication module 460, audio module 470, speaker 470A, receiver 470B, microphone 470C, sensor module 480, buttons 490, motor 491, indicator 492, camera 493, display 4
  • the sensor module 480 may include a pressure sensor 480A, a gyroscope sensor 480B, a magnetic sensor 480C, an acceleration sensor 480D, a distance sensor 480E, a fingerprint sensor 480F, a touch sensor 480G, an ambient light sensor 480H, etc.
  • the structure illustrated in the embodiment of the present invention does not constitute a specific limitation on the mobile phone 100.
  • the mobile phone 100 may include more or fewer components than shown, or combine certain components, or split certain components, or arrange different components.
  • the illustrated components can be implemented in hardware, software, or a combination of software and hardware.
  • the processor 410 may include one or more processing units.
  • the processor 410 may include an application processor (AP), a modem processor, a graphics processing unit (GPU), an image signal processor (ISP), and a flight controller.
  • Video codec digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU), etc.
  • the different processing units may be independent devices or integrated in one or more processors.
  • a memory may also be provided in the processor 410 for storing instructions and data.
  • the memory in the processor 410 is a cache memory.
  • the memory can store instructions or data that have just been used or recycled by the processor 410. If the processor 410 needs to use the instruction or data again, it can be directly called from the memory. Repeated access is avoided, the waiting time of the processor 410 is reduced, and the efficiency of the system is improved.
  • the processor 410 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, an integrated circuit built-in audio (inter-integrated circuit sound, I2S) interface, a pulse code modulation (PCM) interface, and a universal asynchronous transmitter receiver/transmitter, UART) interface, mobile industry processor interface (MIPI), general-purpose input/output (GPIO) interface, subscriber identity module (SIM) interface, and / Or Universal Serial Bus (USB) interface, etc.
  • I2C integrated circuit
  • I2S integrated circuit built-in audio
  • PCM pulse code modulation
  • UART universal asynchronous transmitter receiver/transmitter
  • MIPI mobile industry processor interface
  • GPIO general-purpose input/output
  • SIM subscriber identity module
  • USB Universal Serial Bus
  • the I2C interface is a two-way synchronous serial bus, including a serial data line (SDA) and a serial clock line (SCL).
  • the processor 410 may include multiple sets of I2C buses.
  • the processor 410 may be coupled to the touch sensor 480G, charger, flash, camera 493, etc. through different I2C bus interfaces.
  • the camera 493 is at least one; the camera 493 may also be a 360° rotatable camera.
  • the processor 410 may couple the touch sensor 480G through an I2C interface, so that the processor 410 and the touch sensor 480G communicate through the I2C bus interface to implement the touch function of the mobile phone 100.
  • the USB interface 430 is an interface that complies with the USB standard specification, and specifically may be a Mini USB interface, a Micro USB interface, a USB Type C interface, and so on.
  • the USB interface 430 can be used to connect a charger to charge the mobile phone 100, and can also be used to transfer data between the mobile phone 100 and peripheral devices. It can also be used to connect to other electronic devices, such as AR devices.
  • the charging management module 440 is used to receive charging input from the charger.
  • the charger can be a wireless charger or a wired charger.
  • the charging management module 440 may receive the charging input of the wired charger through the USB interface 430.
  • the charging management module 440 may receive the wireless charging input through the wireless charging coil of the mobile phone 100. While the charging management module 440 charges the battery 442, it can also supply power to the electronic device through the power management module 441.
  • the power management module 441 is used to connect the battery 442, the charging management module 440 and the processor 410.
  • the power management module 441 receives input from the battery 442 and/or the charge management module 440, and supplies power to the processor 410, the internal memory 421, the display screen 494, the camera 493, and the wireless communication module 460.
  • the power management module 441 can also be used to monitor parameters such as battery capacity, battery cycle times, and battery health status (leakage, impedance).
  • the power management module 441 may also be provided in the processor 410.
  • the power management module 441 and the charging management module 440 may also be provided in the same device.
  • the wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor, and the baseband processor.
  • the antenna 1 and the antenna 2 are used to transmit and receive electromagnetic wave signals.
  • Each antenna in the mobile phone 100 can be used to cover a single or multiple communication frequency bands. Different antennas can also be reused to improve antenna utilization.
  • antenna 1 can be multiplexed as a diversity antenna of a wireless local area network.
  • the antenna can be used in combination with a tuning switch.
  • the mobile communication module 450 may provide a wireless communication solution including 2G/3G/4G/5G and the like applied on the mobile phone 100.
  • the mobile communication module 450 may include at least one filter, switch, power amplifier, low noise amplifier (LNA), etc.
  • the mobile communication module 450 may receive electromagnetic waves by the antenna 1, and perform processing such as filtering, amplifying and transmitting the received electromagnetic waves to the modem processor for demodulation.
  • the mobile communication module 450 can also amplify the signal modulated by the modem processor, and convert it into electromagnetic wave radiation via the antenna 1.
  • at least part of the functional modules of the mobile communication module 450 may be provided in the processor 410.
  • at least part of the functional modules of the mobile communication module 450 and at least part of the modules of the processor 410 may be provided in the same device.
  • the modem processor may include a modulator and a demodulator.
  • the modulator is used to modulate the low frequency baseband signal to be sent into a medium and high frequency signal.
  • the demodulator is used to demodulate the received electromagnetic wave signal into a low-frequency baseband signal. Then the demodulator transmits the demodulated low-frequency baseband signal to the baseband processor for processing.
  • the low-frequency baseband signal is processed by the baseband processor and then passed to the application processor.
  • the application processor outputs a sound signal through an audio device (not limited to the speaker 470A, the receiver 470B, etc.), or displays images or video through the display screen 494.
  • the modem processor may be an independent device.
  • the modem processor may be independent of the processor 410 and be provided in the same device as the mobile communication module 450 or other functional modules.
  • the wireless communication module 460 can provide applications on the mobile phone 100 including wireless local area networks (WLAN) (such as WiFi networks), Bluetooth BT, global navigation satellite system (GNSS), frequency modulation, FM), near field communication technology (NFC), infrared technology (infrared, IR) and other wireless communication solutions.
  • WLAN wireless local area networks
  • GNSS global navigation satellite system
  • FM frequency modulation
  • NFC near field communication technology
  • IR infrared technology
  • the wireless communication module 460 may be one or more devices integrating at least one communication processing module.
  • the wireless communication module 460 receives electromagnetic waves via the antenna 2, frequency modulates and filters the electromagnetic wave signals, and sends the processed signals to the processor 410.
  • the wireless communication module 460 may also receive the signal to be sent from the processor 410, perform frequency modulation, amplify, and convert it into electromagnetic waves to radiate through the antenna 2.
  • the antenna 1 of the mobile phone 100 is coupled with the mobile communication module 450, and the antenna 2 is coupled with the wireless communication module 460, so that the mobile phone 100 can communicate with the network and other devices through wireless communication technology.
  • the wireless communication technologies may include global system for mobile communications (GSM), general packet radio service (GPRS), code division multiple access (CDMA), broadband Code division multiple access (wideband code division multiple access, WCDMA), time-division code division multiple access (TD-SCDMA), long term evolution (LTE), BT, GNSS, WLAN, NFC , FM, and/or IR technology, etc.
  • the GNSS may include global positioning system (GPS), global navigation satellite system (GLONASS), Beidou navigation satellite system (BDS), quasi-zenith satellite system (quasi -zenith satellite system, QZSS) and/or satellite-based augmentation systems (SBAS).
  • GPS global positioning system
  • GLONASS global navigation satellite system
  • BDS Beidou navigation satellite system
  • QZSS quasi-zenith satellite system
  • SBAS satellite-based augmentation systems
  • the mobile phone 100 implements a display function through a GPU, a display screen 494, and an application processor.
  • the GPU is a microprocessor for image processing, connected to the display 494 and the application processor.
  • the GPU is used to perform mathematical and geometric calculations for graphics rendering.
  • the processor 410 may include one or more GPUs that execute program instructions to generate or change display information.
  • the display screen 494 is used to display images, videos, etc.
  • the display screen 494 includes a display panel.
  • the display panel can adopt liquid crystal display (LCD), organic light-emitting diode (OLED), active-matrix organic light-emitting diode or active-matrix organic light-emitting diode (active-matrix organic light-emitting diode).
  • LCD liquid crystal display
  • OLED organic light-emitting diode
  • active-matrix organic light-emitting diode active-matrix organic light-emitting diode
  • AMOLED flexible light-emitting diode (FLED), Miniled, MicroLed, Micro-oLed, quantum dot light-emitting diode (QLED), etc.
  • the mobile phone 100 may include one or N display screens 494, and N is a positive integer greater than one.
  • the mobile phone 100 can realize a shooting function through an ISP, a camera 493, a video codec, a GPU, a display 494, and an application processor.
  • the ISP is used to process the data fed back from the camera 493. For example, when taking a picture, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, the light signal is converted into an electrical signal, and the photosensitive element of the camera transfers the electrical signal to the ISP for processing and is converted into an image visible to the naked eye.
  • ISP can also optimize the image noise, brightness, and skin color. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be provided in the camera 493.
  • the camera 493 is used to capture still images or videos.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the optical signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • ISP outputs digital image signals to DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other formats.
  • the mobile phone 100 may include one or N cameras 493, and N is a positive integer greater than one.
  • Digital signal processors are used to process digital signals. In addition to digital image signals, they can also process other digital signals. For example, when the mobile phone 100 selects a frequency point, the digital signal processor is used to perform Fourier transform on the energy of the frequency point.
  • Video codecs are used to compress or decompress digital video.
  • the mobile phone 100 may support one or more video codecs. In this way, the mobile phone 100 can play or record videos in a variety of encoding formats, such as: moving picture experts group (MPEG) 1, MPEG2, MPEG3, MPEG4, etc.
  • MPEG moving picture experts group
  • MPEG2 MPEG2, MPEG3, MPEG4, etc.
  • NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • applications such as intelligent cognition of the mobile phone 100 can be realized, such as image recognition, face recognition, voice recognition, text understanding, etc.
  • the external memory interface 420 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the mobile phone 100.
  • the external memory card communicates with the processor 410 through the external memory interface 420 to realize the data storage function. For example, save music, video and other files in an external memory card.
  • the internal memory 421 may be used to store computer executable program code, where the executable program code includes instructions.
  • the internal memory 421 may include a program storage area and a data storage area.
  • the storage program area can store an operating system, at least one application program (such as a sound playback function, an image playback function, etc.) required by at least one function.
  • the data storage area can store data (such as audio data, phone book, etc.) created during the use of the mobile phone 100.
  • the internal memory 421 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash storage (UFS), etc.
  • the processor 410 executes various functional applications and data processing of the mobile phone 100 by running instructions stored in the internal memory 421 and/or instructions stored in a memory provided in the processor.
  • the mobile phone 100 can implement audio functions through an audio module 470, a speaker 470A, a receiver 470B, a microphone 470C, and an application processor. For example, music playback, recording, etc.
  • the audio module 470 is used to convert digital audio information into an analog audio signal for output, and also used to convert an analog audio input into a digital audio signal.
  • the audio module 470 can also be used to encode and decode audio signals.
  • the audio module 470 may be disposed in the processor 410, or some functional modules of the audio module 470 may be disposed in the processor 410.
  • the speaker 470A also called a “speaker” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 can perform voice playback or notification through the speaker 470A.
  • the receiver 470B also called “earpiece” is used to convert audio electrical signals into sound signals.
  • the mobile phone 100 answers a call or voice message, it can receive the voice by bringing the receiver 470B close to the human ear.
  • Microphone 470C also called “microphone”, “microphone”, is used to convert sound signals into electrical signals.
  • the user can make a sound by approaching the microphone 470C through the mouth, and input the sound signal to the microphone 470C.
  • the mobile phone 100 can be provided with at least one microphone 470C.
  • the mobile phone 100 may be provided with two microphones 470C, which can implement noise reduction functions in addition to collecting sound signals.
  • the mobile phone 100 may also be provided with three, four or more microphones 470C to collect sound signals, reduce noise, identify sound sources, and realize directional recording functions.
  • the pressure sensor 480A is used to sense the pressure signal and can convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 480A can be used to determine the user's pressing on the frame of the mobile phone 100, so as to facilitate the mobile phone 100 to respond to the control instruction corresponding to the operation.
  • the gyro sensor 480B may be used to determine the posture of the mobile phone 100 during the movement.
  • the angular velocity of the mobile phone 100 around three axes i.e., x, y, and z axes
  • the gyroscope sensor 480B can be determined by the gyroscope sensor 480B.
  • the magnetic sensor 480C includes a Hall sensor. In some embodiments, the magnetic sensor 480C can induce magnetic field strength to measure physical parameters such as current, position, and direction.
  • the acceleration sensor 480D can detect the magnitude of the acceleration of the mobile phone 100 in various directions (generally three axes).
  • the magnitude and direction of gravity can be detected when the mobile phone 100 is stationary. It can also be used to recognize the posture of the mobile phone 100, and be used for camera viewfinder lens switching, etc.
  • the mobile phone 100 can measure the distance by infrared or laser. In some embodiments, when shooting a scene, the mobile phone 100 may use the distance sensor 480E to measure the distance to achieve fast focusing.
  • the ambient light sensor 480H is used to sense the brightness of the ambient light.
  • the mobile phone 100 can adaptively adjust the brightness of the display 494 according to the perceived brightness of the ambient light.
  • the ambient light sensor 480H can also be used to automatically adjust the white balance when taking pictures.
  • the ambient light sensor 480H can also work with the proximity light sensor.
  • the mobile phone 100 may instruct the large-screen device to adjust the brightness of the display screen of the large-screen device according to the brightness of the ambient light sensed by the ambient light sensor 480H.
  • the fingerprint sensor 480F is used to collect fingerprints. Any type of sensing technology can be used, including but not limited to optical, capacitive, piezoelectric or ultrasonic sensing technology.
  • the mobile phone 100 can use the collected fingerprint characteristics to realize fingerprint unlocking, access application locks, fingerprint photographs, etc.
  • Touch sensor 480G also called “touch device”.
  • the touch sensor 480G (also referred to as a touch panel) may be disposed on the display screen 494, and the touch screen is composed of the touch sensor 480G and the display screen 494, which is also called a “touch screen”.
  • the touch sensor 480G is used to detect touch operations acting on or near it.
  • the touch sensor can transmit the detected touch operation to the application processor to determine the type of touch event.
  • the visual output related to the touch operation can be provided through the display screen 494.
  • the touch sensor 480G may also be disposed on the surface of the mobile phone 100, which is different from the position of the display screen 494.
  • the button 490 includes a power button, a volume button, and so on.
  • the button 490 may be a mechanical button. It can also be a touch button.
  • the mobile phone 100 can receive key input, and generate key signal input related to user settings and function control of the mobile phone 100.
  • the motor 491 can generate vibration prompts.
  • the motor 491 can be used for incoming call vibration notification, and can also be used for touch vibration feedback.
  • touch operations applied to different applications can correspond to different vibration feedback effects.
  • Acting on touch operations in different areas of the display screen 494, the motor 491 can also correspond to different vibration feedback effects.
  • Different application scenarios for example: time reminding, receiving information, alarm clock, games, etc.
  • the touch vibration feedback effect can also support customization.
  • the indicator 492 can be an indicator light, which can be used to indicate the charging status, power change, and can also be used to indicate messages, missed calls, notifications, and so on.
  • the SIM card interface 495 is used to connect to the SIM card.
  • the SIM card can be connected to and separated from the mobile phone 100 by inserting into the SIM card interface 495 or pulling out from the SIM card interface 495.
  • the mobile phone 100 may support 1 or N SIM card interfaces, and N is a positive integer greater than 1.
  • the SIM card interface 495 can support Nano SIM cards, Micro SIM cards, SIM cards, etc.
  • the same SIM card interface 495 can insert multiple cards at the same time. The types of the multiple cards can be the same or different.
  • the SIM card interface 495 can also be compatible with different types of SIM cards.
  • the SIM card interface 495 may also be compatible with external memory cards.
  • the mobile phone 100 interacts with the network through the SIM card to implement functions such as call and data communication.
  • the mobile phone 100 uses an eSIM, that is, an embedded SIM card.
  • the eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
  • the following specifically introduces the method for controlling the display of a large-screen device provided by an embodiment of the present application in conjunction with the mobile phone in FIG. 4.
  • the methods in the following embodiments can all be implemented in a mobile terminal with the above hardware structure or a mobile terminal with a similar structure.
  • the basic principle of the method for controlling the display of a large-screen device in the embodiment of this application is: using the data collected by the acceleration sensor, the gyroscope sensor and the magnetic sensor to obtain the real-time space attitude of the mobile phone through the nine-axis fusion algorithm, and map the real-time space attitude Determine the display position of the laser pointer at the 2D position on the display of the large screen device. And control the large-screen device to make a certain response at the determined display position of the laser pointer.
  • the nine-axis fusion algorithm refers to the method of obtaining the posture of an object through acceleration sensors, gyroscope sensors and magnetic sensors.
  • the mobile phone 100 may be integrated with an IMU.
  • the mobile phone 100 can acquire the posture of an object by fusing an acceleration sensor, a gyroscope sensor, and a magnetic sensor in an inertial measurement unit (IMU).
  • IMU is a unit that measures and reports speed, direction and gravity through a combination of sensors (acceleration sensor, gyroscope sensor and magnetic sensor).
  • the working principle of the acceleration sensor is to determine the translation direction and the translation speed of the mobile phone 100 by measuring the force of the component in a certain axis.
  • the manifestation of the force condition is the direction (X, Y, Z axis direction) and the magnitude of the acceleration in the corresponding direction.
  • the working principle of the gyroscope sensor is to measure the angle between the vertical axis of the gyroscope rotor and the mobile phone 100 in the three-dimensional coordinate system (the three-dimensional coordinate system includes X-axis, Y-axis, and Z-axis), and calculate the angular velocity.
  • the principle of the magnetic sensor is similar to that of a compass, which can measure the angles between the mobile phone 100 and the four directions of south, east, north, and west.
  • the acceleration sensor measures "how far the mobile phone has gone along the X axis”
  • the gyroscope sensor measures "the mobile phone 100 has turned around”
  • the magnetic sensor measures "the mobile phone 100 moves westward”. Because acceleration sensors, gyroscope sensors, and magnetic sensors can measure 3-axis motion data, they are usually called “nine-axis sensors.”
  • the nine-axis fusion algorithm in the embodiment of this application makes full use of the features of the above-mentioned "nine-axis sensor” to fuse and calculate the more accurate real-time spatial attitude of the mobile phone 100, that is, to calculate the spatial coordinates of the mobile phone 100 in the geographic coordinate system (x i , y i , z i ) and the attitude angle.
  • the attitude angle is used to reflect the attitude of the mobile phone 100 relative to the ground.
  • the attitude angle includes a yaw angle (yaw), a pitch angle (pitch) and a roll angle (roll) of the mobile phone 100 relative to the ground coordinate system.
  • FIG. 5 a schematic diagram of a mobile phone spatial attitude provided by an embodiment of this application.
  • the x b axis, y b axis, z b axis and the origin O constitute the coordinate system of the mobile phone 100
  • the x g axis, y g axis, z g axis and the origin O constitute the ground coordinate system.
  • O is the center of mass of the mobile phone 100
  • the x b axis is in the symmetry plane of the mobile phone 100 and is parallel to the body axis of the mobile phone 100 and points to the head of the mobile phone
  • the y b axis is perpendicular to the symmetry plane of the mobile phone 100 and points to the right side of the mobile phone
  • the z b axis is perpendicular It is on the x b axis and points to the bottom of the phone 100.
  • the x g axis is in the horizontal plane and points to a certain direction; the z g axis is perpendicular to the ground and points to the center of the earth; the y g axis is perpendicular to the x g axis in the horizontal plane, and its direction is determined according to the right-hand rule.
  • the yaw angle refers to the angle between the projection of the mobile phone’s 100 axis x b on the horizontal plane and the ground coordinate system x g axis
  • the front end of the mobile phone 100 is deflection to the right as positive
  • the pitch angle refers to the angle ⁇ between the axis x b of the mobile phone 100 and the ground plane (or horizontal plane), and the upward deflection of the front end of the mobile phone is positive
  • the roll angle refers to the mobile phone 100z
  • the angle ⁇ between the b axis and the vertical plane passing through the axis x b of the mobile phone is taken as the right rolling of the mobile phone 100.
  • the method for controlling the display of a large-screen device in the embodiment of the present application can be implemented through S601-S605 in FIG. 6.
  • the mobile phone 100 can perform some or all of the steps in the embodiment of this application. These steps or operations are only examples, and the embodiment of this application can also perform other operations or variations of various operations. In addition, each step may be executed in a different order presented in the embodiment of the present application, and it may not be necessary to perform all the operations in the embodiment of the present application.
  • the mobile phone 100 collects motion data of the mobile phone 100 through an acceleration sensor, a gyroscope sensor, and a magnetic sensor.
  • the mobile phone 100 uses a nine-axis fusion algorithm to determine the spatial posture of the mobile phone 100 according to the collected motion data.
  • the spatial attitude of the mobile phone 100 includes (x i , y i , z i ) of the spatial coordinates of the mobile phone 100 in the geographic coordinate system and the yaw angle of the mobile phone 100 relative to the ground coordinate system Pitch angle ⁇ and roll angle ⁇ .
  • the use of a nine-axis fusion algorithm to determine the spatial posture of the mobile phone 100 may include: first calibrating the motion data of the mobile phone 100, and then calculating the current motion data according to the calibrated motion data through the fusion algorithm Phone space gesture.
  • the mobile phone 100 determines the first position according to the spatial posture of the mobile phone 100.
  • the first position is the position where the spatial attitude of the mobile phone 100 is mapped on the display screen of the laptop computer.
  • the determined first position is the display position of the laser pointer on the display screen of the laptop computer 200.
  • the display position of the laser pointer on the display screen of the laptop computer 200 is at point P in Figure 7 in 7A, where the two-dimensional coordinates of point P on the display screen of the laptop computer 200 are ( x i ', y i ').
  • the origin of the coordinates is the upper left corner of the display screen of the laptop computer 200, the x-axis is from left to right along the upper edge of the laptop display, and the y axis is from top to bottom along the left edge of the laptop display.
  • the mobile phone 100 can determine R according to the display width W of the laptop computer 200 and the sensitivity s of the mobile phone 100.
  • the user manipulates the angle rotation s (unit: degree) of the mobile phone 100, and correspondingly, the laser pointer moves the distance W on the display screen.
  • W is the range of the laser pointer that can be manipulated
  • s is the corresponding angle of the mobile phone 100 that can be manipulated.
  • the distance between the person and the display screen is L>W/2*cot(s/2)
  • the mobile phone 100 is controlled to rotate the same angle, the distance of the laser pointer movement becomes larger, and the maneuverable space of the mobile phone 100 is smaller (it is easier to reach the edge ), the sensitivity of human perception is relatively higher.
  • the distance between the person and the display screen L ⁇ W/2*cot(s/2)
  • the mobile phone 100 is controlled to rotate the same angle, the distance of the laser pointer will become smaller, and the movement space of the laser pointer will become larger.
  • the controllable space is larger, and the sensitivity of human perception is relatively lower.
  • the s corresponding to the mobile phone 100 can be adjusted.
  • s can be adjusted to be smaller, so that the mobile phone 100 can be controlled to rotate a smaller angle, and the laser pointer You can move the same distance.
  • L>W/2*cot(s/2) you can increase s so that the laser pointer can move farther by manipulating the mobile phone 100 to rotate the same angle the distance.
  • the mobile phone 100 sends the second information to the laptop computer 200.
  • the second information is used to indicate the first position.
  • the second information includes control information.
  • the control information is used to instruct the laptop computer 200 to make a first response at the first position.
  • the mobile phone 100 controls the display device through the service device.
  • the mobile phone 100 controls the display of the television 300 through the laptop computer 200
  • the mobile phone 100 in FIG. 3 uses the laptop 200 to project the display on the screen of the projection device.
  • the mobile phone 100 sends the second message to the laptop computer 200, which is used to instruct the laptop computer 200 to control the television 300 to make a first response at the first position.
  • the mobile phone 100 sends the second message to the laptop computer 200 for instructing the laptop computer 200 to control the projector 400 to make a first response at the first position of the screen 500.
  • S605 The laptop computer 200 makes a first response at the first position.
  • control information may be used to instruct, but is not limited to, the laptop computer 200 to make any one of the following first responses: call the mouse pointer to click the display content at the first position (for example, the video play button ), call the mouse pointer to double-click the displayed content at the first position (for example, a folder), call the mouse pointer to make a sliding operation (for example, slide the video playback progress button), and turn the page, call the mouse pointer to draw at the first position Line etc.
  • first responses call the mouse pointer to click the display content at the first position (for example, the video play button ), call the mouse pointer to double-click the displayed content at the first position (for example, a folder), call the mouse pointer to make a sliding operation (for example, slide the video playback progress button), and turn the page, call the mouse pointer to draw at the first position Line etc.
  • control information may also be used to instruct the laptop computer 200 to switch the function of the mouse pointer from the laser pointer function to the laser brush function. For example, by switching the laser pointer to a laser brush, operations such as scribing, drawing, and writing can be completed.
  • the user can instruct the laptop computer 200 to switch the laser pointer to the laser brush by making a preset operation on the mobile phone 100.
  • the preset operation is to slide or press a certain position on the touch screen of the mobile phone 100, press the "+" and "-" volume keys at the same time (1010 in Figure 10), long press the volume key, and hold the phone 100 for a long time Side frame etc.
  • the user can instruct the laptop computer 200 to switch the laser pointer to the laser brush by making a corresponding operation on the human-computer interaction interface of the mobile phone 100. For example: click the pointer/brush switch button on the human-computer interaction interface (1007 in Figure 10), or make a preset operation on the touchpad of the human-computer interaction interface (1009 in Figure 10) (such as double-click, Operations such as sliding).
  • the above-mentioned human-computer interaction interface may be a basic service function in the mobile phone 100, or an application program (APP) installed in the mobile phone 100.
  • APP application program
  • an "air mouse” APP may be installed in the mobile phone 100, and the user may complete the above-mentioned custom settings on the APP interface.
  • the "air mouse” APP is used to enable the mobile phone 100 to implement various functions of a traditional mouse, but it does not need to work in a fixed position or a fixed platform.
  • the method for controlling the display of a large-screen device in the embodiment of the present application may further include:
  • the mobile phone 100 receives the user's first operation.
  • S604 in FIG. 6 is actually: the mobile phone 100 sends the second information to the laptop computer 200 in response to the first operation. That is S607 in Figure 9.
  • the mobile phone 100 sending the second information to the laptop computer 200 according to the first operation may include: the mobile phone 100 determines the control instruction corresponding to the first operation according to the first operation; the mobile phone 100 sends the second information to the laptop 200.
  • the second information includes a control instruction corresponding to the first operation.
  • the correspondence between different operations and different control commands can be customized by the user.
  • the user can customize the correspondence between different operations and control commands on the above-mentioned human-computer interaction interface.
  • the first operation may be an operation performed by the user on the human-computer interaction interface of the mobile phone 100.
  • the mobile phone 100 may determine the control instruction identified by the specific first operation according to the corresponding relationship between the different operations of the custom settings and the different control instructions. For example, you can click 1008 in Figure 10 to perform the above-mentioned custom settings.
  • the first operation may also be an operation of the user pressing the volume key of the mobile phone 100 (1010 in FIG. 10) or an operation of the user pinching the side frame of the mobile phone 100.
  • the mobile phone 100 detects through the pressure sensor that the pressure applied to a certain position of the side frame is greater than F 0 and the duration is greater than T 0 .
  • the mobile phone 100 determines the corresponding control instruction based on the detected conditions.
  • the human-computer interaction interface may include at least one of the following virtual buttons: a left-handed/right-handed interface switching button (1006 in Figure 10), a pointer/brush switching button (1007 in Figure 10), One-key play/one-key exit button (1001 in Figure 10), personalized setting button (1008 in Figure 10).
  • the basic functions of the "mouse” can be realized. For example: slideshow one-key playback/one-key exit, slideshow page turning, drawing, confirming, playing, opening files, writing, etc. Or, you can also configure the "mouse" through the human-computer interaction interface. For example: click 1008 in Figure 10 to configure the display style of the laser pointer, click 1005 in Figure 10 to configure the laser pointer position control sensitivity, and click 1004 in Figure 10 to configure the interface layout.
  • the method for controlling the display of a large-screen device of the present application may further include:
  • the mobile phone 100 In response to receiving the user's first instruction, the mobile phone 100 sends the first information to the laptop computer 200.
  • the first information is used to instruct the laptop computer 200 to display the laser pointer at the preset position.
  • the preset position is the center position of the display screen of the laptop computer 200.
  • the first information includes a 2D position realignment instruction, which is used to instruct the laptop computer 200 to control the laser pointer to display at the center of the display screen of the laptop computer 200 according to the 2D position realignment instruction.
  • the human-computer interaction interface may also include a 2D position realignment button (1002 in FIG. 10). The user can click the button through the mobile phone 100 to instruct the laptop computer 200 to perform 2D position realignment.
  • the laptop computer 200 displays the laser pointer at the preset position according to the first information.
  • the position where the mobile phone 100 currently points to the display screen of the portable computer 200 is the center position of the display screen of the portable computer 200.
  • the current posture of the mobile phone 100 (that is, the above-mentioned first posture) is the initial posture, wherein the yaw angle and pitch angle of the initial posture can both be regarded as zero.
  • the first position is determined according to the attitude angle of the mobile phone 100, that is, S603 can be implemented through the following steps:
  • Step 1 The mobile phone 100 uses the quaternion q 0 corresponding to the initial posture as the starting point and the current posture quaternion q i as the end point.
  • the quaternion q i is relative to the quaternion q 0 relative rotation matrix C.
  • the mobile phone 100 uses the quaternion q 0 corresponding to the result of the current posture nine-axis fusion algorithm and the quaternion q i corresponding to the result of the initial posture nine-axis fusion algorithm to determine the rotation quaternion, and the rotation is four.
  • the element number is converted to the relative rotation matrix C.
  • Step 2 The mobile phone 100 determines the second attitude according to the initial yaw angle, the initial pitch angle and the relative rotation matrix C.
  • the initial yaw angle is the yaw angle of the mobile phone 100 relative to the ground coordinate system when the mobile phone 100 is in the initial attitude
  • the initial pitch angle is the pitch angle of the mobile phone 100 relative to the ground coordinate system when the mobile phone 100 is in the initial attitude.
  • Step 3 The mobile phone 100 determines the first position according to the second posture.
  • the initial posture corresponds to the center position of the display screen of the laptop computer 200
  • the 3D spatial posture is projected onto the 2D display screen to obtain the display position of the laser pointer, and a laser pointer control with higher accuracy and higher sensitivity can be obtained.
  • the left and right hand recognition algorithms can be integrated in the mobile phone 100, so that the human-computer interaction interface of the mobile phone 100 follows the adaptive layout of the left and right hands, and improves the user operation experience.
  • the user can switch the left-hand/right-hand interface by clicking 1006 in Figure 10.
  • the mobile terminal (such as the mobile phone 100) includes hardware structures and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software. Whether a certain function is executed by hardware or computer software-driven hardware depends on the specific application and design constraint conditions of the technical solution. Professionals and technicians can use different methods for each specific application to implement the described functions, but such implementation should not be considered beyond the scope of this application.
  • the embodiments of the present application may divide the mobile terminal into functional modules.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or software functional modules. It should be noted that the division of modules in the embodiments of the present application is illustrative, and is only a logical function division, and there may be other division methods in actual implementation.
  • the mobile terminal may include a sensor module 1210, an analysis module 1220, and a sending module 1230.
  • the sensor module 1210 can be used to support the mobile terminal to perform the above step S601
  • the analysis module 1220 can be used to support the mobile terminal to perform the above steps S602 and S603
  • the sending module 1230 can be used to support the mobile terminal to perform the above steps S604, S607 and S608, and / Or other processes used in the techniques described herein.
  • the mobile terminal may further include a detection module 1240, which is used to support the mobile terminal to perform the above step S606, and/or other processes used in the technology described herein.
  • the analysis module 1220 may be the processor 410 shown in FIG. 4. It can implement or execute various exemplary logical blocks, modules and circuits described in conjunction with the disclosure of this application.
  • the processor may also be a combination of computing functions, for example, a combination of one or more microprocessors, a combination of digital signal processing (DSP) and a microprocessor, and so on.
  • DSP digital signal processing
  • the above mobile terminal may also include a radio frequency circuit.
  • the mobile terminal can receive and send wireless signals through a radio frequency circuit.
  • the radio frequency circuit includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like.
  • the radio frequency circuit can also communicate with other devices through wireless communication.
  • the wireless communication can use any communication standard or protocol, including but not limited to global system for mobile communications, general packet radio service, code division multiple access, broadband code division multiple access, long-term evolution, email, short message service, etc.
  • the computer program product includes one or more computer instructions.
  • the computer may be a general-purpose computer, a special-purpose computer, a computer network, or other programmable devices.
  • the computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium.
  • the computer instructions may be transmitted from a website, computer, server, or data center.
  • the computer-readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server or a data center integrated with one or more available media.
  • the usable medium may be a magnetic medium (for example, a floppy disk, a hard disk, and a magnetic tape), an optical medium (for example, a DVD), or a semiconductor medium (for example, a solid state disk (SSD)).
  • the steps of the method or algorithm described in the embodiments of the present application may be implemented in a hardware manner, or may be implemented in a manner in which a processor executes software instructions.
  • Software instructions can be composed of corresponding software modules, which can be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, mobile hard disk, CD-ROM or any other form of storage known in the art Medium.
  • An exemplary storage medium is coupled to the processor, so that the processor can read information from the storage medium and can write information to the storage medium.
  • the storage medium may also be an integral part of the processor.
  • the processor and the storage medium may be located in the ASIC.
  • the ASIC may be located in the detection device.
  • the processor and the storage medium may also exist as separate components in the detection device.
  • the disclosed user equipment and method may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the modules or units is only a logical function division.
  • there may be other division methods for example, multiple units or components may be It can be combined or integrated into another device, or some features can be omitted or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms.
  • the units described as separate parts may or may not be physically separate.
  • the parts displayed as units may be one physical unit or multiple physical units, that is, they may be located in one place, or they may be distributed to multiple different places. . Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a readable storage medium.
  • the technical solutions of the embodiments of the present application are essentially or the part that contributes to the prior art, or all or part of the technical solutions can be embodied in the form of software products, which are stored in a storage medium.
  • a device which may be a single-chip microcomputer, a chip, etc.
  • a processor processor
  • the aforementioned storage media include: U disk, mobile hard disk, read-only memory (Read-Only Memory, ROM), random access memory (Random Access Memory, RAM), magnetic disk or optical disk and other media that can store program code .

Abstract

本申请公开了一种控制大屏设备显示的方法、移动终端及第一系统,涉及计算机领域,可以使用已有的便携设备实现对大屏设备显示器的多样显示的控制,提高用户体验。已有的移动终端(如手机),可以接收用户对该移动终端空间姿态的控制,根据其空间姿态计算得到该空间姿态对应到大屏设备显示屏对应的位置,通过向该大屏设备发送控制信息,控制大屏设备的显示屏在对应位置做出显示激光指针,移动激光指针等响应。

Description

控制大屏设备显示的方法、移动终端及第一系统
本申请要求于2019年7月30日提交国家知识产权局、申请号为201910693432.8、发明名称为“控制大屏设备显示的方法、移动终端及第一系统”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请实施例涉及计算机领域,尤其涉及通过移动终端控制大屏设备显示的方法、移动终端及第一系统。
背景技术
现如今,随着各种商务展会、研讨会、大型路演和新品发布会的举办越来越普及。在上述活动中,演讲者通常需要向观众作演讲报告。例如,主讲人通过在大屏幕上播放幻灯片(PowerPoint,PPT)或者keynote向观众作演讲报告。演讲者会使用激光笔将可见激光点照射在大屏幕上某一位置,来提示观众正在演讲的内容。
但是,激光笔只具有位置显示以及翻页等一般性展示功能,无法控制完成文件打开、下划线等操作。另外,激光笔需要经常更换电池,且在使用时不能有遮挡物,使用起来非常不便。
发明内容
本申请实施例提供一种控制大屏设备显示的方法、移动终端及第一系统,可以使用已有的便携设备实现对大屏设备显示器的多样显示的控制,提高用户体验。
为达到上述目的,本申请实施例采用如下技术方案:
第一方面,提供一种控制大屏设备显示的方法,该方法应用于移动终端,该移动终端包括多个运动传感器,多个运动传感器至少包括加速度传感器、陀螺仪传感器和磁传感器,该移动终端与大屏设备之间建立了通信连接,该移动终端用于控制所述大屏设备上显示的激光指针,该方法包括:移动终端通过多个运动传感器采集该移动终端的运动数据;该移动终端根据运动数据,采用九轴融合算法确定该移动终端的空间姿态;该移动终端根据该移动终端的空间姿态,确定大屏设备的显示屏上的第一位置;该移动终端控制大屏设备在所述第一位置做出第一响应,该第一响应包括在第一位置显示激光指针。
上述第一方面提供的技术方案,用户可以使用已有的移动终端,通过控制该移动终端的姿态,在该移动终端根据其空间姿态计算得到该空间姿态对应到大屏设备显示屏对应的位置后,向该大屏设备发送控制信息,用于控制大屏设备的显示屏在对应位置做出显示激光指针,移动激光指针等响应。
在一种可能的实现方式中,空间姿态至少用于标识以下信息:移动终端相对于地面坐标系的偏航角
Figure PCTCN2020102191-appb-000001
俯仰角θ和横滚角φ。移动终端可以通过采集到的该移动终端的偏航角、俯仰和横滚角,计算得到该空间姿态对应到大屏设备显示屏对应的位置。
在一种可能的实现方式中,移动终端根据空间姿态,确定所述大屏设备的显示屏 上的第一位置,包括:该移动终端采用以下确定所述第一位置在大屏设备的显示屏上的坐标(x' i,y' i):
Figure PCTCN2020102191-appb-000002
其中,W为大屏设备显示屏的宽度,s为移动终端的灵敏度。采用该计算方法,移动终端可以根据其空间姿态确定该空间姿态对应到大屏设备显示屏对应的位置。
在一种可能的实现方式中,在移动终端通过多个运动传感器采集该移动终端的运动数据之前,该方法还包括:该移动终端处于第一姿态时,接收用户的第一指令,该第一指令用于指示该移动终端与大屏设备进行2D位置重对准;响应于该第一指令,该移动终端向大屏设备发送第一信息,该第一信息用于指示大屏设备在预设位置显示激光指针;其中,第一姿态为移动终端的任意姿态。移动终端可以接收用户重对准的指令,对激光指针的大屏设备显示屏上的位置进行重对准,可以提高激光指针显示位置的准确度,提高用户体验。
在一种可能的实现方式中,该移动终端根据空间姿态,确定大屏设备的显示屏上的第一位置,包括:该移动终端以第一姿态对应的四元数q 0作为起点,空间姿态对应的四元数q i作为终点,计算该移动终端由第一姿态变换为空间姿态时,四元数q i相对四元数q 0的相对旋转矩阵C;该移动终端根据初始偏航角、初始俯仰角和相对旋转矩阵C,确定第二姿态;该移动终端根据第二姿态,确定第一位置;其中,初始偏航角为该移动终端处于第一姿态时,该移动终端相对于地面坐标系的偏航角,初始俯仰角为该移动终端处于所述第一姿态时,该移动终端相对于地面坐标系的俯仰角。移动终端可以通过计算空间姿态变换对应的四元数,以及根据该四元数计算姿态变换后的空间姿态,进而进行重对准,可以提高激光指针显示位置的准确度,提高用户体验。
在一种可能的实现方式中,移动终端控制大屏设备在所述第一位置做出第一响应,包括:该移动终端向大屏设备发送第二信息;其中,该第二信息用于指示第一位置,该第二信息包括控制信息,控制信息用于指示大屏设备在所述第一位置做出第一响应。移动终端可以通过向大屏设备发送具体的控制信息以及希望大屏设备做出对应响应的位置,控制大屏设备响应,该方式性能稳定,用户体验度高。
在一种可能的实现方式中,移动终端控制大屏设备在所述第一位置做出第一响应,包括:该移动终端向服务设备发送第二信息;其中,所述第二信息用于指示第一位置,该第二信息包括控制信息,该控制信息用于指示服务设备控制大屏设备在第一位置做出第一响应。移动终端可以通过控制服务设备,来控制大屏设备进行响应。可以适用于不同的应用场景,使用方便,用户体验度高。
在一种可能的实现方式中,在移动终端控制大屏设备在所述第一位置做出第一响应之前,该方法还包括:该移动终端接收用户的第一操作;该移动终端控制大屏设备在第一位置做出第一响应,包括:响应于第一操作,该移动终端控制大屏设备在第一位置做出第一响应。移动终端可以接收用户代表某一控制指令的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一响应至少包括以下任一种响应:将激光指针切换为激光画笔、翻页、打开文件、划线、播放、停止播放、调节亮度。移动终端可以根据用户的操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制, 操作方便,用户体验度高。
在一种可能的实现方式中,第一操作是移动终端的显示屏显示人机交互界面时,接收到的用户操作。移动终端可以接收用户在人机交互界面上的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一操作是用户在人机交互界面的操作;其中,人机交互界面至少包括以下中的至少一个虚拟按钮:左手/右手人机交互界面切换按钮,指针/画笔切换按钮,2D位置重对准按钮,一键播放/一键退出按钮,个性化设置按钮。移动终端可以接收用户在人机交互界面上的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一操作还包括:用户在人机交互界面空白处的点击/双击/长按操作,用户在人机交互界面空白处的触摸滑动,用户按压所述移动终端的音量键的操作或者用户捏握移动终端侧边框的操作。移动终端可以接收用户在人机交互界面上或者移动终端其他位置处的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
第二方面,提供一种移动终端,该移动终端包括传感器模块,该传感器模块包括多个运动传感器,多个运动传感器至少包括加速度传感器、陀螺仪传感器和磁传感器,该移动终端与大屏设备之间建立了通信连接,该移动终端用于控制所述大屏设备上显示的激光指针,该传感器模块用于采集该移动终端的运动数据;该移动终端还包括:分析模块,用于根据运动数据,采用九轴融合算法确定该移动终端的空间姿态;以及,该根据该移动终端的空间姿态,确定大屏设备的显示屏上的第一位置;发送模块,用于向大屏设备发送控制信息,用于控制大屏设备在所述第一位置做出第一响应,该第一响应包括在第一位置显示激光指针。
上述第二方面提供的技术方案,用户可以使用已有的移动终端,通过控制该移动终端的姿态,在该移动终端根据其空间姿态计算得到该空间姿态对应到大屏设备显示屏对应的位置后,向该大屏设备发送控制信息,用于控制大屏设备的显示屏在对应位置显示激光指针,移动激光指针等响应。
在一种可能的实现方式中,空间姿态至少用于标识以下信息:移动终端相对于地面坐标系的偏航角
Figure PCTCN2020102191-appb-000003
俯仰角θ和横滚角φ。动终端可以通过采集到的该移动终端的偏航角、俯仰和横滚角,计算得到该空间姿态对应到大屏设备显示屏对应的位置。
在一种可能的实现方式中,分析模块根据空间姿态,确定所述大屏设备的显示屏上的第一位置,包括:该分析模块采用以下确定所述第一位置在大屏设备的显示屏上的坐标(x' i,y' i):
Figure PCTCN2020102191-appb-000004
其中,W为大屏设备显示屏的宽度,s为移动终端的灵敏度。采用该计算方法,移动终端可以根据其空间姿态确定该空间姿态对应到大屏设备显示屏对应的位置。
在一种可能的实现方式中,移动终端还包括,检测收模块,用于传感器模块通过多个运动传感器采集该移动终端的运动数据之前,在该移动终端处于第一姿态时,检测用户的第一指令,该第一指令用于指示该移动终端与大屏设备进行2D位置重对准;该发送模块还用于,响应于该第一指令,向大屏设备发送第一信息,该第一信息用于 指示大屏设备在预设位置显示激光指针;其中,第一姿态为移动终端的任意姿态。移动终端可以接收用户重对准的指令,对激光指针的大屏设备显示屏上的位置进行重对准,可以提高激光指针显示位置的准确度,提高用户体验。
在一种可能的实现方式中,分析模块根据空间姿态,确定大屏设备的显示屏上的第一位置,包括:该分析模块以第一姿态对应的四元数q 0作为起点,空间姿态对应的四元数q i作为终点,计算该移动终端由第一姿态变换为空间姿态时,四元数q i相对四元数q 0的相对旋转矩阵C;该分析模块根据初始偏航角、初始俯仰角和相对旋转矩阵C,确定第二位姿态;该移动终端根据第二姿态,确定第一位置;其中,初始偏航角为该移动终端处于第一姿态时,该移动终端相对于地面坐标系的偏航角,初始俯仰角为该移动终端处于所述第一姿态时,该移动终端相对于地面坐标系的俯仰角。移动终端可以通过计算空间姿态变换对应的四元数,以及根据该四元数计算姿态变换后的空间姿态,进而进行重对准,可以提高激光指针显示位置的准确度,提高用户体验。
在一种可能的实现方式中,发送模块向大屏设备发送控制信息,包括:该发送模块通过服务设备向服务设备发送控制信息,用于指示大屏设备根据服务设备的控制在第一位置做出第一响应。移动终端可以通过控制服务设备,来控制大屏设备进行响应。可以适用于不同的应用场景,使用方便,用户体验度高。
在一种可能的实现方式中,该移动终端还包括:检测模块,用于在发送模块向大屏设备发送控制信息之前,检测用户的第一操作;该发送模块向大屏设备发送控制信息,包括:响应于第一操作,该发送模块向大屏设备发送控制信息。移动终端可以接收用户代表某一控制指令的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一响应至少包括以下任一种响应:将激光指针切换为激光画笔、翻页、打开文件、划线、播放、停止播放、调节亮度。移动终端可以根据用户的操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一操作是移动终端的显示屏显示人机交互界面时,接收到的用户操作。移动终端可以接收用户在人机交互界面上的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一操作是用户在人机交互界面的操作;其中,人机交互界面至少包括以下中的至少一个虚拟按钮:左手/右手人机交互界面切换按钮,指针/画笔切换按钮,2D位置重对准按钮,一键播放/一键退出按钮,个性化设置按钮。移动终端可以接收用户在人机交互界面上的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
在一种可能的实现方式中,第一操作还包括:用户在人机交互界面空白处的点击/双击/长按操作,用户在人机交互界面空白处的触摸滑动,用户按压所述移动终端的音量键的操作或者用户捏握移动终端侧边框的操作。移动终端可以接收用户在人机交互界面上或者移动终端其他位置处的预设操作,对应的控制大屏设备做出响应,可以支持对大屏设备的多样控制,操作方便,用户体验度高。
第三方面,提供一种移动终端,该移动终端包括:存储器,用于存储计算机程序代码,该计算机程序代码包括指令;射频单元,用于进行无线电信号的发射和接收;当该存储器存储的一个或多个计算机程序被处理器执行时,使得移动终端执行第一方面任一种可能的实现方式中的控制大屏设备显示的方法。
第四方面,提供一种第一系统,该第一系统包括:移动终端和大屏设备,该移动终端用于控制大屏设备,实现第一方面任一种可能的实现方式中的控制大屏设备显示的方法。
在一种可能的实现方式中,该第一系统还包括:服务设备,用于实现第一方面中相应的任一种可能的实现方式中的控制大屏设备显示的方法。
第五方面,提供一种芯片系统,该芯片系统包括处理器、存储器,存储器中存储有指令;所述指令被所述处理器执行时,实现如第一方面任一种可能的实现方式中的控制大屏设备显示的方法。该芯片系统可以由芯片构成,也可以包含芯片和其他分立器件。
第六方面,提供一种计算机可读存储介质,该计算机可读存储介质上存储有计算机执行指令,该计算机执行指令被处理器执行时实现如第一方面任一种可能的实现方式中的控制大屏设备显示的方法。
第七方面,提供一种计算机程序产品,提供一种计算机程序产品,当其在计算机上运行时,使得第一方面任一种可能的实现方式中的控制大屏设备显示的方法。例如,该计算机可以是至少一个存储节点。
附图说明
图1A为本申请实施例提供的通过手机控制电视机显示的方法的应用场景示例一;
图1B为本申请实施例提供的通过手机控制手提电脑显示的方法的应用场景示例;
图2为本申请实施例提供的通过手机控制电视机显示的方法的应用场景示例二;
图3为本申请实施例提供的一种控制投影设备显示的方法的应用场景示例;
图4为本申请实施例提供的一种手机的硬件结构示意图;
图5为本申请实施例提供的一种手机空间姿态示意图;
图6为本申请实施例提供的一种通过移动终端控制大屏设备显示的方法流程图一;
图7为本申请实施例提供的一种手机与手提电脑相对位置示意图;
图8为本申请实施例提供的一种第一位置确定方式示意图;
图9为本申请实施例提供的一种控制大屏设备显示的方法流程图二;
图10为本申请实施例提供的人机交互界面示例;
图11为本申请实施例提供的一种控制大屏设备显示的方法流程图三;
图12为本申请实施例提供的一种移动终端的结构示意图;
图13为本申请实施例提供的另一种移动终端的结构示意图。
具体实施方式
本申请实施例提供一种通过控制大屏设备显示的方法、电子设备及系统。该电子设备是移动终端或大屏设备。其中,大屏设备包括显示屏。例如,该大屏设备可以为电视机、个人计算机(personal computer,PC)、平板电脑、上网本、投影设备等。移动终端可以直接操控大屏设备控制器其显示屏上显示的内容及显示形式。例如,本 申请实施例中的大屏设备为投影设备,如该投影设备包括投影仪和幕布。将移动终端显示屏上显示的内容直接通过投影仪投影在幕布上显示,移动终端可以直接操控投影仪,控制投影仪投影在幕布上的内容及显示形式。
或者,移动终端可以通过操控服务设备控制显示屏上显示的内容及显示形式。例如,投影仪与笔记本电脑连接,将笔记本电脑显示屏上显示的界面,投影在幕布上。移动终端可以通过操控笔记本电脑,控制投影仪投影在幕布上的内容及显示形式。
需要说明的是,本申请实施例中的大屏设备还可以为人机交互场景中的其他设备,例如体感游戏机等。本申请实施例对大屏设备的具体类别和形式不作限定。
本申请实施例中的移动终端可以是手持设备,例如智能手机、平板电脑、掌上电脑等。还可以是可穿戴设备(例如智能手表)、便携式多媒体播放器(Portable Multimedia Player,PMP)、专用媒体播放器、AR(增强现实)/VR(虚拟现实)设备等其他类型的电子设备。
本申请实施例的通过控制大屏设备的方法中。包括移动终端直接控制大屏设备,或者移动终端通过其他设备(如服务设备)控制大屏设备。
其中,移动终端和大屏设备建立了通信连接。移动终端可以通过该通信连接控制大屏设备。该通信连接可以是有线连接或者无线连接。
例如,移动终端、大屏设备和服务设备等之间可以遵循无线传输协议,通过无线连接收发器传输信息。其中,传输的信息包括但不限于需要显示的内容数据和控制指令。该无线传输协议可以包含但不限于蓝牙(bluetooth,BT)传输协议或无线保真(Wireless Fidelity,WiFi)传输协议等。该无线连接收发器包含但不限于蓝牙或WiFi等收发器。通过无线配对,实现移动终端与大屏设备之间的信息传输。或者,移动终端、大屏设备和服务设备之间可以通过有线连接实现信息传输。例如,该有线连接为数据传输线连接。本申请以下实施例仅以移动终端、大屏设备和服务设备之间遵循无线传输协议实现信息传输作为示例。
需要说明的是,本申请实施例对人机交互的具体应用场景也不作限定。该场景可以如背景技术中所述,为商务展会、研讨会、大型路演和新品发布会。还可以是体感游戏场景,例如“切西瓜”、“射击”等。或者还可以是电视化教学、电视化会议等场景。且如上所述,移动终端可以直接操控大屏设备控制其显示,也可以通过操控其他设备控制该大屏设备的显示,对于具体的场景,本申请不做具体限定。
如图1A所示,为本申请实施例提供的通过手机控制电视机显示的方法的应用场景示例一。如图1A所示,以移动终端为手机100、大屏设备为电视机300为例。手机100可以直接将手机100显示屏上显示的界面投影至电视机300的显示屏。即电视机300的显示屏可以同步显示手机100显示屏上的内容。例如,假设手机100中安装的移动办公软件通过“共享WiFi”等技术,在电视机300的显示屏上展示PPT。用户通过手机100控制激光指针在电视机300的显示屏上的位置,以及控制电视机300在对应位置做出对应的显示、效果等,例如标颜色、划线、打钩。
或者,手机还可以通过“共享WiFi”等技术,在手提电脑Laptop Computer(也称为“笔记本电脑”)的显示屏上同步显示手机100显示屏上的内容。
如图1B所示,以移动终端为手机100、大屏设备为手提电脑Laptop Computer(也 称为“笔记本电脑”)200为例。其中,手提电脑200可以仅作为一个显示设备。其显示屏上同步显示手机100显示屏上的内容。在一种可能的实施例中,手提电脑200还可以作为一个播放设备,播放手提电脑200中的内容。例如,手提电脑200中安装有体感游戏“切西瓜”应用程序(Application,APP)。用户将手机100作为体感“鼠标”。其中,手机100和手提电脑200均连接至WiFi。用户“切西瓜”的过程可以为:手提电脑200“切西瓜”游戏界面初始位置(例如游戏界面中心)显示激光指针。用户右手持握手机100在空中任意方向滑动,该激光指针沿着对应的轨迹滑动。用户控制激光指针移动至“西瓜”A附近,右手继续在空中滑动做出“切西瓜”的动作,对应的,游戏界面中的“西瓜”A显示被切开的模拟视觉效果。
或者,手机100还可以通过操控手提电脑200,控制投影设备屏幕上的显示。
如图2所示,为本申请实施例提供的通过手机控制电视机显示的方法的应用场景示例二。如图2所示,电视机300仅作为一个显示设备显示手提电脑200显示屏上显示的界面,在这种情境下,电视机300也可以理解为一个投影设备。例如,3-4人的小型会议中,主讲者将其手提电脑200显示屏上的PPT演示界面显示在电视机300显示屏上。主讲者通过手机100操控手提电脑200,控制其显示在电视机300显示屏上的PPT演示界面的显示效果,例如标颜色、划线、打钩。
或者,如图3所示,投影设备包括投影仪400和幕布500。例如,上百人的大型会议中,或者新品发布会上。主讲者将其手提电脑200显示屏上的PPT演示界面通过投影仪400投影到幕布500上。主讲者通过手机100操控手提电脑200,使得在其通过投影仪400投影在幕布500上的显示界面上,激光指针位置可以跟随手机100的姿态变化而变化。以及,手机100控制手提电脑200,使得手提电脑200通过投影仪400投影在幕布500上的显示界面的对应位置,做出对应的显示、效果等。
在图1B与图2的示例中,手机100和手提电脑200可以均连接至WiFi。图2中的手提电脑200与电视机300之间,图3中的投影仪400与手提电脑200之间,可以通过数据传输线600(例如,高清晰度多媒体接口(High Definition Multimedia Interface,HDMI)线)连接。或者,电视机300和投影仪400也可以连接至WiFi,通过WiFi从手提电脑200接收显示的内容数据和控制指令等。
基于图1A、图1B、图2、图3中的示例。本申请实施例的通过移动终端控制大屏设备的方法中,移动终端,如手机100既可以具有激光笔的指示、翻页等基本功能。例如,服务设备或者大屏设备可以根据用户手持手机100的3维(3 Dimensions,3D)位置(也称为空间位置)映射在大屏设备显示屏(如图1A或图2中电视机300的显示屏、图3幕布500)上的2维(2 Dimensions,2D)位置(也称为平面位置),确定激光指针的显示位置,来提示观众正在演讲的内容。以及,手机100向服务设备或者大屏设备发送“翻页”指令。手机100也可以具有无线鼠标的功能。例如,用户可以通过手机100向服务设备或者大屏设备发送“高亮”、“双击打开”、“播放”、“划线”等指令,控制大屏设备显示屏上激光指针处对应的显示。
下面以手机为例介绍本申请中的移动终端的结构。如图4所示,手机100可以包括处理器410,外部存储器接口420,内部存储器421,通用串行总线(universal serial bus,USB)接口430,充电管理模块440,电源管理模块441,电池442,天线1,天线2, 移动通信模块450,无线通信模块460,音频模块470,扬声器470A,受话器470B,麦克风470C,传感器模块480,按键490,马达491,指示器492,摄像头493,显示屏494,以及用户标识模块(subscriber identification module,SIM)卡接口495等。其中传感器模块480可以包括压力传感器480A,陀螺仪传感器480B,磁传感器480C,加速度传感器480D,距离传感器480E,指纹传感器480F,触摸传感器480G,环境光传感器480H等。
可以理解的是,本发明实施例示意的结构并不构成对手机100的具体限定。在本申请另一些实施例中,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器410可以包括一个或多个处理单元。例如:处理器410可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),飞行控制器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。
处理器410中还可以设置存储器,用于存储指令和数据。在一些实施例中,处理器410中的存储器为高速缓冲存储器。该存储器可以保存处理器410刚用过或循环使用的指令或数据。如果处理器410需要再次使用该指令或数据,可从所述存储器中直接调用。避免了重复存取,减少了处理器410的等待时间,因而提高了系统的效率。
在一些实施例中,处理器410可以包括一个或多个接口。接口可以包括集成电路(inter-integrated circuit,I2C)接口,集成电路内置音频(inter-integrated circuit sound,I2S)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,移动产业处理器接口(mobile industry processor interface,MIPI),通用输入输出(general-purpose input/output,GPIO)接口,用户标识模块(subscriber identity module,SIM)接口,和/或通用串行总线(universal serial bus,USB)接口等。
I2C接口是一种双向同步串行总线,包括一根串行数据线(serial data line,SDA)和一根串行时钟线(derail clock line,SCL)。在一些实施例中,处理器410可以包含多组I2C总线。处理器410可以通过不同的I2C总线接口分别耦合触摸传感器480G,充电器,闪光灯,摄像头493等,其中,摄像头493为至少一个;摄像头493还可以为可360°旋转的摄像头。例如:处理器410可以通过I2C接口耦合触摸传感器480G,使处理器410与触摸传感器480G通过I2C总线接口通信,实现手机100的触摸功能。
USB接口430是符合USB标准规范的接口,具体可以是Mini USB接口,Micro USB接口,USB Type C接口等。USB接口430可以用于连接充电器为手机100充电,也可以用于手机100与外围设备之间传输数据。也可以用于连接其他电子设备,例如AR设备等。
可以理解的是,本发明实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对手机100的结构限定。
充电管理模块440用于从充电器接收充电输入。其中,充电器可以是无线充电器,也可以是有线充电器。在一些有线充电的实施例中,充电管理模块440可以通过USB接口430接收有线充电器的充电输入。在一些无线充电的实施例中,充电管理模块440可以通过手机100的无线充电线圈接收无线充电输入。充电管理模块440为电池442充电的同时,还可以通过电源管理模块441为电子设备供电。
电源管理模块441用于连接电池442,充电管理模块440与处理器410。电源管理模块441接收电池442和/或充电管理模块440的输入,为处理器410,内部存储器421,显示屏494,摄像头493,和无线通信模块460等供电。电源管理模块441还可以用于监测电池容量,电池循环次数,电池健康状态(漏电,阻抗)等参数。在其他一些实施例中,电源管理模块441也可以设置于处理器410中。在另一些实施例中,电源管理模块441和充电管理模块440也可以设置于同一个器件中。
手机100的无线通信功能可以通过天线1,天线2,移动通信模块450,无线通信模块460,调制解调处理器以及基带处理器等实现。
天线1和天线2用于发射和接收电磁波信号。手机100中的每个天线可用于覆盖单个或多个通信频段。不同的天线还可以复用,以提高天线的利用率。例如:可以将天线1复用为无线局域网的分集天线。在另外一些实施例中,天线可以和调谐开关结合使用。
移动通信模块450可以提供应用在手机100上的包括2G/3G/4G/5G等无线通信的解决方案。移动通信模块450可以包括至少一个滤波器,开关,功率放大器,低噪声放大器(low noise amplifier,LNA)等。移动通信模块450可以由天线1接收电磁波,并对接收的电磁波进行滤波,放大等处理,传送至调制解调处理器进行解调。移动通信模块450还可以对经调制解调处理器调制后的信号放大,经天线1转为电磁波辐射出去。在一些实施例中,移动通信模块450的至少部分功能模块可以被设置于处理器410中。在一些实施例中,移动通信模块450的至少部分功能模块可以与处理器410的至少部分模块被设置在同一个器件中。
调制解调处理器可以包括调制器和解调器。其中,调制器用于将待发送的低频基带信号调制成中高频信号。解调器用于将接收的电磁波信号解调为低频基带信号。随后解调器将解调得到的低频基带信号传送至基带处理器处理。低频基带信号经基带处理器处理后,被传递给应用处理器。应用处理器通过音频设备(不限于扬声器470A、受话器470B等)输出声音信号,或通过显示屏494显示图像或视频。在一些实施例中,调制解调处理器可以是独立的器件。在另一些实施例中,调制解调处理器可以独立于处理器410,与移动通信模块450或其他功能模块设置在同一个器件中。
无线通信模块460可以提供应用在手机100上的包括无线局域网(wireless local area networks,WLAN)(如WiFi网络),蓝牙BT,全球导航卫星系统(global navigation satellite system,GNSS),调频(frequency modulation,FM),近距离无线通信技术(near field communication,NFC),红外技术(infrared,IR)等无线通信的解决方案。无线通信模块460可以是集成至少一个通信处理模块的一个或多个器件。无线通信模块460经由天线2接收电磁波,将电磁波信号调频以及滤波处理,将处理后的信号发送到处理器410。无线通信模块460还可以从处理器410接收待发送的信号,对其进行调频, 放大,经天线2转为电磁波辐射出去。
在一些实施例中,手机100的天线1和移动通信模块450耦合,天线2和无线通信模块460耦合,使得手机100可以通过无线通信技术与网络以及其他设备通信。所述无线通信技术可以包括全球移动通讯系统(global system for mobile communications,GSM),通用分组无线服务(general packet radio service,GPRS),码分多址接入(code division multiple access,CDMA),宽带码分多址(wideband code division multiple access,WCDMA),时分码分多址(time-division code division multiple access,TD-SCDMA),长期演进(long term evolution,LTE),BT,GNSS,WLAN,NFC,FM,和/或IR技术等。所述GNSS可以包括全球卫星定位系统(global positioning system,GPS),全球导航卫星系统(global navigation satellite system,GLONASS),北斗卫星导航系统(beidou navigation satellite system,BDS),准天顶卫星系统(quasi-zenith satellite system,QZSS)和/或星基增强系统(satellite based augmentation systems,SBAS)。
手机100通过GPU,显示屏494,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏494和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器410可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏494用于显示图像,视频等。显示屏494包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,手机100可以包括1个或N个显示屏494,N为大于1的正整数。
手机100可以通过ISP,摄像头493,视频编解码器,GPU,显示屏494以及应用处理器等实现拍摄功能。
ISP用于处理摄像头493反馈的数据。例如,拍照时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将所述电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点,亮度,肤色进行算法优化。ISP还可以对拍摄场景的曝光,色温等参数优化。在一些实施例中,ISP可以设置在摄像头493中。
摄像头493用于捕获静态图像或视频。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,手机100可以包括1个或N个摄像头493,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。例如,当手机100在频点选择时,数字信号处理器用于对频点能量进行 傅里叶变换等。
视频编解码器用于对数字视频压缩或解压缩。手机100可以支持一种或多种视频编解码器。这样,手机100可以播放或录制多种编码格式的视频,例如:动态图像专家组(moving picture experts group,MPEG)1,MPEG2,MPEG3,MPEG4等。
NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现手机100的智能认知等应用,例如:图像识别,人脸识别,语音识别,文本理解等。
外部存储器接口420可以用于连接外部存储卡,例如Micro SD卡,实现扩展手机100的存储能力。外部存储卡通过外部存储器接口420与处理器410通信,实现数据存储功能。例如将音乐,视频等文件保存在外部存储卡中。
内部存储器421可以用于存储计算机可执行程序代码,所述可执行程序代码包括指令。内部存储器421可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如声音播放功能,图像播放功能等)等。存储数据区可存储手机100使用过程中所创建的数据(比如音频数据,电话本等)等。此外,内部存储器421可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。处理器410通过运行存储在内部存储器421的指令,和/或存储在设置于处理器中的存储器的指令,执行手机100的各种功能应用以及数据处理。
手机100可以通过音频模块470,扬声器470A,受话器470B,麦克风470C以及应用处理器等实现音频功能。例如音乐播放,录音等。
音频模块470用于将数字音频信息转换成模拟音频信号输出,也用于将模拟音频输入转换为数字音频信号。音频模块470还可以用于对音频信号编码和解码。在一些实施例中,音频模块470可以设置于处理器410中,或将音频模块470的部分功能模块设置于处理器410中。
扬声器470A,也称“喇叭”,用于将音频电信号转换为声音信号。手机100可以通过扬声器470A进行语音播放或通知等。
受话器470B,也称“听筒”,用于将音频电信号转换成声音信号。当手机100接听电话或语音信息时,可以通过将受话器470B靠近人耳接听语音。
麦克风470C,也称“话筒”,“传声器”,用于将声音信号转换为电信号。当拨打电话或发送语音信息时,用户可以通过人嘴靠近麦克风470C发声,将声音信号输入到麦克风470C。手机100可以设置至少一个麦克风470C。在另一些实施例中,手机100可以设置两个麦克风470C,除了采集声音信号,还可以实现降噪功能。在另一些实施例中,手机100还可以设置三个,四个或更多麦克风470C,实现采集声音信号,降噪,还可以识别声音来源,实现定向录音功能等。
压力传感器480A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,可以通过压力传感器480A确定用户施加在手机100边框上的按压,进而方便手机100对该操作对应的控制指令进行响应。
陀螺仪传感器480B可以用于确定手机100的运动过程中的姿态。在一些实施例 中,可以通过陀螺仪传感器480B确定手机100围绕三个轴(即,x,y和z轴)的角速度。
磁传感器480C包括霍尔传感器。在一些实施例中,可以通过磁传感器480C感应磁场强度来测量电流、位置、方向等物理参数。
加速度传感器480D可检测手机100在各个方向上(一般为三轴)加速度的大小。当手机100静止时可检测出重力的大小及方向。还可以用于识别手机100的姿态,应用于摄像头取景镜头切换等。
距离传感器480E,用于测量距离。手机100可以通过红外或激光测量距离。在一些实施例中,拍摄场景,手机100可以利用距离传感器480E测距以实现快速对焦。
环境光传感器480H用于感知环境光亮度。手机100可以根据感知的环境光亮度自适应调节显示屏494亮度。环境光传感器480H也可用于拍照时自动调节白平衡。环境光传感器480H还可以与接近光传感器配合工作。在一些实施例中,手机100可以根据环境光传感器480H感知的环境光亮度指示大屏设备调节大屏设备的显示屏亮度。
指纹传感器480F用于采集指纹。可以采用任何类型的感测技术,包括但不限于光学式、电容式、压电式或超声波传感技术等。手机100可以利用采集的指纹特性实现指纹解锁,访问应用锁,指纹拍照等。
触摸传感器480G,也称“触控器件”。触摸传感器480G(也称为触控面板)可以设置于显示屏494,由触摸传感器480G与显示屏494组成触摸屏,也称“触控屏”。触摸传感器480G用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触控事件类型。可以通过显示屏494提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器480G也可以设置于手机100的表面,与显示屏494所处的位置不同。
按键490包括开机键,音量键等。按键490可以是机械按键。也可以是触摸式按键。手机100可以接收按键输入,产生与手机100的用户设置以及功能控制有关的键信号输入。
马达491可以产生振动提示。马达491可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如拍照,音频播放等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏494不同区域的触摸操作,马达491也可对应不同的振动反馈效果。不同的应用场景(例如:时间提醒,接收信息,闹钟,游戏等)也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
指示器492可以是指示灯,可以用于指示充电状态,电量变化,也可以用于指示消息,未接来电,通知等。
SIM卡接口495用于连接SIM卡。SIM卡可以通过插入SIM卡接口495,或从SIM卡接口495拔出,实现和手机100的接触和分离。手机100可以支持1个或N个SIM卡接口,N为大于1的正整数。SIM卡接口495可以支持Nano SIM卡,Micro SIM卡,SIM卡等。同一个SIM卡接口495可以同时插入多张卡。所述多张卡的类型可以相同,也可以不同。SIM卡接口495也可以兼容不同类型的SIM卡。SIM卡接口495也可以兼容外部存储卡。手机100通过SIM卡和网络交互,实现通话以及数据通信等功能。在一些实施例中,手机100采用eSIM,即:嵌入式SIM卡。eSIM卡可以嵌在手机100 中,不能和手机100分离。
以下结合图4中的手机,具体介绍本申请实施例提供的控制大屏设备显示的方法。以下实施例中的方法均可以在具有上述硬件结构的移动终端或者具有类似结构的移动终端中实现。
本申请实施例中的控制大屏设备显示的方法的基本原理是:利用加速度传感器、陀螺仪传感器和磁传感器采集到的数据,通过九轴融合算法得到手机的实时空间姿态,将实时空间姿态映射在大屏设备显示屏上的2D位置,确定激光指针的显示位置。以及控制大屏设备在确定的激光指针的显示位置做出某一响应。
其中,九轴融合算法是指通过加速度传感器、陀螺仪传感器和磁传感器,来获取物体姿态的方法。
在一种可能的实现方式中,手机100中可以集成有IMU。手机100可以通过融合惯性测量单元(Inertial measurement unit,IMU)中的加速度传感器、陀螺仪传感器和磁传感器获取物体姿态。其中,IMU是一种通过传感器组合(加速度传感器、陀螺仪传感器和磁传感器)来测量和报告速度、方向和重力的单元。
在本申请实施例的控制大屏设备显示的方法中,加速度传感器的工作原理是通过测量组件在某个轴向的受力情况判断手机100的平移方向和平移速度。其中,受力情况的表现形式为方向(X,Y,Z轴方向)和在对应方向的加速度大小。陀螺仪传感器的工作原理是通过测量三维坐标系(三维坐标系包括X轴,Y轴,Z轴)内陀螺转子的垂直轴与手机100之间的夹角,并计算角速度,通过夹角和角速度来判别物体在三维空间的运动姿态。磁传感器的原理跟指南针类似,可以测量出手机100与东南西北四个方向上的夹角。也就是说,加速度传感器测到“手机沿X轴走了多远”,陀螺仪传感器测到“手机100转了个身”,磁传感器测到“手机100向西运动”。由于加速度传感器、陀螺仪传感器和磁传感器均可以测量3轴的运动数据,因此通常称为“九轴传感器”。
本申请实施例中的九轴融合算法是充分利用上述“九轴传感器”的特征,融合计算手机100更加准确的实时空间姿态,即计算出手机100的空间坐标在地理坐标系中的(x i,y i,z i)以及姿态角。其中,姿态角用于反应手机100用相对地面的姿态。姿态角包括手机100相对于地面坐标系的偏航角(yaw),俯仰角(pitch)和横滚角(roll)。
其中,如图5所示,为本申请实施例提供的一种手机空间姿态示意图。如图5所示,x b轴、y b轴、z b轴和原点O组成了手机100的坐标系,x g轴、y g轴、z g轴和原点O组成了地面坐标系。其中,O为手机100的质心,x b轴在手机100对称平面内并平行于手机100的机身轴线指向手机头部;y b轴垂直于手机100对称平面指向手机右侧;z b轴垂直于x b轴,并指向手机100机身下方。x g轴在水平面内并指向某一方向;z g轴垂直于地面并指向地心;y g轴在水平面内垂直于x g轴,其指向按照右手定则确定。
如图5所示,偏航角是指手机100轴x b在水平面上的投影与地面坐标系x g轴之间的夹角
Figure PCTCN2020102191-appb-000005
以手机100前端部分向右偏转为正;俯仰角是指手机100轴x b与地平面(或水平面)之间的夹角θ,以手机前端部分向上偏转为正;横滚角是指手机100z b轴与通过手机轴x b的铅垂面间的夹角φ,以手机100右滚为正。
以图1B中所示的场景为例,本申请实施例的控制大屏设备显示的方法可以通过图6中的S601-S605实现。
可以理解的是,本申请实施例中,手机100可以执行本申请实施例中的部分或全部步骤,这些步骤或操作仅是示例,本申请实施例还可以执行其它操作或者各种操作的变形。此外,各个步骤可以按照本申请实施例呈现的不同的顺序来执行,并且有可能并非要执行本申请实施例中的全部操作。
S601、手机100通过加速度传感器、陀螺仪传感器和磁传感器采集手机100的运动数据。
S602、手机100根据采集的运动数据,采用九轴融合算法确定手机100的空间姿态。
其中,如上文所述,手机100的空间姿态包括手机100的空间坐标在地理坐标系中的(x i,y i,z i)和手机100相对于地面坐标系的偏航角
Figure PCTCN2020102191-appb-000006
俯仰角θ和横滚角φ。
在一种可能的实现方式中,采用九轴融合算法确定手机100的空间姿态,可以包括:首先对手机100的运动数据进行校准,然后根据校准后的运动数据,通过融合算法计算当前运动数据下的手机空间姿态。
S603、手机100根据手机100的空间姿态,确定第一位置。
其中,第一位置是手机100的空间姿态映射在手提电脑显示屏上的位置。确定的第一位置即激光指针在手提电脑200显示屏上的显示位置。
在手机100控制准确的情况下,激光指针在手提电脑200显示屏上的显示位置即图7中的7A中的P点处,其中,P点在手提电脑200显示屏上的二维坐标为(x i',y i')。其中,坐标原点为手提电脑200显示屏左上角,x轴为沿着手提电脑显示屏上边缘由左至右,y轴为沿着手提电脑显示屏左边缘由上至下。
手机100可以根据手提电脑200的显示屏宽度W和手机100的灵敏度s确定R。其中,用户操控手机100的角度转动s(单位:度),对应的,激光指针在显示屏上移动距离W。其中,W是可操控激光指针的范围,s是对应的可操控手机100的角度。
示例性的,当人到显示屏的距离L如图8所示,为R=W/2*cot(s/2)时,假设此时的s取值能让人操作舒适。当人到显示屏的距离L>W/2*cot(s/2)时,操控手机100转动相同的角度,激光指针移动的距离变大,手机100的可操控空间更小(更容易到边沿),人感知到的灵敏度相对更高。同理,当人到显示屏的距离L<W/2*cot(s/2)时,操控手机100转动相同的角度,激光指针移动的距离变小,激光指针的移动空间变大,手机100的可操控空间更大,人感知到的灵敏度相对更低。
在一些实施例中,手机100对应的s是可以调整的。示例性的,在人到显示屏的距离L固定时,其中,L<W/2*cot(s/2),可以将s调小一下,这样,操控手机100转动更小的角度,激光指针便可以移动同样的距离。或者,在人到显示屏的距离L固定时,其中,L>W/2*cot(s/2),可以将s调大一下,这样,操控手机100转动相同角度,激光指针可以移动更远的距离。
以L=R为例,如图7中的7B和图7中的7C所示,
Figure PCTCN2020102191-appb-000007
如图7中的7C所示,dy=R×tanθ;又由于
Figure PCTCN2020102191-appb-000008
因此,可以计算得到
Figure PCTCN2020102191-appb-000009
S604手机100向手提电脑200发送第二信息。
其中,该第二信息用于指示第一位置。该第二信息包括控制信息。该控制信息用于指示手提电脑200在第一位置做出第一响应。
或者,对于手机100通过服务设备控制显示设备的情况。如图2中的手机100通过手提电脑200控制电视机300显示,或者图3中手机100通过手提电脑200投影在投影设备幕布上的显示的情况。S605中手机100向手提电脑200发送第二信息,用于指示手提电脑200控制电视机300在第一位置做出第一响应。或者,S605中手机100向手提电脑200发送第二信息,用于指示手提电脑200控制投影仪400在幕布500的第一位置做出第一响应。
S605、手提电脑200在第一位置做出第一响应。
在一些可能的实施例中,控制信息可以用于指示但不限于手提电脑200做出以下中的任一种第一响应:调用鼠标指针单击第一位置处的显示内容(例如,视频播放按钮)、调用鼠标指针双击第一位置处的显示内容(例如,文件夹)、调用鼠标指针做出滑动操作(例如,滑动视频播放进度按钮),以及翻页、调用鼠标指针在第一位置处划线等。
在一些可能的实施例中,控制信息还可以用于指示手提电脑200将鼠标指针的功能从激光指针功能切换为激光画笔功能。例如,通过将激光指针切换为激光画笔,完成划线、画图、写字等操作。
在一种可能的实现方式中,用户可以通过在手机100上做出预设操作,指示手提电脑200将激光指针切换为激光画笔。例如:该预设操作为在手机100触摸屏某一位置滑动或按压,同时按下“+”、“-”音量键(如图10中的1010),长按音量键,长时间捏握手机100侧边框等。
在一种可能的实现方式中,用户可以通过在手机100的人机交互界面上做出相应操作,指示手提电脑200将激光指针切换为激光画笔。例如:点击人机交互界面上的指针/画笔切换按钮(如图10中的1007),或者在人机交互界面的触摸板(如图10中的1009)上做出预设操作(如双击、滑动等操作)。
在一些实施例中,上述人机交互界面可以是手机100中的基础服务功能,还可以是手机100中安装的应用程序(Application,APP)。例如,手机100中可以安装有“空中鼠标”APP,用户可以在该APP界面完成上述自定义设置。其中,“空中鼠标”APP用于使手机100实现传统鼠标的各种功能,但是又无需在固定位置、固定平台工作。
在一些实施例中,如图9所示,在手机100向手提电脑200发送第二信息之前,本申请实施例的控制大屏设备显示的方法还可以包括:
S606、手机100接收用户的第一操作。
在这种情况下,图6中S604实际上为:手机100响应于第一操作,向手提电脑200发送第二信息。即图9中的S607。
其中,手机100根据第一操作,向手提电脑200发送第二信息,可以包括:手机100根据第一操作确定与该第一操作对应的控制指令;手机100向手提电脑200发送第二信息。其中,该第二信息包括第一操作对应的控制指令。
在一种可能的实现方式中,不同操作与不同控制指令的对应关系可以由用户自定义设置。例如,用户可以在上述人机交互界面自定义设置不同操作与控制指令的对应 关系。
其中,第一操作可以是用户在手机100的人机交互界面上的操作。例如:用户点击人机交互界面上的虚拟按钮。或者,用户在人机交互界面空白处的点击/双击/长按操作,触摸滑动操作。手机100可以根据自定义设置的不同操作与不同控制指令的对应关系,确定具体第一操作所标识的控制指令。例如,可以通过点击图10中的1008进行上述自定义设置。
或者,第一操作还可以是用户按压手机100的音量键(如图10中的1010)的操作或者用户捏握手机100侧边框的操作。例如,手机100通过压力传感器检测到侧边框某一位置被施加压力大于F 0,且持续时间大于T 0。手机100通过检测到的上述条件确定对应的控制指令。
在一些实施例中,人机交互界面可以至少包括以下中的至少一个虚拟按钮:左手/右手界面切换按钮(如图10中的1006),指针/画笔切换按钮(如图10中的1007),一键播放/一键退出按钮(如图10中的1001),个性化设置按钮(如图10中的1008)。
基于人机交互界面,可以实现“鼠标”的基本功能。例如:幻灯片一键播放/一键退出,幻灯片翻页,划线,确认,播放,打开文件,写字等。或者,还可以通过人机交互界面进行“鼠标”的相关配置。例如:通过点击图10中的1008配置激光指针显示样式,通过点击图10中的1005配置激光指针位置操控灵敏度,通过点击图10中的1004配置界面布局等。
在一些实施例中,如图11所示,在S601之前,在手机100处于第一姿态时,本申请的控制大屏设备显示的方法还可以包括:
S608、响应于接收到用户的第一指令,手机100向手提电脑200发送第一信息。
其中,第一信息用于指示手提电脑200在预设位置显示激光指针。
例如:该预设位置为手提电脑200显示屏的中心位置。第一信息包括2D位置重对准指令,用于指示手提电脑200根据该2D位置重对准指令控制激光指针显示在手提电脑200显示屏的中心位置。
在一种可能的实现方式中,如图10所示,人机交互界面还可以包括2D位置重对准按钮(图10中的1002),用户可以通过手机100点击该按钮,指示手提电脑200进行2D位置重对准。
S609、手提电脑200根据第一信息在预设位置显示激光指针。
例如,在手提电脑200进行2D位置重对准之后,手机100当前指向手提电脑200显示屏的位置,即为手提电脑200显示屏的中心位置。手机100的当前姿态(即上述第一姿态)为初始姿态,其中,初始姿态的偏航角和俯仰角均可以认为是0。
在这种情况下,手机100在后续运动过程中,采用九轴融合算法确定手机100的姿态角后,根据手机100的姿态角,确定第一位置,即S603可以通过如下步骤实现:
步骤1:手机100以初始姿态对应的四元数q 0作为起点,当前姿态四元数q i作为终点,计算手机100由初始姿态变换为空间姿态时,四元数q i相对四元数q 0的相对旋转矩阵C。
具体的,手机100通过当前姿态九轴融合算法的结果对应的四元数q 0,和初始姿 态九轴融合算法的结果对应的四元数q i,确定出旋转四元数,将该旋转四元数换算为相对旋转矩阵C。
步骤2:手机100根据初始偏航角、初始俯仰角和相对旋转矩阵C,确定第二姿态。
其中,初始偏航角为手机100处于初始姿态时,手机100相对于地面坐标系的偏航角,初始俯仰角为手机100处于初始姿态时,手机100相对于地面坐标系的俯仰角。
步骤3:手机100根据第二姿态,确定第一位置。
通过上述方法,以初始姿态对应手提电脑200显示屏中心位置,将3D空间姿态投影到2D显示屏上,获取激光指针的显示位置,可以获得准确度更高、灵敏度更高的激光指针控制。
在一种可能的实现方式中,手机100中可以集成左右手持识别算法,实现手机100人机交互界面跟随左右手自适应布局,提升用户操作体验。用户可以通过点击图10中的1006进行左手/右手界面切换。
可以理解的是,移动终端(例如手机100)为了实现上述任一个实施例的功能,其包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。
本申请实施例可以对移动终端进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
比如,以采用集成的方式划分各个功能模块的情况下,如图12所示,为本申请实施例提供的一种移动终端的结构示意图。该移动终端可以包括传感器模块1210、分析模块1220和发送模块1230。
其中,传感器模块1210可以用于支持移动终端执行上述步骤S601,分析模块1220可以用于支持移动终端执行上述步骤S602和S603,发送模块1230用于支持移动终端执行上述步骤S604、S607和S608,和/或用于本文所描述的技术的其他过程。
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。在一种可能得到结构中,如图13所示,该移动终端还可以包括检测模块1240,用于支持移动终端执行上述步骤S606,和/或用于本文所描述的技术的其他过程。
其中,分析模块1220可以是图4中所示的处理器410。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,数字信号处理(digital signal processing,DSP)和微处理器的组合等等。
需要说明的是,上述移动终端还可以包括射频电路。具体的,移动终端可以通过射频电路进行无线信号的接收和发送。通常,射频电路包括但不限于天线、至少一个放大器、收发信机、耦合器、低噪声放大器、双工器等。此外,射频电路还可以通过无线通信和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于全球移动通讯系统、通用分组无线服务、码分多址、宽带码分多址、长期演进、电子邮件、短消息服务等。
在一种可选的方式中,当使用软件实现数据传输时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地实现本申请实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质,(例如软盘、硬盘、磁带)、光介质(例如DVD)、或者半导体介质(例如固态硬盘Solid State Disk(SSD))等。
结合本申请实施例所描述的方法或者算法的步骤可以硬件的方式来实现,也可以是由处理器执行软件指令的方式来实现。软件指令可以由相应的软件模块组成,软件模块可以被存放于RAM存储器、闪存、ROM存储器、EPROM存储器、EEPROM存储器、寄存器、硬盘、移动硬盘、CD-ROM或者本领域熟知的任何其它形式的存储介质中。一种示例性的存储介质耦合至处理器,从而使处理器能够从该存储介质读取信息,且可向该存储介质写入信息。当然,存储介质也可以是处理器的组成部分。处理器和存储介质可以位于ASIC中。另外,该ASIC可以位于探测装置中。当然,处理器和存储介质也可以作为分立组件存在于探测装置中。
通过以上的实施方式的描述,所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。
在本申请所提供的几个实施例中,应该理解到,所揭露的用户设备和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅是示意性的,例如,所述模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施 例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。
所述集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器(processor)执行本申请各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何在本申请揭露的技术范围内的变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (29)

  1. 一种控制大屏设备显示的方法,其特征在于,应用于移动终端,所述移动终端包括多个运动传感器,所述多个运动传感器至少包括加速度传感器、陀螺仪传感器和磁传感器,所述移动终端与大屏设备之间建立了通信连接,所述移动终端用于控制所述大屏设备上显示的激光指针,所述方法包括:
    所述移动终端通过所述多个运动传感器采集所述移动终端的运动数据;
    所述移动终端根据所述运动数据,采用九轴融合算法确定所述移动终端的空间姿态;
    所述移动终端根据所述移动终端的空间姿态,确定所述大屏设备的显示屏上的第一位置;
    所述移动终端控制所述大屏设备在所述第一位置做出第一响应,所述第一响应包括在所述第一位置显示激光指针。
  2. 根据权利要求1所述的方法,其特征在于,所述空间姿态至少用于标识以下信息:所述移动终端相对于地面坐标系的偏航角
    Figure PCTCN2020102191-appb-100001
    俯仰角θ和横滚角φ。
  3. 根据权利要求2所述的方法,其特征在于,所述移动终端根据所述空间姿态,确定所述大屏设备的显示屏上的第一位置,包括:
    所述移动终端采用公式一和公式二确定所述第一位置在所述大屏设备的显示屏上的坐标(x′ i,y′ i):
    Figure PCTCN2020102191-appb-100002
    Figure PCTCN2020102191-appb-100003
    其中,W为所述大屏设备显示屏的宽度,s为所述移动终端的灵敏度。
  4. 根据权利要求1-3任一项所述的方法,其特征在于,在所述移动终端通过所述多个运动传感器采集所述移动终端的运动数据之前,所述方法还包括:
    所述移动终端处于第一姿态时,接收用户的第一指令,所述第一指令用于指示所述移动终端与所述大屏设备进行2D位置重对准;
    响应于所述第一指令,所述移动终端向所述大屏设备发送第一信息,所述第一信息用于指示所述大屏设备在预设位置显示所述激光指针;
    其中,所述第一姿态为所述移动终端的任意姿态。
  5. 根据权利要求4所述的方法,其特征在于,所述移动终端根据所述空间姿态,确定所述大屏设备的显示屏上的第一位置,包括:
    所述移动终端以所述第一姿态对应的四元数q 0作为起点,所述空间姿态对应的四元数q i作为终点,计算所述移动终端由所述第一姿态变换为所述空间姿态时,所述四元数q i相对所述四元数q 0的相对旋转矩阵C;
    所述移动终端根据初始偏航角、初始俯仰角和所述相对旋转矩阵C,确定第二姿态;所述移动终端根据所述第二姿态,确定所述第一位置;
    其中,所述初始偏航角为所述移动终端处于所述第一姿态时,所述移动终端相对于地面坐标系的偏航角,所述初始俯仰角为所述移动终端处于所述第一姿态时,所述 移动终端相对于地面坐标系的俯仰角。
  6. 根据权利要求1-5任一项所述的方法,其特征在于,所述移动终端控制所述大屏设备在所述第一位置做出第一响应,包括:
    所述移动终端向所述大屏设备发送第二信息;
    其中,所述第二信息用于指示所述第一位置,所述第二信息包括控制信息,所述控制信息用于指示所述大屏设备在所述第一位置做出第一响应。
  7. 根据权利要求1-5任一项所述的方法,其特征在于,所述移动终端控制所述大屏设备在所述第一位置做出第一响应,包括:
    所述移动终端向服务设备发送第二信息;
    其中,所述第二信息用于指示所述第一位置,所述第二信息包括控制信息,所述控制信息用于指示所述服务设备控制所述大屏设备在所述第一位置做出所述第一响应。
  8. 根据权利要求1-7任一项所述的方法,其特征在于,在所述移动终端控制所述大屏设备在所述第一位置做出第一响应之前,所述方法还包括:
    所述移动终端接收用户的第一操作;
    所述移动终端控制所述大屏设备在所述第一位置做出第一响应,包括:
    响应于所述第一操作,所述移动终端控制所述大屏设备在所述第一位置做出第一响应。
  9. 根据权利要求8所述的方法,其特征在于,所述第一响应至少包括以下任一种响应:将激光指针切换为激光画笔、翻页、打开文件、划线、播放、停止播放、调节亮度。
  10. 根据权利要求8或9所述的方法,其特征在于,所述第一操作是所述移动终端的显示屏显示人机交互界面时,接收到的用户操作。
  11. 根据权利要求10所述的方法,其特征在于,所述第一操作包括用户在所述人机交互界面的操作;
    其中,所述人机交互界面至少包括以下中的至少一个虚拟按钮:左手/右手人机交互界面切换按钮,指针/画笔切换按钮,2D位置重对准按钮,一键播放/一键退出按钮,个性化设置按钮。
  12. 根据权利要求11所述的方法,其特征在于,所述第一操作还包括:用户在所述人机交互界面空白处的点击/双击/长按操作,用户在所述人机交互界面空白处的触摸滑动,用户按压所述移动终端的音量键的操作或者用户捏握所述移动终端侧边框的操作。
  13. 一种移动终端,其特征在于,所述移动终端包括:传感器模块,所述传感器模块包括多个运动传感器,所述多个运动传感器至少包括加速度传感器、陀螺仪传感器和磁传感器,所述移动终端与大屏设备之间建立了通信连接,所述移动终端用于控制所述大屏设备上显示的激光指针,所述传感器模块用于采集所述移动终端的运动数据;所述移动终端还包括:
    分析模块,用于根据所述运动数据,采用九轴融合算法确定所述移动终端的空间姿态;以及,根据所述移动终端的空间姿态,确定所述大屏设备的显示屏上的第一位 置;
    发送模块,用于向所述大屏设备发送控制信息,用于控制所述大屏设备在所述第一位置做出第一响应,所述第一响应包括在所述第一位置显示激光指针。
  14. 根据权利要求13所述的移动终端,其特征在于,所述空间姿态至少用于标识以下信息:所述移动终端相对于地面坐标系的偏航角
    Figure PCTCN2020102191-appb-100004
    俯仰角θ和横滚角φ。
  15. 根据权利要求14所述的移动终端,其特征在于,所述分析模块根据所述运动数据,采用九轴融合算法确定所述移动终端的空间姿态;以及,根据所述移动终端的空间姿态,确定所述大屏设备的显示屏上的第一位置,包括:
    所述分析模块采用公式一和公式二确定所述第一位置在所述大屏设备的显示屏上的坐标(x′ i,y′ i):
    Figure PCTCN2020102191-appb-100005
    Figure PCTCN2020102191-appb-100006
    其中,W为所述大屏设备显示屏的宽度,s为所述移动终端的灵敏度。
  16. 根据权利要求13-15任一项所述的移动终端,其特征在于,所述移动终端还包括:
    接收模块,用于在所述传感器模块采集所述移动终端的运动数据之前,在所述移动终端处于第一姿态时,接收用户的第一指令,所述第一指令用于指示所述移动终端与所述大屏设备进行2D位置重对准;
    所述发送模块还用于,响应于所述第一指令,向所述大屏设备发送第一信息,所述第一信息用于指示所述大屏设备在预设位置显示所述激光指针。
  17. 根据权利要求16所述的移动终端,其特征在于,所述分析模块根据所述空间姿态,确定所述大屏设备的显示屏上的第一位置,包括:
    所述分析模块以所述第一姿态对应的四元数q 0作为起点,所述空间姿态对应的四元数q i作为终点,计算所述移动终端由所述第一姿态变换为所述空间姿态时,所述四元数q i相对所述四元数q 0的相对旋转矩阵C;
    所述分析模块根据初始偏航角、初始俯仰角和所述相对旋转矩阵C,确定第二姿态;所述移动终端根据所述第二姿态,确定所述第一位置;
    其中,所述初始偏航角为所述移动终端处于所述第一姿态时,所述移动终端相对于地面坐标系的偏航角,所述初始俯仰角为所述移动终端处于所述第一姿态时,所述移动终端相对于地面坐标系的俯仰角。
  18. 根据权利要求13-17任一项所述的移动终端,其特征在于,所述发送模块向所述大屏设备发送控制信息,包括:
    所述发送模块通过服务设备向所述大屏设备发送所述控制信息,所述控制信息用于指示所述大屏设备接受所述服务设备的控制,在所述第一位置做出所述第一响应。
  19. 根据权利要求16-18任一项所述的移动终端,其特征在于,在所述发送模块向所述大屏设备发送控制信息之前,所述接收模块还用于,
    接收用户的第一操作;
    所述发送模块向所述大屏设备发送控制信息,包括:
    响应于所述第一操作,所述发送模块向所述大屏设备发送控制信息,用于控制所述大屏设备在所述第一位置做出第一响应。
  20. 根据权利要求19所述的移动终端,其特征在于,所述第一响应至少包括以下任一种响应:将激光指针切换为激光画笔、翻页、打开文件、划线、播放、停止播放、调节亮度。
  21. 根据权利要求19或20所述的移动终端,其特征在于,所述第一操作是所述移动终端的显示屏显示人机交互界面时,接收到的用户操作。
  22. 根据权利要求21所述的移动终端,其特征在于,所述第一操作包括用户在所述人机交互界面的操作;
    其中,所述人机交互界面至少包括以下中的至少一个虚拟按钮:左手/右手人机交互界面切换按钮,指针/画笔切换按钮,2D位置重对准按钮,一键播放/一键退出按钮,个性化设置按钮。
  23. 根据权利要求22所述的移动终端,其特征在于,所述第一操作还包括:用户在所述人机交互界面空白处的点击/双击/长按操作,用户在所述人机交互界面空白处的触摸滑动,用户按压所述移动终端的音量键的操作或者用户捏握所述移动终端侧边框的操作。
  24. 一种移动终端,其特征在于,所述移动终端包括:传感器模块,所述传感器模块包括多个运动传感器,所述多个运动传感器至少包括加速度传感器、陀螺仪传感器和磁传感器,所述移动终端与大屏设备之间建立了通信连接,所述移动终端用于控制所述大屏设备上显示的激光指针,所述传感器模块用于采集所述移动终端的运动数据;所述移动终端还包括:
    存储器,用于存储计算机程序代码,所述计算机程序代码包括指令;
    射频单元,用于进行无线电信号的发射和接收;
    处理器,用于执行所述存储器中存储的所述指令实现如权利要求1-12任一项所述的控制大屏设备显示的方法。
  25. 一种第一系统,其特征在于,所述第一系统包括:移动终端和大屏设备,所述移动终端用于控制所述大屏设备,实现如权利要求1-12任一项所述的控制大屏设备显示的方法。
  26. 根据权利要求25所述的系统,其特征在于,所述系统还包括:服务设备,用于实现如权利要求7-12任一项所述的控制大屏设备显示的方法。
  27. 一种芯片系统,所述芯片系统包括处理器、存储器,所述存储器中存储有指令;所述指令被所述处理器执行时,实现如权利要求1-12任一项所述的控制大屏设备显示的方法。
  28. 一种计算机存储介质,所述计算机存储介质上存储有计算机执行指令,所述计算机执行指令被处理电路执行时实现如权利要求1-12任一项所述的控制大屏设备显示的方法。
  29. 一种计算机程序产品,所述计算机程序产品包括程序指令,所述程序指令被执行时,以实现权利要求1-12中任一项所述的控制大屏设备显示的方法。
PCT/CN2020/102191 2019-07-30 2020-07-15 控制大屏设备显示的方法、移动终端及第一系统 WO2021017836A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910693432.8A CN110633018B (zh) 2019-07-30 2019-07-30 控制大屏设备显示的方法、移动终端及第一系统
CN201910693432.8 2019-07-30

Publications (1)

Publication Number Publication Date
WO2021017836A1 true WO2021017836A1 (zh) 2021-02-04

Family

ID=68970283

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/102191 WO2021017836A1 (zh) 2019-07-30 2020-07-15 控制大屏设备显示的方法、移动终端及第一系统

Country Status (2)

Country Link
CN (2) CN113220139B (zh)
WO (1) WO2021017836A1 (zh)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114938468A (zh) * 2022-04-22 2022-08-23 海信视像科技股份有限公司 显示设备、屏幕控制方法及存储介质
CN115016629A (zh) * 2021-11-19 2022-09-06 荣耀终端有限公司 防误触的方法和装置
WO2023125514A1 (zh) * 2021-12-28 2023-07-06 华为技术有限公司 设备控制方法及相关装置

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113220139B (zh) * 2019-07-30 2022-08-02 荣耀终端有限公司 控制大屏设备显示的方法、移动终端及第一系统
CN111262909A (zh) * 2020-01-09 2020-06-09 中国建设银行股份有限公司 一种大屏信息显示方法、装置、设备和存储介质
CN111356006B (zh) * 2020-03-13 2023-03-17 北京奇艺世纪科技有限公司 视频播放方法、装置、服务器及存储介质
CN111897437A (zh) * 2020-08-19 2020-11-06 腾讯科技(深圳)有限公司 跨终端的交互方法、装置、电子设备以及存储介质
CN112383664B (zh) * 2020-10-15 2021-11-19 华为技术有限公司 一种设备控制方法、第一终端设备、第二终端设备及计算机可读存储介质
CN113141669B (zh) * 2021-04-15 2022-07-22 维沃移动通信有限公司 数据传输方法、发送终端和电子设备
CN113671997A (zh) * 2021-08-17 2021-11-19 深圳市火乐科技发展有限公司 投影设备控制方法、校正方法、遥控装置以及投影设备
CN114339341A (zh) * 2021-12-15 2022-04-12 海信视像科技股份有限公司 显示设备及显示设备的控制方法
CN114095690A (zh) * 2022-01-24 2022-02-25 龙旗电子(惠州)有限公司 演示控制权转换方法、装置、设备、介质及程序产品
CN116048314B (zh) * 2022-08-25 2024-04-09 荣耀终端有限公司 一种光标控制方法、光标控制设备及存储介质

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103167338A (zh) * 2012-10-09 2013-06-19 深圳市金立通信设备有限公司 一种基于移动终端的智能电视输入控制系统及方法
EP2833353A1 (en) * 2012-03-27 2015-02-04 Konica Minolta, Inc. Display processing terminal device, photosensor-equipped unit, and photometric system
CN107153457A (zh) * 2016-03-04 2017-09-12 中兴通讯股份有限公司 投影处理方法及装置
CN110633018A (zh) * 2019-07-30 2019-12-31 华为技术有限公司 控制大屏设备显示的方法、移动终端及第一系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9055162B2 (en) * 2011-02-15 2015-06-09 Lg Electronics Inc. Method of transmitting and receiving data, display device and mobile terminal using the same
CN102289305B (zh) * 2011-08-29 2013-12-11 江苏惠通集团有限责任公司 姿态感知设备及其定位方法、鼠标指针的控制方法
CN102902375A (zh) * 2012-09-24 2013-01-30 刘丹 一种基于方位和姿态工作的鼠标应用方法及装置
CN106293404A (zh) * 2015-05-22 2017-01-04 联发科技(新加坡)私人有限公司 多屏幕共享显示画面的方法及通信终端
CN104965608A (zh) * 2015-07-20 2015-10-07 杜昊浓 一种空间鼠标系统
CN105628025B (zh) * 2015-12-31 2018-06-29 中国人民解放军国防科学技术大学 一种恒速偏频/机抖激光陀螺惯导系统导航方法
CN106706003A (zh) * 2017-02-15 2017-05-24 重庆邮电大学 一种基于三轴mems陀螺仪的寻北旋转在线校准方法
CN208027318U (zh) * 2018-04-01 2018-10-30 西北农林科技大学 一种可划线的幻灯片遥控笔

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2833353A1 (en) * 2012-03-27 2015-02-04 Konica Minolta, Inc. Display processing terminal device, photosensor-equipped unit, and photometric system
CN103167338A (zh) * 2012-10-09 2013-06-19 深圳市金立通信设备有限公司 一种基于移动终端的智能电视输入控制系统及方法
CN107153457A (zh) * 2016-03-04 2017-09-12 中兴通讯股份有限公司 投影处理方法及装置
CN110633018A (zh) * 2019-07-30 2019-12-31 华为技术有限公司 控制大屏设备显示的方法、移动终端及第一系统

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115016629A (zh) * 2021-11-19 2022-09-06 荣耀终端有限公司 防误触的方法和装置
CN115016629B (zh) * 2021-11-19 2023-06-06 荣耀终端有限公司 防误触的方法和装置
WO2023125514A1 (zh) * 2021-12-28 2023-07-06 华为技术有限公司 设备控制方法及相关装置
CN114938468A (zh) * 2022-04-22 2022-08-23 海信视像科技股份有限公司 显示设备、屏幕控制方法及存储介质

Also Published As

Publication number Publication date
CN113220139A (zh) 2021-08-06
CN113220139B (zh) 2022-08-02
CN110633018A (zh) 2019-12-31
CN110633018B (zh) 2021-04-09

Similar Documents

Publication Publication Date Title
WO2021017836A1 (zh) 控制大屏设备显示的方法、移动终端及第一系统
WO2021213120A1 (zh) 投屏方法、装置和电子设备
CN110502954B (zh) 视频分析的方法和装置
WO2020168965A1 (zh) 一种具有折叠屏的电子设备的控制方法及电子设备
WO2021052214A1 (zh) 一种手势交互方法、装置及终端设备
US11782554B2 (en) Anti-mistouch method of curved screen and electronic device
WO2021104008A1 (zh) 一种折叠屏的显示方法及相关装置
US20220121413A1 (en) Screen Control Method, Electronic Device, and Storage Medium
EP4020954A1 (en) Method for transmitting information over short distances and electronic devices
CN113518967A (zh) 一种控制屏幕显示的方法和电子设备
WO2021013230A1 (zh) 机器人的控制方法、机器人、终端、服务器及控制系统
WO2021008615A1 (zh) 一种基于折叠屏的交互方法及设备
WO2021063237A1 (zh) 电子设备的控制方法及电子设备
WO2021052279A1 (zh) 一种折叠屏显示方法及电子设备
CN111026314B (zh) 控制显示设备的方法及便携设备
WO2021180089A1 (zh) 界面切换方法、装置和电子设备
WO2021082564A1 (zh) 一种操作提示的方法和电子设备
WO2021104010A1 (zh) 支付方法和电子设备
WO2019100298A1 (zh) 一种拍照方法及终端
WO2021208723A1 (zh) 全屏显示方法、装置和电子设备
WO2021057699A1 (zh) 具有柔性屏幕的电子设备的控制方法及电子设备
CN114579016A (zh) 一种共享输入设备的方法、电子设备及系统
WO2022199102A1 (zh) 图像处理方法及装置
WO2021121036A1 (zh) 一种折叠设备的自定义按键方法、设备及存储介质
WO2021170129A1 (zh) 一种位姿确定方法以及相关设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20848508

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20848508

Country of ref document: EP

Kind code of ref document: A1