CN113220139B - Method for controlling display of large-screen equipment, mobile terminal and first system - Google Patents

Method for controlling display of large-screen equipment, mobile terminal and first system Download PDF

Info

Publication number
CN113220139B
CN113220139B CN202110387350.8A CN202110387350A CN113220139B CN 113220139 B CN113220139 B CN 113220139B CN 202110387350 A CN202110387350 A CN 202110387350A CN 113220139 B CN113220139 B CN 113220139B
Authority
CN
China
Prior art keywords
mobile terminal
screen device
screen
display
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110387350.8A
Other languages
Chinese (zh)
Other versions
CN113220139A (en
Inventor
张北航
曾佳
张延海
陈运哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110387350.8A priority Critical patent/CN113220139B/en
Publication of CN113220139A publication Critical patent/CN113220139A/en
Application granted granted Critical
Publication of CN113220139B publication Critical patent/CN113220139B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Abstract

The application discloses a method for controlling display of large-screen equipment, a mobile terminal and a first system, relates to the field of computers, and can realize control over various displays of large-screen equipment displays by using existing portable equipment to improve user experience. The existing mobile terminal (such as a mobile phone) can receive control of a user on the space attitude of the mobile terminal, the position corresponding to the display screen of the large-screen device corresponding to the space attitude is obtained through calculation according to the space attitude, and the display screen of the large-screen device is controlled to make responses such as a display laser pointer, a mobile laser pointer and the like at the corresponding position by sending control information to the large-screen device.

Description

Method for controlling display of large-screen equipment, mobile terminal and first system
Technical Field
The embodiment of the application relates to the field of computers, in particular to a method for controlling display of large-screen equipment through a mobile terminal, the mobile terminal and a first system.
Background
Nowadays, with the development of various business exhibitions, seminars, large roadshow and new product release meetings, the popularization is increasing. In the above activities, the presenter is usually required to make a presentation report to the audience. For example, the presenter gives a lecture report to the audience by playing a slide show (PowerPoint, PPT) or keynote on a large screen. The presenter uses the laser pen to irradiate a visible laser point at a certain position on the large screen to prompt the audience about the content being presented.
However, the laser pen only has general display functions such as position display and page turning, and cannot control and complete operations such as file opening and underlining. In addition, the battery of the laser pen needs to be replaced frequently, and a shielding object cannot be arranged during use, so that the laser pen is very inconvenient to use.
Disclosure of Invention
The embodiment of the application provides a method, a mobile terminal and a first system for controlling display of a large-screen device, which can realize control over various displays of a large-screen device display by using an existing portable device, and improve user experience.
In order to achieve the above purpose, the embodiment of the present application adopts the following technical solutions:
in a first aspect, a method for controlling a display of a large-screen device is provided, where the method is applied to a mobile terminal, the mobile terminal includes a plurality of motion sensors, the plurality of motion sensors includes at least an acceleration sensor, a gyroscope sensor and a magnetic sensor, a communication connection is established between the mobile terminal and the large-screen device, and the mobile terminal is used to control a laser pointer displayed on the large-screen device, and the method includes: the method comprises the steps that the mobile terminal collects motion data of the mobile terminal through a plurality of motion sensors; the mobile terminal determines the space attitude of the mobile terminal by adopting a nine-axis fusion algorithm according to the motion data; the mobile terminal determines a first position on a display screen of large-screen equipment according to the space posture of the mobile terminal; the mobile terminal controls the large-screen equipment to make a first response at the first position, wherein the first response comprises displaying a laser pointer at the first position; the spatial pose is used to identify at least the following information: yaw angle of mobile terminal relative to ground coordinate system
Figure BDA0003014517810000011
A pitch angle theta and a roll angle phi; before the mobile terminal collects the motion data of the mobile terminal through a plurality of motion sensors, the method further comprises the following steps: when the mobile terminal is in a first posture, receiving a first instruction of a user, wherein the first instruction is used for indicating the mobile terminal to carry out 2D position realignment with large-screen equipment; responding to the first instruction, the mobile terminal sends first information to the large-screen device, wherein the first information is used for indicating the large-screen device to display a laser pointer at a preset position; the first posture is any posture of the mobile terminal.
According to the technical scheme provided by the first aspect, a user can use an existing mobile terminal, and by controlling the posture of the mobile terminal, after the mobile terminal calculates the position corresponding to the display screen of the large-screen device corresponding to the space posture according to the space posture, the control information is sent to the large-screen device and used for controlling the display screen of the large-screen device to make responses such as a display laser pointer, a mobile laser pointer and the like at the corresponding position. The mobile terminal can calculate the position corresponding to the space gesture to the large-screen device display screen through the acquired yaw angle, pitch angle and roll angle of the mobile terminal. The mobile terminal can receive the instruction of realignment of the user, and realigns the position of the laser pointer on the display screen of the large-screen device, so that the accuracy of the display position of the laser pointer can be improved, and the user experience is improved.
In one possible implementation manner, the determining, by the mobile terminal, a first position on the display screen of the large-screen device according to the spatial posture includes: the mobile terminal determines coordinates (x) of said first location on the display screen of the large screen device using i ',y' i ):
Figure BDA0003014517810000012
Wherein, W is the width of the display screen of the large-screen device, and s is the sensitivity of the mobile terminal. By adopting the calculation method, the mobile terminal can determine the position corresponding to the display screen of the large-screen equipment by the space attitude according to the space attitude.
In one possible implementation manner, the determining, by the mobile terminal, a first position on a display screen of a large-screen device according to the spatial posture includes: the mobile terminal carries out quaternion q corresponding to a first posture 0 As a starting point, a quaternion q corresponding to the spatial attitude i As an end point, calculating a quaternion q when the mobile terminal is transformed from the first attitude to the spatial attitude i Relative quaternion q 0 A relative rotation matrix C; the mobile terminal determines a second attitude according to the initial yaw angle, the initial pitch angle and the relative rotation matrix C; the mobile terminal determines a first position according to the second posture; the initial yaw angle is a yaw angle of the mobile terminal relative to a ground coordinate system when the mobile terminal is in a first posture, and the initial pitch angle is a pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first posture. The mobile terminal can transform the corresponding quaternion by calculating the spatial attitudeAnd calculating the space attitude after attitude transformation according to the quaternion, and then performing realignment, so that the accuracy of the display position of the laser pointer can be improved, and the user experience is improved.
In one possible implementation manner, the controlling, by the mobile terminal, the large-screen device to make the first response at the first position includes: the mobile terminal sends second information to the large-screen equipment; the second information is used for indicating a first position, the second information comprises control information, and the control information is used for indicating the large-screen equipment to make a first response at the first position. The mobile terminal can control the response of the large-screen device by sending specific control information to the large-screen device and the position where the large-screen device is expected to make a corresponding response.
In one possible implementation manner, the controlling, by the mobile terminal, the large-screen device to make the first response at the first position includes: the mobile terminal sends second information to the service equipment; the second information is used for indicating a first position, the second information comprises control information, and the control information is used for indicating the service equipment to control the large-screen equipment to make a first response at the first position. The mobile terminal can control the large-screen device to respond by controlling the service device. The method and the device are suitable for different application scenes, convenient to use and high in user experience degree.
In one possible implementation manner, before the mobile terminal controls the large-screen device to make the first response at the first position, the method further comprises: the mobile terminal receives a first operation of a user; the mobile terminal controls the large-screen device to make a first response at a first position, and the method comprises the following steps: in response to the first operation, the mobile terminal controls the large-screen device to make a first response at the first position. The mobile terminal can receive the preset operation of a user representing a certain control instruction, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation, the first response includes at least any one of the following responses: switching the laser pointer into a laser painting brush, turning pages, opening a file, scribing, playing, stopping playing and adjusting the brightness. The mobile terminal can respond to the corresponding large-screen control equipment according to the operation of the user, can support various controls on the large-screen equipment, and is convenient to operate and high in user experience degree.
In one possible implementation manner, the first operation is a user operation received when a display screen of the mobile terminal displays a human-computer interaction interface. The mobile terminal can receive the preset operation of the user on the human-computer interaction interface, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation manner, the first operation is an operation of a user on a human-computer interaction interface; wherein, the human-computer interaction interface at least comprises at least one virtual button in the following steps: a left-hand/right-hand man-machine interaction interface switching button, a pointer/brush pen switching button, a 2D position realignment button, a one-key play/one-key exit button, and a personalized setting button. The mobile terminal can receive the preset operation of the user on the human-computer interaction interface, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation, the first operations further include: clicking/double-clicking/long-pressing operation of a user in the blank of the human-computer interaction interface, touch sliding of the user in the blank of the human-computer interaction interface, operation of pressing a volume key of the mobile terminal by the user or operation of pinching and holding a side frame of the mobile terminal by the user. The mobile terminal can receive preset operation of a user on the human-computer interaction interface or other positions of the mobile terminal, the corresponding large-screen control device responds, various control over the large-screen device can be supported, operation is convenient, and user experience is high.
In a second aspect, a mobile terminal is provided, where the mobile terminal includes a sensor module, the sensor module includes a plurality of motion sensors, the plurality of motion sensors includes at least an acceleration sensor, a gyroscope sensor, and a magnetic sensor, a communication connection is established between the mobile terminal and a large-screen device, and the mobile terminal is configured to control a laser finger displayed on the large-screen deviceThe sensor module is used for acquiring motion data of the mobile terminal; the mobile terminal further includes: the analysis module is used for determining the space attitude of the mobile terminal by adopting a nine-axis fusion algorithm according to the motion data; and determining a first position on a display screen of the large-screen device according to the spatial attitude of the mobile terminal; the sending module is used for sending control information to the large-screen equipment and controlling the large-screen equipment to make a first response at the first position, and the first response comprises displaying a laser pointer at the first position; the spatial pose is used to identify at least the following information: yaw angle of mobile terminal relative to ground coordinate system
Figure BDA0003014517810000032
A pitch angle theta and a roll angle phi; the mobile terminal further comprises a detection module, a display module and a display module, wherein the detection module is used for receiving a first instruction of a user when the mobile terminal is in a first posture before the sensor module collects motion data of the mobile terminal through the plurality of motion sensors, and the first instruction is used for indicating the mobile terminal to be subjected to 2D position realignment with the large-screen device; the sending module is further used for responding to the first instruction and sending first information to the large-screen device, wherein the first information is used for indicating the large-screen device to display a laser pointer at a preset position; the first posture is any posture of the mobile terminal.
According to the technical scheme provided by the second aspect, a user can use an existing mobile terminal, and by controlling the posture of the mobile terminal, after the mobile terminal calculates the position corresponding to the display screen of the large-screen device corresponding to the space posture according to the space posture, control information is sent to the large-screen device to control the display screen of the large-screen device to display the laser pointer, move the laser pointer and other responses at the corresponding position. The mobile terminal can calculate the position corresponding to the space gesture to the large-screen equipment display screen through the collected yaw angle, pitch angle and roll angle of the mobile terminal. The mobile terminal can receive the instruction of realignment of the user, and realigns the position of the laser pointer on the display screen of the large-screen device, so that the accuracy of the display position of the laser pointer can be improved, and the user experience is improved.
In one possible implementation, the determining, by the analysis module, a first position on a display screen of the large-screen device according to the spatial posture includes: the analysis module determines coordinates (x) of the first location on a display screen of a large screen device using i ',y' i ):
Figure BDA0003014517810000031
Wherein, W is the width of the display screen of the large-screen device, and s is the sensitivity of the mobile terminal. By adopting the calculation method, the mobile terminal can determine the position corresponding to the display screen of the large-screen equipment by the space attitude according to the space attitude.
In one possible implementation, the determining, by the analysis module, a first position on a display screen of the large-screen device according to the spatial pose includes: the analysis module uses quaternion q corresponding to the first posture 0 As a starting point, a quaternion q corresponding to the spatial attitude i As an end point, calculating a quaternion q when the mobile terminal is transformed from the first attitude to the spatial attitude i Relative quaternion q 0 A relative rotation matrix C; the analysis module determines a second attitude according to the initial yaw angle, the initial pitch angle and the relative rotation matrix C; the mobile terminal determines a first position according to the second posture; the initial yaw angle is a yaw angle of the mobile terminal relative to a ground coordinate system when the mobile terminal is in a first posture, and the initial pitch angle is a pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first posture. The mobile terminal can calculate the quaternion corresponding to the spatial attitude transformation through calculating the spatial attitude after the attitude transformation according to the quaternion, and then realigns, so that the accuracy of the laser pointer display position can be improved, and the user experience is improved.
In a possible implementation manner, the sending module sends the control information to the large-screen device, and the sending module includes: the sending module sends control information to the service equipment through the service equipment, and the control information is used for indicating the large-screen equipment to make a first response at a first position according to the control of the service equipment. The mobile terminal can control the large-screen device to respond by controlling the service device. The method and the device are suitable for different application scenes, convenient to use and high in user experience degree.
In one possible implementation manner, the mobile terminal further includes: the detection module is used for receiving a first operation of a user before the sending module sends the control information to the large-screen equipment; the sending module sends control information to the large-screen device, and the sending module comprises the following steps: in response to the first operation, the transmission module transmits control information to the large-screen device. The mobile terminal can receive the preset operation of a user representing a certain control instruction, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation, the first response includes at least any one of the following responses: switching the laser pointer into a laser painting brush, turning pages, opening a file, scribing, playing, stopping playing and adjusting the brightness. The mobile terminal can respond to the corresponding large-screen control equipment according to the operation of the user, can support various controls on the large-screen equipment, and is convenient to operate and high in user experience degree.
In one possible implementation manner, the first operation is a user operation received when a display screen of the mobile terminal displays a human-computer interaction interface. The mobile terminal can receive the preset operation of the user on the human-computer interaction interface, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation manner, the first operation is an operation of a user on a human-computer interaction interface; wherein, the human-computer interaction interface at least comprises at least one virtual button in the following steps: a left-hand/right-hand man-machine interaction interface switching button, a pointer/brush pen switching button, a 2D position realignment button, a one-key play/one-key exit button, and a personalized setting button. The mobile terminal can receive the preset operation of the user on the human-computer interaction interface, the corresponding large-screen control equipment responds, various controls on the large-screen equipment can be supported, the operation is convenient, and the user experience degree is high.
In one possible implementation, the first operations further include: clicking/double-clicking/long-pressing operation of a user in the blank of the human-computer interaction interface, touch sliding of the user in the blank of the human-computer interaction interface, operation of pressing a volume key of the mobile terminal by the user or operation of pinching and holding a side frame of the mobile terminal by the user. The mobile terminal can receive preset operation of a user on the human-computer interaction interface or other positions of the mobile terminal, the corresponding large-screen control device responds, various control over the large-screen device can be supported, operation is convenient, and user experience is high.
In a third aspect, a mobile terminal is provided, which includes: a memory for storing computer program code, the computer program code comprising instructions; a radio frequency unit for transmitting and receiving radio signals; the one or more computer programs stored in the memory, when executed by the processor, cause the mobile terminal to perform the method of controlling the display of a large-screen device in any of the possible implementations of the first aspect.
In a fourth aspect, there is provided a first system comprising: the mobile terminal is used for controlling the large-screen device and realizing the method for controlling the display of the large-screen device in any one of the possible implementation manners of the first aspect.
In one possible implementation, the first system further includes: and the service device is used for realizing the method for controlling the display of the large-screen device in any one possible implementation manner in the first aspect.
In a fifth aspect, a chip system is provided, which includes a processor, a memory, and instructions stored in the memory; the instructions, when executed by the processor, implement a method of controlling a large-screen device display as in any one of the possible implementations of the first aspect. The chip system may be formed by a chip, and may also include a chip and other discrete devices.
In a sixth aspect, a computer-readable storage medium is provided, which stores computer-executable instructions that, when executed by a processor, implement a method for controlling a display of a large-screen device according to any one of the possible implementations of the first aspect.
In a seventh aspect, a computer program product is provided, which when run on a computer, causes the method for controlling the display of a large-screen device in any one of the possible implementations of the first aspect. For example, the computer may be at least one storage node.
Drawings
Fig. 1A is a first application scenario example of a method for controlling display of a television through a mobile phone according to an embodiment of the present application;
fig. 1B is an application scenario example of a method for controlling a display of a laptop computer through a mobile phone according to an embodiment of the present application;
fig. 2 is a second application scenario example of the method for controlling display of a television through a mobile phone according to the embodiment of the present application;
fig. 3 is an application scenario example of a method for controlling a display of a projection device according to an embodiment of the present application;
fig. 4 is a schematic diagram of a hardware structure of a mobile phone according to an embodiment of the present disclosure;
fig. 5 is a schematic view of a spatial posture of a mobile phone according to an embodiment of the present application;
fig. 6 is a flowchart of a method for controlling display of a large-screen device through a mobile terminal according to an embodiment of the present application;
FIG. 7 is a schematic view of a relative position between a mobile phone and a portable computer according to an embodiment of the present disclosure;
fig. 8 is a schematic diagram illustrating a first position determination method according to an embodiment of the present application;
fig. 9 is a flowchart of a method for controlling a display of a large-screen device according to an embodiment of the present application;
FIG. 10 is an exemplary human-computer interface provided in an embodiment of the present application;
fig. 11 is a flowchart of a method for controlling display of a large-screen device according to an embodiment of the present application;
fig. 12 is a schematic structural diagram of a mobile terminal according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another mobile terminal according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a method, electronic equipment and a system for controlling display of large-screen equipment. The electronic device is a mobile terminal or a large screen device. Wherein the large screen device comprises a display screen. For example, the large screen device may be a television, a Personal Computer (PC), a tablet, a netbook, a projection device, or the like. The mobile terminal can directly control the content and the display form displayed on the display screen of the large-screen device controller. For example, the large screen device in the embodiment of the present application is a projection device, such as a projector and a curtain. The content displayed on the display screen of the mobile terminal is directly projected on the screen through the projector to be displayed, and the mobile terminal can directly control the projector to control the content and the display form of the projector projected on the screen.
Or, the mobile terminal can control the content and the display form displayed on the display screen by controlling the service equipment. For example, the projector is connected to a notebook computer to project an interface displayed on a display screen of the notebook computer onto a screen. The mobile terminal can control the content and the display form of the projector projected on the curtain by controlling the notebook computer.
It should be noted that the large-screen device in the embodiment of the present application may also be other devices in a human-computer interaction scene, for example, a motion sensing game machine. The embodiment of the application does not limit the specific category and form of the large-screen device.
The mobile terminal in the embodiment of the present application may be a handheld device, such as a smart phone, a tablet computer, a palmtop computer, and the like. But may also be wearable devices (e.g., smartwatches), Portable Multimedia Players (PMPs), dedicated media players, AR (augmented reality)/VR (virtual reality) devices, and other types of electronic devices.
The method for controlling the large-screen device is disclosed by the embodiment of the application. The method comprises the step that the mobile terminal directly controls the large-screen device, or the mobile terminal controls the large-screen device through other devices (such as service devices).
Wherein the mobile terminal and the large-screen device establish communication connection. The mobile terminal can control the large-screen device through the communication connection. The communication connection may be a wired connection or a wireless connection.
For example, a mobile terminal, a large screen device, a service device, etc. may transmit information therebetween via a wireless connection transceiver in accordance with a wireless transmission protocol. The transmitted information includes, but is not limited to, content data and control instructions to be displayed. The Wireless transmission protocol may include, but is not limited to, a Bluetooth (BT) transmission protocol or a Wireless Fidelity (WiFi) transmission protocol, etc. The wireless connection transceiver includes, but is not limited to, a bluetooth or WiFi transceiver. And information transmission between the mobile terminal and the large-screen equipment is realized through wireless pairing. Or, the mobile terminal, the large-screen device and the service device can be connected through wires to realize information transmission. The wired connection is, for example, a data transmission line connection. The following embodiments of the present application only take the implementation of information transmission between a mobile terminal, a large screen device and a service device following a wireless transmission protocol as an example.
It should be noted that, the embodiment of the present application is not limited to a specific application scenario of human-computer interaction. This scenario may be, as described in the background, a business exhibition, a seminar, a large roadshow, and a new product release. And the system can also be a motion sensing game scene, such as 'watermelon cutting', 'shooting', and the like. Or the system can also be used in scenes such as television teaching and television conference. As described above, the mobile terminal may directly control the large-screen device to control the display thereof, or may control the display of the large-screen device by controlling other devices, and the application is not particularly limited to a specific scene.
As shown in fig. 1A, an example of an application scenario of the method for controlling display of a television through a mobile phone according to the embodiment of the present application is shown. As shown in fig. 1A, a mobile terminal is taken as a mobile phone 100, and a large-screen device is taken as a television 300. The mobile phone 100 can directly project the interface displayed on the display screen of the mobile phone 100 to the display screen of the television 300. I.e. the display of the tv-set 300 can synchronously display the content on the display of the handset 100. For example, suppose that mobile office software installed in the mobile phone 100 displays the PPT on the display screen of the television 300 by using technologies such as "sharing WiFi". The user controls the position of the laser pointer on the display screen of the television 300 through the mobile phone 100, and controls the television 300 to make corresponding display, effect and the like, such as marking color, drawing lines and hooking, at the corresponding position.
Alternatively, the mobile phone may also synchronously display the content on the display screen of the mobile phone 100 on the display screen of a Laptop Computer (also referred to as a "notebook Computer") by technologies such as "sharing WiFi".
As shown in fig. 1B, a mobile terminal is taken as a mobile phone 100, and a large-screen device is taken as a Laptop Computer (also referred to as a "notebook Computer") 200. The portable computer 200 may be only one display device. The content on the display of the handset 100 is displayed synchronously on its display. In a possible embodiment, the portable computer 200 can also be used as a playing device to play the content in the portable computer 200. For example, a motion sensing game "watermelon cutting" Application (APP) is installed in the laptop computer 200. The user uses the mobile phone 100 as a motion sensing mouse. The mobile phone 100 and the portable computer 200 are both connected to WiFi. The process of the user 'cutting watermelon' can be as follows: the laptop 200 "cut watermelon" game interface initial position (e.g., game interface center) displays the laser pointer. The user's right hand-held handshaking machine 100 slides in any direction in the air and the laser pointer slides along the corresponding trajectory. The user controls the laser pointer to move to the position near the watermelon A, the right hand continues to slide in the air to perform the watermelon cutting action, and correspondingly, the watermelon A in the game interface displays the cut simulation visual effect.
Alternatively, the mobile phone 100 may control the display on the screen of the projection device by operating the portable computer 200.
As shown in fig. 2, a second application scenario example of the method for controlling display of a television through a mobile phone according to the embodiment of the present application is provided. As shown in fig. 2, the tv 300 is only used as a display device to display the interface displayed on the display screen of the portable computer 200, and in this case, the tv 300 can also be understood as a projection device. For example, in a 3-4 person mini-conference, the speaker displays the PPT presentation interface on the display of his laptop 200 on the display of the television 300. The speaker operates the laptop 200 through the mobile phone 100 to control the display effects of the PPT demonstration interface displayed on the display screen of the television 300, such as color marking, line drawing and hooking.
Alternatively, as shown in fig. 3, the projection apparatus includes a projector 400 and a curtain 500. For example, in a large conference of hundreds of people, or on a new property release meeting. The presenter projects the PPT presentation interface on the display screen of his laptop 200 through projector 400 onto canvas 500. The presenter manipulates the laptop computer 200 through the cellular phone 100 so that the laser pointer position can be changed following the posture change of the cellular phone 100 on the display interface projected on the curtain 500 by the projector 400. And, the mobile phone 100 controls the portable computer 200, so that the portable computer 200 projects the corresponding position of the display interface on the curtain 500 through the projector 400 to make the corresponding display, effect, and the like.
In the example of fig. 1B and 2, both the cell phone 100 and the laptop computer 200 may be connected to WiFi. The portable computer 200 and the television 300 in fig. 2 and the projector 400 and the portable computer 200 in fig. 3 may be connected by a data transmission line 600 (e.g., a High Definition Multimedia Interface (HDMI) line). Alternatively, the television 300 and the projector 400 may be connected to WiFi, and receive the displayed content data and control instructions and the like from the laptop computer 200 through WiFi.
Based on the examples in fig. 1A, 1B, 2, 3. In the method for controlling a large-screen device through a mobile terminal according to the embodiment of the application, the mobile terminal, such as the mobile phone 100, may have basic functions of indication, page turning and the like of a laser pen. For example, the service device or the large screen device may determine the display position of the laser pointer according to the 2-dimensional (2Dimensions, 2D) position (also referred to as a planar position) mapped on the display screen (e.g., the display screen of the tv 300 in fig. 1A or fig. 2, the screen 500 in fig. 3) of the large screen device by the 3-dimensional (3Dimensions, 3D) position (also referred to as a spatial position) of the user holding the mobile phone 100, so as to prompt the audience about the content being spoken. And, the mobile phone 100 sends a "page turning" instruction to the service device or the large screen device. The handset 100 may also have the function of a wireless mouse. For example, the user may send instructions such as "highlight", "double-click open", "play", "scribe" to the service device or the large-screen device through the mobile phone 100, so as to control the display corresponding to the laser pointer on the display screen of the large-screen device.
The structure of the mobile terminal in the present application is described below by taking a mobile phone as an example. As shown in fig. 4, the mobile phone 100 may include a processor 410, an external memory interface 420, an internal memory 421, a Universal Serial Bus (USB) interface 430, a charging management module 440, a power management module 441, a battery 442, an antenna 1, an antenna 2, a mobile communication module 450, a wireless communication module 460, an audio module 470, a speaker 470A, a receiver 470B, a microphone 470C, a sensor module 480, a button 490, a motor 491, an indicator 492, a camera 493, a display 494, a Subscriber Identification Module (SIM) card interface 495, and the like. The sensor module 480 may include a pressure sensor 480A, a gyro sensor 480B, a magnetic sensor 480C, an acceleration sensor 480D, a distance sensor 480E, a fingerprint sensor 480F, a touch sensor 480G, an ambient light sensor 480H, and the like.
It is to be understood that the illustrated structure of the embodiment of the present invention does not specifically limit the mobile phone 100. In other embodiments of the present application, the handset 100 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 410 may include one or more processing units. For example: the processor 410 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a flight controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), among others. The different processing units may be separate devices or may be integrated into one or more processors.
A memory may also be provided in processor 410 for storing instructions and data. In some embodiments, the memory in the processor 410 is a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 410. If the processor 410 needs to use the instruction or data again, it can be called directly from the memory. Avoiding repeated accesses reduces the latency of the processor 410, thereby increasing the efficiency of the system.
In some embodiments, processor 410 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc.
The I2C interface is a bi-directional synchronous serial bus that includes a serial data line (SDA) and a Serial Clock Line (SCL). In some embodiments, processor 410 may include multiple sets of I2C buses. The processor 410 may be coupled to the touch sensor 480G, the charger, the flash, the camera 493, etc. through different I2C bus interfaces, wherein the number of the camera 493 is at least one; the camera 493 may also be a 360 ° rotatable camera. For example: the processor 410 may be coupled to the touch sensor 480G through an I2C interface, such that the processor 410 and the touch sensor 480G communicate through an I2C bus interface to implement the touch function of the cell phone 100.
The USB interface 430 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 430 may be used to connect a charger to charge the mobile phone 100, and may also be used to transmit data between the mobile phone 100 and peripheral devices. But also for connecting other electronic devices, such as AR devices, etc.
It should be understood that the connection relationship between the modules according to the embodiment of the present invention is only an exemplary illustration, and does not limit the structure of the mobile phone 100.
The charging management module 440 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 440 may receive charging input from a wired charger via the USB interface 430. In some wireless charging embodiments, the charging management module 440 may receive a wireless charging input through a wireless charging coil of the cell phone 100. The charging management module 440 may also supply power to the electronic device through the power management module 441 while charging the battery 442.
The power management module 441 is used to connect the battery 442, the charging management module 440 and the processor 410. The power management module 441 receives input from the battery 442 and/or the charging management module 440 and provides power to the processor 410, the internal memory 421, the display screen 494, the camera 493, the wireless communication module 460, and the like. The power management module 441 may also be used to monitor parameters such as battery capacity, battery cycle number, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 441 may be disposed in the processor 410. In other embodiments, the power management module 441 and the charging management module 440 may be disposed in the same device.
The wireless communication function of the mobile phone 100 can be implemented by the antenna 1, the antenna 2, the mobile communication module 450, the wireless communication module 460, the modem processor, the baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the handset 100 may be used to cover a single or multiple communications bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 450 may provide a solution including 2G/3G/4G/5G wireless communication applied to the handset 100. The mobile communication module 450 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 450 may receive the electromagnetic wave from the antenna 1, and filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 450 can also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 450 may be disposed in the processor 410. In some embodiments, at least some of the functional blocks of the mobile communication module 450 may be disposed in the same device as at least some of the blocks of the processor 410.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs sound signals through an audio device (not limited to the speaker 470A, the receiver 470B, etc.) or displays images or video through the display screen 494. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be separate from the processor 410, and may be located in the same device as the mobile communication module 450 or other functional modules.
The wireless communication module 460 may provide a solution for wireless communication applied to the mobile phone 100, including Wireless Local Area Networks (WLANs) (e.g., WiFi networks), bluetooth BT, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), and the like. The wireless communication module 460 may be one or more devices integrating at least one communication processing module. The wireless communication module 460 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering on the electromagnetic wave signal, and transmits the processed signal to the processor 410. The wireless communication module 460 may also receive a signal to be transmitted from the processor 410, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, the antenna 1 of the handset 100 is coupled to the mobile communication module 450 and the antenna 2 is coupled to the wireless communication module 460, such that the handset 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technology may include global system for mobile communications (GSM), General Packet Radio Service (GPRS), code division multiple access (code division multiple access, CDMA), Wideband Code Division Multiple Access (WCDMA), time-division code division multiple access (time-division code division multiple access, TD-SCDMA), Long Term Evolution (LTE), LTE, BT, GNSS, WLAN, NFC, FM, and/or IR technologies, etc. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a beidou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The mobile phone 100 implements a display function through the GPU, the display screen 494, and the application processor. The GPU is an image processing microprocessor connected to a display screen 494 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 410 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 494 is used to display images, videos, and the like. The display screen 494 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, cell phone 100 may include 1 or N display screens 494, N being a positive integer greater than 1.
The mobile phone 100 may implement a shooting function through the ISP, the camera 493, the video codec, the GPU, the display screen 494, the application processor, and the like.
The ISP is used to process the data fed back by the camera 493. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 493.
The camera 493 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the handset 100 may include 1 or N cameras 493, where N is a positive integer greater than 1.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the handset 100 is in frequency bin selection, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. Handset 100 may support one or more video codecs. Thus, the handset 100 can play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can realize applications such as intelligent recognition of the mobile phone 100, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 420 may be used to connect an external memory card, such as a Micro SD card, to extend the storage capability of the mobile phone 100. The external memory card communicates with the processor 410 through the external memory interface 420 to implement data storage functions. For example, files such as music, video, etc. are saved in an external memory card.
The internal memory 421 may be used to store computer-executable program code, including instructions. The internal memory 421 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The data storage area may store data (e.g., audio data, a phonebook, etc.) created during use of the handset 100, and the like. In addition, the internal memory 421 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. The processor 410 executes various functional applications of the cellular phone 100 and data processing by executing instructions stored in the internal memory 421 and/or instructions stored in a memory provided in the processor.
The handset 100 may implement audio functions through the audio module 470, the speaker 470A, the receiver 470B, the microphone 470C, the application processor, and the like. Such as music playing, recording, etc.
The audio module 470 is used to convert digital audio information into an analog audio signal output and also used to convert an analog audio input into a digital audio signal. The audio module 470 may also be used to encode and decode an audio signal. In some embodiments, the audio module 470 may be disposed in the processor 410, or some functional modules of the audio module 470 may be disposed in the processor 410.
The speaker 470A, also called a "horn", is used to convert the audio electrical signals into sound signals. The handset 100 may play voice or notify etc. through the speaker 470A.
The receiver 470B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the cellular phone 100 receives a call or voice information, it can receive voice by placing the receiver 470B close to the ear of the person.
The microphone 470C, also referred to as a "microphone," is used to convert sound signals into electrical signals. When making a call or transmitting voice information, the user can input a voice signal into the microphone 470C by speaking the user's mouth near the microphone 470C. The handset 100 may be provided with at least one microphone 470C. In other embodiments, the handset 100 may be provided with two microphones 470C to achieve noise reduction functions in addition to collecting sound signals. In other embodiments, the mobile phone 100 may further include three, four or more microphones 470C to collect sound signals, reduce noise, identify sound sources, and implement directional recording functions.
The pressure sensor 480A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 480A may determine that the user presses on the bezel of the mobile phone 100, so as to facilitate the mobile phone 100 to respond to the control command corresponding to the operation.
The gyro sensor 480B may be used to determine the attitude during movement of the cell phone 100. In some embodiments, the angular velocity of the cell phone 100 about three axes (i.e., x, y, and z axes) may be determined by the gyro sensor 480B.
The magnetic sensor 480C includes a hall sensor. In some embodiments, physical parameters such as current, position, orientation, etc. may be measured by sensing magnetic field strength with the magnetic sensor 480C.
The acceleration sensor 480D can detect the magnitude of acceleration of the cellular phone 100 in various directions (generally, three axes). The magnitude and direction of gravity can be detected when the handset 100 is stationary. The method can also be used for recognizing the gesture of the mobile phone 100 and applied to camera viewfinder lens switching and the like.
And a distance sensor 480E for measuring distance. The handset 100 may measure distance by infrared or laser. In some embodiments, taking a picture of a scene, the cell phone 100 may utilize the distance sensor 480E to measure distances to achieve fast focus.
The ambient light sensor 480H is used to sense the ambient light level. The cell phone 100 can adaptively adjust the brightness of the display screen 494 according to the perceived ambient light level. The ambient light sensor 480H may also be used to automatically adjust the white balance when taking a picture. Ambient light sensor 480H may also work in conjunction with a proximity light sensor. In some embodiments, the cell phone 100 may instruct the large-screen device to adjust the brightness of the display screen of the large-screen device according to the ambient light level sensed by the ambient light sensor 480H.
The fingerprint sensor 480F is used to capture a fingerprint. Any type of sensing technology may be employed including, but not limited to, optical, capacitive, piezoelectric, or ultrasonic sensing technologies, etc. The mobile phone 100 can utilize the collected fingerprint characteristics to realize fingerprint unlocking, access application lock, fingerprint photographing and the like.
The touch sensor 480G is also referred to as a "touch device". The touch sensor 480G (also referred to as a touch panel) may be disposed on the display screen 494, and the touch sensor 480G and the display screen 494 form a touch screen, also referred to as a "touch screen". The touch sensor 480G is used to detect a touch operation applied thereto or therearound. The touch sensor can communicate the detected touch operation to the application processor to determine a touch event type. Visual output associated with the touch operation may be provided through the display screen 494. In other embodiments, the touch sensor 480G can be disposed on the surface of the mobile phone 100 at a different position than the display screen 494.
The keys 490 include a power-on key, a volume key, etc. The keys 490 may be mechanical keys. Or may be touch keys. The cellular phone 100 may receive a key input, and generate a key signal input related to user setting and function control of the cellular phone 100.
The motor 491 may generate a vibration indication. The motor 491 may be used for both incoming call vibration prompting and touch vibration feedback. For example, touch operations applied to different applications (e.g., photographing, audio playing, etc.) may correspond to different vibration feedback effects. The motor 491 may also respond to different vibration feedback effects in response to touch operations applied to different areas of the display screen 494. Different application scenes (such as time reminding, receiving information, alarm clock, game and the like) can also correspond to different vibration feedback effects. The touch vibration feedback effect may also support customization.
The indicator 492 may be an indicator light, and may be used to indicate a charging status, a change in charge level, or a message, a missed call, a notification, etc.
The SIM card interface 495 is used to connect a SIM card. The SIM card can be attached to and detached from the cellular phone 100 by being inserted into the SIM card interface 495 or being pulled out from the SIM card interface 495. The handset 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 495 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. Multiple cards can be inserted into the same SIM card interface 495 at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 495 may also be compatible with different types of SIM cards. The SIM card interface 495 may also be compatible with an external memory card. The mobile phone 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the handset 100 employs esims, namely: an embedded SIM card. The eSIM card can be embedded in the mobile phone 100 and cannot be separated from the mobile phone 100.
The method for controlling the display of the large-screen device provided by the embodiment of the present application is specifically described below with reference to the mobile phone in fig. 4. The methods in the following embodiments may be implemented in a mobile terminal having the above hardware structure or a mobile terminal having a similar structure.
The basic principle of the method for controlling the display of the large-screen device in the embodiment of the application is as follows: the real-time spatial attitude of the mobile phone is obtained through a nine-axis fusion algorithm by utilizing data acquired by an acceleration sensor, a gyroscope sensor and a magnetic sensor, the real-time spatial attitude is mapped to a 2D position on a display screen of large-screen equipment, and the display position of a laser pointer is determined. And controlling the large-screen device to make a certain response at the determined display position of the laser pointer.
The nine-axis fusion algorithm is a method for acquiring the attitude of an object through an acceleration sensor, a gyroscope sensor and a magnetic sensor.
In one possible implementation, the handset 100 may have an IMU integrated therein. The mobile phone 100 may acquire the object posture by fusing an acceleration sensor, a gyroscope sensor, and a magnetic sensor in an Inertial Measurement Unit (IMU). Among them, the IMU is a unit that measures and reports speed, direction, and gravity by a sensor combination (acceleration sensor, gyro sensor, and magnetic sensor).
In the method for controlling the display of the large-screen device according to the embodiment of the application, the working principle of the acceleration sensor is to determine the translation direction and the translation speed of the mobile phone 100 according to the stress condition of the measuring component in a certain axial direction. The expression form of the stress condition is directions (X, Y and Z axis directions) and the magnitude of the acceleration in the corresponding direction. The working principle of the gyroscope sensor is to measure an included angle between a vertical axis of a gyroscope rotor in a three-dimensional coordinate system (the three-dimensional coordinate system comprises an X axis, a Y axis and a Z axis) and the mobile phone 100, calculate an angular velocity, and judge the motion posture of an object in a three-dimensional space through the included angle and the angular velocity. The principle of the magnetic sensor is similar to that of a compass, and the included angles between the mobile phone 100 and the four directions of the south, the east, the west and the north can be measured. That is, the acceleration sensor detects "how far the mobile phone has moved along the X axis", the gyro sensor detects "the mobile phone 100 has turned around", and the magnetic sensor detects "the mobile phone 100 moves westward". Since the acceleration sensor, the gyro sensor, and the magnetic sensor can each measure 3-axis motion data, they are generally referred to as "nine-axis sensors".
In the nine-axis fusion algorithm in the embodiment of the present application, the features of the above-mentioned "nine-axis sensor" are fully utilized, and a more accurate real-time spatial attitude of the mobile phone 100 is fused and calculated, that is, a (x) of a spatial coordinate of the mobile phone 100 in a geographic coordinate system is calculated i ,y i ,z i ) And attitude angle. Wherein the attitude angle is used to reflect the attitude of the handset 100 relative to the ground. The attitude angle includes yaw angle (yaw), pitch angle of the handset 100 with respect to a ground coordinate system(pitch) and roll angle (roll).
Fig. 5 is a schematic view of a spatial posture of a mobile phone according to an embodiment of the present disclosure. As shown in fig. 5, x b Axis, y b Axis, z b The axes and origin O form the coordinate system, x, of the handset 100 g Axis, y g Axis, z g The axes and origin O constitute the ground coordinate system. Wherein O is the centroid of the handset 100, x b The axis is in the plane of symmetry of the handset 100 and points towards the handset head parallel to the body axis of the handset 100; y is b The axis is perpendicular to the symmetry plane of the handset 100 and points to the right side of the handset; z is a radical of b Axis perpendicular to x b The axis, and points below the body of the handset 100. x is a radical of a fluorine atom g The shaft is in the horizontal plane and points to a certain direction; z is a radical of g The shaft is vertical to the ground and points to the center of the earth; y is g The axis being perpendicular to x in the horizontal plane g The axis, the pointing direction of which is determined according to the right-hand rule.
As shown in FIG. 5, the yaw angle refers to the axis x of the handset 100 b Projection on horizontal plane and ground coordinate system x g Angle between axes
Figure BDA0003014517810000123
The front end of the mobile phone 100 is deflected to the right to be positive; the pitch angle refers to the 100 axes x of the mobile phone b The included angle theta between the mobile phone and the ground plane (or horizontal plane) is changed to be positive by the upward deflection of the front end part of the mobile phone; the roll angle refers to the mobile phone 100z b Axis x of mobile phone b The angle phi between the vertical planes of the mobile phone 100 is positive when the mobile phone rolls right.
Taking the scenario shown in fig. 1B as an example, the method for controlling the display of the large-screen device in the embodiment of the present application may be implemented through S601-S605 in fig. 6.
It is to be understood that, in the embodiment of the present application, the mobile phone 100 may perform some or all of the steps in the embodiment of the present application, and these steps or operations are merely examples, and the embodiment of the present application may also perform other operations or various modifications of the operations. Further, the various steps may be performed in a different order presented in the embodiments of the application, and not all operations in the embodiments of the application may be performed.
S601, the mobile phone 100 collects the motion data of the mobile phone 100 through the acceleration sensor, the gyroscope sensor and the magnetic sensor.
And S602, determining the spatial attitude of the mobile phone 100 by the mobile phone 100 according to the collected motion data by adopting a nine-axis fusion algorithm.
Wherein, as described above, the spatial pose of the handset 100 comprises (x) of the spatial coordinates of the handset 100 in a geographic coordinate system i ,y i ,z i ) And yaw angle of the handset 100 relative to a ground coordinate system
Figure BDA0003014517810000124
Pitch angle θ and roll angle φ.
In one possible implementation, determining the spatial attitude of the mobile phone 100 by using a nine-axis fusion algorithm may include: firstly, the motion data of the mobile phone 100 is calibrated, and then the spatial attitude of the mobile phone under the current motion data is calculated through a fusion algorithm according to the calibrated motion data.
S603, the mobile phone 100 determines the first position according to the spatial posture of the mobile phone 100.
Wherein the first position is a position where the spatial posture of the mobile phone 100 is mapped on the display screen of the portable computer. The first determined position is the display position of the laser pointer on the display screen of the laptop 200.
Under the condition that the control of the mobile phone 100 is accurate, the display position of the laser pointer on the display screen of the portable computer 200 is the point P in 7A in FIG. 7, wherein the two-dimensional coordinate of the point P on the display screen of the portable computer 200 is (x) i ',y i '). The origin of the coordinates is the upper left corner of the display screen of the portable computer 200, the x-axis is from left to right along the upper edge of the display screen of the portable computer, and the y-axis is from top to bottom along the left edge of the display screen of the portable computer.
The mobile phone 100 may determine R according to the display width W of the portable computer 200 and the sensitivity s of the mobile phone 100. The user controls the angle of the mobile phone 100 to rotate by s (unit: degree), and correspondingly, the laser pointer moves by a distance W on the display screen. Where W is the range of the steerable laser pointer and s is the angle of the corresponding steerable handset 100.
For example, when the distance L from the person to the display screen is as shown in fig. 8, and is R ═ W/2 × cot (s/2), it is assumed that s is a value at this time that is comfortable for the person to operate. When the distance L from the person to the display screen is greater than W/2 × cot (s/2), the mobile phone 100 is controlled to rotate by the same angle, the moving distance of the laser pointer is increased, the controllable space of the mobile phone 100 is smaller (the edge is easier to be sensed), and the sensitivity sensed by the person is relatively higher. Similarly, when the distance L from the person to the display screen is less than W/2 × cot (s/2), the mobile phone 100 is controlled to rotate by the same angle, the moving distance of the laser pointer becomes smaller, the moving space of the laser pointer becomes larger, the controllable space of the mobile phone 100 is larger, and the sensitivity perceived by the person is relatively lower.
In some embodiments, s for the handset 100 is adjustable. Illustratively, when the distance L from the person to the display screen is fixed, where L < W/2 × cot (s/2), s can be decreased, so that the laser pointer can move the same distance by manipulating the handset 100 to rotate a smaller angle. Alternatively, when the distance L from the person to the display screen is fixed, where L > W/2 × cot (s/2), s can be increased, so that the laser pointer can move a greater distance by rotating the mobile phone 100 by the same angle.
Taking L ═ R as an example, as shown in 7B in fig. 7 and 7C in fig. 7,
Figure BDA0003014517810000131
as shown in fig. 7C, dy ═ R × tan θ; and due to
Figure BDA0003014517810000132
Thus, it can be calculated
Figure BDA0003014517810000133
Figure BDA0003014517810000134
S604, the mobile phone 100 sends the second message to the portable computer 200.
Wherein the second information is used for indicating the first position. The second information includes control information. The control information is used to instruct the portable computer 200 to respond at the first location.
Alternatively, the mobile phone 100 controls the display device through the service device. As in the case where the mobile phone 100 in fig. 2 controls the display of the television 300 through the portable computer 200, or the mobile phone 100 in fig. 3 projects the display on the projection device curtain through the portable computer 200. S605, the mobile phone 100 sends a second message to the portable computer 200, instructing the portable computer 200 to control the television 300 to make a first response at a first position. Alternatively, in S605, the mobile phone 100 sends a second message to the portable computer 200, for instructing the portable computer 200 to control the projector 400 to make a first response at the first position of the curtain 500.
S605, the portable computer 200 makes a first response at the first position.
In some possible embodiments, the control information may be used to instruct, but is not limited to, the laptop 200 to make any of the following first responses: the method includes the steps of invoking a mouse pointer to click on display content (e.g., a video play button) at a first location, invoking a mouse pointer to double click on display content (e.g., a folder) at a first location, invoking a mouse pointer to make a sliding operation (e.g., sliding a video play progress button), and turning pages, invoking a mouse pointer to draw a line at a first location, and the like.
In some possible embodiments, the control information may also be used to instruct the laptop 200 to switch the function of the mouse pointer from the laser pointer function to the laser brush function. For example, by switching the laser pointer to a laser brush, operations such as scribing, drawing, writing, and the like are completed.
In one possible implementation, the user may instruct the laptop 200 to switch the laser pointer to the laser brush by performing a preset operation on the mobile phone 100. For example: the preset operation is to slide or press a certain position on the touch screen of the mobile phone 100, and simultaneously press "+", "-" volume keys (1010 in fig. 10), press the volume keys for a long time, pinch the side frame of the mobile phone 100 for a long time, and the like.
In one possible implementation, the user may instruct the laptop 200 to switch the laser pointer to the laser brush by performing a corresponding operation on the human-computer interface of the mobile phone 100. For example: clicking a pointer/brush switching button (e.g. 1007 in fig. 10) on the man-machine interaction interface, or making a preset operation (e.g. double-click, sliding, etc.) on a touch pad (e.g. 1009 in fig. 10) of the man-machine interaction interface.
In some embodiments, the human-machine interface may be a basic service function in the mobile phone 100, and may also be an Application (APP) installed in the mobile phone 100. For example, an "air mouse" APP may be installed in the mobile phone 100, and the user may complete the above-mentioned custom setting in the APP interface. The "air mouse" APP is used to enable the mobile phone 100 to implement various functions of a conventional mouse, but does not need to work on a fixed location or a fixed platform.
In some embodiments, as shown in fig. 9, before S604, that is, before the mobile phone 100 sends the second message to the laptop 200, the method for controlling the display of the large-screen device according to the embodiment of the present application may further include:
s606, the mobile phone 100 receives the first operation of the user.
In this case, S604 in fig. 6 is actually: the cellular phone 100 transmits the second information to the laptop computer 200 in response to the first operation. I.e., S607 in fig. 9.
The sending of the second information to the portable computer 200 by the mobile phone 100 according to the first operation may include: the mobile phone 100 determines a control instruction corresponding to the first operation according to the first operation; the cellular phone 100 transmits the second message to the portable computer 200. And the second information comprises a control instruction corresponding to the first operation.
In a possible implementation manner, the correspondence between different operations and different control instructions may be set by a user in a customized manner. For example, a user can set the corresponding relationship between different operations and control instructions in a customized manner on the human-computer interaction interface.
The first operation may be an operation of the user on the man-machine interaction interface of the mobile phone 100. For example: and clicking a virtual button on the human-computer interaction interface by the user. Or clicking/double-clicking/long-pressing operation of the user in the blank of the human-computer interaction interface, and touching sliding operation. The mobile phone 100 may determine the control instruction identified by the specific first operation according to the corresponding relationship between the different operations and the different control instructions set by the user. The above-described custom setting can be made, for example, by clicking 1008 in fig. 10.
Alternatively, the first operation may also be an operation in which the user presses a volume key (e.g., 1010 in fig. 10) of the cellular phone 100 or an operation in which the user pinches a side bezel of the cellular phone 100. For example, the mobile phone 100 detects that a certain position of the side frame is pressed by a pressure sensor and is greater than F 0 And a duration greater than T 0 . The mobile phone 100 determines the corresponding control command according to the detected condition.
In some embodiments, the human-machine interface may include at least one virtual button of: a left/right hand interface switch button (e.g., 1006 in fig. 10), a pointer/brush switch button (e.g., 1007 in fig. 10), a one-key play/one-key exit button (e.g., 1001 in fig. 10), and a personalization setting button (e.g., 1008 in fig. 10).
Based on the man-machine interaction interface, the basic function of a mouse can be realized. For example: slide one-button play/one-button exit, slide turning, line drawing, confirmation, play, file opening, writing, etc. Or, the related configuration of the mouse can be carried out through a human-computer interaction interface. For example: the laser pointer display style is configured by clicking 1008 in fig. 10, the laser pointer position manipulation sensitivity is configured by clicking 1005 in fig. 10, the interface layout is configured by clicking 1004 in fig. 10, and the like.
In some embodiments, as shown in fig. 11, before S601, when the mobile phone 100 is in the first posture, the method for controlling the display of the large-screen device of the present application may further include:
s608, in response to receiving the first instruction from the user, the mobile phone 100 sends the first message to the portable computer 200.
The first information is used for indicating the portable computer 200 to display the laser pointer at a preset position.
For example: the predetermined position is the center position of the display screen of the portable computer 200. The first information includes a 2D position realignment command instructing the portable computer 200 to control the laser pointer to be displayed at a central position of the display screen of the portable computer 200 according to the 2D position realignment command.
In one possible implementation, as shown in fig. 10, the human-computer interface may further include a 2D position realignment button (1002 in fig. 10), which the user may click on by the mobile phone 100 to instruct the laptop 200 to perform 2D position realignment.
And S609, displaying the laser pointer at the preset position by the portable computer 200 according to the first information.
For example, after the portable computer 200 is subjected to 2D position realignment, the mobile phone 100 is currently pointing to the position of the display screen of the portable computer 200, i.e. the center position of the display screen of the portable computer 200. The current attitude of the mobile phone 100 (i.e., the first attitude) is an initial attitude, and the yaw angle and the pitch angle of the initial attitude can be considered to be 0.
In this case, in the subsequent movement process of the mobile phone 100, after the attitude angle of the mobile phone 100 is determined by using the nine-axis fusion algorithm, the first position is determined according to the attitude angle of the mobile phone 100, that is, S603 can be implemented by the following steps:
step 1: quaternion q corresponding to the initial attitude of the mobile phone 100 0 As a starting point, the current attitude quaternion q i As an end point, the quaternion q is calculated when the mobile phone 100 is transformed from the initial attitude to the spatial attitude i Relative quaternion q 0 Relative rotation matrix C.
Specifically, the mobile phone 100 uses the quaternion q corresponding to the result of the current attitude nine-axis fusion algorithm 0 Quaternion q corresponding to the result of the initial attitude nine-axis fusion algorithm i And determining a rotation quaternion, and converting the rotation quaternion into a relative rotation matrix C.
Step 2: the handset 100 determines the second attitude according to the initial yaw angle, the initial pitch angle, and the relative rotation matrix C.
The initial yaw angle is a yaw angle of the mobile phone 100 relative to a ground coordinate system when the mobile phone 100 is in an initial attitude, and the initial pitch angle is a pitch angle of the mobile phone 100 relative to the ground coordinate system when the mobile phone 100 is in the initial attitude.
And step 3: the handset 100 determines a first position based on the second pose.
By the method, the initial posture corresponds to the central position of the display screen of the portable computer 200, the 3D space posture is projected onto the 2D display screen, the display position of the laser pointer is obtained, and laser pointer control with higher accuracy and higher sensitivity can be obtained.
In a possible implementation manner, a left-right hand-held recognition algorithm can be integrated in the mobile phone 100, so that a man-machine interaction interface of the mobile phone 100 follows the left-right hand self-adaptive layout, and the user operation experience is improved. The user may switch left/right hand interfaces by clicking on 1006 in fig. 10.
It is understood that the mobile terminal (e.g., the handset 100) includes corresponding hardware structures and/or software modules for performing the respective functions in order to realize the functions of any of the above-described embodiments. Those of skill in the art would readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the application can divide the functional modules of the mobile terminal, for example, each functional module can be divided corresponding to each function, or two or more functions can be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
For example, in a case that each functional module is divided in an integrated manner, as shown in fig. 12, a schematic structural diagram of a mobile terminal provided in the embodiment of the present application is shown. The mobile terminal may include a sensor module 1210, an analysis module 1220, and a transmission module 1230.
The sensor module 1210 may be configured to support the mobile terminal to perform the step S601, the analysis module 1220 may be configured to support the mobile terminal to perform the steps S602 and S603, and the sending module 1230 is configured to support the mobile terminal to perform the steps S604, S607, and S608, and/or other processes for the techniques described herein.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again. In one possible resulting configuration, as shown in fig. 13, the mobile terminal may further include a detection module 1240 for enabling the mobile terminal to perform step S606 described above, and/or other processes for the techniques described herein.
The analysis module 1220 may be the processor 410 shown in fig. 4, among others. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like.
It should be noted that the mobile terminal may further include a radio frequency circuit. Specifically, the mobile terminal may receive and transmit wireless signals through the radio frequency circuit. Typically, the radio frequency circuitry includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency circuitry may also communicate with other devices via wireless communication. The wireless communication may use any communication standard or protocol including, but not limited to, global system for mobile communications, general packet radio service, code division multiple access, wideband code division multiple access, long term evolution, email, short message service, and the like.
In an alternative, when the data transfer is implemented using software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions described in the embodiments of the present application are implemented in whole or in part. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored in a computer readable storage medium or transmitted from one computer readable storage medium to another, for example, from one website site, computer, server, or data center to another website site, computer, server, or data center via wired (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that includes one or more available media. The usable medium may be a magnetic medium (e.g., floppy Disk, hard Disk, magnetic tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., Solid State Disk (SSD)), among others.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware or may be embodied in software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, a hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an ASIC. In addition, the ASIC may reside in a probing apparatus. Of course, the processor and the storage medium may reside as discrete components in the probe device.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions.
In the several embodiments provided in the present application, it should be understood that the disclosed user equipment and method may be implemented in other manners. For example, the above-described device embodiments are merely illustrative, and for example, the division of the modules or units is only one logical functional division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or may be integrated into another device, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may be one physical unit or a plurality of physical units, that is, may be located in one place, or may be distributed in a plurality of different places. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially or partially contributed to by the prior art, or all or part of the technical solutions may be embodied in the form of a software product, where the software product is stored in a storage medium and includes several instructions to enable a device (which may be a single chip, a chip, or the like) or a processor (processor) to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (24)

1. A method for controlling the display of a large-screen device is applied to a mobile terminal, the mobile terminal comprises a plurality of motion sensors, the motion sensors at least comprise an acceleration sensor, a gyroscope sensor and a magnetic sensor, a communication connection is established between the mobile terminal and the large-screen device, and the mobile terminal is used for controlling a laser pointer displayed on the large-screen device, and the method comprises the following steps:
the mobile terminal acquires motion data of the mobile terminal through the plurality of motion sensors;
the mobile terminal determines the space attitude of the mobile terminal by adopting a nine-axis fusion algorithm according to the motion data;
the mobile terminal determines a first position on a display screen of the large-screen equipment according to the space posture of the mobile terminal;
the mobile terminal controls the large-screen equipment to make a first response at the first position, wherein the first response comprises displaying a laser pointer at the first position;
the spatial pose is used to identify at least the following information: yaw angle of the mobile terminal relative to a ground coordinate system
Figure FDA0003606446730000013
A pitch angle theta and a roll angle phi;
before the mobile terminal collects the motion data of the mobile terminal through the plurality of motion sensors, the method further comprises:
when the mobile terminal is in a first posture, receiving a first instruction of a user, wherein the first instruction is used for indicating the mobile terminal to carry out 2D position realignment with the large-screen equipment;
responding to the first instruction, the mobile terminal sends first information to the large-screen device, wherein the first information is used for indicating the large-screen device to display the laser pointer at a preset position;
the first posture is any posture of the mobile terminal;
the method further comprises the following steps: determining a distance L between the mobile terminal and a display screen of the large-screen device; adjusting the sensitivity of the mobile terminal according to the relation between the distance L and the sensitivity s of the mobile terminal and the width W of the display screen of the large-screen device; wherein the adjusted sensitivity is less than the sensitivity s when L < W/2 × cot (s/2); when L is more than W/2 × cot (s/2), the adjusted sensitivity is more than the sensitivity s; and when the angle of the mobile terminal is rotated by s degrees, the distance of the laser pointer moving on the display screen is the width W of the display screen.
2. The method of claim 1, wherein the mobile terminal determines a first location on a display screen of the large-screen device according to the spatial gesture, and comprises:
the mobile terminal determines coordinates (x ') of the first position on a display screen of the large-screen device by adopting a formula I and a formula II' i ,y′ i ):
Figure FDA0003606446730000011
Figure FDA0003606446730000012
And W is the width of the display screen of the large-screen equipment, and s is the sensitivity of the mobile terminal.
3. The method of claim 1, wherein the mobile terminal determines a first location on a display screen of the large-screen device according to the spatial gesture, and comprises:
the mobile terminal uses the quaternion q corresponding to the first posture 0 As a starting point, the quaternion q corresponding to the space attitude i As an end point, calculating the quaternion q when the mobile terminal is transformed from the first attitude to the spatial attitude i With respect to said quaternion q 0 A relative rotation matrix C;
the mobile terminal determines a second attitude according to the initial yaw angle, the initial pitch angle and the relative rotation matrix C; the mobile terminal determines the first position according to the second posture;
the initial yaw angle is a yaw angle of the mobile terminal relative to a ground coordinate system when the mobile terminal is in the first posture, and the initial pitch angle is a pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first posture.
4. The method according to any one of claims 1-3, wherein the mobile terminal controls the large screen device to make a first response at the first position, comprising:
the mobile terminal sends second information to the large-screen equipment;
the second information is used for indicating the first position, the second information comprises control information, and the control information is used for indicating the large-screen device to make a first response at the first position.
5. The method according to any one of claims 1-3, wherein the mobile terminal controls the large screen device to make a first response at the first position, comprising:
the mobile terminal sends second information to the service equipment;
the second information is used for indicating the first position, the second information comprises control information, and the control information is used for indicating the service equipment to control the large-screen equipment to make the first response at the first position.
6. The method according to any of claims 1-5, wherein before the mobile terminal controls the large screen device to make the first response at the first position, the method further comprises:
the mobile terminal receives a first operation of a user;
the mobile terminal controls the large-screen device to make a first response at the first position, and the method comprises the following steps:
and responding to the first operation, and controlling the large-screen equipment to make a first response at the first position by the mobile terminal.
7. The method of claim 6, wherein the first response comprises at least any one of: switching the laser pointer into a laser painting brush, turning pages, opening a file, scribing, playing, stopping playing and adjusting the brightness.
8. The method according to claim 6 or 7, wherein the first operation is a user operation received when a display screen of the mobile terminal displays a human-computer interaction interface.
9. The method according to claim 8, wherein the first operation is a user operation on the human-computer interaction interface;
wherein the human-computer interaction interface at least comprises at least one virtual button selected from the following: a left-hand/right-hand man-machine interaction interface switching button, a pointer/brush pen switching button, a 2D position realignment button, a one-key play/one-key exit button, and a personalized setting button.
10. The method of claim 9, wherein the first operations further comprise: clicking/double-clicking/long-pressing operation of a user in the blank of the human-computer interaction interface, touch sliding of the user in the blank of the human-computer interaction interface, operation of pressing a volume key of the mobile terminal by the user or operation of pinching and holding a side frame of the mobile terminal by the user.
11. A mobile terminal, characterized in that the mobile terminal comprises: the sensor module comprises a plurality of motion sensors, the plurality of motion sensors at least comprise an acceleration sensor, a gyroscope sensor and a magnetic sensor, communication connection is established between the mobile terminal and the large-screen device, the mobile terminal is used for controlling a laser pointer displayed on the large-screen device, and the sensor module is used for acquiring motion data of the mobile terminal; the mobile terminal further includes:
the analysis module is used for determining the space attitude of the mobile terminal by adopting a nine-axis fusion algorithm according to the motion data; determining a first position on a display screen of the large-screen equipment according to the space posture of the mobile terminal;
a sending module, configured to send control information to the large-screen device, and to control the large-screen device to make a first response at the first position, where the first response includes displaying a laser pointer at the first position;
the spatial pose is used to identify at least the following information: yaw angle of the mobile terminal relative to a ground coordinate system
Figure FDA0003606446730000021
A pitch angle theta and a roll angle phi;
the mobile terminal further includes:
the detection module is used for receiving a first instruction of a user when the mobile terminal is in a first posture before the sensor module collects motion data of the mobile terminal, wherein the first instruction is used for indicating the mobile terminal to be subjected to 2D position realignment with the large-screen device;
the sending module is further configured to send, in response to the first instruction, first information to the large-screen device, where the first information is used to instruct the large-screen device to display the laser pointer at a preset position;
the analysis module is further used for determining a distance L between the mobile terminal and a display screen of the large-screen device; adjusting the sensitivity of the mobile terminal according to the relation between the distance L and the sensitivity s of the mobile terminal and the width W of the display screen of the large-screen device; wherein the adjusted sensitivity is less than the sensitivity s when L < W/2 × cot (s/2); when L is more than W/2 × cot (s/2), the adjusted sensitivity is more than the sensitivity s; and when the angle of the mobile terminal is rotated by s degrees, the distance of the laser pointer moving on the display screen is the width W of the display screen.
12. The mobile terminal of claim 11, wherein the analysis module determines the spatial pose of the mobile terminal according to the motion data by using a nine-axis fusion algorithm; and determining a first position on a display screen of the large-screen device according to the spatial attitude of the mobile terminal, including:
the analysis module determines coordinates (x ') of the first position on a display screen of the large-screen device by adopting a formula I and a formula II' i ,y′ i ):
Figure FDA0003606446730000031
Figure FDA0003606446730000032
And W is the width of the display screen of the large-screen equipment, and s is the sensitivity of the mobile terminal.
13. The mobile terminal of claim 11, wherein the analysis module determines a first location on a display screen of the large screen device according to the spatial gesture, comprising:
the analysis module uses the quaternion q corresponding to the first posture 0 As a starting point, the quaternion q corresponding to the space attitude i As an end point, calculating the quaternion q when the mobile terminal is transformed from the first attitude to the spatial attitude i With respect to said quaternion q 0 A relative rotation matrix C;
the analysis module determines a second attitude according to the initial yaw angle, the initial pitch angle and the relative rotation matrix C; the mobile terminal determines the first position according to the second posture;
the initial yaw angle is a yaw angle of the mobile terminal relative to a ground coordinate system when the mobile terminal is in the first posture, and the initial pitch angle is a pitch angle of the mobile terminal relative to the ground coordinate system when the mobile terminal is in the first posture.
14. The mobile terminal according to any of claims 11-13, wherein the sending module sends control information to the large screen device, including:
the sending module sends the control information to the large-screen device through the service device, wherein the control information is used for indicating the large-screen device to receive the control of the service device and making the first response at the first position.
15. The mobile terminal of any of claims 11-14, wherein the detection module is further configured to, prior to the sending module sending control information to the large-screen device,
receiving a first operation of a user;
the sending module sends control information to the large-screen device, and the control information comprises the following steps:
in response to the first operation, the sending module sends control information to the large-screen device, so as to control the large-screen device to make a first response at the first position.
16. The mobile terminal of claim 15, wherein the first response comprises at least one of: switching the laser pointer into a laser painting brush, turning pages, opening a file, scribing, playing, stopping playing and adjusting the brightness.
17. The mobile terminal according to claim 15 or 16, wherein the first operation is a user operation received when a display screen of the mobile terminal displays a human-computer interaction interface.
18. The mobile terminal according to claim 17, wherein the first operation is a user operation on the human-computer interaction interface;
wherein the human-computer interaction interface at least comprises at least one virtual button selected from the following: a left-hand/right-hand man-machine interaction interface switching button, a pointer/brush pen switching button, a 2D position realignment button, a one-key play/one-key exit button, and a personalized setting button.
19. The mobile terminal of claim 18, wherein the first operations further comprise: clicking/double-clicking/long-pressing operation of a user in the blank of the human-computer interaction interface, touch sliding of the user in the blank of the human-computer interaction interface, operation of pressing a volume key of the mobile terminal by the user or operation of pinching and holding a side frame of the mobile terminal by the user.
20. A mobile terminal, characterized in that the mobile terminal comprises: the sensor module comprises a plurality of motion sensors, the plurality of motion sensors at least comprise an acceleration sensor, a gyroscope sensor and a magnetic sensor, communication connection is established between the mobile terminal and the large-screen device, the mobile terminal is used for controlling a laser pointer displayed on the large-screen device, and the sensor module is used for acquiring motion data of the mobile terminal; the mobile terminal further includes:
a memory for storing computer program code, the computer program code comprising instructions;
a radio frequency unit for transmitting and receiving radio signals;
a processor for executing the instructions stored in the memory to implement the method of controlling a large screen device display of any one of claims 1-10.
21. A first system, characterized in that the first system comprises: a mobile terminal and a large-screen device, the mobile terminal is used for controlling the large-screen device, and the method for controlling the display of the large-screen device according to any one of claims 1-10 is realized.
22. The system of claim 21, further comprising: service device for implementing the method of controlling a large screen device display according to any of claims 5-10.
23. A chip system, comprising a processor, a memory, wherein the memory has instructions stored therein; the instructions, when executed by the processor, implement a method of controlling a large screen device display as claimed in any one of claims 1 to 10.
24. A computer storage medium having computer executable instructions stored thereon, which when executed by processing circuitry implement a method of controlling a large screen device display as claimed in any one of claims 1 to 10.
CN202110387350.8A 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system Active CN113220139B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110387350.8A CN113220139B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201910693432.8A CN110633018B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system
CN202110387350.8A CN113220139B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201910693432.8A Division CN110633018B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system

Publications (2)

Publication Number Publication Date
CN113220139A CN113220139A (en) 2021-08-06
CN113220139B true CN113220139B (en) 2022-08-02

Family

ID=68970283

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110387350.8A Active CN113220139B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system
CN201910693432.8A Active CN110633018B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201910693432.8A Active CN110633018B (en) 2019-07-30 2019-07-30 Method for controlling display of large-screen equipment, mobile terminal and first system

Country Status (2)

Country Link
CN (2) CN113220139B (en)
WO (1) WO2021017836A1 (en)

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113220139B (en) * 2019-07-30 2022-08-02 荣耀终端有限公司 Method for controlling display of large-screen equipment, mobile terminal and first system
CN111262909A (en) * 2020-01-09 2020-06-09 中国建设银行股份有限公司 Large-screen information display method, device, equipment and storage medium
CN111356006B (en) * 2020-03-13 2023-03-17 北京奇艺世纪科技有限公司 Video playing method, device, server and storage medium
CN111897437A (en) * 2020-08-19 2020-11-06 腾讯科技(深圳)有限公司 Cross-terminal interaction method and device, electronic equipment and storage medium
CN112383664B (en) * 2020-10-15 2021-11-19 华为技术有限公司 Device control method, first terminal device, second terminal device and computer readable storage medium
CN113141669B (en) * 2021-04-15 2022-07-22 维沃移动通信有限公司 Data transmission method, sending terminal and electronic equipment
CN113671997A (en) * 2021-08-17 2021-11-19 深圳市火乐科技发展有限公司 Projection equipment control method, correction method, remote control device and projection equipment
CN114339341A (en) * 2021-12-15 2022-04-12 海信视像科技股份有限公司 Display apparatus and control method of display apparatus
CN115016629B (en) * 2021-11-19 2023-06-06 荣耀终端有限公司 Method and device for preventing false touch
CN116360583A (en) * 2021-12-28 2023-06-30 华为技术有限公司 Equipment control method and related device
CN114095690A (en) * 2022-01-24 2022-02-25 龙旗电子(惠州)有限公司 Demonstration control right conversion method, device, equipment, medium and program product
CN114938468A (en) * 2022-04-22 2022-08-23 海信视像科技股份有限公司 Display device, screen control method, and storage medium
CN116048314B (en) * 2022-08-25 2024-04-09 荣耀终端有限公司 Cursor control method, cursor control device and storage medium

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289305A (en) * 2011-08-29 2011-12-21 江苏惠通集团有限责任公司 Attitude sensing equipment and positioning method thereof and method for controlling mouse pointer
CN102902375A (en) * 2012-09-24 2013-01-30 刘丹 Application method and application device for mouse working based on direction and posture
CN104965608A (en) * 2015-07-20 2015-10-07 杜昊浓 Space mouse system
CN208027318U (en) * 2018-04-01 2018-10-30 西北农林科技大学 A kind of lantern slide remote control pen crossed

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9055162B2 (en) * 2011-02-15 2015-06-09 Lg Electronics Inc. Method of transmitting and receiving data, display device and mobile terminal using the same
JP5440746B1 (en) * 2012-03-27 2014-03-12 コニカミノルタ株式会社 Display processing terminal device, optical sensor mounting unit, and photometric system
CN103167338A (en) * 2012-10-09 2013-06-19 深圳市金立通信设备有限公司 System and method for input control of smart television based on mobile terminal
CN106293404A (en) * 2015-05-22 2017-01-04 联发科技(新加坡)私人有限公司 Multi-screen shares method and the communication terminal of display picture
CN105628025B (en) * 2015-12-31 2018-06-29 中国人民解放军国防科学技术大学 A kind of constant speed offset frequency/machine laser gyroscope shaking inertial navigation system air navigation aid
CN107153457A (en) * 2016-03-04 2017-09-12 中兴通讯股份有限公司 Projection process method and device
CN106706003A (en) * 2017-02-15 2017-05-24 重庆邮电大学 Online calibration method for north-seeking rotation on basis of triaxial MEMS (Micro-Electromechanical System) gyroscope
CN113220139B (en) * 2019-07-30 2022-08-02 荣耀终端有限公司 Method for controlling display of large-screen equipment, mobile terminal and first system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102289305A (en) * 2011-08-29 2011-12-21 江苏惠通集团有限责任公司 Attitude sensing equipment and positioning method thereof and method for controlling mouse pointer
CN102902375A (en) * 2012-09-24 2013-01-30 刘丹 Application method and application device for mouse working based on direction and posture
CN104965608A (en) * 2015-07-20 2015-10-07 杜昊浓 Space mouse system
CN208027318U (en) * 2018-04-01 2018-10-30 西北农林科技大学 A kind of lantern slide remote control pen crossed

Also Published As

Publication number Publication date
WO2021017836A1 (en) 2021-02-04
CN110633018B (en) 2021-04-09
CN113220139A (en) 2021-08-06
CN110633018A (en) 2019-12-31

Similar Documents

Publication Publication Date Title
CN113220139B (en) Method for controlling display of large-screen equipment, mobile terminal and first system
CN110502954B (en) Video analysis method and device
US20200327692A1 (en) Position and pose determining method, apparatus, smart device, and storage medium
CN111026314B (en) Method for controlling display device and portable device
WO2022134632A1 (en) Work processing method and apparatus
CN112581358B (en) Training method of image processing model, image processing method and device
CN112581571B (en) Control method and device for virtual image model, electronic equipment and storage medium
WO2022028241A1 (en) Preview cover generation method and electronic device
CN114579016A (en) Method for sharing input equipment, electronic equipment and system
KR20140136349A (en) Apparatus saving conversation and method thereof
WO2021027890A1 (en) License plate image generation method and device, and computer storage medium
CN111027490A (en) Face attribute recognition method and device and storage medium
CN110991445B (en) Vertical text recognition method, device, equipment and medium
WO2022199102A1 (en) Image processing method and device
CN106412810B (en) Data transmission method and device
CN111385525B (en) Video monitoring method, device, terminal and system
CN110349527B (en) Virtual reality display method, device and system and storage medium
CN113391775A (en) Man-machine interaction method and equipment
CN110163192B (en) Character recognition method, device and readable medium
KR102166719B1 (en) Apparatus and method for information exchange
CN111982293B (en) Body temperature measuring method and device, electronic equipment and storage medium
CN114092655A (en) Map construction method, device, equipment and storage medium
KR101566113B1 (en) Watch-type mobile terminal and method for controlling the saem
CN110443841B (en) Method, device and system for measuring ground depth
KR102410211B1 (en) Mobile device, display device and method for controlling each of the devices

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant